Cloud Data Engineer at Jersey City, New Jersey, USA |
Email: [email protected] |
From: Sanjeev Kumar Singh, Tek Inspirations LLC [email protected] Reply to: [email protected] Hello, I'm Sanjeev from TEK Inspirations We have a requirement for you and the details are as follows Please review the Job specifications below and let me know if interested. Please confirm this email that you received it also send me your DL, Visa , 4 digit SSN, education details and LinkedIn profile as well. Job Description - EC: Verisk TITLE: Cloud Data Engineer VISA: USC, H4-EAD, GC. MOI: Skypoe LOCATION: Jersey City, NJ(LOCAL and priority) LINKEDIN NEEDED Must have total 8+ yrs. experience in IT and working as a Data Architect and 7+ years in Data warehouse, ETL, BI Projects. Must have experience at least 4 End to End implementations of cloud data warehouse (Redshift, Postgres, or Snowflake). Design and build scalable data pipelines using spark preferably PySpark in Glue/Databricks/EMR. Design and Develop ETL Processes in AWS Glue to migrate data from external sources like S3, ORC/Parquet/Text Files into AWS Redshift. Hands-on experience with Postgres SQL, RedShift, Redshift Spectrum, AWS Glue, Athena, Snowflake utilities, Snow SQL, Snow Pipe, and AWS Lambda model techniques using Python. Experience in Data Migration from RDBMS to Redshift/Snowflake cloud data warehouse. Expertise in data modeling, implementing complex stored Procedures, and standard DWH and ETL/ELT concepts. Query and automate the data with AWS Athena using Python. Expertise in advanced concepts like setting up resource monitors, Role-based access controls (RBAC), virtual warehouse sizing, query performance tuning and understanding how to use these features. Expertise in deploying features such as data sharing, events, and lake-house patterns. Deep understanding of relational as well as NoSQL data stores, Methods, and approaches (Star and Snowflake, Dimensional Modelling). Create ETL/ELT pipelines that transform and process terabytes of structured and unstructured data in real-time Experience with data security and data access controls and design. Build processes supporting data transformation, data structures, metadata, dependency, and workload management. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, and troubleshooting. Provide resolution to an extensive range of complicated data pipeline-related problems, proactively and as issues surface. Should be able to troubleshoot problems across infrastructure, platform, and application domains. Must have experience of Agile development methodologies. Strong written communication skills. Is effective and persuasive in both written and oral communication. Additional Info - Regards!! Sanjeev Kumar Singh Technical Recruiter | IT Healthcare & Informatics | Email: [email protected] Keywords: business intelligence sthree information technology green card procedural language New Jersey |
[email protected] View all |
Thu Aug 17 02:56:00 UTC 2023 |