Urgent need Data Engineer With AWS data lakes & AWS S3 Malvern, PA ONSITE at Malvern, Alabama, USA |
Email: [email protected] |
From: Surya Bhargav, KK Software Associates [email protected] Reply to: [email protected] Hi, This is Surya from KK Software Associates. Note: Above 10years & PP number Must. Which sharing the profiles pls provide all the docs in one email, if any consultant matching to the below JD. I will reach you directly. (PP Number, H1B & DL Copies) I have an urgent requirement for AWS Data Engineer. Please go through the below requirement and forward me your updated resume with contact details ASAP. Role Description: We are looking for strong AWS Data Engineers who are passionate about Cloud technology. Your work will be to:Design and Develop Data Pipelines: Create robust pipelines to ingest, process, and transform data, ensuring it is ready for analytics and reporting.Implement ETL/ELT Processes: Develop Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) workflows to seamlessly move data from source systems to Data Warehouses, Data Lakes, and Lake Houses using Open Source and AWS tools.Worked on DBT (Data Build Tool) to perform transformations on Snowflake data.Worked on Extracting, Transforming and Loading data into Snowflake.Created complex queries to pull data from multiple source tables and created testing scripts to verify the data quality accuracy in the target tablesAdopt DevOps Practices: Utilize DevOps methodologies and tools for continuous integration and deployment (CI/CD), infrastructure as code (IaC), and automation to streamline and enhance our data engineering processes.Design Data Solutions: Leverage your analytical skills to design innovative data solutions that address complex business requirements and drive decision-making Competencies: Digital : Amazon Web Service(AWS) Cloud Computing, Data Build Tool Experience (Years): 6-8 Essential Skills: AWS concepts (S3, Lambda, Glue ETL, Cloud Watch).GCP concepts (Big Query, Vertex-AI, Cloud Storage, Composer, Pub/Sub)Python, Shell scripting, Spark, Jupiter Notebook.Programming Skills: Strong experience with modern programming languages such as Python, Java, and Scala. Desirable Skills: Expertise in Data Storage Technologies: In-depth knowledge of Data Warehouse, Database technologies, and Big Data Eco-system technologies such as AWS Redshift, AWS RDS, and Hadoop.Experience with AWS Data Lakes: Proven experience working with AWS data lakes on AWS S3 to store and process both structured and unstructured data sets. Keywords: continuous integration continuous deployment artificial intelligence sthree information technology golang Urgent need Data Engineer With AWS data lakes & AWS S3 Malvern, PA ONSITE [email protected] |
[email protected] View all |
Mon Oct 14 22:13:00 UTC 2024 |