Ab Initio Developer AWS-Remote at Remote, Remote, USA |
Email: [email protected] |
From: Abdul, VDart.INC [email protected] Reply to: [email protected] Hi Team, This is Abdul from VDart. We are a global IT services and workforce solutions firm based in Alpharetta, Georgia. I have the job description below and would like to discuss it further if this interests you. Ab Initio Developer AWS Contract Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience. Hands on Ab Initio/ETL (Informatica) development experience. To gain expertise in critical business process of Deposits Eco system. Hands-on practical experience in system design, application development, testing, and operational stability. In-depth knowledge of Ab Initio/Informatica ETL programming (GDE/EME), Unix Shell Scripting and Control-M / Autosys batch schedulers. In-depth knowledge of developing application and infrastructure architectures. Experience in developing, debugging, and maintaining code in a large corporate environment and in RDBMS/querying languages. Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security. Demonstrated knowledge of software applications and technical processes within a technical discipline (ETL Processing, Scheduling, Operations). Proficient in scripting with Python for data processing tasks and ETL workflows. Experience writing Splunk or Cloudwatch queries, DataDog metrics. Preferred qualifications, capabilities, and skills Familiarity with Java framework, modern front-end technologies and exposure to cloud technologies. Practical cloud native experience in AWS (EC2, S2, Glue, AWS Lambda, Athena, RDS, SNS), proficiency with Python, PySpark, Machine Learning disciplines. Strong experience with distributed computing frameworks such as Apache Spark, specifically PySpark and event driven architecture using Kafka. Experience with distributed databases like AWS DynamoDB and distributed computing frameworks such as Apache Spark, specifically PySpark. Working knowledge of AWS Glue services, including experience in designing, building, and maintaining ETL jobs for diverse data sources. Familiarity with AWS Glue Dynamic Frames to streamline ETL processes. Capability to troubleshoot common issues in PySpark and AWS Glue jobs, with the ability to identify and address performance bottlenecks. Keywords: continuous integration continuous deployment rlang information technology |
[email protected] View all |
Fri Mar 08 23:56:00 UTC 2024 |