Home

Tomorrow Interview .Position :- Job Title : Data Engineer Location: Remote at Remote, Remote, USA
Email: [email protected]
From:

Mike,

kanap

[email protected]

Reply to:   [email protected]

Job Title : Data Engineer

Duration : Long term

Location:  Remote

Client : OPTUM

Need visa USC,GC,TN,H4-EAD

Posting Job Responsibilities:

Assesses requirements, use cases and customer needs to make recommendations on Data engineering, data quality and data analytics-related technical tools and services as part of solutions and pilots
Serves as a key resource in developing key DataOps and Data engineering pipelines, applications, and programs in support of broader strategy and customer needs
Solves complex problems as they arise in vetting new technology and develops innovative solutions as applicable related to DataOps, integration with CI/CD and automation
Focuses on quality of work and continuous improvement for developing scalable code using modern tools, applications, and services in the AWS cloud
Provides clear explanations and recommendations to other on complex issues including working with vendors as applicable on debugging, troubleshooting or research
Adheres to deadlines and delivers on tasks assigned ensuring transparency with challenges, and proactively identifies mitigations and solutions to ensure tasks are complete on-time
Reviews work performance by others and provides recommendations for improvement
Collaborates with organizational leadership to understand strategy and needs to ensure recommendations and results align with expectations
Collects input from other team members as well as internal and external stakeholders to fold into implementation or develop new ideas for delivering results

Posting Job Qualifications:

10+ years Data engineering and Data operation related experience in AWS. Preferably in Azure as well. The experience includes build systems that collect, manage, and convert raw data into usable information for data scientists and business analysts to interpret
Experience with Spark based data engineering tools such as Databricks, RStudio, Jupiter, AWS Sage maker, AWS serverless services such as Lambda, AWS Glue and EMR.
Experience in automating data Opes jobs, data pipelines using the orchestration tools such as Airflow, Databricks or equivalent AWS services
Experiences with management of streaming data using KAFKA
Experience with cloud native data warehouse such as Snowflake, RedShift, Amazon RDS, Delta Lake
Experience with Data cataloging tools such as AWS Glue, Unity Catalog, Alation or equivalent
Fluency in developing complex data engineering pipelines using scripting languages such as Python, Bash Shell Spark, R , Lambda, Scala, Hive and integration with AWS services to monitor pipelines
Experience working in multiple cloud environments, especially AWS and Azure and familiarity with cloud-native tools and services
Experience in operationalizing data pipeline application log management, monitoring, debugging and notification services
Experience with how data access management, authentication and authorization concept and implementation for the data management tools such as Snowflake, Delta, S3, AWS RDS. Knowledge of Privacera and/or opensource Apache Ranger
Experience with team collaboration tools such as JIRA and agile methodology
Hand-on knowledge of scripting languages such as Bash shell, YAML and JSON
Experience with tools and process to manage data profile

Fluency in using REST API based technology to use as data source

Must have working experience in using technology and/or developing codes to manage data quality
Experience with various data formats including Parquet, CSV, TSV, JSON, and PDF

Preferred Qualifications:

Experience of Graph database technology preferably TigerGrpah or Neo4J
Experience working with Federal Government projects including CMS, FDA, HRSA or other federal customers especially in a cloud-hosted environment
Hands on knowledge any COTS ETL tools such as Informatica, Talend
Experience of Claims data, EHR data, FHIR and HL7 data standards
Experience with any document management engine like OpenText, IBM FileNet
Experience in extracting contents from PDF and ingesting in database engines like Elastic Search, AWS Kendra  or equivalent

--

Thanks & regards,

Mike Foster 

 Senior Talent Acquisition Specialist

1501 42nd Street, Suite 471, West Des Moines, IA - 50266

LinkedIn: https://www.linkedin.com/in/syed-abdul-hafeez-mike-foster-29589a1a3/

E mail: [email protected] 

Contact no: : 515-316-8556 or 515-605-7915, India No :9581662201 | http://kanapsystems.

Keywords: continuous integration continuous deployment sthree rlang green card trade national Iowa Tennessee
[email protected]
View all
Fri Jan 13 02:06:00 UTC 2023

To remove this job post send "job_kill 275699" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,