Home

data engineer with insurance domain at Remote, Remote, USA
Email: [email protected]
From:

Syeda Hajra,

Absolute IT

[email protected]

Reply to:   [email protected]

Data Engineer / Data Analyst

Contract 

C2C/w2/1099

Bethlehem, PA. 

only USC GC GCEAD

Local to PA 

Must have one of the below client experiance

("21st Century" OR "Acuity" OR Aflac OR "Allianz Life" OR "Allied" OR Allstate OR "American Automobile Association" OR AAA OR "American Family" OR "American Income Life Company" OR "American International Group" OR AIG OR "American National Company" OR "American Strategic" OR ASI OR "Ameriprise Auto & Home" OR "Ameriprise Financial" OR "Ameritas Life" OR "Amica Mutual" OR "Amtrust Financial Services" OR "Applied Underwriters" OR "Arbella Group" OR Assurant OR "Assurity Life Company" OR "Auto-Owners" OR "AXA Equitable Life Company" OR "Bankers Life and Casualty Company" OR "Berkshire Hathaway" OR "Brotherhood Mutual Company" OR CareSource OR "Chubb Corp" OR "Citizens Property Corporation" OR "CNA Financial" OR "CNO Financial Group" OR "Colonial Life & Accident Company" OR "Combined" OR "Commerce Group" OR "Country Financial" OR "Delta Dental" OR "Encompass Company" OR "Erie Group" OR "Esurance" OR "Evergreen USA RRG" OR "Farmers Group" OR "Federated Mutual Company" OR "First Company of Hawaii" OR "FM Global" OR "GAINSCO" OR GEICO OR "General Re" OR "Genworth Financial" OR "Gerber Life Company" OR "Globe Life And Accident Company" OR "GMAC" OR "Gracy Title Company" OR "Grange Mutual Casualty Company" OR "The Guardian Life Company of America" OR "GuideOne" OR "Hagerty Agency" OR "Hanover" OR "The Hartford" OR "HCC Holdings" OR "Hiscox Small Business" OR "Horace Mann Educators Corporation" OR "Ironshore" OR "K&K" OR "Kansas City Life Company" OR "Kemper Corporation" OR "Knights of Columbus" OR "Lemonade" OR "Liberty Mutual" OR "Lincoln National Corporation" OR "Manhattan Life Company" OR "Markel Corporation" OR "MassMutual" OR "Merchants Group" OR "Mercury Group" OR "MetLife" OR "Metromile" OR "Modern Woodmen of America" OR "Mutual of Omaha" OR "National Flood Program" OR "Nationwide Mutual Company" OR "New Jersey Manufacturers Company" OR "New York Life Company" OR "NJM Group" OR "The Norfolk & Dedham Group" OR "Northwestern Mutual" OR "Omega" OR "Oxford Health Plans" OR "Pacific Life" OR PEMCO OR "Penn Mutual" OR "Penn National Company" OR "Philadelphia Contributionship for the of Houses from Loss by Fire" OR "Philadelphia Companies" OR "Physicians Mutual" OR "Primerica" OR "Principal Financial Group" OR "Progressive" OR "ProSight Specialty" OR "Protective Life" OR "Prudential Financial OR "State Automobile Mutual Company" OR "State Farm" OR "Sun Life Financial" OR "Symetra" OR "The General" OR "The Travelers Companies" OR TIAA OR "Titan Company" OR "Transamerica Corporation" OR "Tricare" OR "Trupanion" OR "UPC" OR "Unum" OR "USAA" OR "West Coast Life" OR "Western & Southern Financial Group" OR "Western Mutual Group" OR "Westfield" OR "White Mountains Group" OR "XL Catlin" OR "Zurich Group")

Responsibilities :

Architect, build, and maintain scalable and reliable data pipelines including robust data quality as part of data pipeline which can be consumed by analytics and BI layer.

Design, develop and implement low-latency, high-availability, and performant data applications and recommend & implement innovative engineering solutions.

Design, develop, test and debug code in Python, SQL, PySpark, bash scripting as per standards.

Design and implement data quality framework and apply it to critical data pipelines to make the data layer robust and trustworthy for downstream consumers.

Design and develop orchestration layer for data pipelines which are written in SQL, Python and PySpark.

Apply and provide guidance on software engineering techniques like design patterns, code refactoring, framework design, code reusability, code versioning, performance optimization, and continuous build and Integration (CI/CD) to make the data analytics team robust and efficient.

Performing all job functions consistent with policies and procedures, including those which govern handling PHI and PII.

Work closely with various IT and business teams to understand systems opportunities and constraints for maximally utilizing Enterprise Data Infrastructure.

Develop relationships with business team members by being proactive, displaying an increasing understanding of the business processes and by recommending innovative solutions.

Communicate project output in terms of customer value, business objectives, and product opportunity.

Required Skills :

5+ years of experience with Bachelors / master's degree in computer science, Engineering, Applied mathematics or related field.

Extensive hands-on development experience in Python, SQL and Bash.

Extensive Experience in performance optimization of data pipelines.

Extensive hands-on experience working with cloud data warehouse and data lake platforms like Databricks, Redshift or Snowflake.

Familiarity with building and deploying scalable data pipelines to develop and deploy Data Solutions using Python, SQL, PySpark.

Extensive experience in all stages of software development and expertise in applying software engineering best practices.

Experience in developing and implementing Data Quality framework either home grown or using any open-source frameworks like Great Expectations, Soda, Deequ.

Extensive experience in developing end-to-end orchestration layer for data pipelines using frameworks like Apache Airflow, Prefect, Databricks Workflow.

Familiar with RESTful Webservices (REST APIs) to be able to integrate with other services.

Familiarity with API Gateways like APIGEE to secure webservice endpoints.

Familiarity with concurrency and parallelism.

Familiarity with Data pipelines and ML development cycle.

Experience in creating and configuring continuous integration/continuous deployment using pipelines to build and deploy applications in various environments and use best practices for DevOps to migrate code to Production environment.

Ability to investigate and repair application defects regardless of component: front-end, business logic, middleware, or database to improve code quality, consistency, delays and identify any bottlenecks or gaps in the implementation.

Ability to write unit tests in python using unit test library like pytest.

Experience in using and implementing data observability platforms like Monte Carlo Data, Metaplane, Soda, bigeye or any other similar products.

Expertise in debugging issues in Cloud environment by monitoring logs on the VM or use AWS features like Cloudwatch.

Experience with DevOps tech stack like Jenkins and Terraform.

Experience working with concept of Observability in software world and experience with tools like Splunk, Zenoss, Datadog or similar.

Ability to learn and adopt to new concepts and frameworks and create proof of concept using newer technologies.

Ability to use agile methodology throughout the development lifecycle and provide update on regular basis, escalating issues or delays in a timely manner.

Keywords: continuous integration continuous deployment machine learning business intelligence information technology green card wtwo Pennsylvania
[email protected]
View all
Sat Oct 21 00:31:00 UTC 2023

To remove this job post send "job_kill 777605" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,