ETL DATA ENGINEER - USC only - PA at Remote, Remote, USA |
Email: [email protected] |
From: JYOTI, KPG99 [email protected] Reply to: [email protected] ETL Data Engineer USC only Data Engineer/Visualization Opportunity for a pharmaceutical organization in the King of Prussia, PA area. However this role can be mostly (95%+) remote. Mid-Atlantic technology consulting firm and Managed Services provider located just outside Philadelphia, is looking for a Data Engineer to join our team! *Job Summary* The Data Engineering Team is a part of the Modeling and Data Sciences technology organization. This team supports key initiatives and improves the competitiveness and operational efficiencies in our existing and next-generation offerings by developing digital solutions and technologies in condition monitoring, reliability/risk modeling, dynamic modeling, IIoT, and new technology development & evaluation. This team works in close collaboration with the Engineering, Technology, and Operational teams to deliver competitive world-leading solutions. The position will be on the technical ladder where the incumbent will be provided with learning and advancement opportunities focusing on long-term career growth and success. *Responsibilities:* * Data Modeling & strong analytics and problem-solving skills. * Develop and manage effective working relationships with other departments, groups, and personnel with whom work must be coordinated or interfaced * Efficiently communicate with ETL architect while understanding the requirements and business process knowledge in order to transform the data in a way thats geared towards the needs of end users * Assist in the overall architecture of the ETL Design, and proactively provide inputs in designing, implementing, and automating data strategies * Investigate and mine data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions * Build data pipelines and data flows in and out of the data storage using Azure Data Factory. * Solution and design data delivery mechanisms using Azure Synapse, Azure Storage and Azure SQL Server databases. * Responsible for documenting the implementations, and test cases as well as responsible for building deployment documents needed for CI/CD. * Familiarity with data visualization tools (e.g. Power BI, Qlik Sense etc.) * Ability to identify data bottlenecks and provide solutions. *Required Experience:* * Bachelors degree or equivalent work experience * 5+ years of experience in data warehousing concepts, ETL concepts, and working with ETL/Data quality tools on any ETL platform * 5+ years of experience in writing complex SQL queries, stored procedures and performance tuning * 5+ years of experience working with robust analytical models that are critical to business operations in a hands-on role that is responsible for the processing and prepping of the data leveraged by the models. * Strong experience in automating data processing jobs with Python. * Strong experience in leveraging data processing tools, optimizing queries and advanced SQL scripting * Ability to leverage command line utilities as needed and able to utilize on premise and cloud environments. * 5+ years of experience with Microsoft Azure cloud services and data warehouse environment * 5+ years of experience with building, deploying, and troubleshooting data extraction and loading pipelines (ETL) using Azure Data Factory (ADF) * 5+ years analyzing and executing data profiles. *Preferred Experience:* * Azure Certification in database and data engineer roles * 1+ years of experience developing a data solution on Snowflake Data Warehouse using snowflake continuous data pipelines with Snowpipe, Streams and Tasks * Familiar with R programming and Data Science concepts. * Familiar with data orchestration tools. Keywords: continuous integration continuous deployment business intelligence rlang Pennsylvania |
[email protected] View all |
Tue Feb 07 03:37:00 UTC 2023 |