Home

ETL PYTHON DEVELOPER - PHILADELPHIA - DAY ONE ONSITE at Philadelphia, Pennsylvania, USA
Email: [email protected]
HI All,

ETL
Python Developer 

Responsibilities: 

1.

Hands-on building of ETL pipelines using our internal framework written in Java and Python 

2.

Hands-on solutioning of real time REST APIs or other solutions for streaming data from Graph 

3.

Modify existing application code or interfaces or build new application components from detailed requirements. 

4.

Analysis of requirements, support of the design, development of the code, testing, debugging, deployment, and maintenance of those programs and interfaces. Documentation of the work is essential 

5.

Participation in most aspects of programming and application development, including file design, update, storage, and retrieval 

6.

Enhance processes to resolve operational problems and add new functions taking into consideration schedule, resource constraints, process complexity, dependencies, assumptions and application structure 

7.

Ability to maintain the developed solution on an on-going basis is essential 

8.

Ability to follow the existing development methodology and coding standards, and ensure compliance with the internal and external regulatory requirements 

9.

Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality 

10.

Acquire data from primary or secondary data sources and maintain databases/data systems 

11.

Work with management to prioritize business and information needs 

12.

Locate and define new process improvement opportunities 

13.

Document design and data flow for existing and new applications being built.  

14.

Co-ordinate with multiple different teams QA, Operations and other development team within the organization.  

15.

Testing methods, including unit and integration testing (PyTest, PyUnit) 

16.

Ability to integrate with large teams, demonstrating strong verbal and written communication skills 

17.

Utilization of software configuration management tools 

18.

Code deployment and code versioning tools 

1.

Excellent Communication Skills 

Qualifications: 

1.

Bachelors degree preferably with Computer Science background.  

2.

At least 5+ years of experience implementing complex ETL pipelines preferably with Spark toolset. 

3.

At least 5+ years of experience with Java particularly within the data space 

4.

Technical expertise regarding data models, database design development, data mining and segmentation techniques 

5.

Good experience writing complex SQL and ETL processes 

6.

Excellent coding and design skills, particularly either in Scala or Python. 

7.

Strong practical working experience with Unix scripting in at least one of Python, Perl, Shell (either bash or zsh). 

8.

Experience in AWS technologies such as EC2, Redshift, Cloud formation, EMR, AWS S3, AWS Analytics required. 

9.

Experience designing and implementingdata pipelines in a onprem/cloud environment is required. 

10.

Experience building/implementing data pipelines usingDatabricks/On prem or similar cloud database. 

11.

Expert level knowledge of using SQL to write complex, highly optimized queries across large volumes of data. 

12.

Hands-on object-oriented programming experience using Python is required. 

13.

Professional workexperience building real-time data streams using Spark and Experience in Spark. 

1.

Knowledge or experience in architectural best practices in building data lakes 

2.

Develop and work with APIs 

3.

Develop and maintain scalable data pipelines and build out new API integrations to support continuing increases in data volume and complexity. 

4.

Collaborate with analytics and business teams to improve data models that feed business intelligence tools, increase data accessibility, and foster data-driven decision making across the organization. 

5.

Implement processes and systems to monitor data quality, to ensure production data accuracy, and ensure key stakeholder and business process access. 

6.

Write unit/integration tests, contribute to engineering wiki, and documents. 

7.

Perform data analysis required to troubleshoot data related issues and assist in the resolution of data issues. 

8.

Experience developing data integrations and data quality framework based on established requirements. 

9.

Experience with CI/CD processes and tools (e.g., concourse, Jenkins). 

10.

Experience with test driven development writing unit tests, test coverage using PyTest, PyUnit, pytest-cov libraries. 

11.

Experience working in an Agile environment. 

12.

Good understanding & usage of algorithms and data structures 

13.

Good Experience building reusable frameworks. 

14.

Experience working in an Agile Team environment. 

15.

AWS certification is preferable: AWS Developer/Architect/DevOps/Big Data 

16.

Excellent communication skills both verbal and written 

Mohd Faisal

--

Keywords: continuous integration continuous deployment quality analyst sthree information technology Colorado
[email protected]
View all
Thu Nov 23 00:22:00 UTC 2023

To remove this job post send "job_kill 881548" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,