Python Developer with ETL informatica exp. (PST Time zone) at Remote, Remote, USA |
Email: [email protected] |
From: Raj, VSII [email protected] Reply to: [email protected] Python Developer with ETL informaticaexp (PST Time zone) Remote Summary This position will be the key player designing and developing a Python process to intake a file, and then generate a Meta Data file to be sent to our partner. This role will also support the ELT/ETL development, deployment, and initial operations of a new process that generates data files in various formats like parquet, and then moves them to our partners S3. This includes data files, but also audit /trace files, and the meta data file mentioned earlier. Candidates should be hands on and demonstrate strong work ethic, sound engineering practice, and the ability to tackle the most difficult problems facing the team. Also, this position will require the engineer to perform on-call support during the go live and initial operations. Responsibilities Design and Engineer a Python program that is coded with best practice and can deliver optimal performance as well as be integrated into the overall Informatica workflow. Design / Engineer and collaborate with a team of engineers to deliver ETL, ELT, and Data Warehouse solutions to fulfill the project requirements Design and build a scalable, modern, optimal performing solution. Work with a team to engineer ETL, ELT, and Data Warehouse solutions to fulfill project requirements Interface closely with partner and other project related teams. Combine consultative skills with analytical knowledge to critically evaluate objectives and customer requirements and translate these into technical requirements. Deliver quality solutions that are optimal in performance and easy to maintain Provide off hours production support for this solution during the go live phase. Requirements 3+ years of Python development experience 1-3 years with Strong Knowledge of Informatica (Power Center or cloud) a major plus 3+ years of relevant experience in architecting, building, and maintaining large scale data systems 1-3 years of advanced SQL 1-3 years of experience developing cloud solutions and utilizing relevant cloud services. 3 years+ experience in scripting languages (like bash/powershell/python/ruby) 3 years of experience in engineering large-scale data-intensive systems. Bachelor's degree in computer science or related field. Masters degree a plus. Experience in Amazon Redshift a plus. Deep knowledge in various ETL/ELT tools and concepts, multi-dimensional data modeling, SQL, query performance optimization, OLTP,OLAP, MDM Knowledge of MPP databases, architecture, and performance tuning a plus Programming experience is a plus. Experience with all phases of the software development life cycle (SDLC). Worked with schedulers and source control software like TFS. Excellent written and verbal communication skills. Banking/Financial experience is a plus Strong verbal and written presentation skills are required. Actively listen to others to understand their perspective and be able to present complex topics and analysis in easy to comprehend manner. Ability to combine industry knowledge, experience, innovate skills, and user input to create a flexible and an optimized DW infrastructure. Must be able to simultaneously handle multiple projects and meet deadlines with proven ability to succeed in a high activity/pace environment with managing associated stress. Outstanding ability to work independently, within a team, and across various departments. |
[email protected] View all |
Wed Oct 19 21:16:00 UTC 2022 |