Informatica Cloud (IICS) Data Engineer - Austin, TX - ned Only Locals - No H1B's at Austin, Texas, USA |
Email: [email protected] |
From: Bhanu Prakash, Intellectt [email protected] Reply to: [email protected] Hi, Hope you are doing well, This is Bhanu from Intellectt INC; we have an immediate opportunity with one of our clients. Please find the below job description and if you are interested, please forward your resume to [email protected] Role: Informatica Cloud (IICS) Data Engineer - Strong Snowflake Experience is Mandatory Location: Austin Texas - Remote Duration: 12 Months with possible extension Job Description Performs advanced (senior-level) Data Pipeline development work with Informatica Cloud (IICS) including data integration, source to target data modeling, Extract-Transaction-Load development, consuming Oracle data connections, RESTful API application-based data connections, targeting Snowflake data connections, designing Snowflake target data lake databases, data warehouse modeling with ELT. This candidate will be using Informatica Cloud to build mass ingestion pipelines or other EL from an Oracle transaction database(s) to our Snowflake Data Lake. Additional duties may include working within Snowflake to create other API based data ingestion routines and/or setting up Data Sharing of these accumulated data. Dimensional modeling of data may be required in our Data Warehouse to accomplish other objectives like data reporting and improved performance data handling. Development of ETL/ELT data mappings and workflows for data pipeline development with Informatica Cloud Data Integration. Practical experience using and building Informatica Mass Ingestion Pipelines. Demonstrated experience with Oracle Database as a Data Connector source. Expert with Snowflake as a Target database platform. Experience with the Snowflake platform and ecosystem. Knowledge of Snowflake data sharing and Snowpark is a plus. Knowledge of the advantages as well as previous experience working with Informatica push-down optimization Experience with Snowflake database creation, optimization, and architectural advantages. Practical experience with Snowflake SQL. Determines database requirements by analyzing business operations, applications, and programming; reviewing business objectives; and evaluating current systems. Obtains data model requirements, develops, and implements data models for new projects, and maintains existing data models and data architectures. Creates graphics and other flow diagrams (including ERD) to show complex database design and database modeling more simply. Performs related work as assigned. Practical experience with one time data loads as well as Change Data Capture (CDC) for bulk data movement. Creation of technical documentation for process and interface documentation is a key element of this role, as the Team is working on release two of a multiple release effort. Ability to review the work of others, troubleshoot, and provide feedback and guidance to meet tight deliverable deadlines is required. Ability to promote code from development environments to production. Familiarity with GitHub or equivalent version control systems. Experience working with state agencies as well as security protocols and processes. Candidate Skills And Qualifications Generating advanced SQL queries and using other data interrogation methods.. Experience with Informatica Products with at least 4 years direct experience with Informatica Cloud Data Integration Reviewing, interpreting, and translation business requirements and design specifications into data mappings, and data pipeline development using all major data integration patterns. Experience in relational database design concepts which including direct experience in Oracle RDBMS. Experience with Data Warehouse architectural patterns including modeling Facts and Dimensions. Experience with static mapping and Change Data Capture (CDC) for bulk data movement. Experience with Informatica Mass Ingestion Experience using Snowflake (creating, managing, optimizing databases as well as time-travel and other technical advantages of cloud data warehousing). Any experience working with Snowpark, servicing the needs of Python programmers, or understanding Data Scientists requirements against Snowflake databases will make the candidate more successful Keywords: |
[email protected] View all |
Tue Mar 19 00:16:00 UTC 2024 |