LOCAL TO ARIZONA : ETL Developer or Data Engineer USC GC at Remote, Remote, USA |
Email: [email protected] |
From: Rachna, TheStaffed [email protected] Reply to: [email protected] Position: ETL Developer/Data Engineer USC GC Client: Republic Services Type: Contract (C2C or W2) Location: Remote (But prefer candidates in the Phoenix, AZ area) Hours Time Zone: AZ time This Client PREFERS local candidates currently living in the Phoenix, AZ area, even though this job is remote they have outings and team building that they like the consultants to participate in. If the person is willing to relocate that will also work if the person is a RockStar they would consider fully Location: Remote (But prefer candidates in the Phoenix, AZ area) The role is to develop, enhance and support data pipelines using Informatica Power center, AWS, and Snowflake. The contractor will be working on the Modernizers team, EAM BI team. Their role is to develop, enhance and support data pipelines using Informatica Power center, AWS, Snowflake and MarkLogic RESPONSIBILITIES: Designs and develops code and data pipelines to ingest from relational databases (Oracle, SQL Server, DB2, Aurora), file shares, and web services. Design and build Informatica Power Center mappings and workflows. Streaming ingestion with Kinesis Streams, Kinesis Firehose, Kinesis Analytics and Kafka (MSK) Build Data Lake on AWS S3 with optimal performance considerations by partitioning and compressing data. Data Engineering and Analytics using AWS Glue, Informatica, EMR, Spark, Athena, Python. Data modeling and building Data Warehouse using Snowflake. Designs and develops code and data pipelines to ingest relational databases, file shares, and web services. Designs JavaScript modules and REST APIs utilizing MarkLogic to support complex searches. Develops MarkLogic integrations with existing enterprise platforms. Participates in requirements definition, system architecture design, and data architecture design. Participates in all aspects of the software life cycle using Agile development methodologies. QUALIFICATIONS: Experience in AWS services for data and analytics (required). 8 years of experience in Data Ingestion, Data Extraction, and Data Integration (required). 8+ years of experience in Enterprise Information Solution Architecture, Design, and development required. 8+ years of experience with integration architectures such as SOA, Microservices, ETL or other integration technologies. 8+ years of experience with working content or knowledge management systems, search engines, relational databases, NoSQL databases, ETL tools, geospatial systems, or semantic technology. 3+ years of hands-on experience with a MarkLogic framework , DynamoDB preferred. 3+ years of hands-on experience with AWS services ( S3, Kinesis, Lambda, Athena, Glue, EMR) required. 3+ years of analytics tools like SAS, R, Python, and other advanced statistical software. 3+ years of Web development experience with Angular, JavaScript, and Node.js. Experience with JSON or XML data modeling required. Experience with Git/GitHub, branching, and other modern source code management methodologies required. Domain knowledge of NoSQL or relational database required. Understanding of database architecture and performance implications required. Experience with Machine Learning and Artificial Intelligence. Ability to work collaboratively as part of an Agile Team. Extensive knowledge and experience with Python, JavaScript and Java. Excellent written and verbal communication skills, sense of ownership, urgency and drive. Bachelors degree in Computer Science, Computer Information Systems, Engineering, Statistics or closely related field (willing to accept foreign education equivalent) (required). Keywords: javascript business intelligence sthree rlang green card wtwo Arizona |
[email protected] View all |
Tue Aug 22 01:08:00 UTC 2023 |