Urgent Requirement || Lead Data Engineer with AWS & Snowflake || Saint Louis, MO(Onsite) || 10+ Years at Snowflake, Arizona, USA |
Email: [email protected] |
Hi Please find the attached resume Thankyou Kranthi kumar 469-898-6223 From: [email protected] <[email protected]> on behalf of sumit kumar <[email protected]> Sent: Friday, January 12, 2024 8:10:57 PM To: [email protected] <[email protected]>; [email protected] <[email protected]>; [email protected] <[email protected]>; [email protected] <[email protected]>; [email protected] <[email protected]> Subject: Urgent Requirement || Lead Data Engineer with AWS & Snowflake || Saint Louis, MO(Onsite) || 10+ Years Hello All, I Hope you are doing well. I have an urgent requirement for a Node.Js Developer to locationSunnyvale, CA(Onsite). Please have a look and let me know if you are interested. Position Overview The Cloud Data Engineering Lead works within our Enterprise Data Fabric team that leverages data as a service for acquiring, securing, cataloging, processing, and analyzing of internal and external data sets. This role will have the opportunity to shape the product strategy, vision and portfolio of our Data Fabric platform. Once joining the team, you will be responsible for designing and implementing large-scale distributed data processing systems using cutting edge cloud based, open source, and proprietary big data technologies. In this role, you will implement a variety of solutions and frameworks to ingest data into, process data within, and expose data from the Data Fabric platform. Contract Assignment: 12 months Business Area: IT/ Data Platforms Responsibilities Provide technical and/or business application consultation to business partners and team members in the areas of functionality, architecture, operating systems and databases for complex application systems. Develop data platform components in a cloud and on-premises environments to ingest data and events from cloud and on-premises environments as well as third parties. Build automated pipelines and data services to validate, catalog, aggregate and transform ingested data. Build automated data delivery pipelines and services to integrate data from the data lake to internal and external consuming applications and services. Build and deliver deployment and monitoring capabilities consistent with DevOps models. Works with architects in transforming high level architecture designs and assists in technical delivery of large scale enterprise projects to implement optimized end-to-end solutions. Develop low-level technical specifications and detailed program specifications to promote a solid core application that can be re-used across projects. Analyze existing systems and architectures for improvement recommendations. Assist in troubleshooting production issues and new build deployments. Ensures code quality, performs code reviews, and mentors development team members. Ensure users expectations are met, gain understanding when desired outcomes are not feasible and provide alternative solutions to meet objective(s). Design and develop software for new functionality, improvements and system longevity. Ensure all documentation of technical architecture and systems are complete. Provide training and guidance to team members and users as required. Must be available to meet schedules of global operation by being available for off hour meetings. Requirements EDUCATION AND EXPERIENCE Required: Bachelors degree in Computer Science or equivalent education and experience 6+ years experience in programming/systems analysis. 5+ years of hands-on experience with big data and data-at-scale platform services 4+ years experience utilizing Python for big data processing 4+ years experience developing cloud-native applications and deploying to a cloud environment (AWS) Experience with Snowflake -- -- Keywords: javascript information technology California Missouri |
[email protected] View all |
Fri Jan 12 21:23:00 UTC 2024 |
Attached files: Sai_Harsha_Resume(1)_1705074795559.doc Please check the file(s) for viruses. Files are checked manually and then available for download. |