Home

urgent : Data Engineer GCP SQL DevelopmentW2 at Remote, Remote, USA
Email: [email protected]
From:

Ramashankar,

vyzeinc

[email protected]

Reply to:   [email protected]

Job Description -

Data Engineer GCP SQL Development 

Ford Motor Co, Michigan 

Hybrid onsite Dearborn, MI  or Remote

12 month contract.

visa -USC / GC

Additional Information:

A high priority will be given to Dearborn local candidates (or those that will relocate). Expectation will be in-office at least 1 day per week and at home rest of the time, subject to change. Full Stack Back End Java

Software Engineer Senior #997179

Job Description:

       Were seeking a Data Engineer who has experience building data products on a cloud analytics platform. 

       You will work on ingesting, transforming, and analyzing large datasets to support the Enterprise in the Data Factory on Google Cloud Platform (GCP).  

       Experience with large scale solution and operationalization of data lakes, data warehouses, and analytics platforms on Google Cloud Platform or other cloud environments is a must. 

       We are looking for candidates who have a broad set of technical skills across these areas.

You will: 

       Work in collaborative environment that leverages paired programming

       Work on a small agile team to deliver curated data products 

       Work effectively with fellow data engineers, product owners, data champions and other technical experts 

       Demonstrate technical knowledge and communication skills with the ability to advocate for well-designed solutions

       Develop exceptional analytical data products using both streaming and batch ingestion patterns on Google Cloud Platform with solid data warehouse principles    

       Be the Subject Matter Expert in Data Engineering with a focus on GCP native services and other well integrated third-party technologies

Primary Skills Required:

       Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production

       Experience in analyzing complex data, organizing raw data, and integrating massive datasets from multiple data sources to build analytical domains and reusable data products

       Experience in working with architects to evaluate and productionalize data pipelines for data ingestion, curation, and consumption

       Experience in working with stakeholders to formulate business problems as technical data requirements, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management

Additional Skills Preferred:

       Strong drive for results and ability to multi-task and work independently 

       Self-starter with proven innovation skills 

       Ability to communicate and work with cross-functional teams and all levels of management 

       Demonstrated commitment to quality and project timing 

       Demonstrated ability to document complex systems 

       Experience in creating and executing detailed test plans 

Experience Required:

       5+ years of SQL development experience

       5+ years of analytics/data product development experience required

       3+ years of Google cloud experience with solutions designed and implemented at production scale

       Experience working in GCP native (or equivalent) services like Big Query, Google Cloud Storage, PubSub, Dataflow, Dataproc, Cloud Build, etc. 

       Experience migrating Teradata to GCP

       Experience working with Airflow for scheduling and orchestration of data pipelines

       Experience working with Terraform to provision Infrastructure as Code

       2 + years professional development experience in Java or Python

Additional Experience Preferred

       In-depth understanding of Googles product technology (or other cloud platform) and underlying architectures 

       Experience in working with DBT/Dataform

       Experience with DataPlex or other data catalogs is preferred

       Experience with development eco-system such as Tekton, Git, Jenkins for CI/CD pipelines

       Exceptional problem solving and communication skills 

       Experience in working with Agile and Lean methodologies

       Team player and attention to detail 

       Experience with performance tuning SQL queries

Education Required:

       Bachelors degree in computer science or related scientific field 

Additional Education Preferred

       GCP Professional Data Engineer Certified 

       Masters degree in computer science or related field

       2+ years mentoring engineers

       In-depth software engineering knowledge

Keywords: continuous integration continuous deployment green card Colorado Michigan
urgent : Data Engineer GCP SQL DevelopmentW2
[email protected]
[email protected]
View all
Thu Jun 27 00:08:00 UTC 2024

To remove this job post send "job_kill 1514547" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 35

Location: Dearborn, Michigan