Home

Remote Position : GCP Data Engineer-W2 only at Dearborn, Michigan, USA
Email: [email protected]
From:
Sachin,
Sspearhead Inc.
[email protected]
Reply to:   [email protected]

Job Description:

Remote Position : GCP Data Engineer W2 required

Long Term

Location: Dearborn, MI (100% Remote work may require relocation down the road)

Job Description:

         MIS team is seeking a GCP Data Engineer to create, deliver, and support custom data products, as well as enhance/expand team capabilities.

         They will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and analytics.

         Google Cloud Data Engineers will be responsible for designing the transformation and modernization on Google Cloud Platform cloud.

         Experience with large scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform is a must.

         We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of Google Cloud Platform and 3rd party technologies for deploying on Google Cloud Platform cloud.

Skills Required:

         Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment.

         Implement methods for automation of all parts of the pipeline to minimize labor in development and production.

         This includes designing and deploying a pipeline with automated data lineage.

         Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions.

         Test and compare competing solutions and report out a point of view on the best solution.

         Integration between GCP Data Catalog and Informatica EDC.

         Design and build production data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine.

Skills Preferred:

         Strong drive for results and ability to multi-task and work independently

         Self-starter with proven innovation skills

         Ability to work with cross-functional teams and all levels of management

         Demonstrated commitment to quality and project timing

         Demonstrated ability to document complex systems

         Experience in creating and executing detailed test plans

Experience Required:

         In-depth understanding of Google"s product technology and underlying architectures.

         5+ years of application development experience required, +3 years of GCP experience.

         Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc, Airflow etc.

         2 + years coding skills in Java/Python. Work with data team to analyze data, build models and Integrate massive datasets from multiple data sources for data modelling Implement methods for automation of all parts of the predictive pipeline to minimize labor in development and production

         Formulate business problems as technical data problems while ensuring key business drivers are captured in collaboration with product management

         Extracting, Loading, Transforming, cleaning, and validating data + Designing pipelines and architectures for data processing

         Minimum 1 year of designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java/ Python etc. Hands-on GCP experience with a minimum of 1 solution designed and implemented at production scale

Experience Preferred:

         Architecting and implementing next generation data and analytics platforms on GCP cloud

         Experience in building solution architecture, provision infrastructure, secure and reliable data-centric services and application in GCP

         Experience with Informatica EDC is preferred.

         Experience with development eco-system such as Git, Jenkins and CICD.

         Exceptional problem solving and communication skills .

         Experience in working with Agile and Lean methodologies Team player and attention to detail Education Required IT or related Associated topics: data architect, data center, data integrity, data manager, data management, data scientist, data warehousing, sql, sybase, Teradata Google Cloud Platform (GCP) Certification preferred.

Education Required:

         Bachelors degree in Computer Engineering, Information Systems, or related field of study

Regards,

Sachin

Sspearhead Inc.

www.sspearhead.com

Keywords:
[email protected]
View all
Mon Nov 21 19:07:00 UTC 2022

To remove this job post send "job_kill 157774" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 16

Location: Dearborn, Michigan