GCP Data Engineer in Detroit Metropolitan Area,MI(Hybrid) at Detroit, Michigan, USA |
Email: [email protected] |
From: sarfaraz, convextech Inc [email protected] Reply to: [email protected] Hi Hope you are doing good.!! Please let me know if you are interested in the position below Title: Data Engineer Location:Hybrid( Detroit Metropolitan Area,MI ) Duration : 6 months contract Visa Restrictions: No H1b/CPT Job Description: About the job Responsibilities: Work on a small agile team to deliver curated data products Work effectively with fellow data engineers, product owners, data champions and other technical experts Demonstrate technical knowledge and communication skills with the ability to advocate for well-designed solutions Develop exceptional analytical data products using both streaming and batch ingestion patterns on Google Cloud Platform with solid data warehouse principles Be the Subject Matter Expert in Data Engineering with a focus on GCP native services and other well integrated third-party technologies Working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production analyzing complex data, organizing raw data, and integrating massive datasets from multiple data sources to build analytical domains and reusable data products working with architects to evaluate and productionalize data pipelines for data ingestion, curation, and consumption Working with stakeholders to formulate business problems as technical data requirements, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management Required Experience and Skills: 5+ years of SQL development experience 5+ years of analytics/data product development experience required 3+ years of Google cloud experience with solutions designed and implemented at production scale Experience working in GCP native (or equivalent) services like Big Query, Google Cloud Storage, PubSub, Dataflow, Dataproc, Cloud Build, etc. Experience migrating Teradata to GCP Experience working with Airflow for scheduling and orchestration of data pipelines Experience working with Terraform to provision Infrastructure as Code 2 + years professional development experience in Java or Python Experience converting Microsoft SSRS, SSAS and SSIS to BQ Experience converting Mainframe JCL , COBOL and CA7 Experience converting SAS code to BQ Experience converting Web Focus reports in to Qlik Sense In-depth understanding of Googles product technology (or other cloud platform) and underlying architectures Experience in working with DBT/Dataform Experience with DataPlex or other data catalogs is preferred Experience with development eco-system such as Tekton, Git, Jenkins for CI/CD pipelines Experience in working with Agile and Lean methodologies Team player and attention to detail Experience with performance tuning SQL queries Bachelors degree or Masters degree in computer science or related scientific field GCP Professional Data Engineer Certified 2+ years mentoring engineers Thanks and Regards Sarfaraz Khan US IT Recruiter | Convex Tech In Email: [email protected] LinkedIn:https://www.linkedin.com/in/sarfaraz-khan-stellar/ Keywords: continuous integration continuous deployment information technology Michigan GCP Data Engineer in Detroit Metropolitan Area,MI(Hybrid) [email protected] |
[email protected] View all |
Tue Jul 02 08:22:00 UTC 2024 |