Senior Data Engineer (GCP) :: Remote :: Contract at Remote, Remote, USA |
Email: [email protected] |
Hi, Hope you are doing great. My name is Devendra Pratap Singh and I am a Lead Talent Acquisition at Amaze Systems Inc.. I am reaching out to you on an exciting job opportunity with one of our clients. If you feel Interested then please share me your updated resume along with submission details. Job Title: Senior Data Engineer (GCP) Job Location: Remote Type of Hire: Contract JD: It is 100% hands on coding role. Candidate is supposed to do coding. Should be expert with SQL and should know some programming languages to write the logics. Interview Process: 1 Round: 1 hour video interview with Quest Global technical person 2 Round: 1-hour technical interview with Customer 3 Round: 1-hour live codility test in presence of Customer 4 Round: 1 hour discussion with hire authorities of Customer. If candidate is fine with this process, then I am interested in talking to him. This is a 6-month contract to hire role. Candidate should be independent and ready to join customer as fulltime after 6 months. There will be no absorption fee for vendor. (Need at least 3 years experience with GCP) Role Overview We are seeking a dynamic and highly skilled Senior Data Engineer who has extensive experience building enterprise scale data platforms and lead these foundational efforts. This role demands someone who not only possesses a profound understanding of the data engineering landscape but is also at the forefront of their game. The ideal candidate will contribute significantly to platform development with diverse skillset while also being very hands-on coding and actively shaping the future of our data ecosystem. What We Are Looking For: Responsibilities: As a senior/principal engineer, you will be responsible for ideation, architecture, design and development of new enterprise data platform. You will collaborate with other cloud and security architects to ensure seamless alignment within our overarching technology strategy. Architect and design core components with a microservices architecture, abstracting platform, and infrastructure intricacies. Create and maintain essential data platform SDKs and libraries, adhering to industry best practices. Design and develop connector frameworks and modern connectors to source data from disparate systems both on-prem and cloud. Design and optimize data storage, processing, and querying performance for large-scale datasets using industry best practices while keeping costs in check. Architect and design the best security patterns and practices Design and develop data quality frameworks and processes to ensure the accuracy and reliability of data. Collaborate with data scientists, analysts, and cross functional teams to design data models, database schemas and data storage solutions. Design and develop advanced analytics and machine learning capabilities on the data platform. Design and develop observability and data governance frameworks and practices. Stay up to date with the latest data engineering trends, technologies, and best practices. Drive the deployment and release cycles, ensuring a robust and scalable platform. Requirements: 10+ (for senior) 15+ (for principal) of proven experience in modern cloud data engineering, broader data landscape experience and exposure and solid software engineering experience. Prior experience architecting and building successful enterprise scale data platforms in a green field environment is a must. Proficiency in building end to end data platforms and data services in GCP is a must. Proficiency in tools and technologies: Big Query, Cloud Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, Airflow, PubSub. Experience with Microservices architectures - Kubernetes, Docker and Cloud Run Experience building Symantec layers. Proficiency in architecting and designing and development experience with batch and real time streaming infrastructure and workloads. Solid experience with architecting and implementing metadata management including data catalogues, data lineage, data quality and data observability for big data workflows. Hands-on experience with GCP ecosystem and data Lakehouse architectures. Strong understanding of data modeling, data architecture, and data governance principles. Excellent experience with DataOps principles and test automation. Excellent experience with observability tooling: Grafana, Datadog. Nice to have: Experience with Data Mesh architecture. Experience building Semantic layers for data platforms. Experience building scalable IoT architectures Questioners to ask with every candidate: What is your experience with Data Platforms What were your responsibilities and contribution How hands-on are you What was the most challenging data platform did you personally solve and how How many years of production level GCP experience do you have What services are you most comfortable with What is your experience with BigQuery What was your individual contribution and responsibilities What is your experience with data transformation and what tools have you used How much experience do you have with DBT or dataform What exactly di you develop Do you have any experience with TypeScript or NodeJS framework What is your experience with microservices What microservices have you developed and with what technologies Devendra Pratap Singh | Talent Acquisition Specialist Amaze Systems Inc USA: 8951 Cypress Waters Blvd, Suite 160, Dallas, TX 75019 Canada: 55 York Street, Suite 401, Toronto, ON M5J 1R7 D: +1 ( 4 69) 424-3431 E: [email protected] | www.amaze-systems.com/ USA | Canada | UK | India Amaze Systems is an Equal Opportunity Employer (EOE), and does not discriminate based on age, gender, religion, disability, marital status, race and also adheres to laws relating to non-discrimination on the basis of national origin and citizenship status. -- Keywords: access management information technology Texas Senior Data Engineer (GCP) :: Remote :: Contract [email protected] |
[email protected] View all |
Mon Jul 29 23:56:00 UTC 2024 |