GCP Data Engineer at Remote, Remote, USA |
Email: [email protected] |
Hello Associate, Hope you are doing well We have the below requirement open. Please send me your genuine candidate on my email ID [email protected] Position : GCP Data Engineer Location : Lansing, MI onsite Duration : Long Term VISA: No CPT, H1B Primary Skills : Terraform, Big Query, Bigtable, Job Responsibilities: Minimum of 4+ years of experience as a Data Engineer, with a strong focus on GCP or similar cloud providers like Azure and AWS. Extensive hands-on experience with GCP services or other clouds(Azure, AWS) and tools such as Terraform, Big Query, Bigtable, Google Cloud Storage, Pub Sub, Data Fusion, Dataflow, Dataproc, Cloud Build, Airflow, Cloud Composer, Tekton, and Vertex AI. Proficiency in designing and implementing large-scale data solutions using GCP services, ensuring scalability, reliability, and performance. Strong knowledge of data integration, transformation, and processing techniques, leveraging GCP services and tools. Experience with infrastructure automation using Terraform for GCP resource provisioning and management. Solid understanding of CI/CD practices and experience with Tekton and other relevant tools for building data engineering pipelines. In-depth knowledge of data storage and retrieval mechanisms using GCP services such as Big Query, Bigtable, and Google Cloud Storage. Familiarity with data orchestration and workflow management using GCP services like Dataproc, Cloud Build, and Airflow. Strong proficiency in big data technologies, including HDFS, Hive, Sqoop, Spark, PySpark, Scala, and Python. Proven experience in building end-to-end machine learning pipelines and deploying ML models in production. Familiarity with ML frameworks such as TensorFlow, PyTorch, or Scikit-learn. Good Python programming skills -- Keywords: continuous integration continuous deployment artificial intelligence machine learning information technology Idaho Michigan |
[email protected] View all |
Thu Nov 02 20:24:00 UTC 2023 |