GCP Data Engineer at Remote, Remote, USA |
Email: [email protected] |
From: muddu Krishna, RITWIK Infotech, Inc [email protected] Reply to: [email protected] Position : GCP Data Engineer Duration : 6+months Location : MI onsite Visa : No H1b Primary Skills :Terraform, BigQuery, BigTable, Job Description : Minimum of 4+ years of experience as a Data Engineer, with a strong focus on GCP or similar cloud providers like Azure and AWS. Extensive hands-on experience with GCP services or other clouds(Azure, AWS) and tools such as Terraform, BigQuery, BigTable, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc, Cloud Build, Airflow, Cloud Composer, Tekton, and Vertex AI. Proficiency in designing and implementing large-scale data solutions using GCP services, ensuring scalability, reliability, and performance. Strong knowledge of data integration, transformation, and processing techniques, leveraging GCP services and tools. Experience with infrastructure automation using Terraform for GCP resource provisioning and management. Solid understanding of CI/CD practices and experience with Tekton and other relevant tools for building data engineering pipelines. In-depth knowledge of data storage and retrieval mechanisms using GCP services such as BigQuery, BigTable, and Google Cloud Storage. Familiarity with data orchestration and workflow management using GCP services like Dataproc, Cloud Build, and Airflow. Strong proficiency in big data technologies, including HDFS, Hive, Sqoop, Spark, PySpark, Scala, and Python. Proven experience in building end-to-end machine learning pipelines and deploying ML models in production. Familiarity with ML frameworks such as TensorFlow, PyTorch, or Scikit-learn. Good Python programming skills Keywords: continuous integration continuous deployment artificial intelligence machine learning Michigan |
[email protected] View all |
Thu Nov 02 23:13:00 UTC 2023 |