Requirement for the Position : GCP Data Engineer at Washington, DC, USA |
Email: [email protected] |
Position: GCP Data Engineer Location: Washington, DC - ( Remote fine, Need to travel when client required, Hybrid Working ) Exp Level : NEED ONLY 12+ YEARS PROFILES Rate : $78 Design and implement AI/ML models and Solutions using Google Cloud Platform technologies, including Vertex AI, Tensor Flow, PyTorch Design and Implement AI/ML Models Develop a deep understanding of generative AI models and techniques such as Text to Speech Evaluate AI Models for various applications such as content creation and design automation, Cloud Vision AI, Natural Language AI. Write and Maintain high quality Python code that is efficient, scalable, and well documented Collabrate with other engineers, researchers and product managers to define requirements, brainstorm solutions, and iterate on designs. Contribute to the development of internal tools and infrastructure for AI/ML development and research Stay up to date on latest advancements in AI/ML research and development Understanding of AI Chatbots and document search Understanding of MLOps Understanding of supervised learning, unsupervised learning, Video analytics, Image analytics, NLP, and so on Understanding of exploratory data analysis, descriptive analytics, prescriptive analytics, predictive analytics and so on Good to Have : GCP Data Engineering understanding like Bigquery, Dataflow, Cloud Data Fusion, Cloud Composer, Dataproc, Cloud Function Thanks & Regards, Akhil Reddy Talent Acquisition -- Keywords: artificial intelligence machine learning information technology |
[email protected] View all |
Wed Feb 28 21:29:00 UTC 2024 |