Urgent Hiring GCP Data Architect at Phoenix, Arizona, USA |
Email: [email protected] |
From: Vinni, IndusAiLabs [email protected] Reply to: [email protected] Hello, Hope you are doing good Please find the below requirement and revert with suitable resume Position: GCP Data Architect Location: Phoenix, AZ (Onsite) Duration: 12+months Must Have 12+ Years experience Passport Number Mandatory, LinkedIn is mandatory for submission Minimum Qualifications: Bachelor degree in Engineering or Computer Science or equivalent OR Master in Computer Applications or equivalent. A solid experience and understanding of considerations for large scale architecting, solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must. Create detailed target state technical, security, data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value proposition Minimum of 12 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & Githubperforming detail assessments of current state data platforms and creating an appropriate transition path to GCP cloudA solid experience and understanding of considerations for large scale architecting, solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must.Create detailed target state technical, security, data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value propositionMinimum of 8 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & Githubperforming detail assessments of current state data platforms and creating an appropriate transition path to GCP cloudExperience with Data lake, data warehouse ETL build and designExperience with Google Cloud Services such as Streaming + Batch, Cloud Storage, Cloud Dataflow, Data Proc , DFunc, Big Query & Big TableProven ability in one or more of the following programming or scripting languages- Python, JavaScript, Java, -- Thanks & Regards, Vinni |Technical Recruiter Indus AiLabs Phoenix, AZ, USA 85004 Email: [email protected] Website: www.indusailabs.com Keywords: Arizona |
[email protected] View all |
Thu Feb 22 23:28:00 UTC 2024 |