Urgent Need | Data Architect Role | Initial Remote at Remote, Remote, USA |
Email: [email protected] |
From: Satya Prakash, Envision Tech Sol. [email protected] Reply to: [email protected] Hi Hope you are doing great!! Please find the requirement below , If you find yourself comfortable with the requirement please reply back with your updated resume and I will get back to you or I would really appreciate if you can give me a call back at my contact number 315-275-5700. Role- GCP Data Architect Duration- Long term Contract Location: Warren NJ (Initial Remote) Job Description GCP Data Architecture and designing streaming & Batch pipelines Data processing services: Dataproc ( Pyspark ) & Dataflow (Apache Beam) Cloud databases: Spanner, Cloud SQL, Memory store, BigQuery , Looker studio & Operations suite (Cloud monitoring and Logging) Google certified Data Engineer Secondary Skill Python, Flask FastAPI Development Role Description: Develop data solutions in distributed microservices and full stack systems Responsible and accountable for doing requirement gathering and creation of Architecture, High level, and detailed technical design using GCP Data focused reference Architecture Participate in the analysis of newest technologies and suggest the o ptimal solutions which will be best suited for satisfying the current requirements and will simplify the future modifications. Provide design expertise with Master Data Management, Data Quality and Meta Data Management Design systems to provide c omplete observability and support continuous improvement in DevOps Automation Design appropriate data models for the use in transactional and big data environments as an input into Machine Learning processing. Design and Build the necessary infrast ructure for optimal ETL from a variety of data sources to be used on GCP services. Experience with SQL and NoSQL modern data stores. Ensure clarity on NFR and implement these requirements. Develop data and semantic interoperability specificati ons Build relationships with client stakeholders to establish a high-level of rapport and confidence. Work with clients, local teams and offshore resources to deliver modern data products Collaborate with several external vendors to support data acquisition. Knowledge of SAFe delivery model and groom requirements for PI backlog, Sprint backlog and identify all technical dependencies between teams Detailed JD Primary Skills Expert in following GCP Data Architecture and designing stream ing & Batch pipelines Data processing services: Dataproc ( Pyspark ) & Dataflow (Apache Beam) Cloud databases: Spanner, Cloud SQL, Memory store, BigQuery, Looker studio & Operations suite (Cloud monitoring and Logging) Google certified Data Thanks and Regards Satya Prakash Email : [email protected] Direct : 3152755700 EXT 112 Keywords: cprogramm New Jersey |
[email protected] View all |
Sat Jan 14 02:09:00 UTC 2023 |