Home

GCP Data Architect at SFO, California, USA
Email: [email protected]
From:

SHASHI,

Cloudingest

[email protected]

Reply to:   [email protected]

GCP Data Architect

Location: SFO CA (Remote position)

Duration: 12+ months

Interview: Video

Client input:

Is a highly effective communicator who can work with both stakeholders and data team. Someone who can procure refined requirements, and collaborate effectively with a global team in a fast-moving environment to execute on them.

Is a skilled programmer (Python preferred) who follows best practices and, importantly, is highly proficient in using Spark. Ideally, he should be able to mentor other engineers on the team in following best practices.

Has significant experience with building scalable and automated data processes, including the underlying infrastructure that supports it.

Has data modeling and cloud experience. 

Must have: GCP Cloud experience, Solutions Architect experience

Need someone with GCP, python, Pyspark, Architecture and Solutioning exp

Responsibilities:

Experience in collaboration and coordination with the stakeholders to ensure project delivery from requirement to user acceptance testing.

Establish scalable, efficient, automated processes for data analyses, model development, validation, and implementation.

Work closely with data scientists and analysts to create and deploy new features.

Write efficient and well-organized software to ship products in an iterative, continual-release environment.

Reporting key insight trends, using statistical rigor to simplify and inform the larger team of noteworthy story lines that impact the business.

Monitor and plan out core infrastructure enhancements.

Contribute to and promote good software engineering practices across the team.

Mentor and educate team members to adopt best practices in writing and maintaining production code.

Communicate clearly and effectively to technical and non-technical audiences.

Actively contribute to and re-use community best practices.

Minimum Qualifications:

University or advanced degree in engineering, computer science, mathematics, or a related field

Strong experience working with a variety of relational SQL and NoSQL databases.

Strong experience working with big data tools: Big Data tech stack (Hadoop, Spark, Kafka etc.)

Strong experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

Ability to work with Linux platform.

Strong knowledge of data pipeline and workflow management tools

Expertise in standard software engineering methodology, e.g. unit testing, code reviews, design documentation

Experience creating Data pipelines that prepare data for ingestion & consumption appropriately.

Experience in setting up, maintaining, and optimizing databases/filesystems for production usage in reporting and analytics.

Experience with workflow orchestration (Airflow, Tivoli, etc.)

Working knowledge of Git Hub/Git Toolkit

Working in a collaborative environment and interacting effectively with technical and non-technical team members equally well (Good verbal and written English)

Relevant working experience with Containerization (Docker and Kubernetes) preferred.

Experience working with APIs (Data as a Service) preferred.

Experience with data visualization using Tableau, PowerBI, Looker or similar tools is a plus

Keywords: cplusplus information technology California
GCP Data Architect
[email protected]
[email protected]
View all
Thu Apr 11 01:00:00 UTC 2024

To remove this job post send "job_kill 1299978" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 9

Location: SFO, California