Home

Sr. Cloud DataOps Engineer (GCP) || Remote || CTH || Visa: USC/GC at Remote, Remote, USA
Email: [email protected]
Hi,

Hope you are doing great! Please find the requirement below , If you find yourself comfortable with the requirement please reply back with your updated resume and I will get back to you or I would really appreciate
if you can give me a call back at my contact number

(302)-485-1559

Position: Sr. Cloud DataOps Engineer (GCP)

Location: Remote

Duration: 6 Months- CTH

Experience: 12 Years

Visa: USC/GC

Job Summary:

The Senior Cloud DataOps Engineer plays a pivotal role in designing, building, and maintaining scalable, reliable, and efficient data pipelines and infrastructure on the Google Cloud Platform (GCP). Your expertise
will be instrumental in ensuring seamless data flow, optimizing performance, and driving data-driven decision-making within our organization. The incumbent is required to work within cloud environments with complex data interdependencies, visibility constraints,
privacy protections, and security protocols. The Senior Cloud DataOps Engineer is expected to possess sufficient business, data, and cloud environment expertise to handle complex advanced data initiatives. The incumbent is self-motivated and able to work
under limited supervision from management, can manage large and/or complex and significant projects or multiple projects simultaneously, and is responsible for making connections and integrating work across teams and departments. The incumbent knows how to
interact with the business, set up projects, define goals, set expectations, and communicate results in language the business understands.

Working closely with leaders of the Data and Analytics Department and Network Technology and Operations Department, the incumbent ensures alignment of prioritization of work and appropriate allocation of resources.
The incumbent will lead efforts around DataOps process improvement to enhance team effectiveness in leveraging advanced cloud computing and storage techniques in our Google Cloud environment.

Responsibilities:

Advanced Cloud Storage Technologies: Understand the various cloud data storage technologies, and their appropriate use-cases. These will include Columnar (e.g., BigQuery), NoSQL (e.g., BigTable), and relational (e.g., AlloyDB).

Data Transformations and Data Fusion: Explore and preprocess data from various sources, measuring and ensuring data quality and integrity, develop operational data pipelines for ETL/ELT, using Python, SQL, and other relevant cloud
technologies.

Data Cleansing, De-Identification, Data Masking: Use appropriate DataOps processes to ensure data accuracy and consistency by identifying and correcting errors, protect sensitive information by removing or obfuscating personal
identifiers, and securing data by replacing sensitive information with cryptographic hashes.

Containerization: Leverage containers and containerization technologies such as Kubernetes, GKE, DataProc or similar cloud compute orchestration.

Cloud Processing Composition: Composer and (Apache) Airflow, or similar cloud/cluster processing task orchestration experience.

Infrastructure Management: Provision, configure, and manage GCP resources like Compute Engine, Cloud SQL, and BigQuery to support data processing and analytics workloads.

DevOps Practices: Implement and promote DevOps principles, including CI/CD pipelines, infrastructure as code (IaC), and monitoring and alerting.

Data Quality and Governance: Ensure data quality and integrity throughout the data lifecycle, implementing data validation, cleansing, and governance measures.

Performance Optimization: Analyze and optimize data pipeline performance, identifying bottlenecks and implementing strategies to improve efficiency.

Troubleshooting and Support: Diagnose and resolve data pipeline and infrastructure issues promptly, providing technical support to data scientists and scientists.

Mentorship: Provide guidance and mentorship to data engineers, data scientists, and data scientists within the Data and Analytics team. Serves as a mentor/role model, imparting data workflow, transformation, and orchestration
knowledge, experience, and skills to other staff at all levels, either individually or as a member of project teams.

Continuous Learning: Stay abreast of industry trends, emerging technologies, and best practices in cloud data operations.

Question, validate, and perform quality assurance of the data for integrity and consistency to support ongoing data quality assessment and improvement initiatives.

Understand all applicable data privacy and security laws, rules, regulations, and contractual restrictions, and follow all the company data governance and data usage rights policies and procedures.

Effectively communicate with & engage colleagues at all levels of the organization

Effectively distribute responsibilities to the appropriate people and levels.

Develop internal and external networks of contacts and have a positive influence on those networks.

Play a key role in supporting corporate initiatives and support senior leadership initiatives to realize company goals.

Qualifications:

Basic Requirements:

Bachelors degree in Computer Science, Data Science, or other related field; or equivalent experience

5+ years of experience in DevOps roles, with a strong focus on GCP or other Cloud providers

Experience with DevOps practices, CI/CD pipelines, Rest API integrations and infrastructure as code (IaC)

Strong knowledge of data warehousing/data lake, ETL/ELT processes, and reporting layers solutions

5+ years of experience in healthcare transaction data, including QA, testing, and reporting

Fluency in a relevant cloud programming language such as Python or Java / JavaScript.

Proficiency with SQL, Relational, NoSQL, and Columnar databases, and Structured and Unstructured data

Familiarity with data quality and governance concepts and implementation

Proficiency with cloud processing orchestration technologies, such as Apache Airflow, DataProc, and Kubernetes (e.g., GKE)

Knowledge of privacy laws and regulations around health data (HIPAA) and PII

Ability to work independently and as part of a team with excellent collaboration and communication skills

Preferred Qualifications:

Masters degree in a computational field (e.g., Computer Science or Data Science)

Google Cloud environment data storage expertise (BigQuery, BigTable, and AlloyDB)

Health IT industry knowledge with a particular focus on e-prescribing

DataOps experience: Data Fusion with CDAP, ETL/ELT, SQL, Python

Apache Airflow for DataOps Orchestration

Certifications related to GCP Cloud DevOps Engineer, e.g., IaC with TerraForm

Experience with containerization technologies (e.g., Docker, Kubernetes)

Knowledge of cloud security best practices

Experience with data visualization tools (e.g., Looker, Tableau)

Thanks And Regards

Pankaj Chauhan

Cell: (302)-485-1559

[email protected]

https://www.linkedin.com/in/pankajschauhan/

Accroid Inc.

1007 Orange ST 4th FL 1651 Wilmington, DE 19801

--

Keywords: continuous integration continuous deployment quality analyst information technology green card Delaware Florida
Sr. Cloud DataOps Engineer (GCP) || Remote || CTH || Visa: USC/GC
[email protected]
[email protected]
View all
Fri Oct 18 20:56:00 UTC 2024

To remove this job post send "job_kill 1856200" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,