Home

GCP Data Engineer -- Dallas TX -- only locals at Dallas, Texas, USA
Email: [email protected]
From:

Rakesh,

Blue Ocean Ventures

[email protected]

Reply to: [email protected]

Hi,

GCP Data Engineer

Dallas TX only locals

A Brief Overview:

Designs, builds, and maintains large-scale data infrastructure and data processing systems. Implements robust and scalable solutions to support data-driven applications, analytics, and business intelligence.
They have to take care of migrating the historical data from Teradata into the new GCP system. That is one part, then second part is a big strategically. Currently all those upstream right data sources directly loading the data into Teradata. Instead, these people should build the pipelines.
That should take the data from the source team and putting it directly into the big query.

What you will do:

Ensures seamless integration of data from different sources, such as databases, application programming interfaces (APIs), or streaming platforms.

Optimizes data processing and query performance by fine-tuning data pipelines, database configurations, and data partitioning strategies.

Establishes data quality checks and validations to identify and resolve data issues, ensuring high-quality and reliable data for downstream applications and analytics.

Implements security measures to protect sensitive data throughout the data lifecycle by working closely with security teams to ensure data encryption, access controls, and compliance with data protection regulations.

Collaborates with cross-functional teams, including data scientists, analysts, software engineers, and business stakeholders.

Designs and develops data infrastructure, including data warehouses, data lakes, and data pipelines.

Establishes auditing and monitoring mechanisms to track data access and maintain data governance standards.

Establishes monitoring and alerting mechanisms to identify bottlenecks, latency, or failures, and troubleshoots data-related problems, investigates root causes, and implements remediation measures.

Explores new frameworks, platforms, or cloud services that can enhance data processing capabilities.

Leverages devesecops practices

Position Summary:

Understands the Enterprise data systems and acquires knowledge on the relevant processes need for project delivery.

Participate in project estimation process and provide inputs to Tech Lead.

Participate in Agile scrum activities/project status meetings on regular basis.

Participate in User story grooming/Design discussion with technical lead.

Analyzes complex Data structure from disparate data sources and design large scale data engineering pipeline.

Uses strong programming skills to build robust data pipelines for ETL (Extract / Transform / Load) processes, designs database systems and develops tools for data processing.

Perform all Data Engineering job activities EDW/ETL project development/testing and deployment activities.

Work closely with the developers on the ETL Jobs/Pipelines development.

Create the Project process/automation by integrating the involved components.

Documents data engineering processes, workflows, and systems for reference and knowledge-sharing purposes.

Implements data quality checks and validation processes to ensure the accuracy, completeness, and consistency of the data.

Be a team player and work with team members for Business solution and implementation.

Preferred Qualifications:

GCP Experience - BigQuery, Cloud SQL, Python, Cloud composer/Airflow , Cloud Storage & Dataflow/Data Fusion

Hands-on experience building and deploying data transformation and processing solutions using Teradata utilities (BTEQ, TPT, FastLoad & SQL Queries).

GCP - Data Engineer certification strongly preferred.

Ability to leverage multiple tools and programming languages to analyze and manipulate data sets from disparate data sources.

Strong problem-solving skills and critical thinking ability

Strong collaboration and communication skills within and across teams

Knowledge in Flask, JavaScript, HTML , CSS, Django

Knowledge in BI Tools MicroStrategy, Tableau

Must understand software development methodologies including waterfall and agile.

Health Care/PBM domain experience

Excellent communication and presentation skills.

Required Qualifications:

7+ years of Experience in building, executing Data Engineering pipelines.

6+ years of Experience with Python

7+ years of Experience with SQL

7+ years of hands-on Experience with bash shell scripts, UNIX utilities & UNIX Commands

5+ years of hands-on Experience with a major cloud platform GCP, BiqQuery, Cloud SQL etc.

5+ experience with various databases Teradata, DB2, Oracles, SQL Server etc.

Healthcare experience and PBM systems knowledge preferred.

Keywords: business intelligence information technology Texas
GCP Data Engineer -- Dallas TX -- only locals
[email protected]
[email protected]
View all
Thu Aug 08 19:18:00 UTC 2024

To remove this job post send "job_kill 1640716" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,