Home

GCP Data Engineer with 10+years of experience || Irving, Texas- locals ONLY || NON JNTU & OSMANIA at Irving, Texas, USA
Email: [email protected]
Hello,

Hope you are doing well

Role: 
GCP
Data Engineer with 10+years of experience

Location-
Irving,
Texas- locals ONLY

Onsite- 2days a week ,
Tue and Thu

Client- Verizon

GCP
,teradata, analytical skills, big query, CP services, composer, data flow , SQL
hands on.

Overview

The
GCP Teradata Engineer will be responsible for developing and
supporting database applications to drive automated data collection, storage,
visualization and transformation as per business needs. The candidate will
uphold Prodapts winning values and work in a way that contributes to the
Companys vision.

Roles &
Responsibilities:

Data Pipeline and Software Implementation

Write Teradata, SQL/Oracle code according to
established design patterns.

Analyze, design, code, and test complex ETL processes
for data warehouses and operational data stores.

Implement data pipelines as per the design document

Database design, Data Modelling and Mining

Consolidate data across multiple sources and databases
to make it easier to locate and access

Implement automated data collection and data storage
systems

Provide database support by coding utilities, respond
to and resolve user problems

Cloud Enablement

Develop and deploy applications at the direction of
leads including large-scale data processing, computationally intensive
statistical modeling, and advanced analytics

Data Visualization and Presentation

Write complex SQL queries (T-SQL/PL-SQL) and stored
procedures

Should have GCP Cloud Experience.

Familiarity with the Technology stack available in the
industry for data management, data ingestion, capture, processing and curation

Requirements:

8+ years experience.

ETL development experience with a strong SQL
background, analyzing huge data sets, trends and issues, and creating
structured outputs.

Minimum 5+ years of experience in Teradata utilities
like BTEQ, FastLoad, MultiLoad and TPT (Teradata Parallel Transporter).

Hive Queries to point external tables

Experience in building high-performing data processing
frameworks leveraging Google Cloud Platform and Teradata

OOzie scheduling and GCP  Airflow
scheduling 

Experience in building data pipelines supporting both
batch and real-time streams to enable data collection, storage, processing,
transformation and aggregation.

Spark streaming jobs

Experience in utilizing GCP Services like Big Query,
Composer, Dataflow, Pub-Sub, Cloud Monitoring

GCP data flow custom templates creation

Experience in performing ETL and data engineering work
by leveraging multiple Google Cloud components using Dataflow, Data Proc,
BigQuery

Experience in scheduling like Airflow, Cloud Composer
etc.

Understand ETL application design, data sources, data
targets, relationships, and business rules.

Experience in JIRA or any other Project Management
Tools

Experience in CI/CD automation pipeline facilitating
automated deployment and testing

Experience in bash shell scripts, UNIX utilities &
UNIX Commands

Minimum a Bachelors Degree.

--

Keywords: continuous integration continuous deployment information technology procedural language
GCP Data Engineer with 10+years of experience || Irving, Texas- locals ONLY || NON JNTU & OSMANIA
[email protected]
[email protected]
View all
Thu Oct 03 00:19:00 UTC 2024

To remove this job post send "job_kill 1806731" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,