Home

Data Engineer GCP Google BigQuery SQL at Dallas, Texas, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=1577888&uid=

From:

Rajeev,

Tek Inspirations LLC

[email protected]

Reply to:   [email protected]

Hello All,

I hope you are doing well,

I have a role of

Data Engineer (GCP / Google BigQuery / SQL)

Please let me know if you are comfortable with this role,

At the time of submission I need DL and Visa.

Job Description -

Title: Data Engineer (GCP / Google BigQuery / SQL)

Visa: GC/USC/GC-EAD

Duration: 12+ month

Location: Dallas, TX or Little Rock, AR

MOI: In-person 2nd round

Must have Visa, LinkedIn, Dl 

Need Local Candidate Only

This is for Simmons Bank in Dallas.

In person interview required.

We had 1 guy clear 1st round and went for in person They rejected him saying he didnt have enough SQL Querying skills. 

He had 6 years of GCP.

GCP is a must but 6 years of GCP is not required. 2 years of Solid GCP with any other cloud exp is also OK. 

GCP and Related services, Strong SQL writing skills and Migration to GCP from SQL Server (DB) or any ant legacy DB or DW env or ON -prem Data Lake is preferred.

About the job

Data Engineer (GCP / Google BigQuery / SQL)

    Location: Dallas, TX or Little Rock, AR

    Position is On-site 5 days/week.

    Client: Financial

    Contract: 12 + months

    Strong SQL Query skills - Required 

    2 rounds of interview- 2nd round In-person in Dallas, TX- 75225

    Local to Dallas area will be given preference.

The client is seeking a mid to Sr. level Data Engineer, with experience in Google Big Query, GCP, ETL Pipeline, BI, Cloud Skills, Microsoft SQL with experience in both building and designing.

Job Description -

We are looking for a Google Cloud data engineer, who wants to collaborate in an agile team of peers developing cloud based analytics platform integrating data from broad amount of systems to enable next-gen analytical products.

The Data Engineering Google Cloud Platform (GCP) Engineer is responsible to develop and deliver effective cloud solutions for different business units. This position requires in-depth knowledge and expertise in GCP services, architecture, and best practices. They will collaborate with cross-functional teams to design, implement, and manage scalable and reliable cloud solutions. This position will also be responsible for driving innovation and staying up-to-date with the latest GCP technologies and trends to provide industry-leading solutions.

Responsibilities:

You will directly work on the platform based on Google BigQuery and other GCP services to integrate new data sources and model the data up to the serving layer.

Contribute to this is unique opportunity as the program is set-up to completely rethink reporting and analytics with Cloud technology.

Collaborate with different business groups, users to understand their business requirements and design and deliver GCP architecture, Data Engineering scope of work

You will work on a large-scale data transformation program with the goal to establish a scalable, efficient and future-proof data & analytics platform.

Develop and implement cloud strategies, best practices, and standards to ensure efficient and effective cloud utilization.

Work with cross-functional teams to design, implement, and manage scalable and reliable cloud solutions on GCP.

Provide technical guidance and mentorship to the team to develop their skills and expertise in GCP.

Contribute to multiyear data analytics modernization roadmap for the bank.

Stay up-to-date with the latest GCP technologies, trends, and best practices and assess their applicability to client solutions.

Qualifications:

What will help you succeed

Bachelors University degree computer science/IT

Masters in data Analytics/Information Technology/Management Information System (preferred)

Strong understanding of data fundamentals, knowledge of data engineering and familiarity with core cloud concepts

Must have good implementation experience on various GCPs Data Storage and Processing services such as Big Query, Dataflow, Bigtable, Data form, Data fusion, cloud spanner, Cloud SQL

Must have programmatic experience of SQL, Python, Apache Spark

At least 3-5 years of professional experience in building data engineering capabilities for various analytics portfolios with at least 2 years in GCP/Cloud based platform.

Your expertise in one or more of the following areas is highly valued:

Google Cloud Platform, ideally with Google Big Query, Cloud Composer and Cloud Data Fusion, Cloud spanner, Cloud SQL

Experience with legacy data warehouses (on SQL Server or any Relational Datawarehouse platform)

Experience with our main tools DBT (Data Build Tool) , Terraform/Terragrunt, Git (CI/CD)

Experience with a testing framework.

Experience with Business Intelligence tools like PowerBI and/or Looker.

What sets you apart:

Experience in complex migrations from legacy data warehousing solutions or on-prem Data Lakes to GCP

Experience with building generic, re-usable capabilities and understanding of data governance and quality frameworks.

Experience in building real-time ingestion and processing frameworks on GCP.

Adaptability to learn new technologies and products as the job demands.

Multi-cloud & hybrid cloud experience

Any cloud certification (Preference to GCP Certifications)

Experience working with Financial and Banking Industry

Featured benefits

Medical insurance, Vision insurance, Dental insurance

Requirements added by the job poster

    Bachelor's Degree

    Working in an onsite setting

    4+ years of work experience with SQL

Regards,

Rajeev kharwar

Sr.Technical Recruiter DevOps Specialist

TEK Inspirations LLC |

13573 Tabasco Cat Trail, Frisco, TX 75035

Desk # 

469-393-0216 | E:

[email protected]

Whatsapp : - 7525894499

LinkedIN : -

https://www.linkedin.com/in/rajeev-kharwar-8869251b5/

Reach out if you have candidates of ( Site Reliability, AWS/Azure, Cloud Computing, GCP,

Infrastructure, ETL/Informatica, Data-warehouse, DataStage, Data Modeling, Python, Data Engineer,

Data scientist, ML, Data Architect, Hadoop, Big Data )

Disclaimer:

 If you are not interested in receiving our e-mails then please reply with a "REMOVE" in the subject line to 

[email protected]

.

And mention all the e-mail addresses to be removed with any e-mail addresses,which might be diverting the e-mails to you.

We are sorry for the inconvenience.

Keywords: continuous integration continuous deployment machine learning business intelligence database information technology green card Arkansas Texas
Data Engineer GCP Google BigQuery SQL
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=1577888&uid=
[email protected]
View All
03:47 AM 19-Jul-24


To remove this job post send "job_kill 1577888" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 10

Location: Dallas, Texas