Home

DBT Developer with snowflake at Snowflake, Arizona, USA
Email: [email protected]
From:

Sanjeev Kumar Singh,

Tek Inspirations LLC

[email protected]

Reply to:   [email protected]

Hello,

I'm Sanjeev from
TEK Inspirations

We have a requirement for you and the details are as follows

Please review the Job specifications below and let me know if interested. 

Please confirm this email that you received it also send me your
DL, Visa , 4 digit SSN, education details and LinkedIn profile  as well

Job Description -

DBT Developer with snowflake

Remote

Duration: 6+ months

Responsibilities:
Hands-on working knowledge of Snowflake Architecture/Lead (Access Control, provisioning etc) and DBT (Data Build Tool).
Person should be good with Data transformation and processing using Data Build Tool.
SnowPro Data engineering certification is a plus
Teradata and Snowflake experience.
Professional experience with source control, merging strategies and coding standards, specifically Bitbucket/Git and deployment through Jenkins pipelines.
Demonstrated experience developing in a continuous integration/continuous delivery (CI/CD) environment using tools like Jenkins, circle CI Frameworks.
Demonstrated ability to maintain the build and deployment process through the use of build integration tools
Experience designing instrumentation into code and using and integrating with software and logging analysis tools like log4Python, New Relic, Signal FX and/or Splunk.
Conduct knowledge-sharing sessions and publish case studies. Take accountability for maintaining program or project documents in a knowledge base repository.
Identify accelerators and innovations. Understand complex interdependencies to identify the right team composition for delivery.

Required Skills:
Hands-on experience in DBT tools.
Working experience and communicating with business stakeholders and architects
Industry experience in developing relevant big data/ETL data warehouse experience building ccloud-nativedata pipelines
Experience in ETL, Pyspark, Scala, Java and SQL Strong Object and Functional programming experience in Python
Experience working with REST and SOAP-based APIs to extract data for data pipelines
Extensive experience working with Hadoop and related processing frameworks such as Spark, Hive, Sqoop, etc.
Experience working in a public cloud environment, particularly AWS is mandatory
Ability to implement solutions with AWS Virtual Private Cloud, EC2, AWS Data Pipeline, AWS Cloud Formation, Auto Scaling, AWS Simple Storage Service, EMR and other AWS products, HIVE, Athena
Experience in working with Real-time data streams and Kafka Platform.
Working knowledge of workflow orchestration tools like Apache Airflow design and deploy dags.
Hands-on experience with performance and scalability tuning
Professional experience in Agile/Scrum application development using JIRA

Additional Info -

Regards!!

Sanjeev Kumar Singh

Technical Recruiter

| IT Healthcare & Informatics

||

Email: 

[email protected]

Keywords: continuous integration continuous deployment information technology
[email protected]
View all
Mon Aug 21 22:07:00 UTC 2023

To remove this job post send "job_kill 550572" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,