Home

Direct Client requirement for Data Architect with Databricks, AWS, Python at Remote, Remote, USA
Email: [email protected]
From:

satin,

Cohesive Technologies

[email protected]

Reply to:   [email protected]

Role: Data Architect (Databricks/AWS/Python)

Plano, TX

Long Term Contract

Roles & Responsibilities:

Perform analysis, design, development, and configuration functions as well as define technical requirements for assignments of intermediate complexity.

Participate with a team to perform analysis, assessment and resolution for defects and incidents of intermediate complexity and escalate appropriately.

Work within guidelines set by the team to independently tackle well-scoped problems.

Seek opportunities to expand technical knowledge and capabilities.

Stand up data platforms, build out ETL pipelines, write custom code, interface with data stores, perform data ingestion, and build data models

Oversee data ingesting into enterprise data mining solutions

Ability to take ownership when necessary, acting with urgency, putting customers first, and looking into the future

Solid understanding of cloud technologies, enterprise level Data Strategy and Data Governance concepts

Familiarity with data visualization tools and methodologies is a plus

Development of data pipelines, in AWS, using all types of data sets along with Redshift

Strong familiarity and hands-on experience with Databricks, Data Factory, StreamSets

Experience in writing code in Python/Scala.

Experience & Qualifications:

10+ years of experience in architecture, design, implementation, and analytics solutions

Hands on development Design and Develop applications using Databricks

Experience with Solutioning on AWS

Data Migration experience from other platforms to AWS

In depth understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames, Spark Streaming, RDD caching, Spark MLib.

Any programming language experience with SQL, Store procedures, Spark/Scala.

Strong understanding of Data Modeling and defining conceptual logical and physical data models.

Ability to design and demonstrate system Architecture with different environments

Ability to deal with both On-Premises and Cloud systems

Data Engineering experience

Experience working with ETL tools such as Databricks and Redshift

Experience coding in Python, and PySpark

Experience with Python SDLC tools (flake8, commitizen, CircleCI)

Comfortable working with APIs

Cloud experience, specifically working with AWS (ECS, Redshift)

Experience working with relational databases and SQL scripts.

Satin Harper

Senior Talent Acquisition Partner

Phone: (470) 668- 2233

www.cohetech.com

LinkedIn ID : https://www.linkedin.com/in/satin-harper-0b9393229/

Keywords: Idaho Texas
[email protected]
View all
Wed Jan 25 14:50:00 UTC 2023

To remove this job post send "job_kill 301385" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,