Home

Urgently Looking for Data Architect in Atlanta GA at Atlanta, Georgia, USA
Email: [email protected]
From:

Peter Edward,

Tekgence.com

[email protected]

Reply to:   [email protected]

Hi,

Job Title:  Data Architect

Location:  Atlanta, US

Experience:   12 years

Work Location: Any state within  EST time zone

Role: Contractor

Robosoft is looking for a highly skilled Data Architect with 12 years of experience to join our dynamic team.

If you have a proven track record in data architecture, extensive experience with Data Bricks migration projects, expertise in PySpark, Airflow, SSIS, Azure Data Factory, SQL Server, Python scripting, and a strong background in designing large-scale data lakes, coming from healthcare sector, would like to hear from you.

Responsibilities:

Lead the design, development, and implementation of scalable and robust data architecture solutions.

Drive Databricks migration projects, ensuring seamless transition and optimization of existing data workflows.

Architect and implement data pipelines, lake using PySpark framework and Delta Live Table [DLT].

Migrate existing workload from SSIS, Azure Data Factory, Python, SQL Server and other data processing and transformation tools without changing the existing functionality

Design and optimize data models for efficient storage, retrieval, and analysis of large volumes of structured and unstructured data.

Develop and maintain Python scripts for data processing, transformation, and automation tasks.

Collaborate with cross-functional teams to gather and analyse data requirements and translate them into technical specifications.

Ensure data quality and integrity across the organization by implementing data governance policies and best practices.

Design and implement large-scale data lake solutions, leveraging cloud platforms such as AWS and Azure.

Provide technical leadership and mentorship to junior team members, fostering a culture of continuous learning and innovation.

Stay updated with emerging technologies and trends in data architecture, cloud computing, and healthcare domain advancements.

Requirements:

Bachelor's degree in Computer Science, Engineering, or related field.

12 years of experience in data architecture, with a focus on designing and implementing data solutions.

Proven experience leading Data Bricks migration projects and expertise in PySpark, Airflow, SSIS, Azure Data Factory, SQL Server, and Python scripting.

Strong background in designing and implementing large-scale data lake solutions.

Experience working in the healthcare domain, with a deep understanding of healthcare data standards and regulations (e.g., HIPAA, PHI).

Exposure to cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP).

Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and stakeholders.

Strong problem-solving and analytical skills, with a focus on delivering innovative and scalable data solutions.

Relevant certifications in data architecture, cloud computing, or related areas are a plus.

Keywords:
[email protected]
View all
Thu Feb 01 00:43:00 UTC 2024

To remove this job post send "job_kill 1072257" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 1

Location: ,