Home

New Role: AWS Enterprise Data Architect at Denver, Colorado, USA
Email: [email protected]
From:

Amandeep Singh,

Amber IT Staffing

[email protected]

Reply to: [email protected]

Job Title: AWS Enterprise Data Architect

Location: Denver, CO (On-Site)

Duration: 12 Months Contract

Start Date: ASAP

Job Description:

Job Duties and Responsibilities

Deploy enterprise-ready, secure and compliant data-oriented solutions leveraging Data Warehouse, Big Data and Machine Learning frameworks

Optimizing data engineering and machine learning pipelines

Reviews architectural designs to ensure consistency & alignment with defined target architecture and adherence to established architecture standards

Support data and cloud transformation initiative

Contribute to our cloud strategy based on prior experience

Understand the latest technologies in a rapidly innovative marketplace

Independently work with all stakeholders across the organization to deliver point and strategic solutions

Assist solution providers with the definition and implementation of technical and business strategies

Skills - Experience and Requirements - A successful Architect will have the following:

Should have prior experience in working as a Data Warehouse/Big Data Architect.

Experience in advanced Apache Spark processing framework, spark programming languages such as Scala/Python/Advanced Java with sound knowledge in shell scripting.

Should have experience in both functional programming and Spark SQL programming dealing with processing terabytes of data

Specifically, this experience must be in writing Big Data data engineering jobs for large scale data integration in AWS. Prior experience in writing Machine Learning data pipeline using Spark programming language is an added advantage.

Advanced SQL experience including SQL performance tuning is a must.

Should have worked on other big data frameworks such as MapReduce, HDFS, Hive/Impala, AWS Athena.

Experience in logical & physical table design in Big Data environment to suite processing frameworks

Knowledge of using, setting up and tuning resource management frameworks such as Yarn, Mesos or standalone spark.

Experience in writing spark streaming jobs (producers/consumers) using Apache Kafka or AWS Kinesis is required

Should have knowledge in variety of data platforms such as Redshift, S3, Teradata, Hbase, MySQL/Postgres, MongoDB

Experience in AWS services such as EMR, Glue, S3, Athena, DynamoDB, IAM, Lambda, Cloud watch and Data pipeline

Must have used the technologies for deploying specific solutions in the area of Big Data and Machine learning.

Experience in AWS cloud transformation projects are required.

Telecommunication experience is an added advantage.

Keywords: sthree information technology Colorado
New Role: AWS Enterprise Data Architect
[email protected]
[email protected]
View all
Fri Aug 30 18:47:00 UTC 2024

To remove this job post send "job_kill 1710014" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 10

Location: Denver, Colorado