Home

Big data Engineer with Azure(10+ Years Must) at Remote, Remote, USA
Email: [email protected]
From:

Richardson,

Vbeyond

[email protected]

Reply to:   [email protected]

Greetings from Vbeyond !

I saw your resume online and would like to connect regarding this role big data Engineer with Azure at
Franklin TN.

Please review below job details. If interested, please revert with your updated resume.

Role : Big data Engineer with Azure

Location :
Franklin TN

Type: Contract

Mandatory :

10 Years

Bigdata with Azure

Scala

Kafka

Python

Summary :

This role will be part of the Data Exchange group and will report to Software Engineering Senior Manager. It will require cross coordination between multiple teams. This role will be a key player in defining and implementing Big Data Strategy for the organization along with driving implementation of IT solutions for the business. This role will also provide direction for implementing best practices to determine optimum solutions. Minimum Education, Licensure and Professional Certification requirement: Bachelors Degree or equivalent Experience Minimum Experience required (number of years necessary to perform role): 10+ years IX.

Required Skills/Qualifications:

Bachelors Degree or Equivalent Experience

Minimum 10 years of IT Experience

Minimum 3 years in implementing Big Data Solutions

Proficiency in developing batch and streaming application using PySpark/Scala and Kafka

At least 3 years experience with working on Cloud implementations requiredAt least 2 years experience in using Azure Databricks Platform, Databricks Delta

Candidate must be able to lead cross functional Solutions

Experience in using different data services on Azure

Proficient in Database Concepts and Technologies including MS SQL Server, DB2, Oracle, Cosmos DB, and No-SQL Databases

Proficiency in file formats such as (but not limited to) Avro, Parquet, and JSON

Familiar with Data Modeling, Data Architecture & Data Governance concepts

Adept in designing and leveraging APIs including integrating to drive dynamic content

Exposure to at least one: Azure DevOps, AWS, or Google Cloud

Demonstrated problem-solving skills and the ability to work collaboratively with other stakeholders or team member to resolve issues

Excellent Communications skills and should be able to effectively collaborate with the remote teams both on-shore and off-shore

Healthcare or Financial background is a plus Job Description Overview In this role, you will be responsible for full life cycle solutions, from conception through deployment, for data solutions

Improve coding quality and reliability by implementing good standards and processes. Increase productivity by implementing tools and processes. Serve as the technology go-to person on any technical questions

Resolve complex technical issues. Ensure quality is maintained by following development patterns and standards

Prepare deployment and post-deployment plans to support the conversion and deployment of the solution

Interact with architects, technical project managers, developers to ensure that solutions meet requirements and customer needs

Improve coding quality and reliability by implementing good standards and processes (best practices).

Increase productivity by implementing tools and processes. Serve as the technology go-to person on any technical questions

Resolve complex technical issues. Ensure quality is maintained by following development patterns and standards.

Responsibilities Build ETL processes to allow data to flow seamlessly from source to target using tools like DataBricks, Azure Data Factory SSIS.

Load and enhance dimensional data models.

Leverage code to apply business rules to ensure data is clean and is interpreted correctly by all business stakeholder like DataBricks, SQL, Scala, and Spark.

Perform peer code reviews and QA.

Fine tune existing code to make processes more efficient. Maintain and create documentation to describe our data management processes.

Drive development & delivery of Key Business initiatives for the Big Data Platform in collaborating with other stakeholders.

Collaborate with Business stakeholders in gathering Business requirements

Perform POCs on Big Data Platform to determine optimum solution.

Work with vendors in evaluating Big Data Technologies and resolving Technical Issues

Effectively collaborate with remote teams (on-shore and off-shore) for Solution Delivery

Regards,

Richardson

Vbeyond Corporation

[email protected]

linkedin.com/in/richard-son-5147a6183

Contact No

862-343-6856

Keywords: quality analyst database information technology golang
[email protected]
View all
Fri Dec 02 00:41:00 UTC 2022

To remove this job post send "job_kill 181263" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 8

Location: Franklin, Tennessee