Home

Big data Developer : Salt Lake City, UT at Salt Lake City, Utah, USA
Email: [email protected]
Hello
,

This is
Deepak from

Vlink Info, I do have a urgent requirement for you, Below is the complete Job Description. Kindly revert me with your update resume if you are comfortable with the job description.

Only H4 EAD consultant
.

Job Title: Bigdata Developer

Location: Salt Lake City, UT

Duration: Long Term Contract

Position Description: Client is seeking a dynamic Hadoop, Spark and Scala Developer to join our team supporting a core system for a large financial client in Cary, NC. We are looking for a
Hands-on Hadoop, Spark and Scala (6+ years) with experience. The system has high visibility within the organization, and is used by many prominent collaborators. This role offers flexibility of joining a team applying Agile methodologies to deliver high quality
software to our customers. You will also have the opportunity to work with cloud based technologies on highly visible initiatives. This role can be performed in Cary, NC.

Responsibilities:

5+ years of demonstrated ability with Big Data tools and technologies including working in a Production environment of a Hadoop Project.

2+ years of experience with Spark, PySpark, SQL, Hive, Impala, Oozie, HDFS, Hue, Git, MapReduce and Sqoop.

2+ years of Programming experience in Scala Programming and Application Development.

Experience in Test Driven Development (TDD), and/or Continuous Integration/Continuous Deployment (CI/CD) is a plus

Big Data Development using Hadoop Ecosystem including Pig, Hive and other Cloudera tools.

Analytical and problem-solving skills, applied to a Big Data environment.

Experience with large-scale distributed applications.

Experience with Agile methodologies to iterate quickly on product changes, developing user stories and working through backlog.

We Prefer experience with Cloudera Hadoop distribution components and custom packages.

Traditional Data Warehouse/ETL experience.

Excellent planning, organization, communication and thought leadership skills.

Ability to learn and apply new concepts quickly.

Validated ability to mentor and coach junior team members.

Strong leadership, communication, and interpersonal skills.

Ability to adapt to constant changes. Sense of innovation, creativity, organization, autonomy, and quick adaptation to various technologies.

Capable and eager to work under minimal direction in fast-paced energetic environment.

Required qualifications to be successful in this role
.

Strong Hands-on Hadoop, Python, Mainframe, DB2 and Teradata experience

Experience in analysis, design, development, support, and improvements in data warehouse environment with Bigdata Technologies with a minimum of 5+ years' experience in Hadoop, MapReduce, Sqoop,
HDFS, Hive, Impala, Oozie, Hue, Kafka, Yarn

2+ Experience in PySpark, Spark.

2+ Experience in Scala Development.

Deepak Prajapati

| Sr. US IT Recruiter 

+1 (860) 640-4611 ext 168

United States | Canada | India | Indonesia

Keywords: continuous integration continuous deployment information technology North Carolina Utah
[email protected]
View all
Fri Sep 29 03:10:00 UTC 2023

To remove this job post send "job_kill 696449" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 8

Location: Salt Lake City, Utah