Home

Hadoop Platform Engineer needed only & at Dallas, Texas, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2049361&uid=

Hello All,

Hope you are doing wonderful

Please go through below job description

Job Title: Hadoop Platform Engineer

Location: Dallas, TX (Onsite)

Duration: Long Term

End Client : Bank of America.

Qualifications:

Bachelors degree in computer science, Information Technology, or a related field (or equivalent work experience).

Strong experience in designing, implementing, and administering Hadoop clusters in a production environment.

Proficiency in Hadoop ecosystem components such as HDFS, YARN, MapReduce, Hive, Spark, and HBase.

Experience with cluster management tools like Apache Ambari or Cloudera Manager.

Solid understanding of Linux/Unix systems and networking concepts.

Strong scripting skills (e.g., Bash, Python) for automation and troubleshooting.

Knowledge of database concepts and SQL.

Experience with data ingestion tools like Apache Kafka or Apache NiFi.

Familiarity with data warehouse concepts and technologies.

Understanding of security principles and experience implementing security measures in Hadoop clusters.

Strong problem-solving and troubleshooting skills, with the ability to analyze and resolve complex issues.

Excellent communication and collaboration skills to work effectively with cross-functional teams.

Relevant certifications such as Cloudera Certified Administrator for Apache Hadoop (CCAH) or Hortonworks Certified Administrator (HCA) are a plus.

Technical Proficiency:

Experience with Hadoop and Big Data technologies, including Cloudera CDH/CDP, Data Bricks, HD Insights, etc.

Strong understanding of core Hadoop services such as HDFS, MapReduce, Kafka, Spark, Hive, Impala, HBase, Kudu, Sqoop, and Oozie.

Proficiency in RHEL Linux operating systems, databases, and hardware administration.

Operations and Design:

Operations, design, capacity planning, cluster setup, security, and performance tuning in large-scale Enterprise Hadoop environments.

Scripting and Automation:

Proficient in scripting (e.g., Bash, KSH) for automation.

Security Implementation:

Experience in setting up, configuring, and managing security for Hadoop clusters using Kerberos with integration with LDAP/AD.

Problem Solving and Troubleshooting:

Expertise in system administration and programming skills for storage capacity management, debugging, and performance tuning.

Collaboration and Communication:

Collaborate with cross-functional teams, including data engineers, data scientists, and DevOps teams

--

Keywords: active directory information technology golang card Texas
Hadoop Platform Engineer needed only & EAD
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2049361&uid=
[email protected]
View All
07:55 PM 03-Jan-25


To remove this job post send "job_kill 2049361" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.

Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 5

Location: Dallas, Texas