Hadoop Admin with Kubernetes-St Louis - MO - Day 1 Onsite at Day, New York, USA |
Email: [email protected] |
From: Basavaraj, Centraprise [email protected] Reply to: [email protected] Hi Team, Greetings from Centraprise, We have below urgent requirements with one of our prime clients, please go through the below job details and let me know your availability along with updated resume and contact details. Role: Hadoop Admin with Kubernetes Location : St Louis - MO - Day 1 Onsite Project: What will they be working on Separating compute and storage - moving from Hadoop to Kubernetes based cluster. Need someone to do research analysis, need clear guidance and roadmap to complete this. Wants to start moving the applications mid next year. There are a lot of servers that are nearing end of life and going to be out of support by mid to end of next year. Needs this person to already have experience doing this. Very fast-tracked. Technical skills: 6-8 years of experience working on a project like this: Hadoop Kubernetes Kafka Object storage experience Minio or Ceph Apache ozone Object storage technologies - S3 Spark, Scala, Hive, Base, Kudu Job summary Resource will be working closely with the architects. On solution and development team. Working with them to come up with a solution using Kubernetes and Object Storage like Minio/ Ceph/Apache Ozone etc. Goal is to create pool of niche resources and build solution based on open source and continue to support and grow the team. Looking for someone who can be a heavy lifter. Roles & Responsibilities 9 -10 years experience in Hadoop administration activities. Exposure in AWS Big data Deploy/maintain Hadoop clusters, add/remove nodes using cluster monitoring tools, configuring the NameNode high availability and keeping a track of all the running Hadoop jobs. Implementing, managing and administering the overall Hadoop infrastructure. Takes care of the day-to-day running of Hadoop clusters Work closely with the database team, network team, BI team and application teams to make sure that all the big data applications are highly available and performing as expected. Capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop cluster. Decide the size of the Hadoop cluster based on the data to be stored in HDFS. Ensure that the Hadoop cluster is up and running all the time. Monitoring the cluster connectivity and performance. Manage and review Hadoop log files. Backup and recovery tasks Resource and security management Troubleshooting application errors and ensuring that they do not occur again. Thanks & Regards Basavaraj | Talent Acquisition Associate Centraprise Corp 3 Ethel Rd, #304, Edison, NJ 08817 Desk : 848-209-8309 Email: [email protected] Connect me on LinkedIn: linkedin.com/in/basavaraj-n-methre-720b55223 Keywords: business intelligence sthree golang Missouri New Jersey |
[email protected] View all |
Thu Oct 19 23:19:00 UTC 2023 |