Big Data Administrator - Cloudera at Remote, Remote, USA |
Email: [email protected] |
From: Mohammed Shaik, www.abidisolutions.com [email protected] Reply to: [email protected] Visa: No, H1B/OPTS NEED LOCAL TO Virginia (VA) PLEASE SHARE ME SUAITABLE RESUMES Role: Big Data Administrator - Cloudera Contract Duration: 12 months with potential for extension Location: Remote with potential for meetings in Reston, VA Visa: No, H1B/OPTS Top 3 Must Have Skills: 5+ years of experience with all the tasks involved in administration of Cloudera. Experience with Cloudera Cluster build infrastructure and installation. 3+ years of experience in the role of a Team Lead (not looking for a manager looking for someone with strong communication skills with ability to mentor and act as a POC during meetings etc.) SME level experience with Apache Solr search platform Job Description: The Big Data Administrator will support an existing Cloudera/AWS data platform while the existing team supports an ongoing enterprise migration to AWS. They will be responsible for configuring, troubleshooting, and installing infrastructure systems. The Big Data Administrator will be responsible for orchestrating, deploying, maintaining, and scaling cloud OR on-premises infrastructure targeting big data and platform data management (e.g., data warehouses, data lakes) including data access APIs. They will prepare and manipulate data using Hadoop or equivalent.) with emphasis on high availability, reliability, automation, and performance. Responsibility: Represent team in all architectural and design discussions. Be knowledgeable in the end-to-end process and able to act as an SME providing credible feedback and input in all impacted areas. Be involved in project tracking and task monitoring. Ensure overall successful implementation especially where team members all are working on multiple efforts at the same time. Lead the team to design, configure, implement, monitor, and manage all aspects of Data Integration Framework. Define and develop the Data Integration best practices for the data management environment of optimal performance and reliability. Plan, develop, and lead administrators with projects and efforts, achieve milestones and objectives. Oversee the delivery of engineering data initiatives and projects including hands on with install, configure, automation script, and deploy. Develop and maintain infrastructure systems (e.g., data warehouses, data lakes) including data access APIs. Prepare and manipulate data using Hadoop or equivalent MapReduce platform. Develop and implement techniques to prevent system problems, troubleshoot incidents to recover services, and support the root cause analysis. Develop and follow standard operating procedures (SOPs) for common tasks to ensure quality of service. Manage customer and stakeholder needs, generate, and develop requirements, and perform functional analysis. Fulfill business objectives by collaborating with network staff to ensure reliable software and systems. Enforce the implementation of best practices for data auditing, scalability, reliability, high availability and application performance. Develop and apply data extraction, transformation, and loading techniques in order to connect large data sets from a variety of sources. Act as mentor for junior and senior team members. Install, tune, upgrade, troubleshoot, and maintain all computer systems relevant to supported applications including all necessary tasks to perform operating system administration, user account management, disaster recovery strategy and networking configuration. Expand job knowledge of engineering and leading technologies by reviewing professional publications; establishing personal networks; benchmarking state-of-the-art practices; educational opportunities and participating in professional societies. Qualifications: Bachelors degree in Information Technology, Computer Science, or related discipline. At least 10 years of Experienced with all the tasks involved in administration of big data and Meta Data Hub such as Cloudera. 8 years of relevant engineering experience (an additional 4 years of relevant experience may be added in lieu of degree). 5+ years of experience with all the tasks involved in the administration of Cloudera including experience with Cloudera Cluster build infrastructure and installation. 3+ years of experience in the role of a Team Lead (not a manager someone with strong communication skills with ability to mentor and act as a POC during meetings etc.). SME level experience with Apache Solr search platform. Advanced (expert preferred) level experience in administrating and engineering relational databases (ex. MySQL, PostgreSQL), Big Data systems (ex. Cloudera Data Platform Private Cloud and Public Cloud), Apache Solr as SME, ETL (ex. Ab Initio), BI (ex. MicroStrategy) automation tools (ex. Ansible, Terraform, Bit Bucket) and experience working cloud solutions (specifically data products on AWS) are necessary. Experience with Ab Initio, EMR, S3, Dynamo DB, Mongo DB, ProgreSQL, RDS, DB2 is a Plus. DevOps (CI/CD Pipeline) is a Plus. Experience with Advance knowledge of UNIX and SQL Experience with manage metadata hub-MDH, Operational Console and troubleshoot environmental issues which affect these components Prefer prior experience with migration from on-premises to AWS Cloud. Represents team in all architectural and design discussions. Knowledgeable in the end-to-end process and able to act as an SME providing credible feedback and input in all impacted areas. Require tracking and monitoring projects and tasks as the lead. Thanks & Regards, Shaik Mohammed Senior Recruiter Email: [email protected] Web: www.abidisolutions.com Keywords: continuous integration continuous deployment business intelligence sthree database Virginia |
[email protected] View all |
Wed Nov 22 20:10:00 UTC 2023 |