| Architect - Big Data - California at California, Maryland, USA |
| Email: [email protected] |
|
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1148758&uid= From: Utsav (IT Resource Manager), ChabezTech [email protected] Reply to: [email protected] **Job Title: Lead Big Data Developer/Architect** **Location: Irvine - CA ### Job Description: We are seeking an experienced and dynamic Lead Big Data Developer/Architect to join our team at [Company Name]. The ideal candidate will have a strong background in Big Data technologies, Data Warehousing, and related tools. As a Lead Big Data Developer/Architect, you will play a crucial role in designing, developing, and implementing innovative solutions on a Hadoop-based platform. ### Key Responsibilities: - Design, develop, and implement Big Data analytic solutions on a Hadoop-based platform. - Refine data processing pipelines focused on unstructured and semi-structured data. - Create custom analytic and data mining algorithms to extract knowledge and meaning from vast data stores. - Configure data flows from different sources (relational databases, XML, JSON) and orchestrate them using Nifi. - Develop Spark Frameworks using PySpark and Java for Raw/Analytical Layers in Big Data. - Utilize Jenkins for Continuous Integration and Git for Version Control. - Write shell scripts and job management scripts to invoke and manage Data Ingestion steps. - Design HIVE tables for better performance and apply partitions where needed. - Review HDFS data organization and provide mechanisms to support Multi-Tenant features. - Work on AWS Services like S3, EMR, Lambda, Glue Jobs, Athena as part of the Open data initiative. - Build and maintain QlikView dashboards using data from various sources. - Collaborate with cross-functional teams and stakeholders to gather requirements and provide technical expertise. - Mentor and guide junior team members. ### Qualifications: - Bachelors or Masters degree in Computer Science, Information Technology, or related field. - 15+ years of total IT experience with a focus on Big Data and Data Warehousing technologies. - Strong proficiency in Python, Java, SQL, PL/SQL, Unix Shell Scripting. - Extensive hands-on development experience with Hadoop technologies including Spark using Python and Java, Hive, Impala, Sqoop, AWS EMR. - In-depth knowledge of Spark distributed Framework involving RDD and Data Frames using Python, Java8, Scala. - Experience with AWS services such as S3, EMR, Lambda, Glue, Athena. - Proficiency in building and maintaining QlikView dashboards. - Excellent understanding of Data Warehousing concepts, Dimensional data modeling, and ETL processes. - Certification in relevant technologies, such as Oracle Certified Associate. ### Preferred Skills: - Familiarity with Apache SOLR, Elasticsearch for data indexing and search. - Experience with Kafka for data streaming. - Knowledge of Data Governance and Metadata Management. - Previous exposure to Informatica PowerCenter or similar ETL tools. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. Thanks & Regards Utsav Manager ChabezTech LLC 4 Lemoyne Dr #102, Lemoyne, PA 17043, USA Email: [email protected] | www.chabeztech.com Keywords: sthree information technology procedural language California Pennsylvania http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1148758&uid= |
| [email protected] View All |
| 01:04 AM 23-Feb-24 |