Urgent Profile Needed - Big Data OR Spark Architect - Remote at Wayne, Pennsylvania, USA |
Email: [email protected] |
From: John Joseph, IDC Technologies.com [email protected] Reply to: [email protected] Hi Partners, Greetings of the day!!! This is John one of the Sr. Technical Recruiter in IDC Technologies. Position:- Big Data Architect Experience:- Min 10+ years Location:- Wayne, PA- Remote Contract:- 12 + Months with Extension Job Details: Must Have Skills Architect Exp - Needed Spark, Data Bricks, Big Data Nice to have skills Spark Databrics Detailed Job Description Qualifications Basic Bachelors degree or foreign equivalent required. Will also consider one year of relevant work experience in lieu of every year of education Minimum 5+ years of experience in IT services and at least 2-3 years experience in Big Data. 5+ year on Big Data Data Bricks solutions and development on Cloud Platforms. Experience in prior Big Data projects. Preferred: Hands-on Experience implementing Lake house architecture using Data bricks Data Engineering platform, SQL Analytics, Delta Lake, and Unity Catalog Hands-on experience implementing server less real-time/near real-time architecture using Cloud native services and Spark technologies leveraging Databricks Experience in Data bricks engineering solutions on one of the Cloud platforms Experience in Data Architecture and delivery experience on one of the Cloud platforms Experience working in transformational projects leveraging ETL/ELT/CDC pipelines using Databricks Experience with building data management pipelines to populate data into data lakes on cloud-centric platforms Responsibilities: Work with clients to define their cloud data platform and implement cutting-edge data solutions to provide valuable business insights including AI/ML Drive analysis, design, governance and development of data warehouse, data lake, and business intelligence solutions Research, synthesize, recommend, and select technical approaches for solving difficult migration, development, and integration problems Conduct and support workshops, design sessions, and project meetings as needed, playing a key role in client relations Collaborate applications teams/Business users to develop new pipelines with Cloud data migration methodologies and processes Create and maintain optimal data pipeline architecture, Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Databricks. Minimum years of experience 8-10 years Certifications Needed: No Top 3 responsibilities you would expect the Subcon to shoulder and execute Work with clients to define their cloud data platform and implement cuttingedge data solutions to provide valuable business insights including AIML Drive analysis, design, governance and development of data warehouse, data lake, and business intelligence solutions Research, synthesize, recommend, and select technical approaches for solving difficult migration, development, and integration problems If you are interested, kindly revert with the below required information to proceed further. 1.Work Authorization Copy 2.DL Copy 3.Updated Resume Candidate Information LinkedIn URL Candidate Contact details SSN Number(Last Four digits) Skype ID Passport Number (Not for Citizen, GC) Current Location Education Details ( Stream, University & Graduation year) Bachelor Master Thanks & Regards, John Joseph M Sr. Technical Recruiter Direct 408 465 4465 Secondary - 424 262 8133 920 Hillview Court, Suite 250, Milpitas, CA 95035 [email protected]| www.idctechnologies.com. Keywords: artificial intelligence machine learning information technology green card California Idaho Pennsylvania |
[email protected] View all |
Wed Feb 15 01:44:00 UTC 2023 |