Senior Data Architect at Remote, Remote, USA |
Email: [email protected] |
From: vaishnavi, Xforia [email protected] Reply to: [email protected] Hello Professional , Greetings from XFORIA I found your resume on Job Portal and noticed you meet our clients mandate. If interested, please check out the job description here below and share your Updated Resume Job Title : Senior Data Architect Location : Remote Job Description : Responsibilities: Leading The Architecture And Implementation Of Spark And Databricks-Based ETL Frameworks For Large-Scale Enterprise Systems. Designing And Developing High-Throughput Data Pipelines Using Spark Core And Streaming Technologies. Implementing And Enforcing Architectural Standards And Frameworks To Ensure A Flexible And Scalable Data Environment. Collaborating With Cross-Functional Teams To Gather Requirements, Analyze Data, And Design Effective Solutions. Hands-On Development Of Python-Based Scripts And Applications To Support Data Processing And Transformation. Utilizing Tools Such As Apache Airflow, Azure Data Factory, And Change Data Capture (CDC) For Orchestrating And Managing Data Workflows. Playing A Key Role In DevOps Activities, Including Deployment Of Spark Jobs And Infrastructure Setup. Providing Mentorship And Technical Guidance To Junior Team Members. Staying Updated With The Latest Industry Trends And Technologies In Data Engineering And Analytics. Qualifications: Bachelor's Or Master's Degree In Computer Science, Information Technology, Or A Related Field. 5+ Years Of Hands-On IT Experience, With A Strong Focus On ETL And Python Technologies. Proven Expertise In Designing And Implementing Data Solutions Using Spark And Databricks. Extensive Experience With Spark Core And Streaming Development. Solid Understanding Of Python Programming For Data Manipulation And Transformation. Hands-On Experience With Databricks Workflows, Delta Live Tables, And Unity Catalog. Proven Ability To Troubleshoot And Optimize SPARK Queries For Analytics And Business Intelligence Use Cases. Proficiency In Utilizing Apache Airflow, Azure Data Factory, And Change Data Capture (CDC) For Data Orchestration. Basic Knowledge Of DevOps Practices For Managing Spark Job Deployments. Databricks Certified Certification In Advanced Data Engineering Is A Plus. Strong Problem-Solving Skills And The Ability To Work Effectively In A Collaborative Team Environment. Excellent Communication And Interpersonal Skills. Keywords: information technology |
[email protected] View all |
Mon Sep 18 22:34:00 UTC 2023 |