Databricks Engineer with INSURANCE DOMAIN - REMOTE- GC AND USC at Remote, Remote, USA |
Email: [email protected] |
From: Shivani Sharma, Abidi Solutions [email protected] Reply to: [email protected] HI, Greetings..!! pl share a few profile with INSURANCE DOMAIN Job Title: Databricks Engineer Location: North Carolina (Remote) Rate: $58/hr on C2C Work Authorization: GC and USC Only Job Summary: We are seeking an experienced Databricks Engineer with a strong background in the insurance industry to join our team. The ideal candidate will have a proven track record in building, managing, and optimizing scalable data pipelines using Databricks, and an in-depth understanding of insurance data and processes. This role will play a critical part in delivering data solutions that support our clients' insurance operations. Key Responsibilities: Design, develop, and maintain scalable data pipelines using Databricks and Apache Spark to support analytics, reporting, and insurance-related business processes. Collaborate with insurance industry stakeholders to gather requirements and translate them into technical solutions. Build and optimize large-scale batch and real-time data processing workflows, ensuring high performance and reliability. Integrate Databricks with cloud-based data services (Azure, AWS, or GCP) to build efficient solutions. Implement best practices for data modeling, transformation, and architecture for complex insurance use cases (e.g., policy lifecycle, claims processing, underwriting). Perform in-depth data analysis and reporting using Databricks to generate actionable insights for insurance teams. Work with data scientists to enable machine learning models for predictive analytics in insurance (e.g., fraud detection, risk assessment). Ensure compliance with regulations like GDPR, HIPAA, and PCI when handling sensitive insurance data. Monitor and troubleshoot performance, data quality, and system reliability across data pipelines and applications. Required Qualifications: Bachelors degree in Computer Science, Data Engineering, or a related field (or equivalent experience). 3+ years of experience working with Databricks and Apache Spark in a production environment. Strong understanding of the insurance industry, including familiarity with data from claims, policies, underwriting, and customer behavior. Proficiency in Python, SQL, and Scala for data manipulation and processing. Experience with cloud platforms (Azure, AWS, GCP) and their integration with Databricks. Knowledge of data warehousing, ETL pipelines, and big data technologies. Familiarity with insurance regulations and data compliance requirements. Strong analytical and problem-solving skills. Preferred Qualifications: Experience with Delta Lake and other advanced Databricks features. Understanding of machine learning frameworks applied in the insurance domain (e.g., risk models, actuarial models). Certifications in Databricks or relevant cloud platforms. Hands-on experience with data visualization tools like Power BI, Tableau, or Databricks visualizations. Soft Skills: Excellent communication skills to collaborate effectively with both technical and non-technical stakeholders in the insurance industry. Team-oriented, with strong problem-solving abilities. High attention to detail, ensuring data accuracy and security. Keywords: business intelligence green card procedural language Databricks Engineer with INSURANCE DOMAIN - REMOTE- GC AND USC [email protected] |
[email protected] View all |
Thu Oct 24 01:33:00 UTC 2024 |