Looking for Data Architect with Pyspark (PA Locals) at Remote, Remote, USA |
Email: [email protected] |
From: venkat, oceanblue [email protected] Reply to: [email protected] Title: Data Architect Location: KOP, PA Visa: Any Experience: Should have min 12-15+ Years of experience Interview Type: F2F Job Description: 1.Expertise with Databricks 2.Hands-on coding skills in Python, SQL, and PySpark 3.Experience with Azure Services, including Azure Data Factory, Azure Databricks, Azure Data Lake Gen2 (ADLS Gen2), Key Vaults and Azure DevOps 4.Knowledge of data warehouse design and data modeling 5.Hands-on experience with Tableau for data visualization and reporting. 6.Proficiency in cloud-based data solutions, specifically with Snowflake and AWS services (e.g., S3, Redshift, Glue, Athena, etc.). 7.Strong understanding of data integration patterns and technologies (ETL/ELT). 8.Familiarity with Big Data technologies and distributed computing frameworks (e.g., Hadoop, Spark) is a plus. Please ask the candidates to fill the years of experience for the below skill matrix. Skills Matrix: Skills Years of experience Data Architect Databricks python SQL Pyspark Azure Data Factory Azure Databricks Azure DataLake Gen2 Azure Key Vault Azure Devops Data Warehouse Data Modeling ETL Keywords: sthree Pennsylvania Looking for Data Architect with Pyspark (PA Locals) [email protected] |
[email protected] View all |
Tue Nov 05 19:51:00 UTC 2024 |