| Databricks Engineer with PySpark & Python Exp - F2F interview at Remote, Remote, USA |
| Email: [email protected] |
|
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=893019&uid= Databricks Engineer with PySpark & Python Exp - F2F interview Databricks Engineer with PySpark & Python Exp - F2F interview Databricks Engineer with PySpark & Python Exp NYC, NY OR Iselin, NJ (Hybrid 3 days +1 in Databricks Engineer with PySpark & Python Exp - F2F interview Databricks Engineer with PySpark & Python Exp NYC, NY OR Iselin, NJ (Hybrid 3 days work from office) In-person interview is must Must Have: * 10+ yrs IT experience * Databricks, Python and PySpark * SQL Server Job Description: This position is for a Cloud Data / Reporting engineer with a background in SQL and data warehousing for enterprise level systems. The position calls for someone that is comfortable working with business users along with business analyst expertise. Major Responsibilities: Design, develop, and deploy Databricks jobs to process and analyze large volumes of data. Collaborate with data engineers and data scientists to understand data requirements and implement appropriate data processing pipelines. Optimize Databricks jobs for performance and scalability to handle big data workloads. Monitor and troubleshoot Databricks jobs, identify and resolve issues or bottlenecks. Implement best practices for data management, security, and governance within the Databricks environment. Experience designing and developing Enterprise Data Warehouse solutions. Demonstrated proficiency with Data Analytics, Data Insights Proficient writing SQL queries and programming including stored procedures and reverse engineering existing process Leverage SQL, programming language (Python or similar) and/or ETL Tools (Azure Data Factory, Data Bricks, Talend and SnowSQL) to develop data pipeline solutions to ingest and exploit new and existing data sources. Perform code reviews to ensure fit to requirements, optimal execution patterns and adherence to established standards. Skills: 10+ years - Enterprise Data Management 10+ years - SQL Server based development of large datasets 5+ years with Data Warehouse Architecture, hands-on experience with Databricks platform. Extensive experience in PySpark coding. Snowflake experience is good to have 3+ years Python (numpy, pandas) coding experience 3+ years experience in Finance / Banking industry some understanding of Securities and Banking products and their data footprints. Experience with Snowflake utilities such as SnowSQL and SnowPipe - good to have Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling. Previous experience leading an enterprise-wide Cloud Data Platform migration with strong architectural and design skills Capable of discussing enterprise level services independent of technology stack Experience with Cloud based data architectures, messaging, and analytics Superior communication skills Cloud certification(s) Any experience with Regulatory Reporting is a Plus Education Minimally a BA degree within an engineering and/or computer science discipline Masters degree strongly preferred Warm Regards, Bhupendra Kushwaha Arkhya Tech Inc. 2023 Arkhya Tech Inc. | 722 Grant Street, Suite G Herndon, VA 20170 USA Web Version Preferences Forward Powered by Mad Mimi A GoDaddy company Keywords: business analyst information technology New Jersey New York Virginia http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=893019&uid= |
| [email protected] View All |
| 09:29 PM 29-Nov-23 |