Senior Azure Data Engineer - (Need EST or CST Candidates only) at Chicago, Illinois, USA |
Email: [email protected] |
From: Mark, Vrddhi Solutions LLC [email protected] Reply to: [email protected] We are looking for Senior profiles(10+ years) with Data warehousing & Data modeling Must have Skills Data warehousing Data modeling core data concepts SQL Python SparkSQL Position: Senior Azure Data Engineer Location: local to Chicago, IL or Beloit, WI (fully remote) (Need EST or CST Candidates only) Duration: 6m + Experience: 10+ years required Visa: no H1B Interview Process/Times: first round will be an hour interview with hiring manager which will include a light coding exercise around SQL and Python, 2nd round will be an interview with the Director which may include coding depending on the candidates performance on the first interview Job Description: The Data Engineer is expected to have deep knowledge of the EDW, Data modelling, integration patterns (ETL, ELT, etc) and may work with one or a range of tools depending on project deliverables and team resourcing. The Data engineer will also be expected to understand traditional relational database systems and be able to assist in administering these systems. Candidates must be interested in working in a collaborative environment and possess great communication skills, experience working directly with all levels of a business and able to work in both a team environment as well as individually. Responsibilities range from batch application\\client integration, aggregating data from multiple sources into a data warehouse, automate integration solution generation using reusable patterns\\scripting, prototyping integration solutions, and security. QUALIFICATIONS A Bachelor's Degree in Computer Science or a related field is required. A high school diploma and/or equivalent combination of education and work experience may be substituted. A minimum of 7 years relevant experience of development experience in Data warehousing and various ELT or ETL tools Preferred experience in Data bricks, Azure, ADF (Azure Data Factory) A minimum of 5 years' experience building database tables and models. Must be able to write Complex SQL for DDL and DML operations fluently. Must have hands on experience with Python, Pyspark, Spark SQL Strong understanding of enterprise integration patterns (EIP) and data warehouse modeling. Experience with development and data warehouse requirements gathering analysis and design. Possess strong business acumen and consistently demonstrates forward thinking. Eagerly and proactively investigates new technologies. Is able to effectively work with ambiguous or incomplete information. Must have a strong working knowledge of technical infrastructure, protocols and networks. Must have strong experience on multiple hardware and software environments and be comfortable in heterogeneous systems. Must be able to routinely work with little or no supervision. Must be able to effectively and efficiently handle multiple and shifting priorities. Primary ACCOUNTABILITIES: Develops batch integration solutions. This includes traditional DW workloads and nightly large extracts that are scheduled. Design and Build Data models star schema, snowflake Create ADF pipelines to bring new data from various sources Create Data bricks notebooks for Data transformation Documents all solutions as needed using standard documentation. Plans, reviews, and performs the implementation of database changes for integrations/DW work. Maintain integration documentation and audit tools. To include developing/updating the integration dashboard. Work with BI team, PO to build required tables and transform data to load into Snowflake Provides support for database/database servers as a member of the Data Management team. Works with project management and business analysis team to provide estimates and ensure documentation of all requirements. Provide logical layers (database views) for end-user access to data in database systems. Partners with functional support and help desk teams to ensure communication, collaboration and compliance with support process standards. Performs data management tasks as needed. Keywords: business intelligence purchase order Illinois Wisconsin |
[email protected] View all |
Mon Feb 06 20:45:00 UTC 2023 |