Data Engineer : USC and GC : Local to : Indianapolis at Indianapolis, Indiana, USA |
Email: [email protected] |
From: Prakash Singh, RCI [email protected] Reply to: [email protected] Local DL to Indiana Only Data Engineer 6+Month Hybrid - Onsite at client in Indianapolis 3 days per week, at home 2 days per week Come join the Client family and be a part of our exciting growth story! We're on the lookout for fantastic leaders at every career stage to join our dynamic team. Together, we're all about unlocking potential whether it's for our clients, colleagues, or the communities we serve. We achieve this through the delivery of cutting-edge, data-driven solutions, advanced technology platforms, and unparalleled expertise. Our approach centers around being more than just a service provider; we aspire to be genuine partners in success by working as true extension of our clients and following proven yet adaptable processes while embodying our core values each day. These values are the foundation of both our client and employee experiences, and the biggest reason that Client has been recognized as a Best Place to Work year over year. Job Description: Consultants are at the forefront of designing, developing, and maintaining cutting-edge data platforms and solutions. As such, your role will encompass crafting platforms that house essential data sets supporting diverse business operations and fostering data-driven decisions. Additionally, you'll dive into analytical solutions, offering visibility and decision support through advanced data technologies. As a Data Engineer, you'll take charge of administering data tools, constructing robust ETL/ELT pipelines, crafting data integration solutions, and troubleshooting technical issues. Collaboration is at the heart of what we do you'll closely partner with data scientists, business analysts, system administrators, and data architects to ensure our data platforms seamlessly meet business demands. Your expertise will pave the way for scalable solutions, incorporating essential monitoring and following best practices. If you're passionate about steering data-centric innovation and thrive in a dynamic, collaborative environment, the place for you. Join us and play a pivotal role in shaping the future of data engineering, where your contributions truly matter. Ready to make an impact Apply now and be part of our journey to success! Responsibilities: Develop ETL pipelines from various data repositories to load into various stages of the data platform. Develop data lake and warehouse architectures that support business objectives. Integrate platform into the existing enterprise data warehouse and various operational systems. Develop administration processes to monitor pipeline performance, resource usage, and failure. Address performance and scalability issues in a large-scale data lake environment. Provide data platform support and issue resolutions. Requirements: Bachelors degree in in computer science, software engineering or a closely related field; a master's degree is preferred At least 5 years of experience with Azure solutions such as functions, Synapse Data Warehousing, Data Factory, and Databricks At least 5 years of experience developing batch and/or streaming ETL/ELT processes At least 5 years of experience with data warehouse or data lake concepts Proficiency in at least one programming language, such as Python or R Experience with a BI reporting platform (e.g. Tableau, PowerBI) Excellent communication, analytical and problem-solving skills At least 2 years of development experience in a project management or agile operation Prior experience working in State, Health and Human Services is a plus Strong communication and collaboration skills Excellent ability to work under tight deadlines and remain flexible when faced with shifting priorities. Bachelors degree in computer science, software engineering, or a closely related field. Keywords: business intelligence rlang information technology |
[email protected] View all |
Thu Mar 14 22:05:00 UTC 2024 |