Data Engineer at Remote, Remote, USA |
Email: [email protected] |
From: Kuldeep Sharma, VizonInc [email protected] Reply to: [email protected] Job Description - Data Engineer (linkedin) Onsite- Columbus, OH (candidate local to Columbus, OH, Broken Arrow, OK or Dallas) MUST BE USC Flight Safety Expert in Databricks Autoloader, Delta Live tables, Data Asset bundles and Unity Catalog Expert in Python and SQL language Mid-level skill in Data modeling Data Vault 2.0, Snowflake, and dimensional modeling Structured, semi structured JSON and Hierarchical data Agile and Devops experience Business domain knowledge at least in couple of areas like Finance, Sales, Customer experience, HR, Safety, and operational excellence Data Management concepts Data Quality, Governance, Meta data management Azure Cloud fundamentals Security and compliance Self-motivated JOB DESCRIPTION BELOW: Overview The position plays a key role developing and maintaining enterprise analytics deliverables, including but not limited operational data stores, data integrations and reports. The ideal candidate will be working in our mixed technology environment to deliver data products providing decision support for business and customers. As part of a highly collaborative team, the role will interact with technical and business resources within and outside of IT organization. The ideal candidate is a committed, creative, self-motivated and passionate technologist who is interested in practicing current skills and learning new ones. Primary Duties and Responsibilities This role is a highly technical position with responsibility to deliver and maintain data engineering components to meet the business objectives. The following duties are essential to the successful and satisfactory performance of this job. Other duties may be assigned. Design Partner with Business Stakeholders, Business Analysts, Data Engineers, Developers to design enterprise data warehouse components. Provide estimations, schedules and regular and timely updates to project managers & senior management as needed. Improve operations by conducting system analysis, recommending changes in policies and procedures Develop/Validate Validate proposed design for accuracy and completeness of business use cases. Develop data integration and transformation solutions to meet the input needs of the models. Develop and support batch jobs. Perform unit & regression testing. Perform code/peer reviews to ensure adherence to established design & development standards. Collaborate with development and quality assurance teams for testing and product quality improvements as needed. Produce deployment scripts, checklists, playbook & operations runbook in accordance with SDLC & change management requirements Deploy/Support Deploy database & data engineering components to higher environments, using CI/CD automation where applicable. Take measure to ensure adherence to committed service level agreements. Monitor the scheduled to jobs & performance of the platform for smooth operation. Independently and with support from other developers, troubleshoot and fix issues that arise with data and/or processes. Serve as techno-functional expert and assist developers and users in understanding data flow and application data Required Experience, Education and Skills Bachelors degree in related field (prefer CS major) 10+ years of software development experience 5+ years of enterprise ETL, analytics & reporting development experience in SSIS, SSAS, SSRS 5+ years of experience in RDBMS design and development. Must demonstrate a clear mastery of the logical and physical database design (for both transactional and data warehouse) and data normalization concepts 5+ Expert SQL abilities and use of tools and platforms and troubleshooting skills. Must have extensive hands-on dev experience: T-SQL, UDF, UDAF, View, Trigger, Stored Proc, 3+ years of experience in Azure using ADLS & Azure Data Factory and Automation Experience working in visual studio development environment and with using DevOps platforms for code management and deployment using CI/CD techniques Familiarity with SDLC and agile methodologies and tools such as TFS, Git or SCRUM Ability to take a project from scoping requirements through actual launch of the project Experience in communicating with users, other technical teams and management to collect requirements, identify tasks, provide estimates and meet production deadlines Experience with professional software engineering best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing and operations Understand and work in an Agile development environment Experience in designing & building BI Reporting solutions, preferably using Power BI. Experience with Apache Spark and PySpark Experience in building data engineering platforms in Python for data science or data analytic groups Experience in Databricks, preferably on Azure Experience with NoSQL Experience with consuming data from streaming platforms like Kafka. System and networking fundamentals Knowledge/experience in Education or Aviation industry Keywords: continuous integration continuous deployment business intelligence information technology Ohio Data Engineer [email protected] |
[email protected] View all |
Thu Nov 14 02:32:00 UTC 2024 |