Urgently Hiring:: Data Engineer::Hybrid (Cincinnati , OH) at Cincinnati, Ohio, USA |
Email: [email protected] |
From: Priyabrata pradhan, Vyze [email protected] Reply to: [email protected] Job Description - Title: Data Engineer Level 2 Authorization: GC or USC only. MOI: F2F Job Description Must be LOCAL to Cincinnati and be able to interview onsite. With Local DL and Genuine Work auth. copy Candidates should be eligible to work for any employer in the United States, without sponsorship. Candidates MUST be local to Cincinnati. Interviews will be onsite at the BTC and they will be working a hybrid schedule, in the office 2-3 day/week. Soft Skills Needed - Self-starter, good communicator, willing to take ownership Work Location (in office, hybrid, remote) - This person will need to report to the office 3 days/week. This person will need to be based on Cincinnati. Candidates must be local or relocate before they start their job. Required Working Hours - Normal working hours, EST Interview process and when will it start - This person will likely have a 1 step interview process with the HM and potentially a couple of other people joining. 2-3 hour Interviews will be done in person at BTC, preferably on Mondays and Tuesdays Job Description Analyze, design and develop enterprise data and information architecture deliverables, focusing on data as an asset for the enterprise. Understand and follow reusable standards, design patterns, guidelines, and configurations to deliver valuable data and information across the enterprise, including direct collaboration with 84.51, where needed. Experience designing, building, and deploying scalable cloud-based solution architectures Expertise in creating lakehouses in Azure and building Realtime Streaming Analytics Platform leveraging Azure Services Experience working with public cloud platforms such as AWS, Azure Expertise in Python, SQL Experience building data lakes & data warehouses to support operational intelligence and business intelligence Experience working with Serverless Computing & Azure Functions Experience with Cosmos DB, Azure Data Explorer, Azure Synapse Analytics, Azure Data Lake, Azure Data Factory, Azure SQL, Azure Databricks, Azure Machine Learning or equivalent tools & technologies Experience with Kafka and/or other event streaming mechanisms Experience with monitoring/alerting techniques and APM tools Experience automating infrastructure provisioning, DevOps, or Continuous Integration/Delivery Knowledge of containerization and container orchestration technologies. Expertise with Object Oriented Programming Skills and Design Patterns Good understanding and/or experience writing software in one or more languages, such as Java, Go, JavaScript, C++, or similar Familiarity with standard IT security practices such as identity and access management, SSO, data protection, encryption, certificate, and key management Familiarity and good understanding of infrastructure storage, networking etc. Strong analytical problem-solving skills Excellent written and verbal communication skills Self-starter takes the initiative and works well under pressure Business-minded approach to time, costs, and milestones Lead the analysis of the technology environment to detect critical deficiencies and recommend solutions for improvement Must be very organized, able to balance multiple priorities, and self-motivated Azure Certifications in Associate and Expert levels for Data Services preferred Familiarity or experience with building AI/ML solutions Key Responsibilities Drive digital innovation by leveraging innovative new technologies and approaches to renovate, extend, and transform the existing core data assets, including SQL-based, NoSQL-based, and Cloud-based data platforms Drive the development and communication of enterprise standards for data domains and data solutions, focusing on simplified integration and streamlined operational and analytical uses Design, develop, and implement integrations based on user feedback. Collaborate with solution architects, cloud services, enterprise data management & product managers to iterate on the design & implementation of Fulfillment Services Data Strategy Design Azure-based technical architectures, migration approaches, and application optimizations that enable the Product Draft and review architectural diagrams, interface specifications and other design documents such as Architectural Decision Records Focus on overall products data quality and user experience Build standards, processes, and procedures to deliver best results Create data pipelines for various types of streaming use cases Be a technical advisor and troubleshoot to resolve technical challenges with data product related infrastructure and application(s) Promote the reuse of data assets, including the management of the data catalog for reference Mentor team members in data principles, patterns, processes, and practices Implement automation tools and frameworks (CI/CD pipelines). Troubleshoot production issues and coordinate with the development team to streamline code deployment. Author and publish data governance models, data lineage and data dictionary for Fulfillment Services Data Catalog Thanks & Regards Priyabrata Pradhan Email: [email protected] Keywords: cplusplus continuous integration continuous deployment artificial intelligence machine learning database information technology golang green card Urgently Hiring:: Data Engineer::Hybrid (Cincinnati , OH) [email protected] |
[email protected] View all |
Tue Oct 29 00:45:00 UTC 2024 |