Looking for Data Engineer | No h1 No OPT CPT at Remote, Remote, USA |
Email: [email protected] |
From: Nitu, RCI [email protected] Reply to: [email protected] Role: Data Engineer Duration: 6+ Month Location: Remote MUST have: SAP experience. 5-8 years of data engineering, 3+ years of very recent Azure data engineering. (Data Bricks, Data Factory, Data Lakes, Pipelines). Job Descriptions: The Full Stack Data Engineer will design, develop, and deliver reliable, scalable data management (acquisition, integration, transformation) using DevOps methodologies and Microsoft Azure PaaS platforms. Design and enable Finance Analytics solutions and services. This role is Finance Capability Teams competitive advantage to delivering business outcomes via actionable analytics, decision intelligence and help our businesses recognize value and here to shape how we deliver analytics solutions going forward. The Data Engineer is multi-skilled engineer who helps to estimate work, accept stories into delivery increments and complete tasks to deliver the work. Is proficient in Source Data Analysis, profiling, integration, modelling. Has expertise in data management technologies such as Azure Data Factory, ADLS G2, Azure Databricks, Azure SQL DW, Azure AAS, SAP Data Services Programming languages such as Python, PySpark, Spark SQL, Scala, SQL and Open source data management tools or equivalent. Responsibilities: Designs, codes, and tests new data management solutions, including supporting applications and interfaces. Architects data structures to provision and enable Financial Analytics solutions. Works with DA&I and other capability teams to support cross-functional development activity in various DA&I and Connected Enterprise related projects, for internal and external customers Proactively monitors industry trends and identifies opportunities to implement new technologies Manages the DevOps pipeline deployment model Implements software in all environments Leverages containerization models and works with other engineers and architects to keep the architecture current Assists in the support and enhancement of applications Writes high-quality code compliant with development guidelines and regulations Collaborates with business systems analysts and product owners to define requirements Requirements: Bachelors Degree in computer science, software engineering, management information systems, or related field Experience in systems development lifecycle Experience in Data management concepts and implementations Experience with Agile development methodologies and system/process documentation Experience with server-side architectures and containerization Experience with Azure Data Factory, ADLS G2, Azure SQL DW, Azure AAS, Tabular models, SAP Data Services Proficiency in programming languages Python, PySpark, Spark SQL. Proficiency in using notebooks with multi-programming language components Working knowledge of multi-dimensional modeling with Fact and Dimension tables High level understanding of SAP extractors and tables and their usage is desirable Working knowledge of Scala and Microsoft Synapse Analytics are added advantages Familiarity with business concepts and impact of data on business processes Experience managing multiple projects simultaneously Excellent interpersonal, verbal and written communication skills Ability to adapt quickly to new technologies and changing business requirements Solid problem-solving skills, attention to detail, and critical thinking abilities Keywords: |
[email protected] View all |
Wed Jan 25 17:23:00 UTC 2023 |