Sr lead Data Engineer at Remote, Remote, USA |
Email: [email protected] |
From: Zara, TechRakers [email protected] Reply to: [email protected] 100% Remote Job Description: Top Skills' Details 1. Senior Lead Data Engineer with professional project experience with AWS Services or Snowflake related work. 2. Subject Matter Expertise and hands on technical skills with Data Ingestion, Data Processing, and Data Transformation. 3. Strong skills with Data/database programming tools and technologies- Python, Spark, SQL, Snowflake. 4. Senior skillset with Data Quality for Data Engineering. Job Description One Main Financial is a lending-exclusive financial company- specialized in personal lending. They have over 1,500 branches in 44 states. OMF's typical customer is individuals with poor credit and then they over solutions through their personalized lending to improve customer's financial lives with solutions for debt consolidation, medical expenses, household bills, home improvements and auto purchases. Marketing Technology (MarTech) is looking to hire a Senior Data Engineer. This Data Intensive group supports the Marketing Communication Business Team. The Marketing Business is looking to gain and retain customers through an Omni-Channel (Email, web, mobile, SMS, social) personalized customer experience. This includes personalized marketing and communications to onboard, engage, expand customer relationships, grow customer lifetime value, and increase revenue retention. This project will directly supporting this initiative through a data modernization effort, specifically Modernizing their Customer Data Platform that will allow OMF to increase automation and the use of data analytics throughout the customer journey. The candidate will be responsible for identifying relevant data and utilizing data tools, technologies and processes to develop continuous, data driven and automated customer communications across marketing and servicing towards omni channel personalized customer experience vision and outcomes Technology Initiatives: Implementing Snowflake- Advanced Data Platform Data Ingestion, Processing, Transforming Automated Data Quality Check Data Visualization Data Distribution Cloud Services (AWS) Core Responsibilities Create and manage cloud resources in AWS Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations Develop an infrastructure to collect, transform, combine and publish/distribute customer data. Define process improvement opportunities to optimize data collection, insights and displays. Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible Identify and interpret trends and patterns from complex data sets Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. Key participant in regular Scrum ceremonies with the agile teams Proficient at developing queries, writing reports and presenting findings Mentor junior members and bring best industry practices Qualifications 5-7+ years experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales) Strong background in math, statistics, computer science, data science or related discipline Advanced knowledge one of language: Java, Scala, Python, C# Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake Proficient with Data mining/programming tools (e.g. SAS, SQL, R, Python) Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum) Data visualization (e.g. Tableau, Looker, MicroStrategy) Comfortable learning about and deploying new technologies and tools. Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines. Good written and oral communication skills and ability to present results to non-technical audiences Knowledge of business intelligence and analytical tools, technologies and techniques. Familiarity and experience in the following is a plus: AWS certification Spark Streaming Kafka Streaming / Kafka Connect ELK Stack Cassandra / MongoDB CI/CD: Jenkins, GitLab, Jira, Confluence other related tools |
[email protected] View all |
Mon Oct 17 23:51:00 UTC 2022 |