Home

Kranthi Reddy - Senior Data Engineer
[email protected]
Location: Marshall, Michigan, USA
Relocation: current project is going to be end
Visa: H1b
K r a n t h i R e d d y

S e n i o r D a t a E n g i n e e r





Senior Data Engineer with over 10 Years of profession experience in managing all phases of Software Development cycle. Extensive experience in Oracle Database with expertise in SQL, Python and PL/SQL programming. Experienced in Designing Data flow and building Data Pipelines. Quick Learner and adept at Working in both team or Individual settings.

WORK EXPERIENCE
META (SEP 2021 PRESENT) REMOTE(MICHIGAN) DATA ENGINEER
Own data domain of 'Stars' Product area. 'Stars' is the monetization program where user appreciate the creator content by sending 'Stars' to the Creator, these stars are converted to cash by creators.

Partnered with leadership, engineers, program managers and data scientists to understand data requirements.
Support feature development by defining data needs, owning logging quality and building end-to-end data flow, from raw logging to automated experiment metrics and dashboards.
Deploy inclusive data quality checks to ensure high quality of data
Optimize existing pipelines and maintain of all domain-related data pipelines
Ownership of the end-to-end data engineering component of the solution
Support on-call shift as needed to support the team
Design and develop new systems in partnership with software engineers to enable quick and easy consumption of data

ADVANTIS GLOBAL (CONTRACT WITH APPLE) (MAY 2021 SEP 2021) DATA ENGINEER REMOTE(MICHIGAN)
Part of Apple Amp reporting team focused mainly on building reports:
Collabrated with multiple teams to understand New Report requirements.
Created data pipelines using python to build new datasets.
Automated reports by creating new shell scripts and Oracle packages.
Improved the efficiency of Data pipelines.

SENIOR DATABASE DEVELOPER
EPSILON (FULL TIME DEC 2017-MAY 2021, CONTRACT JUN 2011 DEC 2017)
Worked on multiple projects with responsibilites including Data Loading, Data Processing, Developing Campaigns, Creating Reports and Automation.
ETL PROCESS:
Developed logical and physical data models that capture current state/future state data elements and data flows.
Worked with SQL*Loader and External Tables to load flat files into tables.
Developed procedures and packages to move data from staging tables to production tables.
Created custom triggers to automatically populate different tables and to restrict the data being inserted to the different tables.
Created Data Pipelines using Python.

CONTACTS

Detroit, MI 813-319-4244
[email protected]



CORE SKILLS

# SQL

# Python # PL/SQL
# Shell Script # SQL Loader
# ORACLE DATA PUMP

# SQL PLUS

# AirFlow

# UNIX/LINUX

BIG DATA
#Hadoop #Hive #Presto #Spark #Pig #Kafka #Sqoop





CAMPAIGNS:
Developed PL/SQL packages, procedures, functions applying the business logic facilitate the functionality for various modules.
Created processes to segment customer data and Identify in-market households within the dealer s database.
Developed process to Identify the Current owner of the Vehicle based on business rules.
Worked on multiple campaigns like Loyalty, Winback and PMA.
Identifed the customer s address change using NCOA and CASS process
UTL_FILE package is used to export all mailing eligible customer s information into delimited text file.
PERFORMANCE TUNING:
Analyzed the performance of SQL queries using Explain plan and SQL TRACE.
Increased the response time of queries by creating proper Indexes and Partitioning the table.
Analzyed PL/SQL code code using Profiler.
Used Oracle Hints to increase the performance.
Analyzed & gathered table statistics for Query Optimization.
Rewrittern Complex Queries to reduce the run time.
Created Materialized Views. Used Bulk collects in procedures.

REPORTING:
Created Dashboard Report for GM showing Communication Volumes, Response Rates, Dealer ROI Metrics, Key Program Headquarters Statistics, Revenue Generated Statistics Services Purchased etc.
Developed weekly Covid Impact report for General Motors by Analyzing data from various sources.
Developed process to create Response Rates for differenct Campaings.
Worked closely with Analytical team to create multiple reports and also to develop various Data models based on requirements.
AUTOMATION:
Developed multiple Shell scripts to automate the process of loading daily files to Oracle tables.
Used Cron Jobs and DBMS_SCHEDULER to schedule the Jobs.
Automated Quickstrike Campaign process.
Developed GUI tool to run Sales and Service Match reports using Python.
Developed dealer enrollment using Oracle Apex.
APIs:
Created a process using Python to call GM API to get current promotions for each Brand in JSON format and parsed JSON data to load data into table.
Created a procedure using UTL_HTTP package to call 3rd party API and parsed XML data to get after service survey link.

UNIVERSITY OF TOLEDO - (AUG 2008 MAR 2011) TOLEDO, OHIO
Created Database for storing data using SQL.
Analyzed Radon data using SQL
Computer Aided Design (CAD) tool is developed in MATLAB for analysis the electrical circuits.
Developed novel image analyzing algorithms for determining the physical characteristics of particulate matter by improving the existing algorithms and customizing them to suit the purpose of particle characterization.
EDUCATION
The University of Toledo, Toledo, OH
Master of Science, Electrical Engineering. GPA: 3.6/4.0.

CERTIFICATION
Oracle Database 11g: Introduction to Oracle SQL (1Z0-051)
Oracle Database 11g: Program with PL/SQL (1Z0-144)
Keywords: rlang procedural language Michigan Ohio

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];567
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: