Home

Naga Gopal Danduboina - Data Analyst
[email protected]
Location: Lakeville, Minnesota, USA
Relocation: Yes
Visa: H1B
Naga Gopal Danduboina
Data Analyst
+12165431043
[email protected]
Lakeville, MN
Yes
H1B

www.linkedin.com/in/naga-gopal-danduboina-1b9025296
SUMMARY :
Data Analyst with over 12 years of experience in coding, implementation, testing, design, analysis, maintenance, software development and support as Programmer/Analyst/Engineer with profound experience in Retail, Marketing and Analytics. Certified Base SAS Programmer for SAS9.
Specific Expertise:
Lead projects that span multiple teams within marketing, operations, Pricing and Promotion, Merchants, Product, Digital, Stores, & Data Sciences.
Performing Customer Data Analytics and providing actionable insights and solutions to support business needs of stakeholders.
Extensive knowledge of Customer & Sales data for various report building and segmentations.
Extensive experience in creating various Tableau and DOMO visualizations.
Report and develop weekly action plans on Key Performance Indicator (KPI) results. Supported a key role in building 15 different capabilities which were getting used by
1000 unique users for their strategic planning of designing Billion-dollar brands.
Participated in Agile Design Sprints with various teams where our goal is to answer critical business questions through design, prototyping, and testing ideas with Users.
Managed a Scrum team as Product owner for multiple sprints while migrating data from DB2 to Teradata platform, worked on user stories by gathering requirements.
Supported data needs for report building in Tableau and Microstrategy.
Experience in various Demographics and Segmentations of Customer Data.
Experience in various areas of data like Clearance, Competitor Shopresults, Promotions, Price Change History etc.., for building DataLab User Interface (UI).
Created Frequency Risk Model for Product Safety and Quality Assurance using Aqua Data Studio and SAS by applying provided Business rules to ensure consistency of product quality and review product for potential defects.
Experience in Automation processes using Oozie, Crontab, etc.
Raised queries and extracted large amounts of data from DB2, Teradata, Oracle, and Big Data databases to provide ad-hoc reports by using SAS, SQL and PROC SQL.
Extensive experience in Teradata SQL Assistant, Teradata Data Labs and Teradata Viewpoint.
Experience in developing SAS Procedures, Macros, Report formatting, Data Loading and Exporting, and Batch Processing.
Experience in different methods of extraction, transformation, and loading (ETL) of data into warehouse.
Experience in encrypted files and transferring data via FTP, SFTP, WinSCP & MFT processes for 3rd party vendors.
Completed Hands-on training for Cloud Computing online event with focus on AWS and Google Cloud Platform using Terraform and Docker.
Excellent problem-solving skills for delivering useful and compact solutions. EDUCATION
Cleveland State University, Cleveland, OH
Master of Science in Engineering, Dec 2010
TECHNICAL SKILLS
Databases: DB2, Teradata, BigData/Hive, SQL Server, MySQL, Presto, Oracle. Programming Languages: SQL, Unix Shell Scripting, HQL, Oozie, Spark SQL, Pyspark, Scala, Python 3.0.
Statistical Software: SAS/BASE, SAS/MACROS, SAS/Enterprise Guide, SAS/GRAPH, SAS/SQL, SAS/ACCESS, SAS/ETL.
Tools: Teradata SQL Assistant, Hadoop/Hue, DOMO, Tableau, Microstrategy, Aqua Data Studio, Smartsheet, Putty, Crontab, WinSCP, GitHub, Atom, Cyberduck, Jira, DataMiner, Terminal, Microsoft Excel, PowerPoint, Access and Word.
PROFESSIONAL EXPERIENCE
Target Corporation, Minneapolis, MN
Lead Data Analyst/Engineer, 06/2020 to 12/2023
Project: Guest Analytics Big Data Migration
The purpose of Guest Analytics team is to establish operating models and appropriate resourcing to support Guest Insights. Streamline and simplify Guest reporting in alignment with Guest Partner teams. Responsibilities include
Build and scale Guest Insights as one stop shop for all Guest data needs. Research data assets, apply various technologies to these assets and develop metrics, reports, and statistics.
Sourcing data and providing the highest quality data for the business domain through various BI&A tools like DataMiner, Honeycomb, Automation Portal and DOMO. Play an active role with BigData Engineers and Product Owners ensuring the critical data is available and reliable in the proper platforms with clear SLAs.
Perform data profiling of source data to identify data quality issues and anomalies, business knowledge embedded in data, gathering of natural keys, and metadata information.
Built key capabilities like Guest Drivers, Walker, Repeat Guest, Brand Loyalty & Guest Overlap from scratch and upgrade them based on User feedback. Maintaining 100% of the Guest Analytics engineering needs, access requirements, data feeds and DOMO reports.
Develop and maintain Visualizations and Ad-Hoc Analysis for vendors and educate business users in decision-making by using data, reports and dashboards. Document the methodologies, best practices, and Python queries on GitHub for peer reviews and usage.
Onboarding Junior Data Analysts by educating Data, Tools and Business Rules.
Business Analyst, 11/2016 to 05/2020
Project: Data Science and Analytics Guest IQ DOMO Dashboards
The purpose of Guest IQ team is to Identify and Learn Guest & Basket Behaviors, Designing Guest Centric Strategies to drive Category growth and Designing Solutions based on prominent questions asked by Strategy Planning and Category Planning Business Insight Leaders. As a part of migration from Teradata to Big Data, I was tasked with re-creating a lot of Microstrategy reports in DOMO. My Responsibilities include but not limited to
Supporting data needs for report building in DOMO and Greenfield. Performing Data Trending Analysis and achieving solutions by optimization.
Automating Actions to run on Daily/Weekly/Monthly & Quarterly basis to support critical reports like Sales Driver, Cherry Picking, etc for Enterprise Data Analytics and Business Intelligence teams.
Creating and maintaining databases, tables and views in Hive/Big Data database that can be useful for various teams.
Building Visualizations and Publishing DOMO Dashboards, created various datafusions and dataflows by using Amazon Redshift and Magic ETL.Performing Admin services for various DOMO pages.
Working with Buyers & Category Managers to resolve data quality issues in DOMO cards and modifying cards according to need for better understanding.
Data Mapping and Gap Analysis of Production data for Teradata to BigData database Migration.
SAS Analyst, 01/2013 to 10/2016
Project: Guest IQ/Tableau/Microstrategy
The Objective is to support data needs for Guest IQ landing page to visualize data in both Tableau and Microstrategy. My responsibilities include
Working extensively on SAS to extract needed data from production tables and uploading it to Teradata DataLabs for Microstrategy reports.
Using Teradata SQL Assistant to run Explains on queries to understand runtimes, performance and complexity.
Using Teradata Viewpoint to monitor various queries and track database traffic. Using various methods like Teradata FastLoad, MultiLoad and FastExport. Automating SAS jobs to run on Daily/Weekly/Monthly & Quarterly basis to support data needs in Tableau.
Building Visualizations and Dashboards in Tableau.
SAS Programmer, 07/2011 to 12/2012
Project: ADW Guest Innovation/Vendor Requests
My job is to assist in Analytical Data Warehouse (ADW) & Insight Center report testing, understanding database and own bulk data transfer to vendor partners, working with engineering teams to address concerns on data quality of Production data, documenting best practices for data transfers and help prototype Microstrategy reports. My Responsibilities include
Extracting data from flat files, excel spreadsheets and external RDBMS tables using LIBNAME and SQL PASSTHRU Facility in SAS.
Manipulating Data with Large records using SAS/BASE, SAS/SQL, SAS/ACCESS, PROC/SQL.
Writing SAS/MACROS extensively to decrease the code redundancy and to automate the SAS code.
Created a new method for users to insert their updated Multiple Item Codes & Multiple Store Codes using MS ACCESS and SAS.
Encrypting the data by PGP encryption in UNIX for secure data transfer through servers using various File Transfer Protocals.
Using Crontab in UNIX to maintain and automate SAS jobs.
Working with End-On-Hand (EOH) and Out-of-Stock (OOS) metrics for Inventory data.
Keywords: user interface business intelligence active directory information technology microsoft Minnesota Ohio

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];1822
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: