Home

Suneeth Molugu - Sr.Data Warehouse Architect
[email protected]
Location: Alpharetta, Georgia, USA
Relocation:
Visa: GC
SUNEETH MOLUGU


SUMMARY
14 plus years of experience in the field of Data Analytics; designing and developing end-to-end BI and data solutions according to standard OLAP and ETL methodologies, Data Engineering for structured and unstructured data using Azure Databricks on Pyspark framework, AWS Glue, Azure Databricks, Redshift, Snowflake, Qliksense, Qlikview, Informatica Suite,Oracle Data Integrator
(ODI), Oracle Business Intelligence (OBIEE), Tableau Server, MS Power BI, Graph Databases, and open source tools including 50+ full life cycle implementations with outstanding record for accomplishment as an Architect, Lead, Developer and Administrator.
Expertise in implementing end to end BI solutions for Federal, State Human Services, Human Resources, Telecom, Health Care, Retail & Manufacturing, Financial, & Engineering sectors for various business processes.
Experience in multiple full life cycle implementations SDLC) : Waterfall and Agile methodologies using Oracle Business Intelligence, Tableau, Power BI, Informatica, ODI & open source tools like Python, Scala involving Requirement Gathering, Design, Development, Testing and Code Migration in Linux and Windows environments on premise and Cloud.
Worked on the cutting edge Data Engineering technologies including Databricks, AWS Glue and custom developed tools to ingest, process/transform and load near real time data up to 1 Billion records a day into the Data Lakes for further Analytical reporting solutions.
Strong experience in Extraction, Transformation, Loading (ETL) of data from various sources into target Data Warehouse and Data Marts using Informatica Power Center 10.x 9.x/7.x/6.x
Led multiple infrastructure builds and commissioned and configured various data warehouse architectural setup - both on premise and cloud implementation of software applications on Windows/Linux platforms.
Extensive experience in design, development, testing and production support for Data Warehouse Applications. Well versed in data warehousing concepts, dimensional modeling, designing Star and Snowflake schemas.
Well versed with Python programming language using Object Oriented Programming, classes, methods and various data analysis libraries like NumPy, Pandas, SciPy, Matplotlib, Seaborn, Stats models, Plotly.
Extensive experience in data modeling techniques, data analysis and data mining to produce highly effective analytical results sets, POCs and insights using ETL methodologies, SQLs
Expertise in implementing Oracle BI Apps in all phases for implementing Out of the Box prebuilt mappings, Designing ETL, metadata management using DAC.
Designed and developed Extract Transform and Load (ETL) and Data Integration (ETL) solutions for complex functional and technical requirements using Informatica Power Center Version 10.X
Extensive experience using wide variety of source systems flatfiles, Oracle, MS Server, Vertica, Json, and XML formats for data manipulation, analysis and produced business oriented measures and results.
Extensive experience in creating, formulating and refining raw data into user friendly data sets and visualizations, and analytical solutions using custom made ETLs.
Creative and aggressive self-starter with integrative thinking skills, capable of forming and maintaining positive and productive working relationships in internal, external, independent, and team environments
EDUCATION
Master of Science in Mechanical Engineering, Villanova University December 2008
Specialization: Systems Engineering, Design & Analysis
Bachelor of Technology in Industrial Engineering, JNTU May 2006
Major: Industrial Engineering, Production Planning & Manufacturing

TECHINICAL SKILLS

Cloud Technologies: AWS: Amazon Elasticsearch Service, AWS Lambda, AWS OpsWorks Stacks,Elastic Load Balancing,VM Import/Export, Amazon Kinesis Analytics, Amazon Kinesis Firehose, Amazon Kinesis Stream, Amazon Machine Learning, Amazon QuickSight, Amazon Redshift , (RDS), Amazon SimpleDB Azure Databricks: Data Lake, Delta Lake, Synapse, Databricks with Pyspark and SQL Framework
Messaging and brokers: RabbitMQ, Apache kafka, SQS, Kinesis,
Big Data: HDFS, Apache Flume, Apache Spark, Hive, Pig, Databricks
Architectures: Event-Driven Architecture or Message-driven architectures
BI: Qliksense, Qlikview, Microsoft Power BI, OBIEE 12c/11g (Administration, Analytics, Mobile App Designer, Interactive Dashboards, Disconnected Analytics, BI Delivers, Web Catalog Administration, Analytics Server Administration), Oracle BI Publisher Reporting, Tableau Server & Tableau Desktop 9.0/9.2/9.3,Tableau10.1/10.2/10.3.16/10.5/2018.3/2019.3/2020.2/2020.4/2021.1(Dashboards, Analytics and Mobile Analytics), (DAC, OBI Apps 7.9.6.4)
ETL: Informatica Power Centre 10.x, SQL*Loader, DAC 7.9.6.2/ 7.9.6.1, ODI 10.1.3.5.0, Tableau Prep
Databases: Oracle 12c/11g/10g/9i/8i, Graph Databases neo4j, MySQL, Amazon Redshift, Influxdb, MariaDB, MongoDB, Cassandra, MS SQL Server 2000/2005/2008, DB2, MS Access, Teradata
OS: Linux (Redhat-X86-64), UNIX (AIX/ Solaris), Window Server 2008/2003
Tools & Utilities: TOAD 11.x, SQL Developer, MS Active Directory Setup, SQL*Plus, SPUFI, QMF, File Aid, Changeman, MS Office, Ervin, TOAD, IBM Utilities, SDF II, Putty, Xming, Secure CRT 6.2
Programming Languages: Python, C, C++, COBOL, JCL, Easytrieve, SQL, PL/SQL

PROFESSIONAL EXPERIENCE:
Sr. Data Analytics Lead
Novelis Dec 2022 Present

Responsibilities:

Designed and developed complete End To End Analytics solution for HR Technology domains Talent Acquisition, Core HR Products, Learning Platform, Finance, and Payroll Systems using Azure Databricks, Power BI and OBIEE tools on Structured and Unstructured data
Designed and developed data models, Source To Target Mapping (STTM) and designed star schema and snowflake schemas for the multiple data warehouse systems using Ervin
Designed and developed data pipelines for Structured and Unstructured data using Azure Databricks Delta Lake, Pyspark and Spark SQL framework for multiple business domains at Novelis
Worked as a Solution Architect to design Security Architecture, Secure Access roles, Data Encryption, and Masking, Data Governance, Data Quality Framework, Data Catalog for the HR Technology Analytics Solution.
Designed the system in compliance with GDPR, LGPD, HIPAA, and PII rules for the entire Analytics Platform
Worked with business, legal, and compliance teams to create One Trust Questionnaire to proceed to work on highly sensitive data
Extensively worked on HR Technologies Business functions NPS reporting, Talent Acquisition, Headcount, Attrition, Survey Data, Finance and Payroll Systems using Power BI and Tableau.
Worked extensively in Agile Methodology used Jira for creating stories, epics and spikes. Participated actively in Sprint planning, Retro and Quarterly planning for the Data Analytics Projects
Designed and Developed Power BI Reports and Dashboards for compelling story telling of KPIs, Scorecards using wide variety of visualization techniques using Power BI, created Dax formulas, created visualizations like Donut chart, bar graphs and trending line graphs using slicers , calculated columns and Measures
Implemented Row Level security in Power BI, which provided data access for over 100 users
Created and Managed Power BI Workspace, setup Gateway connections, designed and developed datasets
Worked on data science projects to develop a system network, child-parent-caseworker-collateral network, chance of payment use cases implementation on Python, ODI and Oracle databases.
Worked on designing solutions to predict/formulate the dataset based on the existing datasets using Python, Informatica PWC and presented the solutions using visualization tools Power BI, Tableau and OBIEE
Worked with multiple teams local and International to provide solutioning globally on Novelis Products
Designed and developed Analytics Maturity Model, for modernizing HR Technology and lead Advanced Analytics solutioning


Environment: Azure Databricks, Power BI Tools , OBIEE 12c/11.1.1.9.x, ODI 12c, Jira, Ervin, Github, Smartsheet, MS Office Suite, R, Python, Scala, JavaScripting

Sr. Data Analytics Consultant
Verizon Wireless Mar 2021 Dec 2022
Responsibilities:

Worked on design and development of complete End To End solution on the AWS, custom Data Engineering tools for building Verizon Smart Family, Video business Analytics in the Data Engineering and Analytics Team.
Worked on the cutting edge Data Engineering technologies including AWS Glue and custom developed tools like Snapshoter, SALO, and ingested Kafka Topics to ingest, process/transform and load near real time data up to 1 Billion records a day into the Data Lakes for further Analytical reporting solutions.
Designed and developed over 200 Informatica workflows & worklets and mappings for wide variety of business needs to extract, transform and load data with multiple strategies and business logic to create data sets for Data Analytics projects
Designed and developed ELT jobs and loaded data into Snowflake. Created Snowflake Stages with S3 storage containers for data access. Used Copy, Merge commands to insert data and maintain state in Snowflake
Designed and developed fact tables and dimension table mappings in Informatica 10.x using various transformations including Filter transformation, Lookup transformations, Router Transformations, Union Transformations, Expression Transformations etc which feed data into Power BI, Tableau Desktop and OBIEE
Worked on developing Analytical Dashboards for Stream box Analytics, Smart Family Analytics using Qliksense, & Tableau Server Editions
Created data modeling, Source To Target Mapping (STTM) for data models, design and development of the workflows
Worked on architecture, sizing, installation and implementation of Data Warehouse Application Implementation for Informatica 10.x
Design, Develop and Implement autosys jobs to trigger UNIX Shell scripts for importing the data from source system and bringing data into HDFS through AWS S3 storage.
Involved in the analysis and functional requirements gathering process by meeting up with the Business users & Business analysts ensuring needs and delivery feasibility.
Worked on setup of DAC in Informatica mappings for automating the workflow process of various Subject Areas. Complete Material View analysis and development and optimized the SQL for Source.
Extensive experience in creating, formulating and refining raw data into user friendly data sets and visualizations, and analytical solutions using custom made ETLs.
Worked on migrating the on prem Informatica and OBIEE tools etc to AWS using DMS, Transfer Family
Worked on installations, patching, maintenance of Informatica 10.x in DEV, SIT and Prod Environments in the Redhat Linux OS

Environment: AWS Glue, Redshift, S3, Snowflake, Qliksense, Tableau 10.1/10.2/10.3.16/10.5/2018.3/2019.3/2020.2/2020.4/2021.1, Tableau Server, OBIEE 12c/11.1.1.9.x, , Redhat Linux 6, Putty, Secure CRT 6.2, TOAD 11.X, Python, R, JavaScripting, Shell Script, Win SCP, Microsoft VISIO, Xming tools, Windows 7/XP

Sr. Data Analytics Consultant
Wabtec Dec 2021 Sep 2022
Responsibilities:

Worked on design and development of complete End To End solution on the custom OBIA Data Warehouse for Wabtec Corporation
Designed and Developed OBIA Finance system to calculate AP, AR, COGS, Inventory Analysis using OBIA, Informatica and visualizations on Qlik Dashboards
Worked as Lead Data Engineer on Analysis, design and development of various features on Analytics reports and Dashboard for Wabtec
Led team of offshore developers in solutioning the AP product for Wabtec
Designed and developed Qlikview Dashboards, created Apps for multiple requirements like Hierarchy, Product roll ups, COGS etc
Created data modeling, Source To Target Mapping (STTM) for data models, design and development of the workflows
Worked on custom solutioning for OBIEE to support the users on AP Dashboards
Involved in the analysis and functional requirements gathering process by meeting up with the Business users & Business analysts ensuring needs and delivery feasibility.
Designed and developed fact tables and dimension table mappings in Informatica 10.x/9.6.x using various transformations including Filter transformation, Lookup transformations, Router Transformations, Union Transformations, Expression Transformations
Developed metrics, attributes, filters, reports, dashboards, visualizations and complex calculations to manipulate the data.
Created and scheduled full and incremental extract based on requirement.
Implemented row level security in Tableau, which provided data for over 500 users
Worked on wide variety of source systems Flatfiles, Oracle, MS Server, Vertica, Json, and XML formats for data manipulation, analysis and produced business oriented measures and results.
Extensive experience in creating, formulating and refining raw data into user friendly data sets and visualizations, and analytical solutions using custom made ETLs.

Environment: Qliksense, OBIEE 12c/11.1.1.9.x, OBIA , Redhat Linux 6, Putty, Secure CRT 6.2, TOAD 11.X, Python, R, JavaScripting, Shell Script, Win SCP, Microsoft VISIO, Xming tools, Windows 7/XP


Sr. Data Warehouse Developer/Consultant
CTIS June 2021 Jan 2022

Responsibilities:

Worked on design and development of complete End To End solution on the custom CTIS Environment by developing Core Data Repository (CDR) for CTEP Project
Designed and Developed Data Warehouse Solution for CDR on CTEP Project to provide solution on Adverse Events, and Adverse Events Reporting Systems (CTEP AERS)
Designed and developed Data Model for CDR to create Data Warehouse using custom SQL Packages, and worked on the CTIS custom software solution.
Worked as Lead Data Engineer on Analysis, design and development of various features on Analytics reports and Dashboard for CTIS
Created POC on various CDR Projects to socialize the Data Warehouse solution for business.
Extensive experience in creating, formulating and refining raw data into user friendly data sets and visualizations, and analytical solutions using custom made ETLs.

Environment: Oracle 11g/10g, Oracle PL/SQL, Jira, Git, Redhat Linux 6, Putty, Secure CRT 6.2, TOAD 11.X, Python, R, JavaScripting, Shell Script, Win SCP, Microsoft VISIO, Xming tools, Windows 7/XP

Sr. Data Warehouse Developer/Consultant
Department Of Human Services State Of Georgia Jan 2013 Nov 2021

Responsibilities:

Worked on the complete SLDC & Agile Implementation of the application using OBIEE 12c/11g, Tableau Server Editions, Informatica 10.x, MS Power BI, Oracle 12c, AWS and Kafka.
Involved in the analysis and functional requirements gathering process by meeting up with the Business users & Business analysts ensuring needs and delivery feasibility.
Worked on architecture, sizing, installation and implementation of Data Warehouse Application Implementation for OBIEE 11.1.1.9.x/OBIEE 12.2.1.4 including Oracle HTTP Server 11g/12c
Led the team in implementing Tableau Server and Desktop Version on the Windows Server, Setup Mobile Analytics for Tableau Server, created a new site and published the code to production. Integrated the code to portal for SSO and user access.
Setup the Wild card urls, network security for the Tableau Server, OBIEE applications
Worked on the end to end upgrade and installation of Informatica 10.5.1 and DAC for Data Warehouse solution
Setup SSO, SAML Authentication for the application by integrating BI Publisher with Analytics on the multiple Environments.
Worked on deploying setting up high availability, multi thread environments, performance tuning, metadata development and front end development on OBIEE projects
Worked on data science projects to develop a system network, child-parent-caseworker-collateral network, chance of payment use cases implementation on Python, Informatica and Oracle databases.
Worked on designing solutions to predict/formulate the dataset based on the existing datasets using Python, Informatica PWC and presented the solutions using visualization tools Power BI, Tableau and OBIEE
Designed and developed over 200 Informatica workflows & worklets and mappings for wide variety of business needs to extract, transform and load data with multiple strategies and business logic to create data sets for Data Analytics projects
Designed and developed fact tables and dimension table mappings in Informatica 10.x/9.6.x using various transformations including Filter transformation, Lookup transformations, Router Transformations, Union Transformations, Expression Transformations etc which feed data into Tableau Desktop and OBIEE
Created different views using Tableau Desktop like doughnut chart, bar graphs and trending line graphs using filters and actions.
Combined visualizations into Interactive Tableau Dashboards and published them to the Tableau Server.
Developed actions, parameters, Filter (Local, Global) and calculated sets for preparing dashboards and worksheets using Tableau Desktop.
Developed metrics, attributes, filters, reports, dashboards, visualizations and complex calculations to manipulate the data.
Used multiple Measures while creating Individual Axes and Dual Axes.
Scheduled data refresh on Tableau Server for daily and monthly increments to ensure that the views and dashboards were displaying the changed data accurately.
Created and scheduled full and incremental extract based on requirement.
Worked with single table as well as multi tables in tableau desktop.
Implemented row level security in Tableau, which provided data for over 500 users
Worked on wide variety of source systems Flatfiles, Oracle, MS Server, Vertica, Json, and XML formats for data manipulation, analysis and produced business oriented measures and results.
Demonstrate in-depth understanding of Data Warehousing (DWH) and ETL concepts, ETL loading strategy, Data archiving, Data reconciliation, ETL Error handling, Error logging mechanism, standards and best practices
Extensively used pm cmd commands on command prompt and executed Unix Shell scripts to automate workflows and to populate parameter files.
Worked extensively to develop facts and dimensions, No SQL data systems, Graph databases from the raw data to help business leaders to make informed decisions.
Extensively worked on HR Analytics, including the HRIS module turnover analysis, employee demographics Analytics, using Informatica, Power BI and OBIEE.
Worked on complete Infrastructure setup from the Server sizing to Application Installation, Security, Oracle Fusion Middleware Web Servers, network and performance tuning of the Data Warehouse
Used Confidential Kinesis as a platform for streaming data on AWS.
Migrated on premise database structure to Confidential Redshift data warehouse
Design, Develop and Implement autosys jobs to trigger UNIX Shell scripts for importing the data from source system and bringing data into HDFS through AWS S3 storage.
Worked on setup of DAC in Informatica mappings for automating the workflow process of various Subject Areas. Complete Material View analysis and development and optimized the SQL for Source.
Extensive experience in creating, formulating and refining raw data into user friendly data sets and visualizations, and analytical solutions using custom made ETLs.

Environment: Tableau 10.1/10.2/10.3.16/10.5/2018.3/2019.3/2020.2/2020.4/2021.1, Tableau Server, OBIEE 12c/11.1.1.9.x, Oracle BI Publisher, AWS Redshift, Kafka, Power BI Tools, Tableau Server, Oracle 11g/10g, Informatica 10.x/ 9.x, Redhat Linux 6, Putty, Secure CRT 6.2, TOAD 11.X, Python, R, JavaScripting, Shell Script, Win SCP, Microsoft VISIO, Xming tools, Windows 7/XP



Systems Analyst Nov 2011-Dec 2012
Syniverse Technologies, FL/ 3A Soft, Inc. Piscataway, NJ

Worked on different projects from requirement gathering to prepare the BRD, LOE, Design, Development, Testing, Release and Production support.
Involved in creating POC s (proof of concepts) for the repositories and business functionalities with the architect team and business users
Involved in creating major Dashboards for the clients within timeline specifications such as NRSC, OPTIX, CSAT.
Worked with Onsite and Offshore and managed a team of 4-6 members. Conducted Project planning and Work estimations
Imported tables and add/change Physical joins/keys, created Complex joins/keys, setting levels to support Drill-down, Aggregation/Sum over columns & created the Presentation layer according to the client naming conventions.
Created Data/row level security to manage complex business requirements as users requested to see separate data.
Created Presentation variables and guided Navigation reports to match complex business requirements. Used Session, Repository and Global variables in the reports.
Worked with OBIEE Answers, Intelligence Dashboards. Involved in creation of Prompts, Views, Aggregates, Pivot tables, Measures, Customization, Conditional formatting of reports to satisfy the end users.
Created and maintained catalog of future needs such as data integrity or reconcilement issues in existing reports, new data source requirements and business description metadata issues that are discovered through the conversion process.
Configured Intelligence dashboards by adding included and embedded content such as links/images, HTML objects and folders.
Reviewed Functional Design Document and Technical Design Document.
Created Database Fragmentation & Partition in databases and metadata, designed and customized Star and Snowflake schemas by extending Fact and Dimension tables.
Trained end users on the Navigation of Reports and creating supporting documents for the same.
Performed Unit testing, creation of Test cases for the UAT and also involved in the Post production support.

Fidelity, Providence, RI Jan 10 Oct 11

Sr. OBIEE Developer
Fidelity is a world-leading financial services company, advising clients in all aspects of Investments around the clock. The trading sector project involved Profit/loss, forecasts, and is a critical component of the strategic architecture under the Reporting System.

Responsibilities:
Involved in the analysis and functional requirements gathering process by meeting up with the Business users & Business analysts ensuring needs and delivery feasibility.
Experience in Preparing Design Specification documents based on functional requirements and also involved in the preparation of Technical Design Documents
Designed, implemented the Metadata, Reports and Dashboards for the Views and Disclosures, DART and Inventory subject areas using the OBIEE tool.
Created the technical design documents and BOW documents. Good working knowledge of Trader, Broker, and Mutual fund tables.
Developed on demand, canned Reports (70+) and Dashboards (10) for the various subject areas like Inventory, Views and Disclosures and Independent Price Verification.
Used union Reports, Drilldown, Presentation variables and Prompts in the creation of answers and dashboard reports.
Created time series calculations (for example, percent change in a measure compared to same period in previous year).
Created application roles in the Enterprise Manager and added users to their respective roles.
Implemented different caching techniques to improve the performance.
Develop a Business Intelligence (BI) Environment for the Client Product and deliver the Business Information to the end users via web-based end user tools (OBIEE) in the form of standard reports, dashboards/KPI scorecards, and ad-hoc query capability.
Performed Unit Testing and Functional Testing for the various reports and dashboards.
Involved in Performance Tuning of reports by identifying the indexes required on the backend tables and also from the data available in the Cache
Extensive experience in writing the queries using PL/SQL.
Involved in creating mappings using different Transformations like Rank, Sequence Generator, SQL, Sorter, Router, HTTP, Application Source Qualifier, Aggregator, Stored Procedure and Union Transformations.
Involved in Root Cause Analysis of issues and day-to-day Production support and on call on weekly basis.
Trained end users on the Navigation of Reports and creating supporting documents for the same.
Performed Unit testing, creation of Test cases for the UAT and also involved in the Post production support.
Managed a team of 5 as part of the on-site and offshore co-ordination.

Environment: OBIEE 11.1.1.3.0/10.1.3.4.1, Informatica 8.x/7.x, Discoverer 10g/4i/3i, Unix, Putty, PLSQL Developer, Shell Script, Win SCP, Microsoft VISIO, Windows XP

Gamajet Cleaning Systems, Inc
Systems Engineer June 2008 November 2009


Developed highly effective and lucid Engineering proposals for CIP systems with the complete concept evolution, component sourcing, design validation and scheduling
Worked on multiple projects for Root cause analysis for failure of critical components in a product.
Involved in the analysis and functional requirements gathering process by meeting up with the Business users & Business analysts ensuring needs and delivery feasibility.
Participated in the installation and configuration of BI tools and MS Access database on Windows Servers.
Maintained the various subject areas by providing technical support as a part of the global implementation.
Involved in Root Cause Analysis of issues and day-to-day Production support and on call on weekly basis.
Keywords: cprogramm cplusplus business intelligence sthree active directory rlang microsoft procedural language Arkansas Colorado Florida New Jersey Rhode Island

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];3646
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: