Home

Lakshmi - Senioe data analyst
soumya2405p@gmail.com
Location: Edison, New Jersey, USA
Relocation:
Visa: H1B
Resume file: Lakshmi_DA_1743793722501.docx
Please check the file(s) for viruses. Files are checked manually and then made available for download.
Name: Lakshmi Soumya
Senior Data Analyst
Email: Psoumya2408@gmail.com
Ph#: +1 (551) 233-8633

Professional Experience:
Senior Data Analyst with 7+ years of experience in Data Modeling, Data Mining and Data Warehouse.
Experienced in all phases of Software Development Life Cycle (SDLC). (Analysis, Requirements gathering, Designing) with expertise in documenting various requirement specifications, functional specifications, Test Plans, Source to Target mappings, SQL Joins.
Strong Data Warehousing, Data Marts, Data Analysis, Data Organization, Metadata and Data Modeling experience on RDBMS databases.
Extensive knowledge in Designing, Developing and implementation of the Data marts, Data Structures using Stored Procedures, Functions, Data warehouse tables, views, Materialized Views, Indexes at Database level using PL/SQL, Oracle.
Proficient in designing and optimizing ETL workflows utilizing Informatic Power Center, AWS Glue, and Azure Data Factory, guaranteeing seamless data extraction, transformation, and loading processes.
Experienced Data Modeler with strong conceptual, Logical and Physical Data Modeling skills, Data Profiling skills, Maintaining Data Quality, creating data mapping documents, writing functional specifications, queries.
Experience in Dimensional Data Modeling, Star/Snowflake schema, FACT & Dimension tables.
Experienced in identifying entities, attributes, metrics, and relationships; also, assigning keys and optimizing the model.
Hands on Experience with AWS Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system which include loading nested JSON formatted data into Snowflake table
Have extensive knowledge in Data flow modeling and Object modeling, case analysis and functional decomposition analysis.
Experienced in implementing real-time data streaming solutions with technologies like Azure Stream Analytics and AWS Kinesis, ensuring swift responses to dynamic data patterns.
Experience in standardizing Tableau for shared service deployment.
Worked on AWS Data Pipeline to configure data loads from S3 into Redshift.
Worked on Data Migration from Teradata to AWS Snowflake Environment using Python and BI tools like Alteryx.
Extensive knowledge on producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.
Extensive experience with Normalization (1NF, 2NF and 3NF) and De-normalization techniques for improved database performance in OLTP and Data Warehouse/Data Mart environments.
Good Experience on Data archival process to SAS data sets and flat files.
Strong experience on Base SAS, SAS/Stat, SAS/Access, SAS/Graphs and SAS/Macros, SAS/ODS and SAS/SQL in Windows Environment.
Experience in Business Intelligence (BI) Technologies like Microsoft Business Intelligence (SSIS, SSAS, and SSRS), Informatic a, Business Objects, Power BI and OBIEE.
Extensive Data Warehousing experience using Informatic a as ETL tool on various data bases like Oracle, SQL Server, Teradata and MS Access.
Experienced in implementing automated backup and disaster recovery solutions within Azure and AWS, ensuring data resilience and business continuity, and optimizing workflows with automation scripts.
Excellent experience in troubleshooting SQL queries, ETL queries, data warehouse /data mart/ data store models.
Experienced in performance tuning and optimization for increasing the efficiency of the scripts on large database for fast data access, conversion and delivery.
Well versed with the concepts of Forward Engineering and Reverse Engineering for the existing databases for Physical models using Erwin tool.
Experience creating design documentation related to system specifications including user interfaces, security and control, performance requirements and data conversion.
Extensively Worked in Agile delivery environments and all phases of Software Development Life Cycle (SDLC).
Team Player as well as able to work independently with minimum supervision, innovative & efficient, good in debugging and strong desire to keep pace with latest technologies.
Excellent Communication and presentation skills along with good experience in communicating and working with various stake holders.




Technical Skills
Analytical Tools Tableau Desktop/Server (2019.2,10/9.x/8.x/7.x), MicroStrategy, BO, SSRS, SSAS, Google Analytics, Power BI, Python
Data Modeling Erwin, ER Studio, MS Visio
ETL Tools Informatic a, SSIS, Alteryx
RDBMS and SQL Teradata, Oracle, SQL Server, AWS RedShift, HANA, Toad, DB2, SQL Assistant, SQL and PL/SQL
Languages Python, SQL, PL/SQL, Unix Shell Scripts
Cloud Amazon Web Services (AWS). Microsoft Azure
Operating Systems Windows, Unix, Sun Solaris, Linux

Professional Experience

Client: Pfizer, New Jersey, NJ. April 2023 Till Date
Role: Senior Data Analyst
Responsibilities:
Participated in requirement gathering session with business users and sponsors to understand and document the business requirements.
Built data visualizations and cross-functional reporting that conveys key performance metrics, significant trends, and relationships across multiple data Sources.
Worked with Data Analysis Tools (SAS Enterprise Miner, SAS Enterprise Guide, Tableau 9 and above) to analyze and visualize data to solve data Analysis Problems.
Worked on profiling source data to determine the key consistency, data type, size, etc.
Designed SSIS packages to export and import data to/from SQL server from/to other sources/destinations after processing.
Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
Designed and optimized ETL workflows on AWS Glue, ensuring streamlined data processing and accuracy.
Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
Created DDL scripts for implementing Data Modeling changes. Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN.
Utilized Data visualization tools to create accessible and user-friendly interfaces, enabling end-users to swiftly identify key themes within their data.
Published Data model in model mart, created Skilled in System Analysis, E-R/Dimensional Data Modeling, Database Design and implementing RDBMS specific features.
Utilized all Tableau tools including Tableau Desktop, Tableau Server, Tableau Reader, and Tableau Public.
Worked with DBA to create the physical model and database objects.
Reporting to the manager once in a week by providing them detailed report and visualization over the lost or damaged packages using Tableau and MS Excel.
Explored data in a variety of ways and across multiple visualizations using Power BI Scheduled Automatic refresh and scheduling refresh in Power BI service.
Created Source-to-target (S2T) mapping document as part of Data Analysis.
Involved in data migration from staging to integration.
Assisted in mining data from the SQL database that was used in several significant presentations.
Assisted in offering support to other personnel who were required to access and analyze the SQL database.
Responsible for development of workflow analysis, requirement gathering, data governance, data management and data loading.
Created views for reporting purpose which involves complex SQL queries with sub-queries, inline views, multi table joins, with clause and outer joins as per the functional needs in the Business Requirements Document (BRD).
Responsible for generating Financial Business Reports using SAS Business Intelligence tools (SAS/BI) and also developed ad-hoc reports using SAS Enterprise Guide

Environment: AWS, Agile, Teradata, Erwin, Snowflake, Power BI, Tableau, SAS, Multiload, Oracle, Unix Shell Scripts, SQL Server, SAS, PROC SQL, MS Office Tools, MS Project, MS Access, Pivot Tables, Windows XP.

Client: Fedarate Insurance, USA(Remote) Nov 2022 Mar 2023
Role: Sr. Data Analyst
Responsibilities:
Perform Daily validation of Business data reports by querying databases and rerun of missing business events before the close of Business Day.
Worked on claims data and extracted data from various sources such as flat files, Oracle and Mainframes.
Gathered Business requirements by interacting with the business users, defined subject areas for analytical data requirements.
Created Many TDE (Tableau Data Extract) for various Projects and Scheduled Refreshed on Tableau Server as per the Business needs.
Created stories and manually tested the data for every requirement.
Optimizing the complex queries for data retrieval from huge databases.
Root cause analysis of data discrepancies between different business system looking at Business rules, data model and provide the analysis to development/bug fix team.
Lead the Data Correction and validation process by using data utilities to fix the mismatches between different shared business operating systems.
Wrote calculated columns, measures queries in Power BI desktop for data analysis techniques. Weekly presentation to the business users about the Power BI reports and their changes as required.
Worked on all kind of reports such as Yearly, Quarterly, Monthly, and Daily in Power BI.
Conduct downstream analysis for different tables involved in data discrepancies and arriving at a solution to resolve the same.
Extensive data mining of different attributes involved in business tables and providing consolidated analysis reports, resolutions on a time-to-time basis.
Created complex SQL scripts to build and store logical data in snowflake database for data analyses and quality checks.
Reviewed Stored Procedures for reports and wrote test queries against the source system (SQL Server) to match the results with the actual report against the Data mart.
Executed number of Queries using the models on TERADATA (Teradata), created logical, physical models using ERWIN tool to provide data analysis and verification.
Converted SAS scripts into Snowflake compatible and migrated data from Teradata to snowflake database for multiple business line.
Wrote packages to fetch complex data from different tables in remote databases using joins, sub queries.
Validated Data to check for the proper conversion of the data. Data Cleansing to identify bad data and clean the data. Data profiling for accuracy, completeness, consistency.
Reviewed all the systems design by assuring adherence to defined requirements.
Met with user groups to analyze requirements and proposed changes in design and specifications.
Flat file conversion from the data warehouse scenario.
Created Static and Dynamic Parameters at the report level.
Involved in Data Reconciliation Process while testing loaded data with user reports.
Documented all custom and system modification.
Worked with offshore and other environment teams to support their activities.
Responsible for deployment on test environments and supporting business users during User Acceptance testing (UAT).

Environment: DataStage, Power BI, Oracle 10g, DB2, Sybase, TOAD, Cognos, SQL Server, TSYS Mainframe, SAS PROC SQL, SQL, PL/SQL, ALM/Quality Center, QTP, UNIX, Shell Scripting, XML, XSLT.

Client: Value Labs, India. Jan 2020 July 2022
Role: Data Analyst
Responsibilities:
Met with business partners to understand their business and reporting needs, including the identification of critical metrics and KPIs.
Responsible for gathering data migration requirements.
Connected Tableau server to publish dashboard to a central location for portal integration.
Actively involved in the creation of users, groups, projects, workbooks and the appropriate permission sets for Tableau server logins.
Created and optimized ETL workflows with Azure Data Factory, ensuring data accuracy and streamlined processes.
Identified problematic areas and conduct research to determine the best course of action to correct the data.
Implemented real-time data streaming with Azure Stream Analytics, enabling prompt responses to changing data patterns.
Analyzed reports of data duplicates or other errors to provide ongoing appropriate inter-departmental communication and monthly or daily data reports.
Monitor for timely and accurate completion of select data elements.
Monitor data dictionary statistics.
Involved in analyzing and adding new features of Oracle 10g like DBMS_SHEDULER, Create Directory, Data pump, CONNECT_BY_ROOT in existing Oracle 9i application.
Archived the old data by converting them in to SAS data sets and flat files.
Created Dax Queries to generated computed columns in Power BI.
Generated computed tables in Power BI by using Dax.
Designed and developed Power BI graphical and visualization solutions with business requirement documents and plans for creating interactive dashboards.
Extensively used Erwin tool in Forward and reverse engineering, following the Corporate Standards in Naming Conventions, using Conformed dimensions whenever possible.
Enhance smooth transition from legacy to newer system, through change management process.
Trained team members in PL/SQL and provided Knowledge Transfer sessions on Finance and Banking domains.
Planned project activities for the team based on project timelines using Work Breakdown Structure.
Created Technical Design Documents, Unit Test Cases.
Involved in Test case/data preparation, execution and verification of the test results
Created user guidance documentations.
Created reconciliation report for validating migrated data.

Environment: Azure, UNIX, Power BI, Tableau, Shell Scripting, XML Files, XSD, XML, SAS, PL/SQL, Oracle, Teradata, Sybase, ERWIN, Toad, Autosys.

Client: Evoke Technologies , India. Nov 2017 Dec 2019
Role: Data Analyst
Responsibilities:
Analyzed problem and solved issues with current and planned systems as they relate to the integration and management of order data.
Evaluated applications, records, and documents to gather information about eligibility and liability issues.
Prepared reports of activities, evaluations, recommendations, and decisions.
Compiled, cleaned, and manipulated data for proper handling.
Developed polished visualizations to share results of data analysis.
Identified compliance issues that required follow-up and investigation.
Provided assistance to internal and external auditors in compliance reviews.
Assisted with new hire orientation and employee training.
Verified documentation, implementation and communication of firm and regulatory policies and procedures.
Met deadlines while maintaining high-quality deliverables.
Participated in ongoing training to enhance own job skills and knowledge.
Data flow check with source to target mapping of the data.
Data matrix creation for mapping the data with the business requirements
Data profiling to cleanse the data in the data base and raise the data issues found.
Created and reviewed mapping documents based on data requirements
Engaged in logical and physical designs and transforms logical models into physical model through forward engineering of Erwin tool.
Perform small enhancements (data cleansing/data quality).
Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
Involved in data mapping and data clean up.
Enhance smooth transition from legacy to newer system, through change management process.
Created Datasets using SAS proc SQL from flat file.
Extracted, transformed and loaded the data into databases using Base SAS.
Involved in Test case/data preparation, execution and verification of the test results.
Reviewed PL/SQL migration scripts.
Coded PL/SQL packages to perform Application Security and batch job scheduling.
Created user guidance documentations.
Created reconciliation report for validating migrated data.

Environment: UNIX, Shell
Keywords: business intelligence sthree active directory rlang microsoft procedural language Delaware New Jersey

To remove this resume please click here or send an email from soumya2405p@gmail.com to usjobs@nvoids.com with subject as "delete" (without inverted commas)
soumya2405p@gmail.com;5167
Enter the captcha code and we will send and email at soumya2405p@gmail.com
with a link to edit / delete this resume
Captcha Image: