Home

Sri G - Business Data Analyst
[email protected]
Location: Dallas, Texas, USA
Relocation:
Visa: H1B
Sri, Garimella
[email protected]
567-698-7202
https://www.linkedin.com/in/sddg2508



Sr. Business Data Analyst / Operations Engineer with around 12+ years of specialized experience in managing end-to-end large scale complex data migration, integration & conversion projects bygathering requirements, building data mappings,dashboard s and BRD S performing BusinessAnalysis, Data Analysis, Data Modeling, Data Mining, Dashboard development and Testing while migrating / integrating data from disparate source systems usingSQL, Python, SSMS, GCP, AWS, Azure, Snowflake, Databricks, DBT, MS Power BI, Tableau & BI Tools.
Skilled in design, development and implementation relational / Non-relational Database schema models and building semantic layer for downstream consumers while creating the user storiesand prioritizing user stories along with tracking of burn up, burn down charts in JIRA / Azure DevOpsleveraging Agile - SCRUM framework by collaborating effectively with cross-functional teams, stakeholders, and executive leadership to deliver large scale data projectsin agile and waterfall project management methodologies.

Certifications & Awards
Microsoft Certified Associate Power BI Data Analyst 05/24/2024.
Received Customer Focus Award from Cognizant, associated with KeyBank Project for building an automation tool to compare Complex Business Data with Python.

Technical Proficiencies
Operating Systems:Windows, UNIX, Linux, MAC OS
Scripting Languages:T-SQL, MS-SQL, Snow SQL, BIGSQL, Big Query, Python, Jinja2, Unix Shell.
Databases:Microsoft SQL Server 2008/2012/2014, Oracle, Teradata, BIG- DATA.
Reporting tools:SQL Server Reporting Services (SSRS), Excel, Tableau, Power BI
Transformationtools: Azure Data factory, AWS Glue, DBT, Alteryx
Microsoft Suite: MS Visio, MS Word, MS Excel, MS Outlook, MS Power point
Data warehouse / Data Lakes: SSMS, Snowflake, Azure Synapse, AWS Redshift, GCP Big Query
Cloud Technology / Eco System: Azure, AWS, Google Cloud (GCP).

Education
Master of Computer Information Systems 3.7 / 4.0 GPA (2017),
California University of Management & Sciences, Arlington, VA.
Bachelor of Civil Engineering (2012),
JNTUH, Telangana, India.



Summary
Project Execution:Proven track record in managing full project lifecycles, including Business requirement elicitation, source to target mapping documentation (STTM), analysis, coding, testing, deployment, and ongoing support, ensuring alignment with business goals while working closely with Data Architect in building Enterprise Data Warehouse for data migration & integration.
Business Data Analysis:Performed BusinessAnalysis identifying GAPs and conducted exploratory data analysis with statistical,quantitative, and analytical financial data and Root Cause Analysis (RCA) to triage and remediate the potential data quality issues during User Acceptance Testing.Queried large datasets to identify patterns, anomalies, and actionable insights while performing Data Mining, Profiling, Cleansing & massaging using SQL and Python.
Business Operations and Data Engineering:Successfully led complex data migration and integration projects from conceptualization to global delivery for end consumers by building data pipelines and optimizing data warehouses such asSnowflakeemploying advanced ETL / ELT practices usingAzure Data Factory, AWS Glue, DBT, MS SSISwithseamless integration across systems with domain knowledge offinancialbanking, and investment dataassociated withFunds- Trading, Wealth and Asset management.
Schema Modeling and Data Architecture:Adept at designing efficient star and snowflake schema models and creating semantic layers that enhance data accessibility and usability for business intelligence and analytics applications such asPower BI.
Coding:Proficient in developing and optimizing advanced complex SQL queries, SQL Stored Procedures, UDF s, Python, and Jinja2templates for data transformation, automation, and integration across multiple platforms, ensuring high performance and data accuracy across relational, NoSQL DB s.
DevOps and CI/CD:Expertise in setting up automated data migrations tasks, building platform features, and maintaining backlogs using ALM, JIRA, ServiceNow & Azure DevOps. Building & maintaining Azure DevOps, and GitHub Repositories for robust CI/CD pipelines ensuring data integrity / governance and deploying reusable code with custom solutions effectively.
Advanced Analytics and Visualization:Designed dashboards using Power BI, Tableau and other BI tools for advanced data visualizations and, parameterized reports, enabling strategic insights and facilitating data-driven decision-making with quantitative statistics in forecastingwith predictive analysis.
Agile Methodology and Documentation: Key contributor in transitioning projects from waterfall to agile methodologies, skilled in data migration, transformation, and extraction methodologies using JIRA, ServiceNow, ALM, Confluence, and SharePoint for efficient defect tracking and documentation.
ESG Framework Knowledge: Exposure to Environmental, Social, and Governance (ESG) frameworks, contributing to strategic investment initiatives and growth opportunities within the industry.


Professional Experience

Intech Investments Mgmt., West Palm Beach, Florida. June 2023- Present
Sr. Business Data Analyst / Operations Engineer

Ecosystem: Snowflake, Databricks, Python 3.8, Advanced SQL, Snowpark, JIRA, Confluence / Wiki, GCP, AWS, SharePoint, Oracle DB, Sybase, MS Office Tools, Power BI.

Responsibilities:
Project Scope Data Platform Development and Integration: Developed and re-engineered a robust data platform as a single source for investment management operations, overseeing integration of complex investment data sources from disparate systems (Simcorp, FactSet, Bloomberg, MSCI, Etc.,) intoSnowflakeasEnterprise Data Warehouse.
Data Architecture Design and Optimization: Led the design, development, and optimization of data pipelines usingDBT and SQL/Python, enabling efficient extraction, loading and transformation (ELT) of upstream investment data IntegratedAWS S3for data ingestion intoSnowflakestaging, implementing advanced SQL queries and Bitemporal/Slowly Changing Dimension (SCD) methodologies within DBT models to enhance accuracy and performance metrics for Portfolio Management while Collaborating closely with Executive Leadership to translate business requirements into technical solutions.
Business Data Analysis:Gathered requirements and documented Source to Target Mapping with required Transformation SQL Logics and performing GAP Analysis. Designed and implemented star schema data models for Research, Trading, and Operations teams, ensuring alignment with business goals and adherence to stringent Data Governance protocols.Conducted data cleansing and standardization activities to ensure data accuracy, consistency, and integrity. Performed GAP analysis by performing the system analysis between the scheduling systems and documented the gap items. Employed SQL for querying and extracting relevant data from databases, ensuring the availability of clean and accurate datasets for data analysis.Conducted JAD sessions, meetings, workshops to gather requirements from executive leadership and SMEs. Conducted thorough data validation and quality assurance checks to verify the accuracy, completeness, and consistency of data inputs and outputs through aggregation & statistical analysis of data from multiple sources into centralized databases or data warehouses, employed viaELT (Extract, Load, Transform) processes
Data Engineering (ELT):Applied deep expertise in DBT and Snowflake to deploy interpolated star schema data models into production environments, leveraging Jinja2 templating, SQL scripting, and Python programming. Implemented and configured DBT utilities, including project evaluators and expectations, to enforce rigorous data quality checks across all models.
DevOps & CI/CD: Implemented robust CI/CD pipelines using GitHub repositories for automated testing, validation, and deployment of DBT data models. Ensured continuous integration and delivery of data solutions, improving efficiency and reducing time-to-production.
Automated Data Migration and Analytics:Achieved automated migration of investment data between various source systems while ingesting data via AWS S3 into Snowflake Data Warehouse and MS-SQL Server using DBT, SQL, & Python automation scripts, facilitating seamless integration and accessibility for downstream analytics. Orchestrated automated data pipelines from leading financial vendors (FactSet, S&P Global) to deliver timely and accurate daily, monthly, and quarterly reports on Portfolio Holdings and Returns, critical for managing Assets under Management. Provided proactive data support, conducting Root Cause Analysis (RCA) to swiftly resolve data quality issues. Managed agile project tasks using JIRA, maintained version control with GitHub repositories, and documented processes on Confluence/Wiki, ensuring transparent project management and collaboration.

MFS Investment Management, Boston, Massachusetts. June 2022- June 2023
Sr. Business Data Analyst / Operations Engineer

Ecosystem: Snowflake, DBT, Python 3.8, Advanced SQL, Snowpark, Jinja2, SSMS, JIRA, Confluence / Wiki, Azure DevOps, SharePoint, Oracle DB, Sybase, MS Office Tools, Power BI.

Responsibilities:
Project Scope - Unified Data Platform / Semantic Layer Development: Collaborated with stakeholders, SMEs, and Executive Leadership Team to define and implement the semantic layer for Investment Data, ensuring alignment with business objectives and downstream analytics requirements.
Business Data Analysis - Transformation and SQL Development: Documented business requirements & Source to Target Data Mapping with business rules for transformation and developed advanced complex SQL queries to optimize data transformation processes for structured and unstructured data within Snowflake Data Warehouse, enhancing performance and scalability.
DBT and Snowflake Integration:Worked closely with solution architect in providing the Proof of concept (PoC) on Snowflake DBT Integration building End End Data Integration Platform including Testing using JMeter. Implemented DBT for data warehousing in Snowflake, including setting up environments, defining databases, and configuring CI/CD pipelines in Azure DevOps. Developed advanced SQL queries and custom DBT macros for optimized data transformations and modeling.
Reporting and Business Analytics: Developed complex SQL queries and integrated Power BI to generate comprehensive reports on Portfolio, Sub portfolio, and Benchmark data, meeting business expectations for dynamic reporting and analytics.
Agile Project Management: Owned and maintained user stories in JIRA, managed Kanban boards, and participated in PI planning sessions to deliver technical requirements effectively, ensuring alignment with project milestones and acceptance criteria.

Capital Group, Los Angeles, California. May 2021- June 2022
Sr. Business Data Analyst / Operations Engineer
Ecosystem: Python 3.8, Spark, MS SQL, Oracle DB, Toad, SSMS, Alteryx Designer, ServiceNow, JIRA, Confluence, AWS, Redshift, Snowflake, Glue, Athena, Azure, Databricks, MS Office Tools, Tableau, Power BI.
Responsibilities:
Data Quality Management and Incident Resolution:As a Production Data Support Team, Collaborated on data quality issues for client delivery materials and quarterly reports. Managed SLA impacts, incident stories, and remediation progress in ServiceNow.
Cross-Functional Analysis and Collaboration: Conducted deep dive and GAP analysis across multiple platform teams. Engaged with backend developers, investment operations, and data automation teams.
Investigative Analytics and Data Workflow Optimization: Utilized MS SQL, Snowflake, Oracle, Athena for deep data analysis and root cause determination within AWS Glue and AlteryxETL data pipelines/ workflows. Created JIRA user stories for long-term fixes and enhancements addressing data quality.
Agile Project Management and Reporting: Owned and maintained JIRA user stories using Kanban for technical delivery. Built Power BI dashboards to monitor progress and present remediation updates.
Automated Reporting and Data Integration:Developed automated monthly and quarterly client operations reports using Python.Automated extraction, transformation, and loading of index data into MS SQL production environments.

Key Bank, Brooklyn, Ohio. April 2019- May 2021
Sr. Business Data Analyst
Ecosystem: Python 3.7, BIGSQL, GCP, Snowflake, Pandas, NumPy, Sci-Kit Learn, Hadoop, MS Office Tools, Tableau, Teradata, ALM, SAS, JIRA, SharePoint.
Responsibilities:
Regulatory Reporting and Data Integration Leadership: Collaborated with stakeholders to develop financial regulatory reports (Commitment Supertype, SALT, WA, etc.,) for Federal and State departments in the USA.Led data mapping strategies for migration and integration of Mortgage, Consumer, Student, and Solar loans data from Mainframe DB2 and Teradata to Snowflake, Oracle Cloud ERP and GCP.
Project Management and Stakeholder Engagement: Facilitated meetings and workshops with stakeholders and vendors to identify requirements, risks, and roadblocks for credit risk reporting and regulatory projects.Acted as a liaison for requirement elicitation and scope definition between business owners, product owners, consultants, architects, and technology teams.
Data Mapping and ETL Architecture Design / Build: Created detailed source-to-target data mapping documents and conducted GAP analysis between legacy Mainframe systems and targetSnowflake, Oracle Cloud ERP and GCP systems.Developed SQL procedures and transformation logic for handling large volumes of structured and unstructured data in Teradata and Hadoop environments.
Testing and Quality Assurance Coordination: Supported test data planning, system testing, and user acceptance testing with the Line of Business and Quality Assurance teams.Utilized JIRA dashboards for tracking project requirements and ensuring progress of QA and development teams.
Analytical Insights and Reporting Solutions: Conducted investigative analysis to uncover hidden patterns and trends in large datasets using SQL,& Python, connecting Snowflake Datawarehouse and generated interactive data reports and dashboards in Tableau for downstream analytics and stakeholders.




Morgan Stanley, Baltimore, Maryland. Oct 2018- March 2019
Data Scientist
Ecosystem: Spark, Hadoop, Bigdata, Snowflake, Python 3.4, Spark, Pandas, NLTK, Impala, HUE, MS Office Tools, Tableau.
Responsibilities:
Data Mining and Investigative Analysis for AML: Utilized Zeppelin (Spark/Pandas) and Hue (Impala SQL) for data mining and investigative analysis to detect Fraud Anomalies in historic data for an Anti-Money Laundering (AML) project.Conducted data collection, cleaning, validation, and visualization to identify underlying factors contributing to anomalies.
Data Preprocessing and Enhancement: Employed Pandas and Scikit-Learn in the preprocessing phase to handle missing values, outliers, errors, and feature selection. Applied scaling and distribution adjustments to enhance data quality and relevance for analysis.
Regulatory Compliance and Reporting: Developed Tableau visuals to review regulatory alignment with internal policies, standards, and processes, facilitating presentations to Morgan Stanley leadership. Generated BAU reports (Daily, Monthly, Quarterly) building a semantic layer from Snowflake Datawarehouse, leveraging remote server connections for seamless reporting.
Stakeholder Collaboration and Data Strategy Development: Collaborated with stakeholders to drive complex business initiatives, focusing on developing data strategies for internal fraud investigative analysis. Led requirements gathering, analysis, and reporting for senior management, contributing to deep dive data analysis and project management.
Quantitative Analysis and Process Enhancement: Supported quantitative team in analyzing equity factor strategies and assessing investment opportunities and risks for portfolio optimization tools. Analyzed KYC files to ensure compliance with management's policies, focusing on high-risk files and adverse client information.

Prudential, Scranton, Pennsylvania. Dec 2017- Sep 2018
Sr. Business Data Analyst
Responsibilities:
Worked in all the stages of the SDLC like Business, Functional, Technical Requirements Gathering, Designing, Documenting, Developing and Testing.
Developed various complex data mappings using Data Mapping Designer and worked with Aggregator, Lookup (connected and unconnected), Filter, Router, Joiner, Source (ERP) Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
Created, Tested and debugged the Stored Procedures, Functions, Packages, Cursors and triggers using PL/SQL. Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Integration and Metadata Management Services.
Thoroughly analyzed large complex datasets with advanced data mining techniques, uncovering trends and patterns that resulted in providing accurate forecasting. Worked on SQL Query optimization and Job schedulers to minimize abends and improve DQ on VM ware Horizon Client VD environment.
Environment: Teradata 15/14, Microsoft SQL Server, UNIX, Fast load, MS Office Tools, Tableau.


TruConnect, Dallas, Texas. June 2016- Nov 2017
Sr. Business Data Analyst
Responsibilities:
Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.Efficiently identified patterns within supplied raw data using advanced analytics techniques such as clustering, classification and regression modeling.
Actively participated in code migration process to higher environment via VMware Horizon Client VD and documents creation for the same. Used data analysis techniques to validate business rules and identify low quality missing data in the existing Amgen enterprise data warehouse (EDW).
Reviewed the SQL for missing joins, join constraints, data format issues, miss-matched aliases and casting errors.
Environment: Teradata 15, Teradata SQL Assistant, UNIX

Google Maps India Pvt. Ltd., Hyderabad, India. Sep 2012 -May 2015
Sr. Analyst
Responsibilities:
Query optimization (explain plans, collect statistics, Primary and Secondary indexes).
Prepare Use case models based on Business Requirements documentation gathered through.
Extensively tested the Business reports by running the SQL queries on the database by reviewing the report requirement documentation.
Performed data analysis on the existing data warehouse's of AFS, ACBS and InfoLease.
Environment: SSMS, Microsoft products.
Keywords: continuous integration continuous deployment quality analyst business intelligence sthree database visual design microsoft procedural language Virginia Washington

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];3587
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: