Sravani Veeraneni - Data Analyst |
[email protected] |
Location: Minneapolis, Minnesota, USA |
Relocation: yes |
Visa: OPT EAD |
Sravani Veeraneni
Email: [email protected] PH: +1 (507-366-8398) Linkedin: https://www.linkedin.com/in/veeranenisravani66 PROFESSIONAL SUMMARY: Around 4 years of professional experience as a Business Data Analyst with good knowledge and understanding, of scale data and analytics solutions on multiple different databases, including data collection, cleaning, visualization manipulation, and interpretation. Expertise in data analysis, including data profiling, data mapping, and data validation, to ensure data integrity and accuracy. Experience in data analysis, data visualization using Tableau/Microsoft Power BI, visio model building with machine learning algorithms for prediction and forecasting using data, using statistical tools like SAS and R, data mining using Python, SQL, Hadoop, Spark, Hive, etc. Experience in Data Extraction/Transformation/Loading (ETL), Data Conversion, and Data Migration using Microsoft SQL Server Integration Services (SSIS) and Informatica. Experienced in conducting user acceptance testing (UAT) and supporting end-users in adopting new systems and processes. Managed the integration of employee data from SAP SuccessFactors to other business intelligence systems for seamless reporting and analysis. Collecting, managing, and analysing healthcare data from multiple sources such as Electronic Health Records (EHRs), patient management systems, and clinical databases. Extensive experience working with JASON, XML, T-SQL, and Python Schema Designing. Developed use cases and user stories that capture functional and non-functional requirements. Experience performing Gap analysis, root analysis, risk analysis, and impact analysis to define boundaries, identify issues, minimize costs, and optimize solutions. Analysing data from CVS Health's retail operations to optimize the integration of health services with store operations, improving access to care for customers. Employed various JAD techniques, such as brainstorming, prototyping, and group discussions, to foster collaboration and promote a shared understanding of business goals and data needs. Experienced in conducting sprint planning sessions and backlog grooming activities using Jira. Proficient in creating various documentation deliverables such as business requirements documents (BRDs), and (FRD) functional specifications. Strong knowledge of insurance regulations and compliance requirements specific to the P&C Insurance domain, ensuring adherence to legal and industry standards. Skilled in utilizing SDLC Agile, and waterfall methodologies, such as Scrum project tasks, facilitate iterative development, and ensure timely delivery. Build the data pipelines that will enable faster, better, data-informed decision-making within the business. Proficient in utilizing Microsoft Visio for creating and modifying diagrams, flowcharts, organizational charts, and other visual representations. Worked with Project Managers to prioritize data analysis tasks based on business needs and project timelines, ensuring that critical analyses were completed on time. Expert in writing SQL queries and optimizing the queries in Oracle, Teradata, and SQL Servers. Experience in designing SQL queries using joins, sub-queries, functions, indexes, views materialized views, set operators, group by, and OLAP functions. Conducting outcome analysis to assess the efficacy of treatments, healthcare interventions, or patient care programs. Experience in working with databases like MongoDB, MySQL, and Cassandra. Created database, schema, and tables using Snowflake schema and migrated multistate-level data from SQL server to Snowflake. Maintained knowledge in designing star schema and snowflake schema for data modeling and created user stories using the JIRA tool. Solid Excellent experience in creating cloud-based applications using Amazon Web services, Amazon RDS, and Microsoft Azure. Worked closely with the project manager to identify data-related dependencies and constraints, ensuring seamless integration of data analysis tasks within the Waterfall framework. Worked on complex ad-hoc SQL queries using Joins, DML, DDL, pivots, views, constraints, and functions using Oracle, MS SQL Server, and MySQL. Experienced in working with JIRA software, including creating and managing projects, workflows, custom fields, dashboards, and reports. Implementing data audits and quality checks to maintain high data integrity across healthcare systems. Conducted data profiling and data quality assessments using SQL on DB2, identifying data anomalies, inconsistencies, and data integrity issues that required resolution. Extensive experience with Informatica PowerCenter and Power Exchange for designing, developing, and maintaining ETL workflows for data integration and migration between various data sources and targets. Advanced working knowledge of MS Project, MS Word, MS Excel, MS Visio, and MS PowerPoint to create test plans and timeframes for deliverables. Prepared and distributed JAD session agendas, meeting materials, and follow-up documentation to ensure all stakeholders were well-informed and engaged throughout the project lifecycle. Technical Skills: Languages Python, R, PL/SQL, SQL, C, C++ Statistical Analysis R, Python, SAS E-miner, SAS Programming Databases SQL Server, MS-Access, Oracle and Teradata, big data, Hadoop Cloud AWS and Azure DWH / BI Tools Tableau, Power BI, SSIS, SSRS, SSAS, Business Intelligence Development Studio (BIDS), Visual Studio, Crystal Reports, Informatica 6.1. Database Design Tools and Data Modelling MS Visio, ERWIN 4.5/4.0, Star Schema/Snowflake Schema modelling, Fact & Dimensions tables, physical & logical data modelling, Normalization and De-normalization techniques, Kimball &Inman Methodologies and Informatica Tools and Utilities SQL Server Management Studio, SQL Server Enterprise Manager, SQL Server Profiler, Import & Export Wizard, Microsoft Management Console, Visual Source Safe 6.0, DTS, Crystal Reports, , ProClarity ,Microsoft Office, Excel Excel Data Explorer, Tableau, JIRA, SparkMlib, MDM, IDQ PROFESSIONAL EXPERIENCE Client: Wallmart , Arkansas March 2023 - Present Role: Data Analyst Responsibilities: Analyzed data and extracted actionable insights to support business decision-making. Designed and implemented effective Analytics solutions and models with Snowflake. Collaborated with business stakeholders to understand their data analysis requirements and developed custom dashboards and visualizations using Databricks notebooks and Power BI. Monitored and maintained data integrity within SuccessFactors by performing regular data audits, ensuring compliance with company policies and HR regulations. Provided insightful recommendations based on data analysis and project outcomes, driving informed decision-making. Demonstrated ability to work effectively in a global delivery environment, collaborating with diverse teams across different time zones and cultures. Developed custom reports and dashboards within SAP SuccessFactors using built-in analytics tools to track key HR metrics such as turnover, recruitment success, and performance management. Utilized advanced data mining techniques, such as clustering, classification, and association rule mining, to discover patterns, trends, and insights from large datasets, facilitating informed business decision-making. Conducted ad-hoc data analysis requests and provided insights to address specific business challenges. Managed Care Contract Management/CMS, Quality and Reporting Provided recommendations and insights to drive business decisions and strategies, keeping up with industry trends. Ensured accurate synchronization of HRIS data across multiple platforms, improving data reliability for decision-makers. Designed and implemented end-to-end data pipelines to ensure smooth data flow from source systems to analytics platforms. Addressed and resolved complex technical challenges related to integration and system compatibility. Developed compelling visualizations and effectively communicated data insights to stakeholders. Identified Key Performance Indicators (KPIs) and developed reports and dashboards for monitoring and tracking performance. Reported on the impact of data normalization on analysis outcomes, ensuring stakeholders understood the significance of clean, normalized data. Worked on moving the data from AWS s3 to Snowflake and vice versa. Developed and executed complex SQL queries on DB2 databases to retrieve, analyze, and manipulate large volumes of business data, providing valuable insights for decision-making. A data discovery and visualization tool that allows users to explore data and create interactive dashboards and reports by using QlikView. Utilized Jira reporting and metrics capabilities to generate insightful reports, enabling data-driven decision-making and project performance evaluation. Queried the datasets from Snowflake and Athena for doing the data analysis and for viz reporting. Created and maintained comprehensive documentation (BRD) for data analysis methodologies, processes, and findings to facilitate knowledge sharing. Utilized Agile Method for daily scrum meetings to discuss project-related information. Involved in the complete SDLC life cycle of big data projects, from requirement analysis to production. Conducted User Acceptance Testing (UAT) to validate data-driven solutions and applications. Integrated data from multiple sources into Excel, performing data consolidation and validation. Utilized advanced Excel functions and pivot tables for data analysis and reporting. Used SUMIFS and LOOKUP features in MS Excel for tracking model construction. Developed VBA, Forms, Reports, and Queries for MS Access databases. Ensured adherence to data governance rules and standards for consistent business element names. Integrated Databricks with cloud providers (Google Cloud, Azure) for efficient data storage and retrieval. Worked with data lakes (Microsoft Azure), leveraging its storage and processing capabilities to handle large-scale datasets for advanced analytics and reporting. Assisted in the development and implementation of technology solutions, such as CRM systems or enterprise resource planning (ERP) systems, to optimize business processes. Applied min-max scaling or z-score normalization to numerical features for uniformity. Performed data profiling activities to understand the structure, quality, and content of source data. Identify any data inconsistencies or anomalies that may impact the mapping process. Experience in managing data model repositories and creating comprehensive documentation in metadata portals using industry-leading tools such as Erwin, Used ETL (Extract, Transform, Load) tools to automate and streamline data flow across different environments. Utilized SQL and Python programming languages to extract, manipulate, and analyze large and complex datasets from diverse sources, including PostgreSQL databases, APIs, and structured/unstructured data. Worked with data ingestions from multiple sources into the Azure Data Lake and worked on data load using Azure Data Factory through an external table approach. Written and maintained T-SQL and PL/SQL scripts to extract, transform, and load data from various databases to Databricks. Designed and deployed scalable, highly available, and fault-tolerant systems on Azure. Utilized GitLab, Docker, and Kubernetes for CI/CD on microservices and deployed them to the Azure Cloud. Developed purging scripts and routines to manage data on Azure SQL Server and Azure Blob storage. Utilized Talend ETL tool to create XML and JSON scripts for data extraction and storage in Oracle DB. Worked with various data integration and transformation technologies (Spark, Hive, Pig) to implement data processing workflows. Linked data lineage to data quality and business glossary work within the overall data governance program. Environment: Azure, Azure Data Lake, MS Excel, MS Access, Azure Data Factory, Power BI, Agile, Snowflake, ETL tool XML, JSON, SQL, CI/CD, Azure Cloud, T-SQL, PL/SQL, PostgreSQL, Erwin, AWS s3, Snowflake, QlikView, Databrick. Company : Capgemini Client: CVS Health July2020 - January 2022 Role: Data Analyst Responsibilities: Defined the customer s critical to quality (CTQ) issues and requirements for the current Defect Management process. Used Collibra Tool to manage, and store rationalized metadata and created automated process workflows for Data Validation by Chief Data Stewards. Measured performance of the As-Is process and collected metrics such as defect resolution rate and cycle time. Worked with ETL tools like Talend, Alteryx, or SSIS to automate data normalization in larger data pipelines. Mapped and transformed claims data from various formats into FHIR-compatible formats to ensure consistency and accuracy in data reporting and analysis. Involved in Performance Measurement to develop measurable indicators that can be systematically tracked to assess progress made in achieving predetermined goals. Worked with building data warehouse structures, and creating facts, dimensions, and aggregate tables, by dimensional modeling, and Star and Snowflake schemas. Proficient in utilizing Quantum Metric to analyze user behavior, identify digital experience issues, and optimize website and application performance. Responsible for ETL design (identifying the source systems, designing source-to-target relationships, data cleansing, data quality, creating source specifications, ETL design documents, and ETL development (following Velocity best practices). Led data migration projects from legacy systems to SAP SuccessFactors, ensuring smooth data transitions, minimal downtime, and maintaining high data accuracy. Ensure data security and compliance by implementing Azure's security features, such as data encryption, role-based access control (RBAC), and auditing, to safeguard sensitive data during analysis. Experienced in leveraging Quantum Metric's platform to gain insights into user journeys, interactions, and pain points across digital touchpoints. leveraging Dynatrace's APM capabilities to monitor application performance, detect anomalies, and troubleshoot issues in real time. Created business requirements and high-level design documentation and validated multiple web services used by the client, both REST and SOAP. Using Quantum Metric's session replay feature to visualize user sessions, understand user behavior, and diagnose usability issues. Gaining end-to-end visibility into application and infrastructure performance with Dynatrace's full-stack observability platform, spanning across frontend, backend, and infrastructure layers. Experienced in using Adobe Analytics to analyze and interpret data, track key performance indicators (KPIs), and generate actionable insights for digital marketing strategies. Implemented Power BI Power Query to extract data from external sources and modify the data to generate the reports; Designed and developed complex KPIs. Performed report validation writing SQL queries using TOAD on an Oracle database. Worked on real-time data flows using tools like Apache Kafka or AWS Kinesis for real-time analytics. Familiarity with using Dynatrace to monitor cloud-native environments, such as Kubernetes clusters, AWS, Azure, or Google Cloud Platform (GCP), to ensure optimal performance and scalability. Setting up and configuring real-time monitoring dashboards in Quantum Metric to track key metrics, detect anomalies, and respond to issues promptly. Proficient in creating customized reports and dashboards in Adobe Analytics to monitor website traffic, user behavior, and campaign performance. Worked on creating documents in Mongo database and creating various types of indexes on different collections Create business requirements, and technical specifications documents, create use case documents, create test cases, and manage project work cycles as a member of the ServiceNow development team. Cerner Managed Care Contract Management/CMS - Build and Load Using Adobe Analytics to track conversion funnels, analyze conversion rates, and identify opportunities for optimization. Designed and orchestrated end-to-end data pipelines using Azure Data Factory, seamlessly integrating data from various sources, performing transformations, and loading data into target destinations. Streamlined data movement and ensured data accuracy for analysis. Integrating Adobe Analytics with other Adobe Marketing Cloud solutions, such as Adobe Experience Manager (AEM) and Adobe Campaign, for seamless data sharing and campaign management. Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to Extract, Load, and Transform. Conducted user interviews, and one-to-one sessions with the managers and the critical users, and arranged JAD Sessions. Performed data mining on data using very complex SQL queries and discovered patterns and used extensive SQL for data profiling/analysis to guide in building the data model. Proficiently utilized Informatica to design, develop, and execute data transformation and ETL (Extract, Transform, Load) processes. Utilized Azure Databricks to analyze and process large-scale datasets, leveraging distributed computing capabilities to perform complex data transformations, machine learning, and advanced analytics tasks. Manage and create JIRA epics for confidential projects and translate functional design requirements into JIRA user stories. Experienced in end-to-end data analysis from data cleaning, data manipulation, data mapping, data mining, database (Oracle) testing, and developing controls using R Programming, Python, and reporting using latex coding. Acquired appropriate business rules in coordination with users to understand the business functionality. Data mapping, logical data MODELING, created class diagrams and ER diagrams, and used SQL queries to filter data within the Oracle database. Demonstrated proficiency in utilizing Azure Synapse Analytics (formerly SQL Data Warehouse) to create scalable data warehousing solutions. Worked as a Databricks Engineer with a focus on Data Warehousing ETL development in Azure Databricks, FHIR, MS Azure Databricks with Database and ADLS (Python). Used MS Visio to develop end-to-end entity relationship diagrams. Utilized Python to conduct statistical analysis and hypothesis testing, applying libraries like Scipy and statsmodels to validate assumptions, draw conclusions, and make data-driven recommendations. Demonstrated expertise in using Python to handle complex data manipulation tasks, such as merging datasets, reshaping data structures, and handling missing values, ensuring data integrity and accuracy. Responsible for the analysis and design of the data models, data mapping, and migration. Continuously work with upper-level management for project planning, scheduling, and budgeting using JIRA. Designed and implemented data integration workflows in Informatica, mapping source data to target structures and ensuring accurate data migration, consolidation, and synchronization. Wrote Stored Procedures, Packages, and PL/SQL scripts to incorporate new validations and enhance existing provisioning logic. Perform the UAT and get sign-off from the application owner before making a live application in AWS. Implemented data collection and transformation in the AWS cloud computing platform using S3, Athena, Glue, Redshift, PostgreSQL, and Quicksight. Environment: GCP, Hadoop, Kafka, Tableau, MS Excel, MS Access, Spark, Python, Data Lake, AWS, Hive, Rest API, Agile Methodology, Snowflake, PL/SQL, Redshift, PostgreSQL, Quicksight, JIRA, S3, Athena, Glue, TOAD, Oracle, Power BI, REST, SOAP. Company: Capgemini Client: CVS Health August 2019 July 2020 Role: Intern Data Analyst Responsibilities: Worked in Initial Analysis root out and identified erroneous/misaligned data. Worked on Data Verification review and confirmed results of data analysis to pinpoint data cleaning and worked with Data Cleaning to correct the erroneous/misaligned data, Data Quality to test and confirm the quality of the data. Worked with management to prioritize business and information needs. Provided a standard process for analyzing data across multiple systems and identifying erroneous/misaligned data recommend resolution and Interpreted data, and analyze results using statistical techniques. Extensively used SQL queries to analyze data and validate data correctness and completeness. Worked with cloud data warehouses, Tableau, Azure SQL Data Warehouse, and Snowflake and Informatica Cloud Data Integration solutions to augment the performance, productivity, and extensive connectivity to cloud and on-premises sources. Performed Data Analysis, and SQL Queries for testing & troubleshooting Data Warehouse. Ensure data security and compliance by implementing AWS's security features, such as encryption, identity and access management (IAM), and auditing, to safeguard sensitive data during analysis. BA Lead activities include managing multiple development projects, development groups, and/or application support functions for major business segments for Case Management and PBM Cerner Models, as well as Athena and eCW. Managed Care Contract Management/CMS Build and Load Performed detailed Data Analysis (DDA), Data Quality Analysis (DQA), and Data Profiling on source data. Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables. Evaluated Snowflake Design Considerations for any change in the application.Built logical and physical data models for snowflake as per the changes required. Defined virtual warehouse sizing for snowflake for different types of workloads. Worked on Electronic Health Records (EHR) systems, integrating and maintaining patient data with attention to privacy (HIPAA) and compliance standards. Utilized AWS Lambda to build serverless data processing pipelines, performing transformations and calculations on demand, and orchestrating data analysis tasks within the AWS ecosystem. Collaborate with pharmacy and clinical teams to ensure the integrity of health data, including patient records, claims data, and prescriptions. Ensuring that all data analysis and reporting adhere to HIPAA and other healthcare regulations regarding patient data privacy and security. Implemented data governance best practices to safeguard sensitive patient information. Designed and developed Insights reports on AWS Quicksight as part of client deliverables. Involved in testing the XML files and checking whether data is parsed and loaded to staging tables. Designed ER diagrams, logical model (relationship, cardinality, attributes, and candidate keys), and physical database (capacity planning, object creation, and aggregation strategies) as per business requirements. Filtered and cleaned data by reviewing data, computer reports, printouts, internet searches, and performance Identify and recommend new ways to save money by streamlining business processes. Designed and developed customized Tableau dashboards tailored to specific business needs, presenting key performance indicators (KPIs) and metrics in a visually appealing and user-friendly manner. Experience in Salesforce Customization, Security Access, Workflow Approvals, Data Validation, data utilities, Analytics, sales, Marketing, Customer Service, and Support Administration. Designed and implement ETL processes to extract, transform, and load data from various healthcare systems into CVS's data warehouses. Created and updated users, reports, and dashboards to track pipeline/stages for management visibility, while integrating Apex (applications) to Salesforce accounts such as Conga Merge and Outlook. Created pivot tables and ran VLOOKUPs in Excel as a part of data validation. Defining roles and responsibilities related to data governance and ensuring clear accountability for the stewardship of the company s principal information assets. Responsible for loading, extracting, and validating client data. Analyze the data using relational database products such as MS Access, Teradata, and SQL Server. Environment: UNIX, Erwin SQL, AWS Quicksight, Talend, Star & Snowflake Schema, MS Excel, MS Access, SQL, queries, and Ad-hoc reporting, VLOOKUPs, XML, EDUCATION Concordia university - St. Paul August 2022 December 2023 GPA 3.3 Information Technology management My master s in information technology management from Concordia university has broadened my expertise across software engineering, AI, data science, cybersecurity, and cloud computing. The program's blend of theory and practical applications has equipped me with robust problem-solving skills. Hands-on projects and collaborative research have kept me updated on emerging technologies, fostering a holistic problem-solving approach. Networking opportunities and mentorship have allowed me to build valuable connections with industry professionals. As I approach the program's conclusion, I am confident that the knowledge and skills acquired will form a strong basis for my future Endeavor in computer technology. Ellenki college of Engineering and Technology Hyderabad Telangana August 2016 June 2020 CGPA 6.87 Bachelors in computer science I have Completed a bachelor s degree in computer science with a focus on software development and algorithms. Courses included programming languages such as Java, Python, and C++, as well as classes in data structures, algorithms, and software engineering principles. Achieved a strong understanding of computer systems, networking, and operating systems through coursework and project-based learning experiences. CERTIFICATIONS CITI Program- Social & Behavioral Research-Basic/Refresher Certificate of Career Essentials in Data Analysis by Microsoft and LinkedIn Career Essentials in Data Analysis by Microsoft and LinkedIn. Keywords: cprogramm cplusplus continuous integration continuous deployment business analyst artificial intelligence access management business intelligence sthree database active directory rlang microsoft procedural language Delaware |