Home

Shiva Kore - Data Analyst
[email protected]
Location: Austin, Texas, USA
Relocation:
Visa:
Shiva Nagendra Babu Kore
Data Analyst
Edison, New Jersey | [email protected] | +1(203)-290-9550 | linkedin.com/shivanagendra
Professional Summary
Over 10 years of experience as a Data Analyst in healthcare, retail, finance & insurance industries
Strong technical skills in data analysis, SQL, and scripting languages (Python, R, PowerShell, .NET, Angular)
Skilled in documenting process flows and developing recommendations for optimization
Strong technical skills in data analysis, SQL, and visualization tools including OBIEE, Tableau, PowerBI, and Looker
Experienced in collaborating with data and AI teams to explore opportunities for automation
Worked extensively with Azure Blob Storage and AWS S3 for data extraction, transformation, and storage, streamlining cloud-based data operations.
Proficient in analyzing program data to identify trends, patterns, and areas for improvement
Created and maintained Define.xml files to describe the structure and content of SDTM datasets.
Strong communication and interpersonal skills, with experience in collaborating with offshore teams and client stakeholders Skilled in designing and implementing data visualizations using Tableau, Looker, and other tools to drive actionable insights.
Experienced in interpreting complex data sets and defining KPIs to guide strategic decision-making.
Demonstrated ability to lead projects independently, from conception to delivery, in dynamic team environments.
Strong communication skills, with a proven track record of presenting complex data findings to non-technical stakeholders.
Conducted ad-hoc analysis using SQL and Excel to provide timely insights and recommendations to management.
Contributed to decision-making by providing real-time insights using Power BI and Excel, enabling the business to make data-driven strategic choices.
Engaged with cross-functional teams to ensure efficient data processing and presented technical solutions to end-users in a clear and understandable manner.
Experienced in designing and developing complex reports and dashboards using tools like SQL Server Reporting Services (SSRS), Tableau, and Power BI
Proficient in leveraging big data tools like Hadoop, Spark, Hive, and Pig for advanced data analysis.
Experienced in version control and collaboration tools like Git and source tree.
Highly experienced in creating complex Informatica mappings and workflows working with major transformations.
Expertise in Microsoft Integration Services like SSIS (Control Flow tasks, Data Flow tasks, Transformations, Database administration tasks)
Expertise in creating reports in Power BI preview portal utilizing the SSAS Tabular via Analysis connector.
Skilled in building and deploying various machine learning models including logistic regression, linear regression, neural networks, random forest, and more using Python libraries like scikit-learn.
Proficient in time series analysis, forecasting models, and building monitoring dashboards using Python and its data visualization libraries.
Experienced in building data pipelines, REST APIs, and containerizing applications using Python, Flask, Kafka, Docker, and Ansible
Skilled in test-driven development, agile methodologies, and SCRUM processes, with the ability to transform business requirements into analytical models and algorithms.

Technical Skills
Programming Languages: Python, Java, R, SQL, .NET, Angular, PowerShell
Databases: MySQL, PostgreSQL, SQL Server, MongoDB, Snowflake, AWS Redshift
Business Intelligence Tools: Excel, Tableau, PowerBI, Shiny App, Teradata, SQL Server Reporting Services (SSRS), Crystal Reports, Looker, Business Objects, Micro Strategy
Cloud Platforms: AWS (Glue, Athena), Azure (Data Factory, SQL), Google Cloud Platform (GCP)
Libraries: Python (Pandas, NumPy, SciPy, Scikit-learn, Statsmodels, NLTK, Plotly, Matplotlib, Seaborn), R (plyR, dplyR, data table and sqldf, tidyR, Forecast, Reshape2, Caret), Web Scraping (BeautifulSoup, Scrapy), TensorFlow
Frameworks: Django, Flask, Spring Boot, Spring, Angular
Big Data Technologies: Apache Hadoop, Hive and Spark, Snowflake
Data Warehousing: SQL Server Integration Server (SSIS), Informatica (Power Center/ Power Mart), Data Mining, DataMart, OLAP, OLTP, Data Profiler, IBM Infosphere
Statistical Models: Decision Trees, Naive Bayes classification, Logistic Regression, Linear Regression, Neural Networks, Support Vector Machines, Clustering Algorithms and PCA
Reporting Tools: Tableau, SAS, Power BI, SQL Server Reporting Tools (SSRS), Alteryx, Adobe Analytics
ETL Tools: Azure Data Factory, AWS Glue, Data Build Tool (DBT)
Operating Systems: PowerShell, UNIX/UNIX Shell Scripting, Linux, Windows & MacOS
IDE Tools: PyCharm, IntelliJ IDEA, Anaconda, Jupyter Notebook
Web Development: HTML, CSS & JavaScript

Education:
Sacred Heart University | Master s in Computer Science Fairfield, CT | 2015 2016
GPA - 3.51/4.0
Coursework Digital Marketing, Machine Learning, Deep Learning, Text Based Analysis etc.
Lovely Professional University | Bachelor s in Computer Science Punjab, India | 2008 2012

Professional Experience

Truist, Atlanta, GA Jan 2022 - Present
Senior Data Analyst
Project Finance Analysis

Responsibilities:
Implemented and followed Agile development methodology within the cross-functional team and acted as a liaison between the business user group and the technical team.
Designed and implemented scalable data pipelines using Dataiku, enabling efficient data processing and analysis across multiple data sources
Led and participated in collaborative data science projects, leveraging Dataiku's integrated environment to coordinate tasks across data preparation, model building, and deployment
Developed large scale data structures and pipelines to organize, collect and standardize data using SQL and Python which helps in generating insights and address reporting needs.
Performed feature engineering, train the algorithms, back-test models, compare model performances using Python packages like Pandas, Numpy, Scipy etc.
Write ETL (Extract / Transform / Load) processes, design database systems and develop tools for real-time and offline analytic processing.
Gathered data from various sources, including databases and spreadsheets, and reviewed the data for accuracy and completeness.
Cleaned data to eliminate duplicates, corrected errors, filled in missing values, and standardized formats for integration into a data lake.
Designed and implemented scalable data pipelines using Azure Data Factory and SQL for data processing and analysis across multiple sources.
Collaborated with engineers, analysts, and developers to understand both upstream and downstream data uses.
Utilized Power BI to create interactive and intuitive visualizations, communicating insights to non-technical audiences.
Worked extensively with Azure and AWS S3 to extract, load, and store data.
Applied SQL for advanced data querying and data manipulation, optimizing database operations and ensuring data accuracy.

Chubb INC, Warren, New Jersey May 2018 Dec 2021
Senior Data Analyst
Project Insurance Service Analysis

Responsibilities:

Developing and deploying predictive models based on historical data that provide future predictions about customer experience.
Built high-performance data workflows and algorithms using Tableau Prep, Python, or R
Performed Exploratory Data Analysis using R. Also involved in generating various graphs and charts for analyzing the data using Python Libraries
Designed data profiles for processing, including running SQL, and using Python and R for Data Acquisition and Data Integrity which consists of Datasets Comparing and Dataset schema checks.
Deploy ML solutions (i.e. new bidding models) into production: supervised, unsupervised, reinforcement and deep learning.
Developed predictive models based on historical data using Python and SQL, leading to improved customer experience.
Performed data cleaning and data visualization using Python libraries like Pandas, NumPy, and Matplotlib.
Collaborated with business owners to prepare comprehensive data mapping documents and present data findings through Power BI dashboards.
Designed and maintained data processing pipelines using Azure Data Factory and performed real-time data extraction and transformation using AWS S3.
Automated workflows using SQL scripts and shell scripting for data ingestion, ensuring consistent data quality.
Created Looks and Dashboards on Looker and OBIEE, including measures and dimensions based on data and business requirements
Contribute to the experimental design, execution, test, and critical evaluation of methods as applied to translational data science research projects.
Built the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS cloud native technologies.
Updated Python scripts to match training data with our database stored in AZURE Cloud Search, so that we would be able to assign each document a response label for further classification.
Use a flexible, analytical approach to extract optimal value from biomedical data.
Contribute to design and conduct of continuous validation plans for production systems that incorporate models and algorithms, providing guidelines and support for large-scale implementation.
Performed Exploratory Data Analysis and Data Visualizations using R, and Tableau.
Contribute to the creation, adoption, and adherence to best practice methodologies for performing data analysis and predictive modeling experiments.
Work with Digital Engineering tasks including use of Natural Language Processing (NLP), Artificial Intelligence (AI), and Machine Learning (ML) methods, techniques, and tools. This includes technically working on multiple projects in a matrix environment while creating new programs and starting new projects.

West Rock, Atlanta, GA Feb 2016 Mar 2018
Data Analyst
Project Packaging Analysis

Responsibilities:
Created and updated SQL tables, database, stored procedures, and queries to modify and/or create reports for respective business units and used Mongo DB to create queries.
Created a User Defined Functions in Python to automate the repetitive task to increase the efficiency of data pipeline development and we used Angular for frontend.
Built high-performance data workflows and algorithms using OBIEE, Tableau Prep, Python, and R
Performed Data visualization and Designed dashboards with KIBANA, and generated complex reports, including Charts, Summaries, and Graphs to communicate the findings to the team and stakeholders.
A user-generated data extraction program was developed to extract potentially useful information from web pages.
Performed Data Cleaning, Data Visualization, Information retrieval, Feature Engineering using Python libraries such as Pandas, NumPy, Scikit-learn, Matplotlib and Seaborn.
Feature Engineered raw data by doing imputation, normalization and scaling as required on the data frame. Converted categorical variables to numerical values using Label Encoder for EDA and readability by the machine learning models.
Performed univariate, bivariate, and multivariate analysis to check how the features were related in conjunction to each other and the risk factor.
Applied PCA to reduce the correlation between features and high dimensionality of the standardized data so that maximum variance is preserved along with relevant features.
Built machine learning models for Regressions based on Decision Trees, Support Vector Machine and Random Forest to predict the different risk levels of applicants and used Grid Search to improve the accuracy over the cleaned data.
Proactively identified opportunities to automate time and resource intensive procedures associated with data validation and transformation using Python and Azure Data Factory.

Logica, Hyderabad, India Jun 2012 Dec 2014
Java Developer

Responsibilities:
Involved in software development life cycle (SDLC), which includes requirement-gathering, design, coding, testing.
Developed Project Specific Java APIs for the new requirements with the Effective usage of Data Structures, Algorithms, and Core Java, OOPS concepts.
Developed web service for web store components using RESTful API using Java & Spring.
Developing web-based applications using CSS, HTML, JavaScript.
Experience in Object Oriented design, Systems Analysis, Software &Web Application development.
Developed business modules using Hibernate & Spring framework technologies.
Responsible for analysis, design, development, and integration of backend components using J2EE technologies such as Spring and Spring JDBC and EJB's.
Developed Project Specific Java APIs for the new requirements with the Effective usage of Data Structures, Algorithms, and Core Java, OOPS concepts.
Developed web service for web store components using RESTful API.
Used XML, XSD and JSON messages for Data transfer, and JMS for communication between the applications and MQ for communicating with the third-party applications.
Implemented Business logic in the middle-tier using Java classes, Java beans.
Created and modified Complex SQL Commands.
Database designing and tables, master data creation in the database.
Good in writing build files with Maven.
Used Sonar for maintaining the code quality and JUnit code coverage.
Writing complex SQL queries, using object-relational mapping libraries (Hibernate) and MVC frameworks (Spring MVC) as well as building and consuming SOAP and REST services.
Keywords: artificial intelligence machine learning message queue business intelligence sthree database active directory rlang Connecticut Georgia

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];3739
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: