Home

Jayanth Devineni - Business Intelligence/ Power BI/ Azure /SQL Server/ Tableau Developer
[email protected]
Location: Fairfax, Virginia, USA
Relocation: Yes
Visa: H1B
Jayanth Devineni
Business Intelligence/ Power BI/ Azure /SQL Server/ Tableau Developer
(703) 689-9945 EXT: 142
[email protected]
Fairfax, VA
Yes
H1B

Summary:
Process-oriented around 9 years of professional experience in various roles like BI Consultant, Data Analyst, Data Engineer, and Ops engineer with solid understanding of Data Modeling, Evaluating Data Sources and strong understanding of Data Warehouse/Data Mart Design, ETL, BI, OLAP, Client/Server applications.
Explore data in a variety of ways and across multiple visualizations using Power BI and Responsible for creating SQL datasets for Power BI and Ad-hoc Reports.
Demonstrated strong communication skills in collaboration with business partners throughout the data life cycle development, Initiative thinker and team player dedicated in generating analytical insights and strategic paths using data.
Helped various businesses to make data insightful decisions using Python, R, SQL, Data Engineering, Data Cleaning, Data Visualization, SAS, Exploratory Data Analysis, Machine learning Algorithms, Google/Adobe Analytics, Tableau, and other web technologies.
Experience in large datasets of Structured and Unstructured data, Data visualization, Data acquisition, Predictive Modeling, Data Validation.
Experience in developing different Statistical Machine Learning Text Analytics, Data Mining solutions to various business generating and problems data visualizations using Python, R, SAS, SQL, and Tableau.
Created different visualization in Power BI according to the requirements.
Create reports in the Power BI preview portal utilizing the SSAS Tabular via Analysis connector.
Performed data cleaning, data validation, and a using data analysis expression (DAX), power query and power Pivot.
Expertise in transforming business requirements into building models, designing algorithms, developing data mining and reporting solutions that scales across massive volume of unstructured data and structured.
Expertise in applying Advance MS excel techniques like VLOOKUP s, Pivot Tables and Macros for analysis and reporting.
Experience in designing stunning visualizations using Tableau and publishing and presenting dashboards on web and desktop platforms (Added some visualizations in public profile mentioned above).
Extensive experience in extraction and in-depth from different Databases using SQL Queries, sub-queries, and joins using SQL.
Excellent knowledge in creating Databases, Tables, Stored Procedure, DDL/DML Triggers, Views, User defined data types and effective functions.
Highly skilled in using visualization tools like Tableau, Matplotlib for creating dashboards.
Extensive working experience with Python including Scikit-learn, Pandas and NumPy SciPy, Scrapy.
Well experienced in Normalization & De-Normalization techniques for optimum performance in relational and dimensional database environments.
Worked and extracted data from various database sources like Oracle, SQL Server and Teradata.
Facilitated and helped translate complex quantitative methods into simplified solutions for users.
Knowledge of working with Proof of Concepts and gap analysis and gathered necessary data for analysis from different sources, prepared data for data exploration using data munging.
Highly experienced and knowledgeable in developing analytics and statistical models as per organizational requirements and ability to produce alternate cost effective and efficient models.

Technical Skills:
Database SQL Server, MS Access, Teradata, Oracle, Amazon Redshift, Postgres SQL
NoSQL Databases HBase, MongoDB, Talend, Cassandra, DynamoDB, CouchDB/Oracle/ DB2, SQL Server
Programming Languages Proficient in Python, R, Familiarity with SAS Base, SAS Enterprise Miner, SQL Server Management Studio (SSMS), Bash scripting; SQL (Oracle &SQL Server, PostgreSQL)
Packages & Tools Pandas, NumPy, SciPy, Scikit-Learn, NLTK, matplotlib, Seaborn, ggplot2, dplyr, data, Table, tidyr, Spark, Pyspark, Keras, TensorFlow
Statistical Modeling Linear Regression, Logistic Regression, Multi-nominal Logistic Regression, Regularization Ridge Regression, Lasso Regression, Time Series forecasting
Text Mining Text Pre-Processing, Information Retrieval, Text Classification, Topic Modeling (Latent Dirichlet Allocation (LDA), Non-negative Matrix Factorization (NMF)), Text Clustering, Sentiment Analysis
Operating Systems UNIX, Linux, Windows
Reporting & Visualization Tableau, Matplotlib, Seaborn, pilot, QlikView, SSRS, Cognos, Shiny Hadoop, Hive, Apache Spark, Pig, Pyspark
Education:
George Mason University, Master s in data Analytics Engineering, Fairfax, VA
Gitam University, Bachelor s in computer science, India

Professional Experience

Power BI/ Data Engineer March 2022 Till Date
Optum (Remote)
Description: Optum, Inc. is an American pharmacy benefit manager and health care provider. It has been a subsidiary of UnitedHealth Group since 2011. UHG formed Optum by merging its existing pharmacy and care delivery services into the single Optum brand, comprising three main businesses: OptumHealth, OptumInsight and OptumRx.
Responsibilities
Developed operational metrics dashboard using Tableau to monitor the performance of the metrics which gives the insights to the business.
Developed and maintained project plans including written specifications of tasks, expected timelines, and deliverables consistent with the team deadlines.
Collaborated with the team on appropriate data sources and analytic approaches for identifying key trend drivers and evaluating potential interventions to impact trends.
Recommended and implemented new or modified reporting methods and procedures to improve report content and completeness of information.
Troubleshoots data integrity issues, analyzes data for completeness to meet business needs, and coordinated resolutions for all issues related to reports supported by the business.
Converted complex data from multiple sources into meaningful, professional, and easy to understand formats for various audiences as defined by the business guidelines.
Developing visual reports, dashboards, and KPI scorecards using Power BI desktop.
Designed and developed Power BI graphical and visualization solutions with business requirement documents and plans for creating interactive dashboards.
Imported data from various sources such as SQL Server, SQL Server Analysis Services (Tabular Model), Multi-Dimensional Model, and MS Excel to generate reports.
Created DAX Queries to generated computed columns in Power BI.
Developed SQL queries using common table expressions (CTEs), views, temporary tables to support the Power BI report.
Wrote Calculated Columns, Measures query is in Power BI desktop to show good data analysis techniques.
Utilized Power Query for data cleansing and data transformation.
Worked on all types of transformations that are available in Power bi query editor.
Created database schema and number of database objects like tables, user defined functions, and views using SQL Server management studio.
Loading data into snowflake tables from the internal stage using snowsql.
Used FLATTEN table function to produce a lateral view of VARIANT, OBJECT AND ARRAY column.
Developed snowflake procedures for executing branching and looping.
Worked on several python packages like numpy, pytables, scipy etc.
Solved complex problems and developd innovative solutions.
Performed all other related duties as assigned.
Developed reports to track ongoing performance of the business programs.
Communicated progress, findings, and analysis in written, oral and graphical form.
Supported reporting and analysis on all business initiative projects.
Environment: Tableau, Power BI, R, Python, numpy, pytables, scipy, Agile, Snowflake, SQL Server, SQL Server Management Studio (SSMS), Data Governance, MS Office Suite Excel (Pivot, VLOOKUP), MS Word, DB2, Hadoop, Azure, Data Quality, Google Analytics.

Power BI/ Data Analyst June 2021 February 2022
AT&T, Dallas, TX
Description: AT&T Inc. is an American multinational conglomerate holding company that is Delaware-registered but headquartered at Whitacre Tower in Downtown Dallas, Texas. It is the world's largest telecommunications company and the largest provider of mobile telephone services in the U.S.
Responsibilities
Conducting full lifecycle analysis to include requirements, activities, and design.
Responsible to develop analysis and reporting capabilities for the daily data trends.
Responsible for weekly and monthly statistics reports.
Prepared performance Dashboard, Jobs Failure stats, Data Volume check using SSRS, SSIS, SQL and Excel and gives PPT presentation near client for this on a daily basis
Creating and managing schema objects such as Tables, Views, Indexes and referential integrity depending on user requirements.
Building the pipelines to copy the data from source to destination in Azure Data Factory
Worked on creating dependencies of activities in Azure Data factory.
Creating Stored Procedure and Scheduled them in Azure Environment.
Monitoring Produced and Consumed Data Sets of ADF.
Creating Data Factories in Azure Data factory.
Successfully creating the Linked Services on the source and as well for the destination servers.
Develop and deploy SSIS packages, configuration files, and schedules job to run the packages to generate data in CSV files.
Utilized Power BI for Office 365 to amplify data visualizations with collaboration and data management.
Mapped millions of records over Bing maps from Excel data model and discovered actionable insights for strategic planning utilizing Power BI (Power Map).
Generated ad-hoc reports in Excel Power Pivot and sheared them using Power BI to the decision makers for strategic planning.
Create and manage Reports subscriptions and schedules SSRS reports.
Designing and implementing a variety of SSRS reports such as Parameterized, Drilldown, Ad hoc and Sub- reports using Report Designer and Report Builder based on the requirements.
Troubleshooting reports issues, ETL job failures, optimizing query performances.
Design and implement the ETL processes using SSIS which involves collection of data from sources like SQL Server.
Develop SSIS, SSRS / T-SQL Code and schedules jobs for Jobs Monitoring Automation.
Created, Maintained & scheduled various reports in Power BI like Tabular Reports.
Explore data in a variety of ways and across multiple visualizations using Power BI.
Created effective reports using visualizations such as Bar chart, Clustered Column Chart, Waterfall Chart, Gauge, Pie Chart, Tree map etc. in Power BI.
Extensively used Erwin tool in Forward and reverse engineering, following the Corporate Standards in Naming Conventions, using Conformed dimensions whenever possible.
Involved in data mapping and data clean up.
Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
Planned project activities for the team based on project timelines using Work Breakdown Structures.
Created user guidance documentations.
Environment: MS SQL Server 2019/2016, MS SQL Server Reporting Services (SSRS), MS SQL Server Integration Services (SSIS), MS SQL Server Analysis Services (SSAS), DAX, Agile, C#.NET, T SQL, SQL Profiler, XML, Team Foundation Server (TFS), MS Excel, Excess, Windows 8.

Data Analyst/Data Engineer February 2020 May 2021
Intel corporation, Santa Clara, CA
Description: Intel Corp. engages in the design, manufacture, and sale of computer products and technologies. It delivers computer, networking, data storage, and communications platforms.
Responsibilities
Worked with Business Systems Analysts to fine tune specifications by ensuring that all database constraints are satisfied and perform impact analysis of changes.
Involved in the logical and physical data models of databases and prepared test cases for existing solution and new solutions to validate the solution met requirements. And prepared documentation for same.
Created complex Stored Procedures, Triggers, Functions, Indexes, Tables, Views, SQL joins and other T-SQL code to implement business rules (used CTE, temporary tables, user defined table types in codes developed).
Performed T-SQL tuning and optimization of queries for reports that take longer execution time using MS SQL Profiler, Index Tuning Wizard and SQL Query Analyzer in MS SQL Server.
Experience in Power BI and with different BI component like (Oracle, SQL, SSIS, SSRS, SSAS).
Manage, monitor, and improve Power BI environment including ongoing changes and active product backlog.
Imported data from Oracle DB, SQL Server DB and Azure SQL DB to Power BI to generate reports.
Cleaning up datasets using merge queries, append queries, and sort data columns to properly fit our reports.
Created Dax Queries to generate computed columns in Power BI, also applied DAX calculated measures and columns.
Good Knowledge Installed and configured Enterprise gateway in Power BI service.
Scheduled Automatic refresh and scheduling refresh in Power BI service.
Experienced on how to publish, schedule reports and dashboards to meet business requirements.
Developed visual reports, dashboards, and KPI scorecards using Power BI desktop.
Good Skilled of calculated columns, Measures queries in Power BI desktop to show good data analysis techniques and Generated reports in Power BI Modelling and usage of DAX.
Worked on all kind of reports such as Yearly, Quarterly, Monthly, and Daily.
Implementation of Row-level security with Power BI (RLS).
Deployed and Created Compute on Azure Cloud Service.
Implemented on Azure Analysis Service, Azure Data factory, Azure Blob, Data Lake.
Designed, implemented, and supported platform providing ad-hoc access to large database.
Environment: SQL Server 2019/2016/2012, SQL BI Suite (SSIS, SSRS), T-SQL, Power BI, SQL Profiler, MS Office, Erwin, Windows Server 2016.

MSBI Developer November 2018 January 2020
The American Society for Engineering Education, Washington, DC
Description: The American Society for Engineering Education (ASEE) is a non-profit member association, founded in 1893, dedicated to promoting and improving engineering and engineering technology education. The purpose of ASEE is the advancement of education in all its functions which pertain to engineering and allied branches of science and technology, including the processes of teaching and learning, counseling, research, extension services and public relations.
Responsibilities:
A machine learning algorithm was developed to identify rogue trading behavior using decision trees (random forest). This analysis helps banks to identify rogue traders. Decision tree algorithm was used to identify factors to find the higher risk of illegal trading behavior.
Working directly with client stakeholders to understand and define analysis objectives and then translate these into actionable results.
Obtaining data from multiple, disparate data sources including structured, semi-structured and unstructured data.
Performed database design and implementation using SQL Server and data modeling tools.
Parameter optimization to make better predictions using TensorFlow.
Working with data integration developers to assess data quality and define data processing business rules for cleansing, aggregation, enhancement etc. support analysis and predictive modeling activities.
Presenting complex analysis results tailored to different audiences (e.g., technical, manager, executive) in a highly consumable and actionable form including the use of data visualizations (Tableau, Python).
Imported the customer data into R using dplyr and finding patterns and correlations in data.
Built predictive models using Logistic Regression, Random Forest, and Decision Tree in Python to predict the probability of customer churn.
Improved model reliability and reduced errors caused by variance, bias and over fitting using ensemble techniques like Bagging and Boosting in Python (Spyder).
Reported business recommendations over customer churn and objectified persistent issues in Jupiter Notebook using interactive visualizations from seaborn in Python.
Performed on dynamic aggregation in DAX formula, multiple Datetime columns in M-Code along with cleaned and transformed tables in Power Query.
Created Sub-Reports, Drilldown-Reports, Summary Reports, and Parameterized Reports using Power BI
Used Power BI features to build and configure Hierarchical dimensions, Measure, Calculated Columns, Filter and Aggregations.
Generated multiple tables and fields using Direct Query to SQL Server in Power BI.
Created and developed Power BI visualizations for the Key Performance Indicator (KPI) and trend analysis.
Established Logical and Physical database models to design source to target process.
Used Common Table Expressions (CTE), Constraints, and Complex SQL queries to fetch the data from different servers.
Ingested data from different sources to the target using SQL Server Integration Services (SSIS) and applied various transformations such as Derived Column, Union All, Merge Join, Data Conversion, Conditional Split etc.
Environment: SQL Server, SQL Server Management Studio (SSMS), VBA, MS Office Suite Excel (Pivot, VLOOKUP), DB2, R, Python, Agile, Azure, Data Quality, Adobe Analytics, Google Analytics, Excel, MS Word, R Shiny, Tableau, Java Script & HTML5.

Business Intelligence Developer June 2017 October 2018
DHL Express, Plantation, FL
Description: DHL Express (USA), Inc., doing business as DHL Express, provides mail services. The Company offers addressed letters, parcels, air and ocean freight, contract logistics, warehousing and distribution, and packages services. DHL Express operates worldwide.
Responsibilities:
Worked as a liaison between multiple teams to gather and document requirements and developed data science platform designed to cover the end-to-end Machine learning workflow: manage data, train, evaluate, and deploy models, make predictions, and monitor predictions using different machine learning methodologies like Regression, Bayesian, Decision Trees, Random Forests, SVM, Kernel SVM Clustering, Instance based methods, Association Rules, Dimensionality Reduction etc.
Used algorithms and programming to efficiently go through large datasets and apply treatments, filters, and conditions as needed.
Created meaningful data visualizations to communicate findings and relate them back to how they create business impact.
Developed user documentation for all the application modules. Also responsible for writing test plan documents and unit testing for the application modules.
Build up strong knowledge into customer systems, processes, and infrastructure VBA coding.
Performing data profiling and analysis on different source systems that are required for Customer Master.
Used T-SQL queries to pull the data from disparate systems and Data warehouse in different environments, worked closely with the Data Governance Office team in assessing the source systems for project deliverables.
Presented DQ analysis reports and score cards on all the validated data elements and presented -to the business teams and stakeholders, used Data Quality validation techniques to validate Critical Data elements (CDE) and identified various anomalies.
Extensively used open-source tools - R Studio(R) and Spyder (Python) for statistical analysis and building the machine learning.
Involved in defining the Source To business rules, target data mappings, data definitions.
Interacting with the Business teams and Project Managers to clearly articulate the anomalies, issues, findings during data validation.
Extracting data from different databases as per the business requirements using SQL Server Management Studio.
Interacting with the BI teams to understand / support on various ongoing projects extensively using MS Excel for data validation.
Generating weekly, monthly reports for various business users according to the business requirements. Manipulating/mining data from database tables (Redshift, Oracle, and Data Warehouse).
Providing analytical network support to improve quality and standard work results.
Create statistical models using distributed and standalone models to build various diagnostics, predictive and prescriptive solution.
Utilize a broad variety of statistical packages like Python, SAS, R, MLIB, and others.
Provides input and recommendations on technical issues to Business & Data Analysts, BI Engineers, and Data Scientists.
Collected and transformed data from different data sources to make some Inter-organizational Analysis.
Attended to various events of academic institutions to understand research requirements and analysis needs.
Generated reports and analysis using reporting and visualization tools such as Power BI and Tableau.
Performed in data analysis, data cleaning, data transforming and data aggregation by using SQL.
Designed and developed of various dynamic dashboards and reports utilizing Tableau and Power BI Visualizations such as bar graphs, scatter plots, pie-charts, maps, funnel charts, lollypop charts, donuts, bubbles, etc. making use of actions and other local and global filters.
Made Explanatory Data Analysis then extracted and manipulated datasets for predictive analysis.
Environment: Data Governance, SQL Server, SQL Server Management Studio (SSMS), MS Office Suite - Excel (Pivot, VLOOKUP), DB2, R, Python, Visio, HP ALM, Agile, Azure, MDM, Share point, VBA, Data Quality, Tableau and Reference Data Management, Linux/Unix.

MSBI Developer March 2016 May 2017
General Motors, Detroit, MI
Description: General Motors Company designs, builds, and sells cars, trucks, crossovers, and automobile parts. The Company offers vehicle protection, parts, accessories, maintenance, satellite radio, and automotive financing services.
Responsibilities:
Responsible for the Study/Creation of SAS Code, SQL Queries, Analysis enhancements and documentation of the system.
Used R, Python, SAS, and SQL to manipulate data, and develop and validate quantitative models.
Brainstorming sessions and propose hypothesis, approaches, and techniques.
Created and optimized processes in the Data Warehouse to import retrieve analyze data from the Cyber Life database.
Develop and modify VBA code for Microsoft Products.
Analyzed data collected in stores (JCL jobs, stored procedures, and queries) and provided reports to the Business team by storing the data in Excel/SPSS/SAS file.
Created database objects like tables, stored procedures, views, triggers, rules, defaults, user defined data types and functions in SQL Server.
Performed Analysis and interpretation of the reports on various findings.
Prepared Test documents for zap before and after changes in Model, Test, and Production regions.
Responsible for production support Abend Resolution and other production support activities and comparing the seasonal trends based on the data by Excel.
Used advanced Microsoft Excel functions such as pivot tables and VLOOKU Pin order to analyze the data and prepare programs.
Successfully implemented migration of client's requirement application from Test/DSS/Model regions to Production.
Prepared SQL scripts for ODBC and Teradata servers for analysis and modeling. Provided complete assistance of the trends of the financial time series data.
Various statistical tests performed for clear understanding to the client. Implemented procedures for extracting Excel sheet data into the mainframe environment by connecting to the database using SQL.
Environment: Python, R/R Studio, SQL Enterprise Manager, SAS, VBA, SQL Server Management Studio (SSMS), Microsoft Excel, Microsoft Access and MS Outlook.

MSBI Developer June 2015 February 2016
HCL Technologies, India
Description: HCL Technologies is an IT software development and related engineering services. The Group's technologies utilize a variety of technologies, including Internet and e-commerce, networking, internet telephony, embedded software, testing, satellite and wireless communications, and component-based object technologies. The company also operating in different sectors like healthcare, finance, aerospace, retail, telecom, logistics & travel among others.
Project: Enhancement of e-commerce application for client Newegg Inc.
Responsibilities:
Involved in complete Software Development Life Cycle (SDLC) process by analyzing business requirements and understanding the functional workflow of information from source systems to destination systems.
Gathered user requirements, analyzed, and designed software solution based on the requirements.
Collect, analyze, and validate data to allow for accurate Tableau based reporting, created action filters, parameters, and calculated sets for preparing dashboards and worksheets using Tableau.
Developed ETL packages with different data sources (SQL Server, Flat Files, Excel source files, XML files) and loaded data into target tables by performing different kinds of transformations using SQL Server Integration Services (SSIS).
Used SSRS Report Manager to assign roles, permissions and to create report schedules.
Maintain and update MS Access databases macros and VBA modules.
Worked in Tableau environment to create dashboards like Yearly, Monthly reports using tableau desktop & publish them to server and developed formatted, complex reusable formula reports.
Restricted data for users using Row level security and User filters.
Designed the architecture of Tableau security by customizing the access levels and creating various user groups by assigning row and column level security at folder level and user level.
Worked on creating interactive, real-time dashboards using drilldowns and customizations.
Published Tableau dashboard on Tableau Server and embedded them into portal.
Designed and Developed SSIS jobs to extract data from heterogeneous sources, applied transform logics to extracted data and loaded into data warehouse databases using SSIS.
Involved in creating database objects like tables, views, procedures, triggers, functions using T-SQL to provide definition, structure and to maintain data efficiently.
Writing SQL queries for data validation by using join conditions.
Provided support for Tableau developed objects and understand tool administration.
Involved to create Documentation for debugging report for Tableau.
Migrating reports from standard reporting tools like excel to tableau.
Provided 24/7 production support for Tableau users.
Environment: Tableau (Desktop/Server), Oracle, MS SQL Server, VBA, SQL Server Management Studio (SSMS), SQL BI Suite (SSIS, SSRS), SharePoint, MS Visio, Microsoft Office Suite (Word, Excel, PowerPoint).
Keywords: csharp business intelligence database active directory rlang information technology golang hewlett packard microsoft California Delaware Florida Michigan Texas Virginia

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];987
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: