Home

Amit Raj - Data Analyst
[email protected]
Location: Herndon, Virginia, USA
Relocation: Yes
Visa: H1B
PROFESSIONAL SUMMARY:
Seasoned Data Analyst with Around 7 years of industry experience, adept at evaluating data sources, designing Data Warehouses/Data Marts, and implementing BI solutions.
Results-oriented IT Professional with a strong background in all phases of data modeling, data cleaning, and interpreting data to drive informed business decisions.
Expertise in writing SQL queries and optimizing performance for Oracle and SQL Server databases.
Extensive experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Export using various ETL tools.
Proficient in the Software Development Life Cycle (SDLC) with a comprehensive understanding of testing methodologies and disciplines.
Skilled in using Excel and MS Access for data manipulation and analysis tailored to business needs.
Demonstrated ability in data mining using complex SQL queries to uncover patterns and insights, particularly in healthcare claim data.
Proficient in creating interactive TABLEAU dashboards for margin analysis and business insights across different funnel stages.
Strong experience in using excel and MS access to dump the data and analyze based on business needs.
Performed data mining on claims data using complex SQL queries and discovered health care claim pattern
Proficient in Creating different TABLEAU dashboards to perform margin analysis for e-rate deals across the different funnel stages.
Collaborated with global teams to deliver ad-hoc reports and analysis.
Used Python Matplotlib packages to visualize and graphically analyses the data.
Involved in testing XML files and checked whether data is parsed and loaded to staging tables.
Experience in creating UNIX scripts for file transfer and Manipulation.
Collaborative team player with a track record of delivering ad-hoc reports and analysis in a global environment.
Experienced in data visualization using Python Matplotlib packages and creating UNIX scripts for file manipulation.
Strong coder proficient in multiple languages including Python, R, and SQL, with a focus on delivering actionable insights to management and technical teams.
Extensive hands-on experience with ETL & Reporting tools such as SQL Server Integration Services (SSIS) and SQL Server Reporting Services (SSRS).

TECHNICAL SKILLS:

Data visualization Tools: Tableau, Power BI, Excel, Jupiter, Google Analytics
ETL Tool: Informatica PowerCenter, Alteryx
Data Visualization Tools: Tableau Server, Tableau Desktop, Tableau Prep, ThoughtSpot
Data Base Tools: Oracle 12C, PostgreSQL, DBeaver 6.3, MS Access, MS-SQL Server Database, IBM DB2, MS Access
Scripting Languages: Python, R
Reporting Tools: Business Objects6.5, XIR3, Cognos 8 Suite
Programming: SQL, PL/SQL, UNIX Shell Scripting, VB Script
Bug Tracking Tools: Jira, gdb, junit, Bugzilla
DataModelling: Star-schema, Snowflake schema, Fact and Dimensions, Pivot, Erwin
Other Tools: MS Access, Unix

PROFESSIONAL EXPERIENCE

Client: Capital One, Richmond VA Jan 2023 Till Date
Role: Data Analyst

Responsibilities
Generated various reports using SQL Server Report Services (SSRS) for business analysts and the management team.
Provided continued maintenance and development of bug fixes for the existing and new Power BI Reports.
Monitor, Triage, Track, and resolve issues with the Excel Power BI reports.
Installed and configured on-premises gateway in PowerBI services.
Migrated 20+ ThoughtSpot dashboards to Tableau precisely and on schedule. Independently resolved production interruptions in fast-paced operations.
Responsible for checking the daily inbound, outbound data activities, and update appropriate EMR data in the back end of the production server.
Developed business process models in Waterfall to document existing and future business processes.
Imported the claims data into Python using Panda s libraries and performed various data analyses.
Programmed a utility in Python that used multiple packages like NumPy, and pandas.
Used PySpark with Python and Apache Spark to apply Python codes to user data stored in Hive.
Updated/Managed data and queries using PLSQL and retrieve data related to equipment s performance and quality reports.
Created Logical & Physical Data Modeling on Relational (OLTP), Dimensional Data Modeling (OLAP) on Star schema for Fact & Dimension tables using Erwin.
Assisted to create an algorithm to detect anomaly/outlier detection on abnormal events for the month-to-month sales of medical devices to distributors.
Accelerated Tableau dashboard efficiency, optimizing Snowflake data sources via SQL, reducing loading speeds of 7 seconds or more to 1 second. Created custom Tableau functions from the ground up.
Developed and implemented scalable ETL pipelines using Databricks Notebooks and Spark SQL, ingesting over 67 TB of data daily with a 99.9% success rate.

Environment: Oracle, SQL, Teradata, IBM Cognos, NumPy, SciPy, Matplotlib s, SSAS, MDM, XML, SDLC, PL/SQL, Informatica IDQ, CDQ Mapping, Erwin Ralph

Client: Cloud Nova Technologies LLC, Reston VA Nov 2020 Jan. 2023
Role: Data Analysis Engineer

Responsibilities
Deployed isolation forest anomaly detection models on AWS for security logs, reducing mean time to detect (MTTD) unauthorized access and maintaining accuracies above 85% in identifying intrusions.
Managed PostgreSQL database containing data from various AWS security logs such as CloudTrail, VPC Flow Logs, and S3 access logs, ensuring consistency and accuracy for security dashboard.
Configured 17+ tags for AWS resource utilization dashboards, enabling cost allocation by department, type of service, and AWS cloud regions.
Experience in developing complex Alteryx workflows, analytical applications, Iterative & batch Macros, chained Apps using Alteryx Designer.
Clinical Data Analyst and Epic Clarity Report Writer, focused in Population Health, Real Time Analysis (RTA), and Predictive Modeling to improve delivery of care across the quality spectrum.
Excellent knowledge of Epic Build, Epic Clinical Workflows, Epic Implementation, Revenue Cycle, Provider, Professional, Facility, COB, Medicare/Medicaid Claims, HL7 Interface Messages, Split Claims, TA1, 820 EDI Transactions, EOB, EOP, Claim Adjudication, FACETS & NASCO User Interface.
Expertise in working with ETL Tools like Informatica. Performed ETL and data staging from multiple data sources using Alteryx.
Good SQL reporting skills with the ability to create SQL views.
Perform Data Cleaning, features scaling, features engineering using pandas and NumPy packages in python.
Responsible for creating SQL datasets for Power BI and Ad-hoc reports.
Expertise in Data Manipulation, Curing, quality control and Anomaly Detection using functional programming in Python & R.
Operated efficiently in a dynamic startup environment, adept at thriving in ambiguity and managing multiple projects with tight deadlines.

Environment: Studio, SQL, Python, APIs, OLAP, OLTP, PL/ SQL, Oracle, Teradata, Power BI, ETL, SQL, Redshift, Pandas, NumPy, Seaborn, Matplotlib, Scikit-Learn, Scipy, NLTK, Python, XGBOOST, Tableau, Power Query, Snowflake, AWS, S3, RDS, Erwin.

Client: Accenture Federal Services Internship May 2020- Aug 2020
Role: Data Analyst

Responsibilities
Implemented an XGBoost model with 84% accuracy to identify assets eligible for retirement within Accenture's asset retirement cycle.
Created filterable dashboards on Tableau, enabling users to visualize data from multiple perspectives and improving cycle efficiency.
Plug Google sheets data to Tableau and create Business reports.
Performed data analysis, data interrogation, validation, and defined data pruning rules.
Streamlining quality reporting processes for medical adherence.
Integrate its customer-created Tableau workbooks and dashboards with Epic s EMR.
Experienced in creating Prototypes, Wireframes and Mock-Up Screens to visualize (GUI).
Involved as a technical expert providing input to application design regarding production using and administration requirements.
Used techniques like data blending which brings data from multiple data sources into a single view in Tableau.
Created reports with Tableau to measure performance in Retail.
Data auditing for various Business networks data
Proactively analysed data using SQL, Python, R and Microsoft Excel to answer key questions from clients to monitor product sales.
Involved in extensive DATA validation using SQL queries and back-end testing.
Well-versed with all stages of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
Obtained input from Business team for Retail data management issues.
Regularly analyzed the Business plans and financial health of the project to track the progress and prevent deviation from the set targets.
Applied various Machine Learning Models like Lasso Ridge, PLS, Random Forest, linear and Logistic regression in R and performed data visualizations using GGPlot.
Used Python Matplotlib packages to visualize and graphically analyses the data.
Good Knowledge on SQL queries and creating database objects like stored procedures, triggers, packages and functions using SQL and PL/SQL for implementing the business techniques.
Created custom SQL queries connecting multiple data sources for a Tableau Dashboard, selectively extracting essential data to optimize loading speed.

Environment: Python, SQL, Tavleu, NumPy, Tableau, MS Office Suite, Visio, PowerShell, Windows XP,7,10...

Client: Impetus, Indore, India Nov 2016 June 2019
Role: Data Scientist

Responsibilities
Developed a product recommendation system using Spark Streaming and Kafka, via Spark MLlib to analyze real-time clickstream data and predict customer preferences with 84% accuracy.
Used XGBoost, gradient boosting algorithm to identify underperforming regions in marketplace data. Obtained AUC-ROC (0.87) and F1 score (0.79)
Partnered with Business Owners to develop key business questions and to build datasets that answer those questions.
Assisted in developing the user test plans, cases, and scenarios to know the business requirements, technical specifications, and product knowledge.
Preprocessed the collected data, imputed the missing value, and applied the business rules for report building.
Processed and updated third party data sources.
Compiled and analyzed customer complaints data in the Track Wise tool to identify metrics and trends associated with the complaint handling process.
Created a script using VBA and Macros in Excel 2010 to automate the merging of tables in the different worksheets for comparison of various metrics in complaints data.
Implemented VBA and Macro code to automate the creation of pivot tables and do VLOOKUPS in Excel.
Created an interactive dashboard application using R and Shiny to evaluate the quality measures using six-sigma charts using the complaint data metrics.
Successfully automated the task of data collection from Track Wise to the backend database of MS SQL Server 2010.
Used ETL to develop functions for extracting, transforming, cleaning, loading data from various sources to a different destination.
Used Python, Tableau, and Excel to analyze the number of products per customer and sales in a category for sales optimization.
Visualized data by using advanced tableau functions like action filters, context filters, and Level-of-Detail (LOD) expressions.
Conducted feature importance analysis to understand factors influencing customer churn and store performance.

Environment: R, RShiny, VBA, Excel, Jupiter, NumPy, Tableau, Clustering, SQL, Visio, Adv. Excel, PowerPoint, MS Access, MS Word.
Keywords: business intelligence sthree active directory rlang information technology microsoft procedural language Virginia

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];2425
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: