Piyush - SQL SSIS ETL Datawarehouse Database Developer |
[email protected] |
Location: Jersey City, New Jersey, USA |
Relocation: |
Visa: H1B |
Excellent 9+ years of working experience with Data warehousing projects design, development, and implementation with SQL Server, SSIS, Informatica Power Center in high transaction and multi-server production environment. Self-learning ability and leadership quality.
Professional Summary: Excellent Knowledge of Software Development Life Cycle, RDBMS, Data Warehouse & Data Modeling concepts. Strong experience in creating T-SQL objects, working with DDL/DML, performs most of the SQL Server Enterprise Manager and Management studio functionality using T-SQL Scripts and Batches. User Defined Functions, Relational Database Models and Data Integrity in observing Business Rules. Proficient in creating Indexed Views, complex Stored Procedures, effective functions, and appropriate Triggers to facilitate efficient data manipulation and data consistency. Extensively experienced in database activities like Maintenance and Performance Monitoring and Tuning using tools such as Index Tuning Wizard, SQL Profiler, Troubleshooting and Replication. Handled huge financial data, expert level query optimization and performance tuning experience and Experience in writing Distributed Queries between SQL Servers. Worked on Snowflake Cloud Database, SQL Server, Oracle, SAP Hana. Worked with different heterogeneous source systems like Epic, Clarity, Caboodle, OPH, Covid Registry, Dentrix, Cerner, Peoplesoft, Workbench, etc. SQL Report writing with Epic and other applications including development with clarity and caboodle EHR databases. Good understanding on Epic Clarity and Caboodle Databases. Good Understanding of functional programming in Python. Strong experience in ETL tools SSIS and reporting tool SSRS with SQL Server and QlikView. Proficient in using T-SQL for developing complex Stored Procedures, Triggers, Tables, Views, User Functions, User profiles, Relational Database Models and Data Integrity, SQL joins and Query Writing also creating Indexes, Indexed Views in observing Business Rules. Excellent experienced in data Extraction, Transformation and Loading/ETL using various tools such as SQL Server Integration Services (SSIS), Informatica power center, DTS, Bulk Insert and Bulk Copy Program utility. Expert in designing SSIS Packages for integrating data using OLE DB connection from heterogeneous sources (Excel, CSV, Oracle, DB2, flat file) by using multiple transformations provided by SSIS. Hands on experience in creating Jobs, Alerts, setting up SQL server mail agent for SSIS packages. Experience in Error and Event Handling: Precedence Constraints, Break Points, Check Points, and Logging using SSIS. Experience in modeling data requirements to successfully create graphs and metrics as needed by end-users in Qlik Sense and QlikView software. Experience in designing, building, testing, and debugging QlikView and Qlik Sense solutions based upon specified requirements. Experience with QlikView Management Console. Experience in working with multiple components such as QVW, QVD, QVF and distribution services Experience in creating and designing QlikView Publisher jobs Expert in using generating Visual Studio 2012/2010/2008 client-side Report, Tabular Reports, Matrix Reports, drill down Reports, Parameterized Reports, Ad-hoc Reports, and Stylish Reports using SQL Server Reporting Services (SSRS), Crystal reports, Power suite and Tableau. Experienced in Logical and Physical Database design & development, Normalization and Data modeling using Erwin and SQL Server Enterprise manager. Good exposure on Data Modeling and Data analysis, Strong Data Warehousing skills in Dimensional modeling using Star Schema/Snowflake schema. Experience in writing Python scripts for automation of the ETL processes. Strong troubleshooting and problem-solving skills. Excellent verbal/written communication skills, proactive, self-managing and teamwork spirits and can perform well both independently and as a team player. Well-organized, goal-oriented, highly motivated effective team member with excellent analytical, troubleshooting, and problem-solving skills. Education: Master s in computer science, San Francisco Bay University CA, USA 2016 Bachelor of Engineering, North Maharashtra University, India 2013 Technical Skill SQL Databases: MS SQL Server 2016/2014/2012/2008r2, My SQL, DB2, Oracle, TFS, Snowflake SQL. Tools: Microsoft Visual Studio 2008/2010/2012, MS-Office, Crystal Reports, Microsoft SQL server Management Studio, Informatica Power Centre, Informatica Enterprise Data Catalog, Alteryx. Tableau, Erwin Tool, SQL server Query Analyzer, SQL server mail service, DBCC, BCP, SQL server profiler, MS Excel, Git, Data Visualizer for SQL, Microsoft Azure, Agile Application, Track-wise Quality Management System, Snowflakes, JIRA, HP ALM, Microsoft Business Intelligence Suits, Service Now. Language: Python, T-SQL, Oracle, My SQL, C#, HTML, .Net, XML, Unix/Linux. Business Intelligence: Qlik Sense, QlikView, Tableau. Data Governance: Informatica Data Catalog, Informatica Data Quality. PROFESSIONAL EXPERIENCE SQL ETL Data Warehouse Developer/Data Analyst New York Health + Hospital, NYC April 2020 -Present Currently working on building the data warehouse EDW project on cloud for New York Health and Hospital. The goal of this project to build an integrated Snowflake Datawarehouse by bringing data from multiple data sources like Oracle, PeopleSoft, SQL Server, and Custom Databases. Worked on multiple Clinical core projects like 300x300, Covid Business Intelligence, Cost Accounting on premise. Worked with different heterogeneous source systems like Epic, Clarity, Caboodle, OPH, Covid Registry, Dentrix, Cerner, Peoplesoft, Workbench, etc. Gathering requirement from Business Analyst, Information analyst, stakeholders to ensure data quality and usability and to convert business requirements to technical specification. Work with product owners and stakeholders to define requirement and develop DataMart. Strong experience in Epic Report Writing. Implement data ecosystems including data management and the integration of structured and unstructured data to generate insights leveraging cloud based Data warehouse Snowflake. Collaborated with teams of Business Analyst, Data Engineer, and Data Warehouse Architects to design an architect Stage, Curated and EDW layer to leverage cloud-based Snowflake Data Warehouse. Perform Data cleansing, Gap Analysis for the data assets and co-ordinate with source systems owner for mitigation plan. Provided detailed STM/data mapping, transformation, and integration rules for Data warehouse. Coordinate with other teams to ensure needed source data availability. Monitor Snowflake across all the environments. Build a Python and Apache Spark based configurable framework to connect common Data sources like MYSQL, Oracle, SQL Server. Writing SQL script to support reporting, API Integration, Business Intelligence development. Developed Epic and Covid and Clinical Mart Layer to developed various decision-making Dashboard s. Worked on Database migration from Epic systems Clarity, Caboodle pseudo DB to Snowflake Data warehouse. Write complex SQL queries involving multiple joins, sub queries, index, aggregate/window functions to design data mart and ensure their stability, reliability, and performance. Query Clarity and Caboodle databases using complex SQL queries. Familiar with Epic Clarity and Caboodle Data Model. Optimized the performance of queries with modification in T-SQL queries, removed unnecessary columns, eliminated redundant and inconsistent data, normalized tables, established joins and created indexes wherever necessary. Involved in designing the data warehouse and core Database for the system including tables, Stored Procedures, Indexes, UDF and triggers. Create SQL objects such as schema, Tables, altered Databases, Indexes, Views, Sequences, Constraints, Synonyms across all SDLC environments. Creating complex stored procedures, and views for the application. Used Temporary tables, Table variables and Common Table Expressions (CTE's) depending on the need of the logic. Worked to design and developed enterprise data warehouse, data marts which has different business areas such as Patient, Provider, Procedure, labs, Diagnosis, Visits, Organizations, Vitals, Imaging, Immunization, etc. Analyze the ETL design document and worked with DBA and ETL Architect and coordinate with application team and develop ETL mapping Responsible for ETL Operation which extracts and transforms the data-to-Data Warehouse using SQL Server Integration Services (SSIS). Developed/Monitored ETL packages with different data sources (SQL Server, Oracle, Flat Files, Excel files, etc.) and loaded the data into target tables by performing different kinds of transformations using SQL Server Integration Services (SSIS). Wrote Stored Procedures, user defined types, trigger, table valued functions and aggregate using .Net managed language. Used ETL to implement the Slowly Changing Transformation, to maintain Historical Data in Data warehouse. Used various SSIS tasks such as Conditional Split, Derived Column, Lookup, Script, Merge Join, Sort and Execute SQL Task which were used for Data Scrubbing, data validation checks during Staging, before loading the data into the Data warehouse. Design and Developed mapping, sessions, and workflows to extract, validate and transform data according to business rules using Informatica power center and SSIS. Developed and maintain data pipelines using SQL and SSIS to load data from various systems like Epic clarity and Caboodle, Dentrix, OPH, DB. Used SSIS, Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems like Epic clarity, Caboodle, Dentrix, flat files into cloud-based data warehouse snowflake. Involved in extracting the data from the Flat Files and Relational databases into staging area. Created complex SSIS packages which loaded multiple files using Foreach loop containers, and utilized a wide-variety of data transformations, including conditional splits, data conversions, merge joins, script components, lookup, Derived column, etc. Implemented Event Handlers and Error Handling in SSIS packages and notified process results to various users and business units. Worked on end-to-end map for major dental payor into 835 formats for NYCHH. Completed and tested an 835 ERA from positional flat file by creating multiple test scenario. Involved in fixing various issues related to data quality, data availability and data stability. Worked on high criticality EIM production support task. SQL/BI Developer Intuitive Surgical, California November 2018 to March 2020 Intuitive Surgical Inc. is an American corporation that develops, manufactures, and markets robotic products designed to improve clinical outcomes of patients through minimally invasive surgery, most notably with the da Vinci Surgical System. The project goal is to provide complete details of Da Vinci Surgery product, parts and documents related to Da Vinci Surgery products. The project is to produce validated source of Agile Data in Enterprise Data Warehouse from back-end Oracle DB of Agile Application to SQL Server and meets business and regulatory requirement. And worked with Data Quality of EDW to data correct and proper way. Developed system standards, architecture, scenarios, detailed screen specification and documented logical and physical data model. Designed and implemented table functions, stored procedures, and triggers in SQL Server 2014/2016. Write SQL code with complex queries involving multiple joins, sub queries, indexes, and other aggregate functions to design database and ensure their stability, reliability, and performance. Create SQL objects such as schema, Tables, altered Databases, Indexes, Views, Sequences, Constraints, Synonyms and Partitions (Range and List partitions) across all SDLC environments. Expert in creating complex stored procedures, and queries for the application team and for the end users based on the business requirements. Used Temporary tables, Table variables and Common Table Expressions (CTE's) depending on the requirements. Utilizing JSON Parsing to improve the system performance during huge volume of data loading. Data cleansing, Data profiling, in-depth analysis using SQL queries, stored procedure and performance tuning and perform query optimization. Analyze the ETL design document and worked with DBA and ETL Architect and coordinate with application team and develop ETL mapping Developing solutions including complex codes and applications using T-SQL, Azure Data warehouse, SSIS, BODS. Responsible for ETL Operation which extracts and transforms the data-to-Data Warehouse using SQL Server Integration Services (SSIS). Developed/Monitored ETL packages with different data sources (SQL Server, Oracle, Flat Files, Excel files, etc.) and loaded the data into target tables by performing different kinds of transformations using SQL Server Integration Services (SSIS). Wrote Stored Procedures, user defined types, trigger, table valued functions and aggregate using .Net managed language. Used ETL to implement the Slowly Changing Transformation, to maintain Historical Data in Data warehouse. Used various SSIS tasks such as Conditional Split, Derived Column, Lookup, Script, Merge Join, Sort and Execute SQL Task which were used for Data Scrubbing, data validation checks during Staging, before loading the data into the Data warehouse. Involved in creating proxy accounts and Scheduling Jobs to execute SSIS packages. Prepare documentation related to database design, Object security and rules and worked on SSIS Packages Documentation, source, archive, Stage path mapping. Develop Exception handling process and audit process for each ETL package. Create logical/physical diagrams to smoothen the mapping of data between multiple data marts/Data Warehouse. Good Exposure on Data Modeling and Data Analysis in Multidimensional modeling using Star/Snowflakes schema, experienced in designing relationship database using entity relationship diagrams & data migration from Oracle to SQL. Prepare code for all module according requirement specification and integrate system with existing EDW system. Maintained the server permissions for Active Directory groups across the team. Perform various testing like integration testing, regression testing and user acceptance testing to validate the service code for the Data Service using ETL tool. Performing high quality Unit Testing using HP ALM application for GxP Project and report various tasks and bugs tracker in HP ALM system during unit testing and QA. Creating test scripts and strategy with Quality Assurance (QA) Team for regression automation. Coordinating with different teams to perform Integration Testing across internal Intuitive dependent applications. Enhancing the existing modules using T-SQL and Azure to accommodate changing business requirements and tracking changes through version control tool like Git Hub and SVN to manage code development. Delivered task using Agile Scrum Methodology, Daily standups. Environment: Microsoft SQL Server 2016, SSMS, SSIS, Microsoft Visual studio 2016 .Net, Git, Service Now, MS SQL Profiler, SQL Query Analyzer, Oracle SQL developer, Agile, HP ALM. SQL/BI Developer Accenture/PECO-Exelon Corporation, PA April 2018 to October 2018 PECO is an electric and natural gas utility subsidiary of Exelon Corporation the nation largest competitive energy provider. PECO is largest electric and natural gas utility in Pennsylvania State. Working with Exelon the project goal was to design and develop data marts architecture and to load the data warehouse using ETL/SSIS, and MS SQL codes. The project goal is to design and develop business related DataMart and Integration packages. Identified opportunities for business automation and drove process improvements. Responsible for converting business requirement into functional and technical document. Installed and configured SQL Server 2012 and SQL Server Integration Service. Helped creating ETL database modules for Design, Development, Analytical/Reporting purposes. Rewrite the existing code, queries, and stored procedures for long running queries by replacing cursors with various joins, common table expression (CTE), indexing and applied MS best practices for query tuning. Developed and optimized database structure, Stored Procedure, Indexes, Trigger, and user defined Functions. Used Aggregate, Conditional Split, Union All, Lookup, Derived Column, Data Conversion, Multicasting and Merge Join Data Flow transformations in SSIS Responsible for implementation of data viewers, SSIS Logging, error configurations for error handling in the packages. Created SSIS packages to load the data from OLTP system to Data Mart for analyzing the data. Involved in daily loads (full and incremental) into staging and ODS areas, troubleshooting process, issues and error using SQL Server Integration Services 2012. Responsible for implementation of data viewers, SSIS Logging, error configurations for error handling the packages. Used SQL Server 2012, SSIS and VS.Net 2012 Data tool, VB.Net 2013, Regular Expression to create SSIS container components, packages to implement large files, data cleansing seeding for data transform and load. Performed transformation conditional split, sorting, and included data about the environment using Audit Transformation. Created SSIS package using for each loop, sequence container to load multiple files from same source folder on different FTP location and to perform group task. Migrated Packages of SQL Server 2012 to SQL Server 2016 as per business requirement. Build & implement drill through drill down, parameterized, and real time BI reports using SSRS. Developed and rendered monthly, weekly, and daily reports and per requirement and gave roles, user access with respect to security report manager. Experienced in generating reports and deploying it into the server using SSRS (SQL Server Reporting Services). Created simple query filters, prompted filters and simple filters to restrict data in reports. Responsible for Scheduling and publishing reports using SSRS 2014, Responsible for deploying reports to Report Manager and Troubleshooting for any error occurs in execution. Documented the developed work and production release for future reference. Delivered task using Agile Scrum Methodology, Daily standups. Experience in supporting multiple production database servers. Participated in testing during the UAT, Mentored and monitored team development and status. Environment: Microsoft SQL Server 2012/2014/2016, Microsoft SQL Server 2008 R2, SSMS, SSIS, SSRS, Microsoft Visual studio 2008/2013 .Net, SVN, ALM, Windows server 2008R2, MS SQL Profiler, SQL Query Analyzer. SQL/BI Developer Streams Inc. USA Dec 2016 to April 2018 Steams Inc. is a technical, consulting, and staffing service provider company to various clients. While working with Streams Inc. the project goal was to design and develop data marts architecture and to load the data warehouse using ETL/SSIS, and MS SQL codes and create daily and monthly reports using SSRS. Involved in preparing technical specification document based on the BRD and interact with the Business Analyst to analyze and understand the Business needs. Design and Architecture of Business Intelligence system. Worked closely with project manager and development team to maintain detailed and accurate project plan and ensure key deliverables were produced on time. Created various SQL objects such as schema, tables, field definitions and mappings, stored procedures, CTE, Triggers, views, and indexes/keys. Worked on Data Marts for Deposits and Loans data. Optimized the performance of queries with modification in T-SQL queries, removed unnecessary columns, eliminated redundant and inconsistent data, normalized tables, established joins and created indexes wherever necessary. Involved in designing the data warehouse and core Database for the system including tables, Stored Procedures, Indexes, UDF and triggers. Designed SSIS Packages in SQL Server 2012 to handle data migration (ETL Processes) from transaction System to Dimensional Model. Implemented SSIS (ETL) to extract and transform data from DB2, Oracle, RDBMS and Flat files/CSV, SQL Server 2012 instances and to load into staging and then to DW for further Data Analysis and Reporting by using multiple transformations provided by SSIS such as Data Conversion, Conditional Split, Bulk Insert, merge and union all, lookup, fuzzy lookup, fuzzy grouping and merge join and derived column. Created SSIS Packages to perform filtering operations and to import the data on daily basis from the OLTP system to SQL server. Responsible for implementation of data viewers, SSIS Logging, error configurations for error handling the packages. Developed and deployed data transfers packages using SSIS, also maintained the jobs running on Servers. Generated Complex reports like reports using Cascading parameters, Snapshot reports Drill- down Reports, Drill-Through Reports, Parameterized Reports and Report Models and ad hoc reports using SQL Server Reporting Services/SSRS also based on Business Requirement Document Designed and created Report templates, bar graphs and pie charts based on the financial data. Involved in scheduled reports on a weekly basis that was generated every Friday containing the details of weekly sales for our location for assessing the business for that week. Developed high level daily summary report (Flash Report) and monthly business summary report (Monthly Management Report) and distributed these reports across different business units using subscription process using SSRS. Scheduled the Reports to run on daily and weekly basis in Report Manager and email them to director and analysts to review in Excel Sheet. Responsible for Scheduling and publishing reports using SSRS 2014, Responsible for deploying reports to Report Manager and Troubleshooting for any error occurs in execution. Created Crystal Reports with multiple sub-reports. Created running total and summaries for calculated filed within Crystal Reports. Developed dashboard with power BI with display key KPIs for business analysis and decision making. Designed and created data extracts, supporting SSRS, POWER BI, Tableau, or other visualization tools reporting applications. Used TFS for version control, Work Item Tracking, and published test results in TFS to share with the team members. Environment: Microsoft SQL Server 2014/2012, SSMS, SSIS, SSRS, BIDS, Microsoft Visual studio 2010, MS SQL Profiler, SQL Query Analyzer, Power BI, Tableau, Qlik Sense. SQL Automation Developer Praxis Technologies Pune, India June 2013 to April 2015 Praxis Technologies is an Information Technology in an enterprise solution providing company in business of companies in all sectors by providing end to end enterprise level solutions. The Project had a series of phases enabling migrate existing system to Microsoft SQL Server Database. Planned, Defined and Designed database based on business requirement and provided documentation. Performed activities like creation & indexing of the tables. Created views to facilitate easy user interface implementation, and triggers on them to facilitate consistent data entry into the database. Coordinated with DBA in creating and managing tables, indexes, triggers, database links and privileges. Gathered data and documented it for further reference. Fine-tuned SQL Queries for maximum efficiency and performance using SQL Profiler and Database Engine Tuning Advisor. Created SSIS package to load data from staging environment to production environment using Lookup, Fuzzy Lookup, Derived Columns, Condition Split, Term Extraction, Aggregate, Pivot Transformation, and Slowly Changing Dimension. Used SSIS to create ETL packages to Validate, Extract, Transform and Load data to Data Warehouse and Data Mart Databases. Create a SSIS package using different source MySQL, OLEDB, flat files, Excel files. Modified the existing SSIS packages to meet the changes specified by the users. Scheduled Jobs for executing the stored SSIS packages which were developed to update the database on Daily basis using SQL Server Agent. Worked in setup logging environment to track the status of the events like error raise, task fail. Responsible for ETL using SSIS Extraction layer to read data from various sources, loading into Staging Layer, transform using business logic into Transformation Layer and present data using Data Virtualization using Presentation Layer. References Available Upon Request Keywords: csharp quality analyst business intelligence database active directory information technology hewlett packard microsoft California Colorado Pennsylvania |