PRAVS - ETL DEVELOPER |
[email protected] |
Location: Detroit, Michigan, USA |
Relocation: |
Visa: GC |
Employer : [email protected]; [email protected]; (609) 778-4215 ext 1000
PRAVS PROFESSIONAL SUMMARY 11 Years of IT experience in analysis, design, and development of Data warehouse & Business Intelligence applications (work + education project/work) using CA Erwin, Cognos products & Oracle. Extensive experience on the usage of ETL Tools: Informatica, ODI. BI Tools: BOBJ, Cognos10/Cognos8 (Framework Manager, Report Studio, Query Studio, Event Studio, Cognos Connection), OBIEE (Answers/BI Admin Tool), QlikView, Tableau. Experience in data modeling, designing, and data analysis with Conceptual, Logical and Physical Modeling for Online Transaction Processing and Online Analytical Processing (OLTP & OLAP). Extensive experience on Data warehouse/Data mart Design, Development, Data modeling and designing star schemas. Solid understanding of data warehousing, OLTP and OLAP concepts. 6+ EDW/BI Architecture, 15 years ETL/BI, extensive performance tuning experience & PRODUCTION SUPPORT (was supporting batch jobs throughout the career). Well versed in Normalization to 3NF/De-normalization techniques for optimum performance in relational and dimensional database environments. Experience in Data Warehousing, Data Modeling, Data Profiling, Data Analysis, OLTP and OLAP. Can effectively use CASE tools like ERwin and ERStudio. Experience in developing DWH s/marts using Bill-Inmon & Ralph-Kimball approaches as per the enterprise/business needs. Developed Data Warehouse designs, working with staging, star schemas and dimensional reporting. Experience with PeopleSoft, Salesforce ERP systems. Experience in dealing with different data sources ranging from flat files, Excel, Oracle and SQL Server. Experience in design reviewing and validation of final data models. Extensive experience in writing View, MV s, SQL & PL/SQL scripts for Oracle DWH environment. Experience in working on Multiple Relational Databases including Oracle, Teradata, MySQL, MS SQL Server. Experience in SQL, PL/SQL, Stored Procedures, Functions, Packages, Triggers and scripting languages. Experience in Leading on-site and offshore developers and timely resolution of issues. Responsible for interacting with business partners to identify information needs and business requirements for reports. Involved in Production/Customer Support, Deployment, Development and Integration. Having strong communication and analytical skills with a desire to learn advancements in the IT industry. CERTIFICATIONS IBM Certified Solution Expert Cognos BI (IBM Cognos 8 BI Author, IBM Cognos 8 BI Metadata Model Developer and IBM Cognos 8 BI Administrator). Oracle Data Integrator 11g Certified Implementation Specialist. Databricks Data Engineer Associate 2022. TECHNICAL SKILLS Data Modeling/ Data Warehousing Tools: Informatica Power Mart/ Power Center, IBM Infosphere Data Stage 8.5, ODI 10g/11g/12c. BI/ OLAP Tools: Cognos 10/8, Power Play, Cognos Query, OBIEE, Cognos Administration Console 8.4, PowerPlay, Transformer, Tableau Server, Tableau Desktop. Databases: SQLServer2008/2005/2000/7.0, Oracle 12c/11.x, 10.x/9i/8i, Exadata, MySQL, Snowflake, Redshift. Case Tools: Erwin. Operating Systems: Windows XP/2007/10, UNIX, MS-DOS. Languages: SQL, PL/SQL, T-SQL, Shell Script, JavaScript, HTML, VB, ASP, ASP.Net and XML. Others: SQL Query Analyzer, SQL Enterprise Manager, Oracle Golden Gate (OGG), ODBC, Office2000, Excel, Toad, Control-M, CA ESP, IDQ. PROFESSIONAL EXPERIENCE Client: Trinity Health, MI Nov 2016 Current Sr. ETL Developer Study and understanding the Business Scenarios of the existing systems, translating business requirements to ETL & BI design. Expert in working on all activities related to the development, implementation, and support of ETL processes for large-scale data warehouses using Power Center. Develop Informatica technical design documentation to load data from legacy systems into Staging and Data warehouse tables. Extensively worked in performance tuning of programs, ETL procedures and processes. Also used debugger to troubleshoot logical errors. Extensively provided production support (enhanced & developed new as well) to 4000+ ETL jobs that are in informatica, oracle PL/SQL (code developed back in 2000 s), MySQL procedures, shell scripts, cron/control-m scheduling that load to Oracle & Teradata. Build Informatica mappings, workflows to process data into the different dimension and fact tables. Used Informatica Designer to create source, target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources. Developed complex mappings using Informatica Power Centre Designer to transform and load the data from various source systems like Flat files, XML, Oracle to Oracle target database. Used Debugger wizard to remove bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations, and target loads. Worked extensively on Stored Procedures, triggers, views, and indexes by using SQL* Plus, PL/SQL in SQL Server and Oracle 12c/18c. Used Oracle performance tuning techniques to optimize SQL queries used in Informatica and part of PL/SQL code. Involved in migrating Oracle DB procedure code to Teradata. Along with data model changes. Extensively used Teradata TPT via informatica to load data to ODS tables. Dynamic SQL code in Teradata to create TYPE-I, TYPE-II for source data that was loaded to ODS. Used BTEQ, Fastload scripts via shell to load data Teradata. Suggestions on Teradata PI, NUPI, PPI, USI based on the joins in the ETL code and the BI LAYER. Address, EMAIL & phone number validation using IDQ. Environment: Informatica 10.2, OBIEE 12c, Oracle 12c/18c, Teradata 16.2, MySQL, Informatica MDM 9.0, SQL Developer 4.1.3, CONTROL-M, SQL Server 2016, EPIC Clarity, IDQ. Client: Raymond James, Florida April 2016 Nov 2016 Sr. DWH Developer Raymond James Financial is an American diversified holding company providing financial services to individuals, corporations and municipalities through its subsidiary companies that engage primarily in investment and financial planning, in addition to investment banking and asset management. The existing Data warehouse in Netezza is migrated to Oracle Exadata, architect their DWH Solution and making changes to the existing data warehouse. Involved in JAD development session in understanding the source NETEZZA data and providing the Logical and Physical data models using Erwin to load to the ODS. Developed the data dictionary for asset & liability project for the standard data definitions related data analytics. Created ODI interfaces/packages/procedures to load the multiple flat files with same structure in a serial fashion to avoid temp space issues while loading from staging to Target. Did fit-gap analysis and loaded data from a different source to the existing DWH tables. Developed ODI interfaces/packages/procedures to load SECURITY BRAKES information to identify the turnover rate of an ASSET and modified existing mappings to handle nightly data load failures while loading the duplicates. BI: Created Cognos ad-hoc reports to identify the missing account information after the Netezza data load and assisted the report developers in understanding the data. Troubleshot and fixed existing Cognos reports as per the tickets issued. Developed Windows Batch scripts to extract Sales data from NCR Teradata Database to Flat Files. Developed scripts to load Flat file data into Oracle Staging Tables and from Staging Tables to DWH facts. Created many Dashboards and KPI reports using relevant Marks or Highlight functionalities. Designed and deployed reports with Drill Down, Drill Through and Drop-down menu option and Parametrized and Linked reports using Tableau. Responsible for creating calculated fields, combine field, bins, sets, geo coding and hierarchies using Tableau Desktop. Environment: ODI 12.1.3 (Designer, Operator, Topology, Security), Cognos 10.1, Oracle Exadata 12C, ERWIN, Golden Gate, TERADATA 13, TableauV8 (Desktop), SQL Developer 4.0.0.12, Control-M BMC. Client: Verizon Wireless, Dallas, NY Oct 13 to April 16 Sr. DWH/ Cognos Architect & Lead (2nd time for the same client) Verizon Wireless offers quality products and services on the nation s largest 4G LTE network and largest, most reliable 3G network, and deliver industry-leading Wireless Technology. The projects I worked include developing reporting solutions for Training Analytics & Completions. Developed the data marts using SQL, PL/SQL and used Oracle Data Integrator (ODI) to load the metric (sales data) to DWH as part of Verizon Enterprise HR/ELM Data warehouse Team. Also used ODI to extract data from Salesforce cloud (Change Management, Incident Management), populating data marts and developing reports for better analysis. Responsibilities: Involved in reviewing business requirements and analyzing data sources form Excel/Oracle SQL Server for design, development, testing, and production rollover of DWH & reporting solutions. Developed Data models and ER diagrams using Erwin. Responsible for extracting the data from different sources like Oracle, SQL, Flat file and XML. Experience to create Logical & Physical Data Modeling on Relational (OLTP), Dimensional Data Modeling (OLAP) on Star schema for Fact & Dimension tables using CA Erwin. Data validation using normalization to get 3NF. Brainstorm ideas on what to include in views/dashboards from the data source and document on the fly business requirements. Involved in creating database objects like tables, views, procedures, triggers, functions using Oracle SQL to provide definition, structure and to maintain data efficiently. Designed data model, analyzed data for online transactional processing (OLTP) and Online Analytical Processing (OLAP) systems Involved in Data flow analysis, Data modeling, Physical database design, forms design and development, data conversion, performance analysis and tuning Reduced the performance of enterprise-wide Cognos burst reports (VP/DIR/AD) from 10+ hours to 40 minutes (90 AD s). Performance tuned the summary report built off other team cognos package from 2 hours to 17 mins; 4 hours+ report to 20 minutes; forever running yearly report to 45 minutes. Identified & solved the performance issues on the reports developed by other team members and framework issues that impacted report errors and performance and guided them. Involved in upgrading the ETL processes/code as per the upgraded PeopleSoft ELM 92 version & custom-built PeopleSoft application (ITAT), verified the cognos framework changes are in sync and updated all the relevant cognos reports as per the new query item name changes. Environment: ODI 11gr2(Designer, Topology, Operator, Security), Oracle 11g, CA Erwin data modeler, CA ESP, SQL Server, PL/SQL, TOAD, Cognos 10.1/10.2, Oracle Golden Gate (OGG). Client: Kansas City Power & Light, Kansas City, Missouri Apr 13 Oct 1 3 Sr. ETL (PL/SQL, ODI) and Business Intelligence Architect Kansas City Power & Light Delivers power to more than 800,000 customers in 47 north western Missouri and eastern Kansas Counties. As part of the Customer Tier1 project, converted the existing Data warehouse scripts (SQL, PL/SQL and SQL Loader scripts) to ODI processes and converted OBIEE 11g solutions to Cognos 8.4 solutions. As part of SPP project converted 5 QlikView reports, developed new reports in Cognos. Involved in DWH and BI solutions prod support. Responsibilities: Involved in SPP project that loads data from PCI GSMS, Endur, Settlements, PJM, and MISO using third party vendor tools to Enterprise Data warehouse. Wrote Stored Procedures that automatically calculates the rows inserted into DWH from the source system daily. Developed Data Models/Architecture (LDM/PDM) between the source systems & DWH, converting the existing reports, doing fit gap analysis. Wrote SQL scripts to find the duplicates in Data Warehouse and to find the missing records in fact tables by year, month, day and hour and 5-minute interval levels. Developed & modified QlikView reports (Dispatch Instructions, Settlements, PJM, MISO and Multi Market Price) that were implemented by third party DWH vendor. As part of One Mobile analytics project, did fit gap analysis between PeopleSoft Financials 8.4 and 9.1 (only on the tables related to BI solutions) and updated the BI solutions to use the real time PS Financials 9.1 tables. Developed Yearly Trial Balance Dashboard and modified the Monthly Trial Balance Dashboard as per user requirements. Modelled Confirming Degenerate Dimensions like Ledger, Base Currency, Currency Code, Budget Period, Budget Reference and Scenario (source is PS Financials 9.1). Fixed the calculations for the Facts that were wrongly calculated (Ledger Beginning Balance Current Period, Ledger Ending Balance Current Period, Budget Beginning Balance Current Period and Budget Ending Balance Current Period). Involved in setting up OBIEE sandbox environments. Troubleshooting the cloned OBIEE sandbox issues and fixing them. Extensively worked with Teradata team in all phases of the project, including requirements, coding, developing and UAT testing. Fixed the Oracle BI APPS DAC setup Issues in DEV and Test Environments. Migrated Informatica Projects from DEV to TEST. As part of Customer Tier1 project, architected the data flow from multiple source systems (KCPL & GMOC) into DWH, participated in JAD sessions with the stake holders and times. Prepare Technical Requirements Document based on Functional Requirements Document; develop SDLC documents, development standards and framework document, create Visio diagrams for the data flow depiction as part of Technical Requirements Documents. Used ODI designer to create projects and release scenarios. Created data servers related to specific technologies and mapped with physical and logical architectures in ODI designer. Used RKM in ODI to reverse the metadata of the system database. Created various mappings from source to target using Topology and Designer Navigator. Extensively used ODI Knowledge Modules such as Loading Knowledge Module, Integration Knowledge Module and Journalizing Knowledge Module for loading data into staging and target areas as well as to maintain data consistency. Knowledge of creating new users and granting access to various contexts in Security Manager. Implemented the Change Data Capture (CDC) feature of ODI to minimize the data load. Schedule scenarios in Operator in DEV/QA/Prod. Error handling by monitoring the job execution in Operator and fixing the code. Testing the integrations developed, create SIT test scripts, validate test results, upload the SIT scripts on share point site and participate in UAT sessions. As part of team Enterprise DWH & BI Team, involved in 24/7 production support, Assigned Work, Organized JAD sessions and status calls with 3 offshore & 2 onsite Developers. Environment: ODI 11.1.1.6.3 (Designer, Topology, Operator, Security), Cognos 8.4.2 (Framework manager, Report/Query/Analysis Studios, Cognos Connection), BI Apps 7.9.6.3, Informatica 9.0.1, DAC 10.1.3, OBIEE 11.1.1.6.2, Oracle 11gr2, TERADATA, Oracle SQL Developer 3.0.04, P/SQL Developer 9.0.6, Windows 7 Enterprise/Server 2008 r2, Microsoft Outlook 2010. Client: Golden Living, Fort Smith, Arkansas Jul 12 to Mar 13 Sr. PL/SQL (ETL IBM DataStage) & Cognos Architect Golden Living is a family of healthcare companies committed to enhancing the lives of its residents and patients. Scope of the project is to maintain the data warehouse for the corporate finance & HR module, which supports functionalities of other divisions of Golden Living and assists on their day-to-day activities. The existing data warehouse offers different reports on Expense Related, Taxation, Project cost and HR Information etc. Migrated the existing Cognos 8.4.1 reports to Cognos 10.1. Inventory of reports consisted of around 1200 reports, 32 Framework manager Models and Power play cubes. Also made changes to SQL and PL/SQL used in IBM DataStage Jobs that load the Data warehouse tables. Responsibilities: Involved in the Requirement analysis for new reports or enhancements to the existing reports. Managed a Team of Five Developers (Offshore), Intergroup Communication and Liaison (Infrastructure Services/ QA/ SME/ Business) and Conducted classroom & WebEx training sessions to business users on Query Studio, Analysis Studio and Powerplay Studio. Analyzed the existing Data sources to the Cognos Data Manager Catalog s, Tables loaded using Different Data Manager Job Streams, Fact Builds, Dimension Builds and documented them. Re-Designed and Re-Architect the DWH data flow, BI Data Models & reports for performance. Created Analysis reports using Analysis Studio and Ad hoc Reports Using Query Studio. Developed Windows Batch scripts to extract Sales data from NCR Teradata Database to Flat Files. Developed scripts to load Flat file data into Oracle Staging Tables and from Staging Tables to DWH facts. Wrote SQL s to predict the Future Budget Data based on the Historical Budget Data, Convert row information of a table into Columns of another table, Split the Single row of a table into multiple rows based on the Admit and discharge Date Columns so that there will be single row per day for their stay, Return NULL Value when NO Data returned by the Select Statement. Developed multi tab HR Dashboards (having lists, crosstabs, charts and prompts) that communicate with each other using global prompt value, has drill through definitions, option to insert comments on the Dashboard off of Cognos Cube using Report Studio. Wrote several PL/SQL subprograms (stored procedures, functions and packages) using PL/SQL records, PL/SQL tables and global variables and using IN and OUT parameters with TYPE, ROWTYPE, PL/SQL tables and PL/SQL records Wrote stored Procedures/Functions/Packages in various schemas as per business requirements and was involved with the tuning, Performance, Optimization of queries and standardization of the code. Used Database trigger for making history of insertion, updating, deletion and all kind of Audit routines. Automated the Dimensional, Relational Reports and Dashboards to run for Current Year, Month & Year to Date Month using prompt token ad macros. In Report studio worked with detail reports, drill-through, prompts, complex calculations, Formatting, conditional formatting, Cross tab, multi-query reports, and charts. Performed bursting Of reports to send different segments of the report to different users. Performed comprehensive unit testing by comparing the Cognos reports against the database using SQL in Toad and documented the same. Analyzed the source DB2 UDB Stored procedures that were written to extract the DWH source Data files in finding the Data Discrepancies and worked with Mainframe team to resolve them. Developed Crosstabs showing Characters as the Measure and Cube View to relational report, dimensional to dimensional report and dimensional to relational report drill through. Installed & Developed POC reports in Tableau to better understand its functionalities and limitations to that of Cognos. Involved in Admin Tasks Like Maintaining Portal, Assigning Securities, and Scheduling Reports. Implemented object, Data level and Package level securities. Enhanced the existing Scorecards in Metric Studio. Expertise with HP Quality Center tool in creating CR (change requests) and general tickets. Environment: IBM Infosphere DataStage 8.5, Information Server Manager 8.5, JD Edwards (HR), CA ESP, Cognos 10.2/8.4.1 (Framework manager, Report/Query/Analysis Studios, Cognos Connection, Event Studio), Oracle 9i, TERADATA 13, SQL, PL/SQL, Toad 10.6.1, SQL Server 2005/08, DB2 UDB 8.1 Windows XP/server 2008, Tableau 7.0, Microsoft Outlook 03. Texas A&M University 2006- 2007 As part of the Project/team, developed Smart Bill Payment system using Java and HTML code, developed analysis and reports off MS access DB to validate the functionality and data. EDUCATION: Bachelor of Technology (Electrical and Electronics Engineering 02 -06). Master s in computer science Texas A&M University 2006-2007. Keywords: quality analyst business intelligence database active directory information technology hewlett packard fourg microsoft procedural language California Delaware Michigan New York |