Mudassar - Senior Informatica/IICS Developer |
[email protected] |
Location: Chicago, Illinois, USA |
Relocation: |
Visa: H1B |
Mudassar Ali
Email id: [email protected] Phone number: (630) 303-9142 Professional Summary: 10+ years of Strong experience in Complete Software Development Life cycle (SDLC) which includes Business Requirements Gathering, System Analysis, and Design, Development, and Implementation of Data warehouse. About 9 years experience in Data Warehouse using Informatica Power Center v 10.5/10.4.1/10.4.0/10.2/9.6.1/ 9.5.1/9.1/8.6/8.5/8.1/ (Repository Manager, Mapping Designer, Workflow Manager, Workflow Monitor). Worked on migrating Informatica Power Center Workflows to IDMC (October 2023 Release) Worked on supporting all the ETL Inbounds and Outbound of TDM for data masking techniques. Data Masking and Data Subset using Informatica ILM-TDM. Experience working with Intelligent Data Management Cloud (IDMC) to load data to and from between Salesforce, Snowflake and On-prem systems Expertise in TSQL programming and have created various SQL objects like stored procedures, triggers, user-defined functions, tables, views, cursors, constraints, and write other complex DML and DDL commands. Experience working with IICS Informatica intelligent cloud services to load data to and from between Salesforce, Snowflake, and On-prem systems. Expert in developing, deploying, debugging, scheduling and maintenance of SSIS packages, and optimization using SQL Server best practices. Working knowledge of Informatica Power Exchange to support Data Integration requirements. Strong experience in installation of Informatica and configuration of Informatica Power Center 10.4 Experience in using Oracle 19C/10g/9i/8i, Netezza, MS SQL Server 2005/2012/2016, Azure Sql Data warehouse, Teradata v15, Snowflake, SQL, TSQL, and PL/SQL. Created packages in SSIS with error handling as well as created complex SSIS packages using various transformations and tasks like Sequence Containers, Script, For loop and For Each Loop Worked with Dimensional Data warehouses in Star and Snowflake Schemas, Slowly changing dimensions and created slowly growing target mappings, Type1/2/3 dimension mappings. Proficient in transforming data from various sources (flat files, XML, Oracle) to Data warehouse using ETL tools. Extensively worked on transformations such as Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Sorter, Normalizer, Update Strategy, Sequence Generator and Stored Procedure transformations. Extensively experience in developing Informatica Mappings / Mapplets using various Transformations for Extraction, Transformation and Loading of data from Multiple Sources to Data Warehouse and Creating Workflows with Worklets & Tasks and Scheduling the Workflows. Extensively involved in Informatica performance issue and Database performance issue, Expertise in error handling, debugging and problem fixing in Informatica. Experience in design of logical and physical data modeling using E/R studio. Expertise in using SQL*LOADER to load Data from external files to Oracle Database. Extensive experience in writing PL/SQL programming units like Triggers, Procedures, Functions and packages in Unix and windows environment. Exposure to on-shore and off-shore support activities Excellent verbal and communication skills has clear understanding of business procedures and ability to work as an individual or as a part of a team. Technical Skills: ETL Tools: Informatica Power Center 10.4/10.2/9.6.1/9.1/8.6/8.5/8.1, Informatica Power Exchange, IDMC October 2023 Release, Informatica developer, Informatica TDM Data Modeling: Erwin 2020 R2, Toad, Oracle Designer, PL/SQL Developer 5.1.4. Databases: Oracle 19C/11g/10g, MS SQL Server 2012,2008/2005, DB2, PL/SQL, SQL *Plus, SQL *Loader and Developer 2000 Programming: SQL, PL/SQL, T-SQL, SQL*Plus, HTML, Batch, UNIX Shell Scripting. Environment: UNIX, Windows Scripting: Shell scripting using environment variables, LINUX commands, PERL Scripts and PL/SQL procedures and Share point. Professional Experience: Corewell Health January 2021- Present Grand Rapids, MI Sr. ETL Informatica Developer Responsibilities: Worked on Informatica TDM- ILM workbench for Data Masking using different Masking Techniques for masking the sensitive data from different systems like Oracle. Developed code using Teradata Scripts, Informatica mappings, workflows, and sessions as per ETL specifications and implemented CDC (Change Data Capture) scripts wherever required. Worked closely with the Data Architects and ETL Architect to implement ETL solutions for the organization using ETL tools, create technical specification documents and create test plans. Worked on creating B2B solutions for the incoming files from the vendor, setup the profiles,endpoints and DX workflows to ensure incoming files are processed. Debugged mappings and identified errors and error rows so that they can be corrected and re-loaded into a target system. Strong experience in the development in BI tool Informatica Power Center, Informatica cloud (IICS), and Informatica PowerExchange. Performed performance tuning of mappings, processes, and load routines. Ensure compliance with HIPAA regulations and requirements. Perform Installation support, upgradation support of Informatica Power center, DVO products including troubleshooting, issue analysis, coding, testing, implementing software enhancements and applying patches. Strong experience in the development in BI tool Informatica Power Center, Informatica cloud (IICS), and Informatica PowerExchange Designed, Developed and Implemented ETL processes using IICS Data integration Extensively used performance tuning techniques while loading data into Azure Synapse using IICS Developed strategies like CDC (Change Data Capture), Batch processing, Auditing, Recovery Strategy etc. Involved in the data model changes and brought new ideas in the Data model design using the ERWIN tool and created new tables and added new columns to the tables. Led the strategic initiative to transition a critical B2B integration process from a Perl-based script to a robust Informatica PowerCenter solution, enhancing efficiency and scalability for handling partner files. Meticulously analyzed and deconstructed the existing Perl script functionality, ensuring a comprehensive understanding of business logic, data transformations, and end-to-end processing requirements. Contributed to the successful execution of PI planning sessions, facilitating cross-functional team collaboration to align on objectives, identify dependencies, and establish clear roadmaps for upcoming program increments. Played a key role in Iteration Planning meetings by defining iteration goals, breaking down features into user stories, and ensuring a balanced workload distribution across the team to maximize efficiency and deliver on sprint commitments. Environment: Informatica PC 9x to 10.5.1, IDMC, TDM, B2B & DVO, Oracle 19C, Sql Server, DB2, Quality center, clear case, bit bucket, Jira, Unix/linux, oracle Apex. Azure Sql Dwh, Snowflake, Salesforce, XML, Control M, Oracle 11g, TOAD , SQL, T-SQL, PL/SQL, Batch, Control-M Herbalife, March 2019 December 2020 Torrance, CA ETL Informatica Developer Responsibilities: Developing code using Teradata Scripts, Informatica mappings, workflows, and sessions as per ETL specifications and implemented CDC (Change Data Capture) scripts wherever required. Implemented CDC by tracking the changes in critical fields required by the user. Worked on optimizing and tuning the Azure Sql Dwh to improve the performance of batch. Involved in loading the data into Azure Sql Dwh from Onpremise systems and flat files using complex Informatica mappings. Worked with IICS Informatica intelligent cloud services to load data to and from between Salesforce, Snowflake. Worked with Oracle, Sql Server, Azure Sql Dwh, Salesforce and Flat file sources. Developed standard and reusable mappings and mapplets using various transformations like Expression, Aggregator, Joiner, Router, Lookup (Connected and Unconnected) and Filter. Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance. Perform data extraction, transformation, and loading (ETL) between systems using Informatica Powercenter and Informatica Cloud. Create packages and stored procedures with error handling techniques Setup the batches, configured the sessions and scheduled the loads as per requirement using Control M scheduler. Perform data extraction, transformation, and loading (ETL) between systems using Informatica Powercenter and Informatica Cloud (IICS). Expertise in Data Warehousing, Data Analysis, Reporting, ETL, Data Modeling, Development, Maintenance, Testing and Documentation. Prepared the documentation for the mappings and workflows. Made use of Post-Session success and Post-Session failure commands in the Session task to execute scripts needed for cleanup and update purposes. Experience working with IICS Informatica intelligent cloud services to load data to and from between Salesforce, Snowflake. Performing Test cases and various test scenarios to validate the ETL load Developed PL/SQL and Batch Scripts for scheduling the sessions in Informatica. Involved in Unit testing, Integration testing UAT by creating test cases, test plans and helping Informatica administrator in deployment of code across development, test and prod repositories, Identify and modify the Key Performance Indicators (KPIs), measures in co-ordinance with the requirements. Involved in performing incremental loads while transferring data from OLTP to data warehouse using different data flow in Informatica Actively involved in the production support and also transferred knowledge to the other team members. Co-ordinate between different teams across circle and organization to resolve release related issues. Environment: Informatica Power Center 10.2/10.1 (Power Center Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Power Exchange, SQL Server 2012, Azure Sql Dwh, Snowflake, Salesforce, XML, Control M, Oracle 11g, TOAD , SQL, T-SQL, PL/SQL, Batch, Control-M Geisinger Health Plan, Danville, PA January 2018 Feb 2019 ETL Informatica Developer Responsibilities Analyze functional and technical specification documents and design use cases documents. Create Sessions and Workflows to load data from the Different Source Systems into Database which in turn feeds the downstream applications using tool Informatica Powercenter. Building ETL Informatica applications for receiving and processing data to build data warehouse/data mart environment for reporting and analytical purpose using tools like Informatica Powercenter, Sql and Scripting. Copied subset of secured data from Production databases to development and Testing environments of Test Data Management (TDM) Write complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look-ups to improve the performance. Develop mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Aggregator, Update strategy and Sequence Generator. Modifying the existing ETL code and scripts as per the user requirements. Develop test plans to test scope and environments in different test phases and methodologies. Test the applications using manual and automated tools and manage defects with defect tracking tools Responsible for Production Turnover for moving ETL Informatica objects (workflows/sessions, mappings), Database objects (functions, stored procedures, DDL queries), scripts from Test to QA to Prod. Serve as point of contact between the developer and the administrators for communications pertaining to successful execution of job. Resolve issues that cause the production jobs to fail by analyzing the ETL code and log files created by failed jobs on the Informatica server. Provide backup support of databases, ETL applications environments. Participate in daily, weekly meetings to discuss data quality, performance issues and ways to improve data accuracy and new requirements, etc.Created test plans and did unit testing for the Informatica BDE mappings and stored procedures Actively involved in the production support and also transferred knowledge to the other team members. Co-ordinate between different teams across circle and organization to resolve release related issues. Environment: Informatica Power Center 10.1/9.6.1 (Power Center Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Informatica TDM, Power Exchange, SQL Server 2008,T-SQL, Azure Sql Dwh ,Netezza, Teradata, DB2 8.1, XML, Autosys, Oracle 11g, TOAD,TSQL, SQL, PL/SQL, UNIX, Control-M Blue Cross and Blue Shield of Nebraska. March 2015- December 2017 Omaha, Nebraska Informatica Developer Responsibilities: Worked closely with client in understanding the Business requirements, data analysis and deliver the client expectation. Used Informatica PowerCenter 9.6/8.6.1/8.5 and its all features extensively in migrating data from OLTP to Enterprise Data warehouse. Extensively used Erwin for Logical and Physical data modeling and designed Star Schemas. Extracted data from different sources like Oracle, flat files, XML, DB2 and SQL Server loaded into DWH. Created complex mappings in PowerCenter Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations. Developed mappings/Reusable Objects/mapplets by using mapping designer, transformation developer and Mapplet designer in Informatica PowerCenter 9.6/8.6.1/8.5 Worked on power exchange to create data maps, pull the data from mainframe, and transfer into staging area. Handle slowly changing dimensions (Type I, Type II) based on the business requirements. Automated the jobs thru scheduling using Maestro scheduler, which runs every day by maintaining the data validations. Worked directly with the chief strategic officer to create multiple T-SQL queries for ad-hoc reports Optimized the performance of queries with modifications in T-SQL queries, removed unnecessary columns, eliminated redundant and inconsistent data, normalized tables, established joins and created indexes whenever necessary. Also re-wrote some queries as needed. Involved in creation of Folders, Users, Repositories and Deployment Groups using Repository Manager Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica. Involved in scheduling the Informatica workflows using Autosys. Migrated mappings, sessions, and workflows from development to testing and then to Production environments. Involved in Performance tuning for sources, targets, mappings and sessions. Performed unit testing on the Informatica code by running it in the Debugger and writing simple test scripts in the database thereby tuning it by identifying and eliminating the bottlenecks for optimum performance. Wrote PL/SQL stored procedures & triggers, cursors for implementing business rules and transformations. Worked extensively with different caches such as Index cache, Data cache and Lookup cache (Static, Dynamic, Persistence and Shared). Created deployment groups, migrated the code into different environments. Worked closely with reporting team to generate various reports. Environment: Informatica PowerCenter v 9.6/8.6.1/8.5 (Power Center Repository Manager, Designer, Workflow Manager, and Workflow Monitor), SQL Server 2005, T-SQL, , XML, Autosys, Tufts Health Plan., Boston, MA May 2013 Feb 2015 ETL Consultant Responsibilities Studying the existing environment, validating the requirements and gathering source data by interacting with clients on various aspects. Worked on complete DW SDLC right from Extraction, Transformation and Loading of data. Involved in designing metadata tables and the ETL process. Designed the Functional Requirements and Mapping Technical Specifications on the basis of Functional Requirements. Create Sessions and Workflows to load data from the Oracle Databases that were hosted on UNIX servers. Developed complex mappings using Lookups connected and unconnected, Rank, Sorter, Joiner, Aggregator, Filter, Router transformations to transform the data per the target requirements. Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator, Update Strategy, Rank, Expression and lookups (connected and unconnected) while transforming the Sales Force Data according to the business logic Write complex SQL override scripts to source qualifier level to avoid Informatica Joiners and Look-ups to improve the performance as the volume of the data was heavy. Executing jobs in CA7 scheduler (Mainframe Scheduler). Modify the existing ETL code (mappings, sessions and workflows) and the shell scripts as per the user requirements. Monitoring workflows/mappings in Informatica for successful execution. Involved in the physical & logical design of the Banking Analysis System using Erwin. Worked on Informatica Designer tool to develop mappings and Mapplet to extract and load the data from flat files, Oracle, SQL Server and Sybase. Used Debugger to validate the mappings and gain troubleshooting information. Worked extensively on creating complex PL/SQL Stored Procedures and Functions and optimizing them for maximum performance. Scheduling workflows and sessions using PMCMD. Extensively used FTP and Command Task to pull data from Mainframes. Involved in creating reports using Business Objects. Involved in documentation of complete life cycle of the project. Involved in preparing a handbook of standards and documented standards for Informatica code development. Environment: Informatica Powercenter 8.6/8.1.1, Informatica DT Studio 8.6, ERWIN 3.5, Business Objects 5.1, Oracle 10G, Oracle Toad, Sybase, SQL*Loader, Harvest, CA7 Scheduler. Zensar Technologies Pune, India July 2012 Mar 2013 PL/SQL Developer Responsibilities: Interacted with end users for gathering requirements. Perform database tuning, monitoring, loading and back up. Creating prototype reporting models, specifications, diagrams and charts to provide direction to system programmers. Developed procedures and functions using PL/SQL. Created number of database Triggers according to business rules using PL/SQL. Developed SQL for loading Meta data from Excel spread sheet to the database using SQL Loader. Extensively used PL/SQL to implement cursors, triggers and packages. Developed SQL script for loading data from existing MS Access tables to Oracle. Create record groups for manipulation of data, Perform unit and system integrated testing. Involved in database design, development of database, tables and views. Involved in application testing, deployment and production support. Environment: Oracle 8i, PL/SQL, TOAD, SQL*Loader, MS Access, Excel spread sheet, Windows NT. Education: Bachelor s in computer science, 2012 References: Upon request only Keywords: quality analyst business intelligence active directory rlang information technology microsoft procedural language California Colorado Idaho Massachusetts Michigan Pennsylvania |