Home

Yaqoob Mazher - ETL Developer
[email protected]
Location: Chicago, Illinois, USA
Relocation: Yes
Visa: GC EAD
Yaqoob
(630) 296-6872
[email protected]
Sr. ETL Informatica Consultant Visa: GC EAD

PROFESSIONAL SUMMARY:
Around 9 years of experience in IT industry, related to various aspects involving Data analysis and Data Modeling techniques, using ETL tools like, Informatica Power Center 10.2/9.x/8.x (Source Analyzer, Mapping Designer, Mapplet Designer, Transformations Designer, Warehouse Designer, Repository Manager, and Workflow Manager), Informatica Power Exchange 9.6.2/9.5.1, Informatica Intelligent Cloud Services (IICS).
Expert in all phases of Software development life cycle (SDLC) Project Analysis, Requirements, Design Documentation, Development, Unit Testing, User Acceptance Testing, Implementation, Post Implementation Support and Maintenance.
Worked with different non-relational Databases such as Flat files, XML files, Mainframe Files and other relational sources such as Oracle, Sql Server, Azure Sql Datawarehouse, Teradata, Salesforce, Snowflake and DB2.
Worked extensively with Data migration, Data Cleansing, Extraction, Transformation and Loading of data from Multiple Sources to Data Warehouse Effectively used Power Center, Power Exchange, IICS and Informatica Developer (IDQ).
Instrumental in setting up standard ETL Naming standards & BEST Practices throughout the ETL process (Transformations, sessions, maps, workflow names, log files, bad files, input, variable, output ports).
Worked with different Informatica performance tuning issues like source, target, mapping, transformation, session optimization issues and fine-tuned Transformations to make them more efficient in terms of session performance.
Experience in implementing the complex business rules by creating re-usable transformations, developing complex Mapplets and Mappings, PL/SQL Stored Procedure and Triggers.
Experience in seamlessly integrating PowerBI within the Informatica environment to deliver comprehensive reporting solutions
Expertise in RDBMS, database Normalization and Denormalization concepts and principles
Database experience using Oracle advanced concepts like stored procedures, functions, packages and complex sql s relating to data integration.
Experience in Creating ETL Design Documents, strong experience in complex PL/SQL packages, functions, cursors, indexes, views, materialized views.
Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team.
Extensive experience in UNIX Shell Scripting, AWK and file manipulation techniques.
Demonstrated ability in defining project goals and objectives, prioritizing tasks, developing project plans and providing framework for effective communication while maximizing responsiveness to change.
Possess leadership, problem solving abilities, good analytical skills, excellent verbal and written communication and good interpersonal skills with an ability to work as a committed individual and motivated team player. Have proved to be a self-starter with excellent organizing and decision-making skills in the past. Possess experience in working on concurrent projects in very demanding and high-pressure situations.

TECHNICAL SUMMARY:

1. Data Warehousing:
Star Schema & Snowflake Schema
Data Warehouse & Data Marts
2.Informatica Tool Set:
Informatica Power 10.2, 9.5.1, 9.1.1, 8.6.1
Informatica Data Quality 9.6.1, 9.5.1
Informatica Power Exchange 9.6.2, 9.5.1
Informatica Intelligent Cloud Services (IICS)
3.Reporting Tools:
Microstratery Tool
PowerBI
SAP Business Object
4.Databases:
Oracle 12c, 11i, 10g, 9i, 8g
IBM UDB DB2
Microsoft SQL Server 2008/2012/2016
Teradata V13
Azure Sql Datawarehouse
Salesforce
Snowflake
5.Operating Systems:
Windows
UNIX
6.Job Schedulers:
Tivoli (TWS)
Tidal Enterprise Scheduler
Control M
Informatica Scheduler
7.Scripting:
Unix, Perl, Batch
8.Software Development Methodologies:
Agile & Waterfall

PROFESSIONAL EXPERIENCE:

Eaton, Cleaveland, OH Apr 2020 Present
Sr. Informatica Developer
Project Description:
To create new systems from existing which may need to be retired/ migrated/cloned to a new application system for enterprise better sales and servicing. we used two kind of approach for creating the new system. 1) Clone&Go approach and 2) Remediation approach.
Informatica power Center is used to develop ETL mappings for loading to Oracle data warehouse (EDW) system from different source system that includes Oracle, Sql Server, Salesforce, Snowflake and Mainframe Files.

Responsibilities:
Collaborating with Business Analyst team to gather and study requirements, providing design solutions as per the end user requirements.
Developed a standard ETL framework to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology.
Building application software, preparing unit test plans, and creating test case scenarios and documenting test results for all the development tasks.
Good performance tuning skills, such as identifying poorly performing SQLs, functions and procedures, making recommendations for improvements to the stored procedures and upon approval, making modifications, testing results, and validating performance improvements.
Worked on complex oracle Sql s involving DDL s, DML s, Oracle functions, PL/SQL blocks and more.
Worked on different IICS API Processes Input formats like FIELDS and WHOLE PAYLOAD.
Extracted existing sales data from Salesforce objects and integrated the data into Oracle warehouse using Powercenter.
Demonstrated proficiency in addressing data-related challenges within Tableau Dashboards, ensuring data accuracy and reliability for impactful business intelligence, all facilitated by IDMC.
Experience working with Custom Built Query to load dimensions and Facts in IICS.
Experience working with IICS monitoring, administrator concepts.
Integrated warehouse data from Oracle semantic layer into downstream system in form of extracts/files and directly integrated into Salesforce to update information from warehouse into SFDC objects.
Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
Implemented a comprehensive data quality monitoring system that showcases data quality scores and statistics through PowerBI dashboards. Utilized Power BI's visualization
Exporting the Mapplets from IDQ into informatica powercenter to use the mapplet in various mappings for implementation of address doctor.
Hands on experience working on profiling data using IDQ.
Created reusable transformations and Mapplets in the designer using transformation developer and Mapplets designer tools. Resuable expression transformation to clean the mainframe files that were sourced using Informatica Powerexchange process into ODS layer.
Expertise in providing end-to-end business intelligent solution by using OBIEE.
Used slowly changing dimensions SCD II and SCD III as per the requirement analysis.
Worked on IICS API Process Objects to perform process lookup for Input payloads.
Extensively used transformations like router, aggregator, lookup, source qualifier, joiner, expression, aggregator and sequence generator transformations in extracting data in compliance with the business logic developed.
Solid experience in debugging and troubleshooting Sessions using the Debugger and Monitor.
Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy
Fixing defects in project and providing solutions to be executed by the developers, Creating migration documents for releases.
Documented handbook of standards for Informatica code development.
Reviewed Informatica ETL mappings/workflows and SQL that populates EDW and data mart Dimension and Fact tables to ensure accuracy of business requirements.
Trinity Health, Livonia, MI Nov 2018 Mar2020
Sr. Informatica Developer
Project Description:
The project was to process files relating to Health claim payments and claims into the data warehouse. EDI files were processed and parsed using Informatica DT and b2b DX process and were loaded into warehouse using powercenter. Multiple ETL process were created to source the files from multiple clients and produce reports based on financial user request

Responsibilities:
Developed a file loading process for Claim Detail file which will source the data from the flat file and go through all the ETL validation (critical data element check ensuring that certain data element fields in the file contains a valid value). Once the file passes the critical data element check it is loaded to the database and a series of notification emails are sent out informing the users about the status of the file load.
Developed Informatica mappings, re-usable transformations, re-usable mappings and Mapplets for data load to data warehouse.
Develop the code at MDM, Entity 360, IDD, IDQ, PowerBI as needed for the scrum team.
Worked on 835 and 837 file formats for business processing.
Created sessions, database connections and batches using Informatica Workflow Manager.
Extracting data from flat file, Excel files, sql server and performed complex joiner, Expression, Aggregate, Lookup, Stored procedure, Filter, Router transformations and Update strategy transformations to load data into the target systems.
Migrate Informatica objects and Database objects to Integration Environment and schedule using Tidal Enterprise Scheduler.
Monitored the ETL jobs/schedules and fixing the Bugs.
Experience working with Key Range Partitioning in IICS, handling File loads with concept of File list option, creating fixed with file format and more, file listener and more.
Experience integrating data using IICS for reporting needs.
Very good Hands on experience on tools like JIRA ticketing system, Tortoise SVN for maintaining sub version of the code.
Setting up Batches and large volumes of data and creating sessions to schedule the loads at required frequency using Power Center Workflow manager.
Handle slowly changing dimensions of Type 2 to populate current and historical data to Dimensions and Fact tables in the data warehouse. Based on the logic, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, Stored procedure transformations in the mapping.
Extensively used various transformations such as Source Qualifier, Expression, Lookup, Sequence Generator, aggregator, Update Strategy, and Joiner in migrating data from various heterogeneous sources like Oracle, OWB, DB2, XML and Flat files to Oracle.

Farmers Insurance, Woodland Hills, CA Jan 2016 Oct 2018
Sr. Informatica Developer
Project Description:
Worked on the MDM implementation project which involved data integration using Informatica power Center from various source systems into the MDM database and Informatica Data Quality for address validation

Responsibilities:
Developed a standard ETL framework to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology.
Extensively developed Low level Designs (Mapping Documents) by understanding different source systems.
Designed complex mappings, sessions and workflows in Informatica PowerCenter to interact with MDM and EDW.
Design and develop mappings to implement full/incremental loads from source system.
Design and develop mappings to implement type1/type2 loads.
Responsible for ETL requirement gathering and development with end to end support.
Responsible to coordinate the db changes required for etl code development.
Responsible for ETL code migration, db code changes and scripting changes to higher environment.
Responsible to support the code in production and QA environment.
Developed complex IDQ rules which can be used in Batch Mode and as well in Online.
Extensively used transformations like router, lookup, source qualifier, joiner, expression, sorter, XML, Update strategy, union, aggregator, normalizer and sequence generator.
Created reusable mapplets, reusable transformations and performed Unit tests over Informatica code.
Responsible for providing daily status report for all Informatica applications to customer, Monitoring and tracking the critical daily applications & code migration during deployments.
Responsible for reloads of Informatica applications data in production and closing user tickets and incidents.
Identify performance issues and bottlenecks

Verizon Wireless, Folsom, CA Aug 2013 Dec 2015
Informatica Developer
Project Description:
This project is created to capture data for annual cycle for Budget Plan and monthly/quarterly cycle for forecasting. Hyperion Planning Platform requires data from various company's source systems to be extracted, cleansed, validated, and transformed into required model of Oracle Hyperion Planning. It involves creating ETL code to read data from various source systems and load it into oracle which would then be consumed by Oracle Hyperion Planning Team for Reporting.

Responsibilities:
Participated in gathering and evaluating requirements, working with application / Data Warehouse team and project managers to provide solutions to end users.
Develop Technical design and reporting solutions to influence business results. Oversee the performance of the project throughout the life cycle from initiation till completion stage.
Proficient in translating users statements of needed system behavior and functionality into
Business and Functional Requirement.
Involved in data modeling through the use of ER, star schema and dimensional modeling. Excellent understanding of OLTP/OLAP System Study, Analysis and developing Database Schemas like Star and Snowflake schema. Exposure to Reporting tools OBIEE, BI Publisher.
Developing ETL mappings from the given requirements and unit testing them accordingly.
Creating Technical Design Documents from Business Requirements.
Worked with various RDBMS like Oracle, SQL 2008,DB2
Creating batch scripts for different requirement of the project such as file validation, moving files from share point to Informatica server, archiving files with date time stamps etc.
Performance tuning various mappings, Sources, Targets and transformations by optimizing caches for lookup, joiner, rank, aggregator, sorter transformation and tuned performance of Informatica session for data files by increasing buffer block size, data cache size, sequence buffer length and used optimized target based commit interval and Pipeline partitioning to speed up mapping execution time.
Reviewing Informatica ETL mappings/workflows and SQL that populates data warehouse and data mart Dimension and Fact tables to ensure accuracy of business requirements.
Created Informatica Source & Targets Instances and maintain shared folders so that shortcuts are used in project.
Responsible for Unit Testing and Integration testing of mappings and workflows.

Education: Bachelor s in computer science, Osmania University INDIA - 2012
Keywords: quality analyst business intelligence database information technology golang green card procedural language California Michigan Ohio

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];3329
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: