Home

Mohammed - ETL IICS Developer
[email protected]
Location: Farmington Hills, Michigan, USA
Relocation: No
Visa:
Mohammed Baig
Ph. No.: 773-681-4947
Email id: [email protected]


Summary:
Over 11 years' SDLC experience including Analysis, Requirements Gathering, ETL Design, Development, Testing, and Documentation using Informatica tools. Proficient in Informatica PowerCenter, Informatica Power Exchange, Informatica IICS/IDMC, Azure Data Factory and Azure Databricks. Actively worked in Informatica upgrade project from 9.x to 10.x and migration from Informatica PowerCenter to Informatica Cloud IICS/IDMC. Skilled in Agile/Scrum methodologies, JIRA, and Azure DevOps. Experienced in data integration with various databases and file formats like delimited, fixed width, Json, xml, parquet. Strong grasp of Data Warehouse concepts, modelling techniques, and best practices. Proficient in automation scheduling tools and mentoring junior developers. Effective communicator with critical thinking skills.

Technical Skills:


ETL
ON prem: Informatica PowerCenter 9.x/10/10.1.0/10.4.0 Informatica Power Exchange 10.4/ 9.5.1, Informatica developer 9.6.1, Informatica B2B Data Exchange.
Cloud: Informatica IICS/IDMC, Azure Databricks and Azure Data Factory
DBMS/RDBMS Oracle 11g/10g/9i/8i, SQL server 2012/2014/2016, Azure SQL Synapse, AWS Redshift, Snowflake, DB2
OPERATING SYSTEMS UNIX, Linux, Windows XP, Windows 7, Windows 10
LANGUAGES SQL, PL/SQL, T-SQL, Unix Shell Programming, Windows Batch
File Based System Azure Data Lake Store Gen2, Azure Blob Storage, Amazon S3, Network Drives, SharePoint
VERSION CONTROL System Tortoise SVN(Subversion), TFS (Team Foundation Version), Azure DevOps Repository, GitLab
JOB SCHEDULING Tool Control-M, TWS, Informatica Scheduler
GUI Tools Toad, SQL Developer, PLSQL Developer, Win SQL, SharePoint, TFS, Visual Basic, Postman, Soap UI console, Microsoft Visio

Web Services Rest API, SOAP API
METHODOLIES Agile/ Scrum, Waterfall, Data Warehouse and Data Modelling
CLOUD PLATFORM Microsoft Azure, AWS

Professional Experience:

Charles Schwab CO (Remote) March 2023 - Present
Sr. ETL Developer

Project: State Street Data Modernization/ Project Birdie

Responsibilities:
Development of Informatica ETL pipelines for Enterprise Data Warehouses, Data Marts and Operational Data Stores (ODS).
Handling, cleansing and standardization, data transformations, merging, change data capture, and reconciliation of data according to the business requirement.
Extensively worked on extracting data from flat files and processing the data according to the business logic and loading in databases such as SQL server and Eagle.
Extraction of data from XML files, Json files and transforming according to the business logic and loading into the Marty database.
Extracted data from Rest API call using Informatica with API Key authentication process
Extracted data from Salesforce using Informatica Salesforce connection to integrate with the target Snowflake database
Extensively worked on creating Informatica code for loading data from source system like Oracle, Salesforce, API s, Files, SQL Server to the Snowflake Database.
Used various Informatica transformations like router, joiner, lookup, rank, filter, transaction control, update strategy, SQL transformation, Expression, sequence generator and stored procedure transformation in the development of complex mappings.
Extensively used the concept of Input and in/Out parameters in IICS/IDMC
Hands-on working experience with all the components of data integration like parser, UDF, fixed width, mapplets, mappings, mapping task and task flows
Hands-on working experience with Mass ingestion domain of IICS/IDMC to ingest both database real time ingestion and File based ingestion
Parallelly worked on the project to convert legacy Informatica PC mappings into Informatica cloud mappings with the help of Informatica Data Migration Factory team
Unit & Functional Tested all the assets migrated to Informatica Cloud IICS and responsible to deploy the new cloud assets to higher environments
Developed mappings with keeping Push Down Optimization usage in mind for better performance and utilizing the Snowflake credits instead of Informatica server usage
Extensively used Informatica Partitioning to improve source read performance
Built type 2 ODS, dimensions and built incremental and append load Facts by making use of Last modified datetime columns
Great exposure of performance tuning an existing mapping in terms of solution improvement and transformation change as needed
Extensively worked on creating mappings for the whole pipeline of incoming data from Flat file to the first warehouse of snowflake as staging database, moving forward to second warehouse as prepared database and extracting data from prepared database to final database and then finally to the semantic warehouse in snowflake for reporting and analytics.
Informatica PWX is used to capture real time changes from Oracle tables and use Informatica PC to read the condense files and load data to the snowflake database
Creating unit tests for the development and testing activities.
Providing support to the QA and UAT team for Unit testing and SAT testing.
Worked extensively on SQL queries such as creating altering Tables, Indexes, Views, partitions and worked with PL/SQL stored procedures. Queried various tables to get resultant datasets as per the business requirements.
Used ControlM for scheduling of the jobs created extensively.
Working experience on Lakehouse/Medallion architecture to devise the dataset as silver, bronze and gold layer
Used Azure Databricks to ingest data in parquet format in the data lake as bronze layer and then push the data to the silver layer in form of delta tables with accessibility to query the delta tables from Databricks workspace using unity catalogue
Good knowledge on Databricks concepts like Clusters, notebooks, workflows, parquet, delta tables, unity catalogue etc
Connecting with different teams like Release team, DBA team, UNIX team to support for the smooth release of the development done.
Involved in Development, Unit Testing, QA, UAT and release phases of project.
Environment: Informatica PowerCenter 10.4.0, Informatica Power Exhange 10.4, IICS / IDMC ControlM 9.0.20.200, Microsoft SQL Server 2016/2014, Azure Databricks, Azure Data Lake Store Gen2, SSMS, XML Files, JSON Files, API, Bitbucket, JIRA, Microsoft Visual Studio 2022.

Alliant Energy - WI (Remote) Sept 2022 March 2023 Sr. ETL Developer/Tester

Responsibilities:
Analysed existing ETL Data Warehouse processes and ERP/NON-ERP Application interfaces to design new Azure Synapse Data Warehouse and Data Lake Store solutions.
Created ETL and Data Warehouse standards documents including Naming Standards, ETL methodologies, data cleansing, and preprocessing strategies.
Generated mapping documents with detailed source-to-target transformation logic.
Designed, Developed, and Implemented ETL processes using IICS Data Integration.
Configured IICS connections using various cloud connectors in Administrator console in IICS
Implemented performance tuning techniques for loading data into Azure Synapse using IICS Push Down Optimization by making use of ODBC connector
Effectively used IICS Azure SQL Data warehouse v2 native connector to stage the data into the Azure Synapse
Utilized various cloud transformations including Aggregator, Expression, Joiner, Lookup, Rank, Router, Sequence Generator, Sorter, and Union Transformations.
Employed cloud connectors such as Azure Synapse (SQL DW), Azure Data Lake Store V3, Azure Blob Storage, Oracle, Oracle CDC, and SQL Server.
Integrated data from staging to ODS layer and then to business layer consisting of dimensions and Facts
Built various templates for ODS layer in IDMC to enable reusability across multiple source systems
Devised and planned the deployment to production to switch the jobs from PC to IDMC jobs
Developed parameterized mapping templates for Stage, Dimension (SCD Type1, SCD Type2, CDC, and Incremental Load), and Fact load processes.
Utilized Parameters, Expression Macros, and Source Partitioning for optimization.
Created testing scenarios and unit test cases for the data load.
Testing the load of data using the GUI tool KNIME between both the databases.
Loaded data into Snowflake instances using Snowflake connector in IDMC for analytics and insights.
Created complex Informatica Cloud Task flows with multiple mapping tasks.
Involved in Development, Unit Testing, System Integration Testing (SIT), and User Acceptance Testing (UAT) phases of projects.
Environment: Informatica Intelligent Cloud Services IICS/IDMC, Informatica PowerCenter 10.2, Informatica Power Exchange 10.2, SSIS, Windows Secure Agent, Azure Synapse (Azure SqlDW), SQL Database, Azure Data Lake Store, Azure Databricks

The Hartford (Remote) Jun 2021 - Aug 2022
Sr. ETL Developer
Responsibilities:
Collaborated with business users to define process metrics and key dimensions, overseeing the full project lifecycle.
Designed and implemented Informatica B2B architecture and created/tested B2B mappings.
Assisted architects in developing STG/ODS/Hub/dimensional warehouse in Azure SQL Data Warehouse.
Worked with various non-relational and relational databases including Flat files, XML files, Oracle, SQL Server, Azure SQL Data Warehouse, Teradata, Salesforce, Snowflake, and DB2.
Developed Restful APIs using Python and wrote Python scripts for parsing XML and CSV documents.
Experience working with Salesforce connector to read data from Salesforce objects into Cloud Warehouse using IICS.
Experience working with IICS monitoring, administrator concepts.
Experience working with Data integration concepts not limited to mapping, mapping configuration task, Taskflows, deployment using GIT automation, schedules, connections, api integration.
Experience working with Key Range Partitioning in IICS, handling File loads with concept of File list option, creating fixed with file format and more, file listener and more.
Experience integrating data using IICS for reporting needs.
Defined ETL specifications based on business requirements and created ETL mapping documents.
Established modelling and naming standards for data models and DDLs/DMLs.
Created Informatica PowerCenter mappings, B2B DX objects, and transformation projects for X12 file processing.
Conducted root cause analysis and provided quick fixes for production issues related to Informatica tools.
Utilized Informatica PowerCenter tools including Mapping Designer, Workflow Manager, and Repository Manager.
Managed projects using JIRA ticketing system and version control with Tortoise SVN.
Worked on complex Oracle SQL queries, DDLs, DMLs, functions, and PL/SQL blocks.
Designed and developed ETL processes using Informatica PowerCenter based on business rules.
Experienced with PWX concepts such as registration maps, logger, listener, and condense files.
Provided development and technical support for post-production issues.
Environment: Informatica PowerCenter, B2B, Informatica Developer, Informatica Data Transformation studio, Oracle 11g, GITHUB

Health Axis Group, Dallas, TX May 2018 - May 2021
Sr. ETL Informatica Developer

Responsibilities:
Interact with the Business users to identify the process metrics and various key dimensions and Facts and involved in full life cycle of the project.
Assist architect on developing STG/ ODS / Hub / dimensional warehouse in Azure Sql Data warehouse.
Assist in defining logical and physical database models for building new enterprise data warehouse in cloud to replace existing on-premises warehouse.
Identify ETL specifications based on business requirements and create ETL Mapping Documents, high level documentation for the product owners and data managers.
Define modelling and naming standards and Best Practices for the Modelling team to use in the Data models as well as in the DDLs and DMLs while creating new data elements and adding attributes.
Importing & exporting databases using SQL Server Integrations Services (SSIS) and Data Transformation Services (DTS Packages).
Worked on Informatica B2B data transformation for parsing the files using the parsers and HIPPA libraries.
Effectively using IICS Data integration console to create mapping templates to bring data into staging layer from different source systems like Sql Server, Oracle, Teradata, Salesforce, Flat Files, Excel Files, PWX Cdc
Proficient in designing techniques like Snowflake schemas, Star schema, fact and dimension tables, logical and physical modeling and used ERWIN to design Physical and Logical Data Modeling.
Involved in all the phases of Migration from DTS to SSIS packages.
I have experienced working with IICS transformations like Expression, joiner, union, lookup, sorter, filter, normalizer, and various concepts like macro fields to templatize column logic, smart match fields, renaming bulk fields and more.
Experience working with Informatica Powerexchange integration with IICS to read data from Condense Files and load into Azure Sql Datawarehouse environment.
Experience with PWX concepts including registration maps, logger, listener, condensing files.
Experience working with IICS with FULL push Down optimization to push data from Staging to Ods at scale using data integration templates.
Imported the PowerCenter workflows to B2B data exchange console to invoke the PowerCenter workflows.
Production deployment of ETL Informatica objects i.e., Workflows, Sessions, Mappings, DT Services, B2B DX Profiles and Windows Scripts.
Experience working with Custom Built Query to load dimensions and Facts in IICS.
Experience working with various Azure Sql data warehouse connectors including V2 and V3 connectors.
Experience working with Microsoft Odbc drivers to enable FULL Pdo while doing loads within sql data warehouse.
Experience in building semantic layer post to fact loads for reporting to connect to data warehouse.
Responsible for deployments in Higher Environments and prod support for warranty period before turning over to manage services.
Environment: Informatica IICS, Oracle 11g, Sql Server 2016, B2B, SQL Server Integrations Services (SSIS), Azure Sql Data warehouse, Teradata v15, flat files, Excel Files, Salesforce, Cognos Reporting, batch, and python scripting

Sammons Financial Group, West Des Moines, IA Apr 2016 Apr 2018
ETL Informatica Developer

Responsibilities:
Analyze the business requirements and framing the Business Logic for the ETL Process and maintain the ETL process using Informatica PowerCenter.
Work extensively on various transformations like Normalizer, Expression, Union, Joiner, Filter, Aggregator, Router, Update Strategy, Lookup, Stored Procedure and Sequence Generator.
Develop ETL Informatica mappings to load data into staging area. Extracted data from Mainframe files, flat files, SQL Server and loaded into Oracle 11g target database.
Create workflows and worklets for Informatica Mappings.
Wrote Stored Procedures and Functions to do Data Transformations and integrate them with Informatica programs and the existing applications.
Work on SQL coding for overriding for generated SQL query in Informatica.
Involved in Unit testing for the validity of the data from different data sources.
Developed workflows for dimension loads, fact loads based on daily/monthly runs.
Developed code to archive monthly data into history tables and effective use of the history table to load the data back into the system for a particular past month.
Developed Audit tables to keep track of ETL Metrics for each individual run.
Experience working with Audit Balance control concept to create the parameter files dynamically for each workflow before its run.
Design and develop PL/SQL packages, stored procedure, tables, views, indexes, and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
Involved in migrating the ETL application from development environment to testing environment.
Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
Developed Informatica SCD type-I, Type-II mappings. Extensively used all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.
Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
Involved in performance tuning for better data migration process.
Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
Create UNIX shell scripts for Informatica pre/post session operations to transfer the files between servers and archiving the files.
Automated the jobs using CA7 Scheduler.
Document and present the production/support documents for the components developed, when handing over the application to the production support team.
Environment: Informatica PowerCenter 9.6, Informatica Power Exchange 9.6, CA7 Scheduler, Oracle 10g, SQL Server 2012, MS Visio, MS Project, UNIX /LINUX Shell Scripting, PERL, Putty, Toad, OBIEE.

Independent Health, Buffalo NY Jul 2013 Mar 2016
ETL Informatica Developer/Data Analyst

Responsibilities:
Used Informatica PowerCenter for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
Involved in extracting the data from the Flat Files and Relational databases into staging area.
Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
Created events and various tasks in the workflows using workflow manager.
Tuned ETL procedures to optimize load and query Performance.
Set up Batches and sessions to schedule the loads at required frequency using Informatica workflow manager and external scheduler. Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling and FACT and Dimensions tables.
Evaluated the mapplets and mappings as per Quality and Analysis standards before moving to production environment.
Wrote shell scripts for file transfers, file renaming and several other database scripts to be executed from UNIX.
Migrated Informatica Objects using Deployment groups.
Troubleshot issues in Test and Prod. Did impact analysis and fixed the issues.
Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
Environment: Informatica PowerCenter 9.6/9.1, Workflow Manager/Monitor, Erwin 4.0, Oracle 10g/9i, SQL, PL/SQL, TOAD, SQL * Loader, UNIX /LINUX Shell Scripting, DB2, Microsoft SQL Server 2012, XML, IDQ, OBIEE.
Keywords: quality analyst user interface sthree microsoft procedural language Colorado Idaho Iowa New York Texas Wisconsin

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];4020
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: