Home

Faraz MD - Informatica ETL TDM IDMC B2B developer
[email protected]
Location: Chicago, Illinois, USA
Relocation:
Visa: GC
Faraz Mohammed
+1 (224) 216 - 2498
[email protected]
https://www.linkedin.com/in/faraz-md-pmp%C2%AE-0b6236b0/

Visa : Permanent Resident
Availability : Available to join in 15 days
Current Location : Chicago, Illinois (CST)

CAREER HIGHLIGHTS
1. Project Management Professional (PMP) certified, with over ten years of experience in data management, specializing in the implementation of Informatica suite for data integration, governance, migration, and masking.
2. Skilled in designing and developing enterprise data warehouse systems and BI applications, utilizing Informatica PowerCenter, IDMC, B2B, SQL/PLSQL, AWS S3, and other leading ETL/BI tools.
3. Experienced across diverse fields including banking, insurance, and marine energy, with responsibilities encompassing development, deployment, testing, maintenance, and release management.
4. Proven expertise in leading both onsite and offshore teams, adept at overseeing project lifecycle activities from development to maintenance, ensuring high-quality delivery on time and within budget.
5. Innovated data processing and governance practices by transitioning from Perl to Informatica PowerCenter and pioneering the use of IDMC for enhanced PHI data protection, aligning with HIPAA standards.
6. Proficient in training and leading teams for new projects, emphasizing knowledge sharing, skill development, and the adoption of best practices to drive project success and team growth.

Informatica Administration:
7. Enhanced data quality and accuracy across multiple Informatica Data Management Cloud (IDMC) modules including Metadata Command Center, Data Governance Catalog (DGC), and Data Quality by refining regex and Spark SQL queries for precise data classification. Demonstrated comprehensive knowledge in data cataloging and classification, leading to over 80% improvement in data segmentation accuracy and bolstering overall data analysis reliability.
8. ILM TDM Administration Expertise in installation and Administration of the ILM TDM 10.5, and Masking and Sub-setting.
9. Strong Knowledge of the architecture and components of Spark, and efficient in working with Spark Core, Spark SQL, Spark Structure Streaming, and Spark streaming.
10. By Using SQL, we have performed data quality checks and validation during the ETL process.
11. Proactively monitor the overall system (executing daily health check reports - job execution stats, volume processed, execution times, IO times taken etc. and take appropriate actions to improve the overall health of the system.
12. Day to day administration activities - manage users, groups, privileges, roles and permissions, folder creation and management, domain management tasks (including backup and restore), configuring DB connections etc.

Informatica Product Expertise (IDMC, PowerCenter, TDM, B2B):
13. PowerCenter: Led the transition of data processing workflows from Perl scripts to Informatica PowerCenter, streamlining partner data integration, cleansing, and loading into databases. Innovated by incorporating Unix scripts/echo commands within PowerCenter workflow manager for enhanced logging and reporting in B2B console event logs.
14. IDMC (formerly IICS): Spearheaded the creation of data catalogs and the establishment of new data and metadata classification rules in IDMC. These initiatives focused on optimizing data governance for Priority Health member data, ensuring compliance with HIPAA guidelines through precise PHI identification and application of masking techniques.
15. TDM: Utilized Informatica TDM for sophisticated data profiling, creating rules that significantly improved data quality and governance. Led efforts in data masking, subsetting, and synthetic data generation, aligning with industry best practices and regulatory requirements.
16. B2B: Demonstrated proficiency in utilizing Informatica B2B for effective data exchange and transformation, facilitating seamless data flows and integration between business partners and internal systems.
17. Data Validation Option (DVO): Leveraged Informatica DVO to ensure the integrity of data post-masking processes conducted via TDM. Developed and managed multiple table pairs to validate that data was masked accurately, ensuring its readiness for use in development and testing environments. This critical validation process, maintained data integrity and compliance, enabling secure and efficient development workflows outside of production environments.
18. Expertise in designing, coding, configuring, development, testing, deployment, and maintain programs of ETL methodology for supporting data transformations and using Informatica Power Center 10.5, Power Connect, Power Plug, Power Analyzer, Power Exchange, ETL, OLAP, ROLAP, MOLAP, OLTP.
19. Worked on Informatica TDM- ILM workbench for Data Masking using different Masking Techniques for masking the sensitive data from different systems like Oracle, MSSQL and DB2.

Database:
20. Extensive experience using Oracle 19C/11g/10g, MS SQL Server 2012,2008/2005, DB2, PL/SQL, SQL *Plus, SQL *Loader and Developer 2000. Hands on Working knowledge in Oracle and PL/SQL Writing Stored Procedures, Packages, Functions and Triggers.
21. Crafting complex PL/SQL queries for diverse operational needs.
22. Worked with Oracle Stored Procedures and experienced in loading data into Data Warehouse/Data Marts using Informatica, SQL *Loader. Extensive Expertise with error handling and Batch processing.
23. Experience with TOAD, AQT, SQL Developer and Data Studio tools to test, modify and analyze the data, create indexes, and compare data from different schemas.


Scripting:
24. Experienced in Shell scripting using environment variables, UNIX/LINUX commands, PERL Scripts and PL/SQL procedures, JSON scripting and Share point.


EDUCATION

Bachelor of Engineering (ECE).
Master of Science Information Systems and Security.

CERTIFICATIONS

Project Management Professional (PMP) (PMP Number: 3234222).












TECHINICAL SKILLS

Technical Proficiencies Location
Job Function Requirement Gathering, Documentation, Design, Analysis, Development, Testing, UAT Facilitation, On-Call Support, Project Management, Team Leadership, Solution Analysis.
Operating Systems UNIX/Linux commands, Windows 7/Vista/XP/2003/2000/95.

Databases Oracle 19C/11g/10g, MS SQL Server 2012/2008/2005, DB2, AWS S3 (for cloud data warehousing and storage solutions).
ETL Informatica PowerCenter 10.5, Informatica Data Management Cloud (IDMC), Informatica TDM (Test Data Management), Informatica B2B, Data Validation Option (DVO).
Applications MS Office Suite (Word, Excel, Access), SharePoint, Visio.
Reporting Tools Power BI , Tableau , Oracle Apex , WebFocus
Data Modeling Tools ER Studio, ERWIN 7, Informatica Data profiler
Database Tools TOAD, SQL Developer, Data Studio, SQL Management Studio
Version Control Git, Tortoise SVN, Bitbucket, JIRA









Previous Clients & Roles:

Company Name Location Designation Domain Worked From To
YS Technologies / Priority Health Michigan, US Informatica ETL/TDM Developer HealthCare Sept 2022 Till Date
Priority Health Michigan, US Informatica ETL/ TDM Developer Health Care March 2018 Sept 2022
Kemper Insurance Jacksonville, Florida ETL Consultant Insurance Jan 2016 March 2018
AIM Speciality Health Deerfield, IL Data Engineer Health Care Dec 2014 Dec 2015
Inforick Corp Newyork , NY SQL Developer/Data Analyst Consulting firm Dec 2012 Dec 2013


Current Project: Priority Health Data Masking (MI, USA) Mar 2018 Till Date
Priority Health - Health care providers of health insurance plans for Michigan individuals and families, small & large employers and Medicare and Medicaid members.
Enterprise Test Data system: The project is to perform masking assessment specific to in-scope enterprise systems and analyze the data flows and data models pertaining to the production and non-production environments to identify the source of truth from different enterprise systems & perform sub-setting and masking on all the PHI element loaded & identified.

Role: Informatica ETL/TDM Developer

1. Informatica TDM (10.4.0) tool installation support and configuration including Power center and Test data warehouse service configuration. Installation of DVO (10.4.0) tool and configuration of the developer client tool and perform ETL Testing using DVO.
2. Data masking, Sub-setting & Synthetic data generation of FACETS, PHDB and other Health care applications data as per the HIPPA guideline. Done installation from scratch including preparing the pre-requisites and analyzing the server configurations and DB set up.
3. Profiling of data pattern to create new rules and generate score cards out of the Informatica Analyst and
created the data quality rules to perform the standardization, parsing consolidation and Match operations. Work on EDI 834I,837P, keyword files, Csv files, xml files, NACHA files and FACETS data.
4. Prepare Masking Process Plan for the implementation & develop rules, data domains, policy pack and policies using different masking techniques and propose consistent masking solution. and dictionary files to perform masking based on the business requirements and monitor the performance of the masking process. Develop test data management strategies and plans to provide optimum reusable and secured test data solution.
5. Define Test Data Management process for the initial implementation as well as ongoing support. Evaluate feasible solutions and implement using TDM including Informatica PowerCenter, TDM & DVO.
6. Expertise in Data Sub-setting Synthetic Data Creation Data, Profiling & develop detailed solutions for database platforms like SQL Server & Oracle.
7. Led the transition from Perl script to Informatica PowerCenter for processing and cleansing files from partners, enhancing data management efficiency. Utilized Unix scripts within PowerCenter workflows for detailed logging, ensuring transparency and traceability in B2B data transactions."
8. Implemented complex mappings in PowerCenter to replicate and improve upon existing Perl script functionality, resulting in streamlined processes and improved data integrity.
9. Analyze the applications existing database schemas and define dictionaries for the masking process. Co-ordinate and lead development of TDM data masking POC solution & Define overall POC TDM data masking solution and architecture.
10. Led the establishment and management of comprehensive data catalogs within Informatica Data Management Cloud (IDMC), enhancing data organization, searchability, and governance across large-scale health data ecosystems.
11. Spearheaded the design and implementation of advanced data classification and profiling strategies using Spark SQL and regular expressions, achieving precise identification and categorization of Protected Health Information (PHI) and ensuring compliance with HIPAA regulations.
12. Played a key role in automating the detection and classification of sensitive PHI fields, such as names, addresses, SSNs, and member IDs, streamlining data privacy measures and significantly reducing manual oversight.
13. Engineered robust data masking processes for production data in lower-tier environments (DEV, TEST, etc.), employing sophisticated techniques to obfuscate sensitive information prior to use, aligning with best practices for data security and regulatory compliance.
14. Collaborated effectively with data governance and IT security teams to establish and enforce PHI data handling policies, optimizing data profiling and masking efforts to mitigate the risk of data breaches and ensure legal and regulatory compliance.
15. Drove continuous improvement initiatives in data governance operations, monitoring the performance of data profiling and masking executions to identify and implement enhancements, increasing efficiency and accuracy in data management practices.
16. Work on FACETS Claims Processing and perform selection, Archiving, purging and restoration of claims data using Cognizant TriZetto s.
17. Configuration and Management of Users/Groups/Roles in PC, this wouldn't be daily activity. Configuration of security domains and Scheduling Synchronization of the users/group with LDAP directory service, if you use LDAP security Domain, not a daily activity. Defining primary keys, entities, applying masking rules, groups, templates, and policies either manually or through a profile.
18. Pioneered the creation of data cataloging and classification in IDMC, setting new standards for managing Priority Health member data in alignment with HIPAA guidelines.
19. Developed and applied new metadata classification rules, significantly enhancing PHI identification and protection through sophisticated profiling and masking techniques.
20. Involve in Identifying the PHI and PII fields and profile with the data domains to validate the PHI fields. Prepare data assessment, data analysis, masking strategy, masking design and testing cases for validation on post masking.
21. Utilized WebFOCUS to design and execute complex ETL processes, ensuring efficient data flow and integration within Priority Health's data masking project, enhancing data accuracy and timeliness.
22. Developed comprehensive dashboards and reports using Power BI, providing actionable insights into data masking efficiency, compliance levels, and operational performance, contributing to strategic decision-making at Priority Health.
23. Copy test data from production, then model, subset and mask using the tool. Publish test data to Development and Testing using project parameters. Copy data between Development and Testing and edit the data. Use the meta-model to validate deployment and test databases.
24. Manage Gold-copies with TDM - Copy the Gold-copy database into TDM server and make it available for cloning to the Tester. Have strong experience on Multiple Databases Integration projects.
25. Strong experience in TDM Strategy/concepts/solutions on Data profiling/Data Masking/Data Integrating. Experience in handling of production data and Handling of multiple data bases environments like DB2/SQL Server /Oracle.
26. Creation of new schema s required for the repositories, TDM services, configuring new service, create user and assign them native and LDAP user accounts. Configuration and Management of Users/Groups/Roles in PC, configuration of security domains and Scheduling Synchronization of the users/group with LDAP directory service, if you use LDAP security Domain, not a daily activity.
27. Managing the deployments (Labels, folder, deployment groups) and conflict resolutions, not a daily activity, Configuration, and management of different types of nodes while installing PC.








Previous Project: Kemper Insurance (Jacksonville, Florida) Jan 2016 Mar 2018
Kemper Insurance - leading U.S. insurer, holding assets of $13 billion and serving through Auto and Life insurance products. It manages over 4.9 million policies via 23,700 agents and employs about 8,100 people.
Unified Data Enhancement Platform (UDEP): The project is to design and implement a comprehensive data integration and management platform using Informatica, aimed at consolidating disparate data sources into a unified data warehouse. This initiative involves developing Informatica workflows based on mapping specifications, guiding both onshore and offshore teams on ETL best practices, and ensuring data accuracy and integrity through rigorous testing and validation processes.

Role: ETL Consultant

1. Aid developers regarding the use of ETL development standards as necessary.
2. Develop Informatica workflows to collect data from multiple data sources based on mapping specifications provided to the developer.
3. Assist project team with unit testing. Develop system test plans and perform full system testing prior to implementation.
4. Provide guidance to onshore, offshore, and near shore development service providers as necessary.
5. Work closely with technical leads, analysts, and developers to design and implement Oracle solutions within a structured development process.
6. Assist in continuous improvement efforts and enhancing project team methodology and performance.
7. Contribute to the design and development of Informatica master data management components of the solution in support of both member and provider master indexes.
8. Modify existing software to correct errors, to adapt it to new hardware or to upgrade interfaces and improve performance.
9. Work as part of the team to deliver critical projects and work-related items.
10. Understand, translate, and create mappings using provided ETL specifications.
11. Identify problems, develop ideas, and propose solutions within differing situations requiring analytical, evaluative, or constructive thinking in daily work.
12. Develop Informatica code, Design, Develop and modify Informatica mappings and workflows.
13. Demonstrated ability to translate functional/high level design in detailed technical design.
14. Strong knowledge of database SQLs and Stored Procedures.
15. Strong Analytical and debugging skills. Quick turnaround time for any issues reported.
16. Working with various tasks like Session, E-Mail, Workflows and Command.
17. Developed various incremental load mappings along with SCD type 1 and SCD type 2 strategies.
18. Worked on various performance tuning methods like push down optimization and Informatica partitioning.










Previous Project: AIM Specialty Health (Deerfield, IL) Dec 2014 Dec 2015

AIM Specialty Health, a pioneer in health management, emphasizes optimizing healthcare outcomes through innovative data solutions. Leveraging advanced data engineering techniques, the focus was on streamlining data flow and enhancing decision-making processes within the healthcare industry.

Integrated Health Data System (IHDS): The project aimed to revolutionize health management by integrating requirement analysis, technical mapping, and automated business processes. Utilizing Infor Process Automation and sophisticated data modeling, IHDS enhanced data accuracy and efficiency, ensuring robust health management solutions.

Role: Data Engineer

1. Design and creation of detailed Technical Mapping document with information on implementation of business logic.
2. Requirement gathering from the Business Unit and end users of the proposed product.
3. Designing and building automated business processes using Infor Process Automation (IPA) framework
4. Performs duties including IPA administration and technical configuration management of core Infor Lawson and third-party products.
5. Troubleshoot Infor Lawson problems and work within Infor/Lawson and ERP team standards and guidelines.
6. Use ERWIN to design Physical and Logical Data Modeling
7. Design of ETL mappings for the CDC (change data capture).
8. Working on various transformations like Filter, Router, Sequence Generator, Lookups, Update Strategy, Joiner, Source Qualifier, Expression, Sorter, and Aggregator.
9. Use Mapping Variables, Mapping Parameters, and Parameter Files for capturing delta loads.
10. Working on slowly changing dimension Type1, Type2.
11. Performance tuning of the process at the mapping level, session level, source level, and the target level.
12. Working with the project team to formulate and implement a flexible system design that meets functional requirements.
13. Working with various tasks like Session, E-Mail, Workflows and Command.
14. Working with the Informatica Scheduler for scheduling the delta loads and master loads.
15. Working extensively with aggregate functions like Min, Max, First, Last, and Count in the Aggregator Transformation.
16. Extensively use SQL Override, Sorter, and Filter in the Source Qualifier Transformation.
17. Extensively use Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation.
18. Working with various re-usable tasks, workflows, Mapplets, and re-usable transformations.
19. Employee various Lookup Caches like Static, Dynamic, Persistent, and Shared Caches.
20. Use session logs, Informatica Debugger, and Performance logs for error handling when we had workflows and session fails.
21. Responsible for the Promotion of Mappings, Workflows and other Informatica objects like sources, targets, and transformation between environments like DEV, TEST and PRD.
22. Extensively used Informatica Scheduler to test the scheduling in the Development Environment.
23. Working with the third party for Automation of job processing using Autosys scheduler, establishing automatic email notifications to the concerned persons by creating email tasks in workflow manager.
24. Working extensively with Versioning of objects in Informatica with Check in and check out options.
25. Working extensively with the business intelligence team to incorporate any changes that they need in the delivered files.

Previous Project: Inforick Corp (New York, NY) Dec 2012 Dec 2013
Inforick Corp - Recognized as a rapidly expanding IT management company located in Staten Island, New York, known for its prowess in consulting, technology, web development, and outsourcing services. The company strives to deliver high-quality, value-added solutions across various industries like telecom, financial services, and healthcare, aiming to establish strong relationships with Fortune 500 clients.
Dynamic Data Analysis System (DDAS): As a SQL Developer/Data Analyst at Inforick Corp, the focus was on optimizing data utilization and analysis for client projects. The role encompassed data trend analysis, development of SQL queries for insightful data extraction, and ensuring data integrity. Through comprehensive data management practices, the project DDAS aimed to enhance decision-making and strategic planning for clients, leveraging extensive SQL and data analysis expertise to provide actionable insights and support client objectives effectively.

Role: SQL Developer/Data Analyst


1. Analyzing data trends and developing SQL queries for data extraction.
2. Ensuring data integrity across databases and systems.
3. Collaborating with teams to gather requirements and translate them into technical specifications.
4. Designing and implementing database schemas and structures.
5. Optimizing queries and database performance.
6. Creating reports and visualizations for data analysis.
7. Managing data migrations and integrations.
8. Developing and maintaining documentation for database systems.
9. Troubleshooting and resolving database issues.

Summarized duties performed in last 10 years.

1. Design, code, configure, test, debug, deploy, document, and maintain programs.
2. Gather business requirements, translate information into detailed technical specifications from which programs are written or configured.
3. Lead a team, onshore and offshore coordinator, involve in effort estimation, assign tasks to team members and KT new members, peer review the code.
4. Validate that proposed applications align with architectural design and business needs.
5. Perform Installation support, upgradation support of Informatica Power center, DVO products including troubleshooting, issue analysis, coding, testing, implementing software enhancements and applying patches.
6. Development of complex mappings, workflows, create connections, folders, deployment groups and assigning user privileges. Perform Installation, support, monitor, debugging and apply fix.
7. Performance tunes the long running one; Scheduling & monitoring of informatica jobs using MJS and monitor.
8. Work on different sources - Flat files, COBOL, XML, EDI, HL7, CSV files in Informatica.
9. Developed strategies like CDC (Change Data Capture), Batch processing, Auditing, Recovery Strategy etc.
10. Involved in the data model changes and brought new ideas in the Data model design using the ERWIN tool and created new tables and added new columns to the tables.
11. Perform data Masking, Match & merge, data cleansing, data profiling and data loading.
12. Led the strategic initiative to transition a critical B2B integration process from a Perl-based script to a robust Informatica PowerCenter solution, enhancing efficiency and scalability for handling EYEMED partner files.
13. Meticulously analyzed and deconstructed the existing Perl script functionality, ensuring a comprehensive understanding of business logic, data transformations, and end-to-end processing requirements.
14. Designed and implemented complex Informatica PowerCenter mappings using a variety of transformations, including Expression, Aggregator, Filter, SQL, Router , Source Qualifier , Stored Procedure and Lookup, to replicate the precise functionality of the original Perl script.
15. Used NORMALIZER Transformation heavily to convert the data into flat file RDBMS structures.
16. Used Mapplets, Parameters, and Variables to facilitate the reusability of code.
17. Updated the B2B profile and its parameters within the Informatica environment to seamlessly integrate the new PowerCenter workflow, maintaining uninterrupted and accurate file processing from EYEMED.
18. Spearheaded the development and testing phases within the DEV environment, applying rigorous validation techniques to ensure the new solution met all functional and performance criteria.
19. Managed the code migration process through to the TEST environment, followed by the creation and execution of Change Requests (CR) for deployment to production, ensuring a smooth transition and minimizing operational risk.
20. Collaborated closely with cross-functional teams, including IT, business stakeholders, and external partners, to communicate project progress, resolve challenges, and align on outcomes, contributing to the successful project delivery and enhanced partner integration capabilities.
21. Led the initiative to secure the storage and handling of Electronic Data Interchange (EDI) formatted Electronic Funds Transfer (EFT) files on AWS S3, ensuring compliance with NACHA regulations for the secure transfer of financial data to partner banks such as PNC and Fifth Third.
22. Designed and implemented secure AWS S3 bucket configurations to store sensitive Electronic Funds Transfer (EFT) files, ensuring robust data protection and compliance with NACHA standards for the health insurance sector.
23. Developed and maintained JSON-based security policies for S3 buckets post-creation, significantly enhancing data security and access controls while ensuring alignment with industry best practices and compliance requirements.
24. Leveraged expertise in AWS IAM (Identity and Access Management) to fine-tune permission sets and minimize the risk of unauthorized access, achieving a high level of security for sensitive financial transactions.
25. Worked on different tools and technologies - Informatica PC 9x to 10.5.1, IDMC, TDM, B2B & DVO, Oracle 19C, Sql Server, DB2, Quality center, clear case, bit bucket, Jira, Unix/linux, oracle Apex.
26. Effectively communicate with business stakeholders and technical leadership groups and prepare design technical specification documents.
27. Co-ordinate the vendors & product teams including Informatica, Oracle 7 Trizetto support for patches and tool limitations and issues.
28. Generate and automate business reports using Tableau and Oracle Apex.
29. Actively engaged in all phases of the Agile and SAFe framework, demonstrating a deep understanding of its methodologies through consistent participation in Program Increment (PI) planning, iteration planning, Inspect & Adapt (I/A) workshops, and retrospective meetings.
30. Contributed to the successful execution of PI planning sessions, facilitating cross-functional team collaboration to align on objectives, identify dependencies, and establish clear roadmaps for upcoming program increments.
31. Played a key role in Iteration Planning meetings by defining iteration goals, breaking down features into user stories, and ensuring a balanced workload distribution across the team to maximize efficiency and deliver on sprint commitments.
32. Led and contributed to engaging and productive retrospective meetings, fostering an environment of continuous improvement by identifying successes, challenges, and actionable insights to enhance team performance and workflow in future iterations.
33. Actively participated in Inspect & Adapt (I/A) workshops, leveraging critical analysis and feedback to refine processes, address impediments, and implement strategic improvements to both the product and the working environment.
34. Demonstrated a strong commitment to Agile principles and SAFe practices, embodying flexibility, responsiveness to change, and a continuous learning mindset, contributing to the overall agility and success of the team and organization.
Keywords: business intelligence sthree database information technology microsoft procedural language Colorado Illinois Maryland Michigan New York

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];2232
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: