Home

Murali - Informatica/IICS Developer
[email protected]
Location: Dallas, Texas, USA
Relocation:
Visa:
Murali Krishna
ETL Informatica/Informatica Cloud Developer (IICS)


PROFESSIONAL SUMMARY
14 years of IT experience in Design, Development and Implementation of Data Warehouse and its applications.
Experience in Enterprise Data Integration process, Requirement Analysis, Scoping, Developing, Debugging, Testing and Documentation of various phases in a project life cycle of Client/Server Applications.
10+ years of industry experience in Data Warehousing with Informatica Power Center 8.X, 9.X & 10.X.
5 + years of implementation experience in Informatica Intelligent Cloud services (CAI/CDI).
Proficient in using Data Synchronization Task, Replication Task, Masking Task, Mass ingestion Task, Task flows in IICS/IDMC.
Developed SOAP/REST/Event based APIs integrate JDE & Oracle ERP & Banking systems.
Created models with intelligent structure model using excel CSV, pdf, and txt files.
Installed secure agent and created connections flat file, oracle, SQL server, Salesforce, workday, Snowflake, Advanced FTP/SFTP, Google sheets.
Extensively worked on Data Extraction, Transformation and Loading data from various sources like, SQL Server, Oracle9i/10g, DB2, Teradata, Salesforce, XMLs, AWS S3, Redshift.
Extracted the data from JSON/XML files using Hierarchy Parser Transformation in Informatica Cloud Mapping Designer.
Proficient in using Informatica Designer, Workflow manager, Workflow Monitor, Repository Manager to create, schedule and control workflows, tasks, and sessions.
Extensive experience in performance tuning of existing workflows with pushdown optimization, session partitioning etc.
Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and Router transformations for populating target table in efficient manner.
Extensive experience in using Debugger utility of the Informatica tool to check the errors in the mapping and made appropriate changes in the mappings to generate the required results.
Develop ETL audits and controls to ensure quality of data meets or exceeds defined standards and thresholds defined by the business customer.
Experience in identifying, researching, and resolving ETL issues and producing root cause analysis documentation.
Well versed in writing UNIX Shell Scripts for running Informatica workflows (pmcmd commands), file manipulations, housekeeping functions, FTP programs.
Extensive working experience in Teradata utilities viz TPUMP, BTEQ, Fast Export, Fast Load, Multiload etc.
Well versed in developing the complex SQL queries, unions and multiple table joins and experience with views, procedures, triggers, functions & PL/SQL scripts.
Experienced in creating ETLs in Relational and Dimensional databases. Have good understanding of data warehouse best practices and methodologies (Kimball & Inman), logical, physical modeling.

TECHNICAL SKILLS:
Data warehousing ETL Informatica Power Center 10.x/ 9.x /8.x/7.x/, DW Designer, Mapping Designer, Workflow Manager, Meta data reporter, Workflow Monitor, Mapplet, Transformations, SSIS, Informatica BDM 10.2.1, Informatica Intelligent Cloud
Services (IICS/IDMC)
Data Modeling Dimensional Data Modeling (Star Schema, Snowflake, FACT, Dimensions), Physical and Logical Data Modeling, Entities, Attributes, Cardinality, ER
Diagrams, ERWIN 4.5/4.0, DB Designer
Reporting & BI SQL Server Reporting Services, Tableau, Power BI
Job Scheduling Autosys, Control M, CA WorkStation, Cron jobs
Programming SQL, PL/SQL, Transact SQL, Unix Shell Scripting, Python, JAVA, JSP, HTML, C#, C++
System Design & Development Requirements Gathering and Analysis, Data Analysis, ETL Design, Development and Testing UAT Implementation
Databases Redshift, Athena, Oracle 10g,9i,8i, MS SQL Server 2000/2005/2008, Tera Data,
Hive, Impala, MS Access, DB2, Snowflake
Environment RHEL (Red Hat Enterprise Linux)4.0, UNIX (Sun Solaris 2.7/2.6, HP-UX 10.20/9.0, IBM AIX 4.3/4.2), Windows 10/7/2003/2000/XP/98, Win NT, Linux, MS-Dos.

PROJECT EXPERIENCE:
NJR, Dallas, Texas Dec 2020- Till date
Sr. Informatica (IDMC) Cloud Developer (CDI/CAI)
Roles and Responsibilities
Extensive experience in developing all CDI components like mappings, task flows, mass ingestion, synchronization task etc.
Created process objects, app connections and service connectors to invoke existing system functionality and their operations.
Developed API s (REST/SOAP/Bulk etc.) using CAI in Informatica Cloud.
Production support for business-critical applications, debugging, issue analysis, resolution and reprocessing of failed integrations.
IIC Secure agent server health check, upgrades, maintenance during the major release.
Ability to deploy & publish processes on secure agent as well as cloud.
Experienced in understanding request, response structures using tools such as postman, SOAPUI. Extracted data from various APIs in CDI & CAI processes.
Created Swagger files for API s. Created REST V2 connections and Business process for the API s.
Performed Real-time & near real-time Data Integration using Informatica IICS for the Financial reports analysis.
Created Dynamic ETL Task flows in CDI to populate Sync the tables from source system to reporting DB using Dynamic mapping.
Worked on the Power Center to IICS migration. Designed and developed new ETL framework for IICS in CDI.
Mentored junior developers and testers in Development and test phases of the projects.
Ability to debug custom bpel processes using advanced view in application integration console.
Experienced in handling exceptions raised by sub processes.
Ability to parameterize the URLs, header info & other connection parameters in service connectors to make deployment error free in higher environments.
Installed secure agent groups and created connections flat file, oracle, SQL server, Salesforce, workday, Snowflake, Advanced FTP/SFTP, Google sheets.
Well versed in Performance tuning the slow running jobs using partitioning, push down optimization etc.
Environment: Informatic Cloud (IICS/IDMC), Oracle Cloud, Google sheets, , UNIX Shell Scripts, Informatica Power center, SNOWFLAKE, ORACLE ERP, bash scripts, Control -M etc.

WEX Inc, Portland Maine June 2017- Dec 2020
Sr. Informatica Cloud Developer (IICS/PC)
Responsibilities:
Extracted data from WORKDAY using the Informatica Cloud connector to build data lake.
Created connections in IICS to read the data from flat files, oracle, SQL server, Salesforce, workday, Snowflake loaded CSV, parquet, AVRO, ORC data format to AWS S3 bucket.
Extensively used Mass ingestion task, Synchronization task, Replication task load the structured, semi- structured data into Data Lake for Data analytics.
Developed data integration processes by constructing mappings, tasks, taskflows, schedules, and parameter files.
Extracted data from WORKDAY custom reports by creating Web services API.
Generated the Swagger files in IICS and Extracted data from WORKDAY Custom report and loaded in S3 location.
Created Glue crawlers, ETL jobs to migrate relational databases to Amazon S3 using AWS Glue.
Loaded the data in formatted zone in AWS S3 with ATLAS engine by full load, incremental, Full compare methods
Increased the query performance by leveraging S3 partitioning and Redshift partitioning, created physical tables for faster retrieval of data for data visualization purpose.
Expose dimension and fact views through JBoss Data Virtualization to downstream Business Intelligence platforms, such as Business Objects and Tableau.
Deployment contact for Data Integration track. Created milestones for deploy events and coordinated to ensure smooth implementation.
Environment: Informatic Cloud (IICS/IDMC), Oracle12.5, Google sheets, DBeaver, WORKDAY, UNIX Shell Scripts, Informatica Scheduler, AWS S3, AWS REDSHIFT, SPECTRUM, VDM, AWS GLUE CRAWLERS etc.

Manhattan Associates, Atlanta, GA March 2013 - May 2017
Sr. Informatica Developer
Responsibilities:
Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
Created connections in IICS to read the data from flat files, oracle, SQL server, Salesforce, workday, Snowflake etc.
Involved in Hybrid integrations of Service Now, Salesforce, and Workday with Hadoop Ecosystem with IICS.
Created Warehouse, Database, tables, staging of files etc. in SNOWFLAKE cloud data warehouse.
Ingested CSV, JSON, Parquet format files in Snowflake Datawarehouse.
Used filter, sorter, expression, aggregator, joiner, router, Normalizer, Union, lookup etc. transformations convert the complex business logics into ETL code.
Experience in iPaaS integration technologies and complete implementation experience of Informatica Cloud Application and Data Integration with Salesforce platform or any other major cloud-based platforms.
Experience with Oracle SQL and PL/SQL programming and used Database utility programs like TOAD and SQL Navigator.
Design workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.
Environment: Power center 10.2.1, Oracle12.5, Flat files, SQL, SQL Developer, Salesforce, UNIX Shell Scripts, Informatica Scheduler, JAVA, JSP, Urban Code, Jenkins, Source Tree.

Nationwide. Columbus, OH November 2010 -March 2013
ETL Informatica Developer
Responsibilities:
Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
Work with data architect to generate Automation opportunities in areas like Data Migration from Legacy system.
Work with subject matter experts and project team to identify, define, collate, document and communicate the data migration requirements.
Developed RTC Stories and docs by Translating Business Requirements to System Requirements for Agile global delivery model.
Partner with DBAs to transform logical data models into physical database designs while optimizing the performance and maintainability of the physical database.
Create complex mappings which involve Slowly Changing Dimensions, implementation of Business Logic programs using Informatica Power center.
Used filter, sorter, expression, aggregator, joiner, router, Normalizer, Union, lookup etc. transformations convert the complex business logics into ETL code.
Design workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.
Practical experience utilizing tools to query such as Teradata SQL Assistant
Extensive working experience in Teradata utilities viz TPUMP BTEQ Fast Export Fast Load Multiload TPUMP etc.
Involved in performance analysis and SQL query tuning using EXPLAIN PLAN Collect Statistics Teradata Viewpoint.
Involved in automating daily tasks using various computer programming languages, such as SQL, PL/SQL, Unix shell scripting, in order to successfully complete the tasks without failures.
Optimizing performance tuning at source, target, mapping and session level.
Query tuning, Informatica partitioning, Redesign of some mappings to remove bottlenecks, caching warehouse and mart data before processing of actual source file data are some examples.
Involved in the performance tuning of the Informatica mappings and stored procedures and the SQL queries.
Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the Informatica Mappings.
Environment: Informatica Power Center 10.1/9.6 (Informatica Server, Informatica Repository Server, Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Oracle 10g, Teradata, PL/SQL, Flat files, XML Files, ERwin 4.0, TOAD 12.8

Lam Research, Fermont, CA March 2009 November 2010
Informatica Developer
Responsibilities:
Involved setting up Hadoop configuration (Hadoop cluster, Hive, Spark, Blaze connections) using Informatica BDM.
Created& deployed BDM applications and run the applications, workflows, mappings.
Used ReactJS to create controllers to handle events triggered by client and send request
to server.
Worked with React for component development, redux state management, React Router
for Single Page Application.
Design & development of BDM mappings in Hive mode for large volumes extracting data from DWH to Data Lake
Involved in designing Logical/Physical Data Model for IDQ custom metadata.
Configure the Sessions & Workflows for recovery and with High Availability.
Developed BDM mappings using Informatica Developer and created HDFS files in Hadoop system.
Implemented SCD type2 mappings using BDE and load the data into Hadoop Hive tables using Push down mode.
Involved in designing Physical Data Model for Hadoop Hive tables.
Wrote HiveSQL queries validate HDFS files & Hive table data to make sure the data meet the requirements & Develop Hive, Impala tables and queries.
Create complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic programs using Informatica Power center.
Used filter, sorter, expression, aggregator, joiner, router, Normalizer, Union, lookup etc. transformations convert the complex business logics into ETL code.
Design workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.
Environment: Informatica BDM 10.2.1, Power center 10.2.1, Hadoop 2.7.1, Hive 2.3.2, Oracle12c, Flat files, SQL, Omniture, SQL Developer, Windows XP, UNIX Shell Scripts, Cron jobs.
Keywords: cplusplus csharp user experience business intelligence sthree database information technology hewlett packard microsoft procedural language California Georgia Ohio

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];2419
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: