Home

Mallikarjn - ETL/Informatica developer
[email protected]
Location: Houston, Texas, USA
Relocation: Yes
Visa: H1b
Career Goals
To build a career with an esteemed organization with committed and dedicated people, which will help me explore myself fully and realize my potential. Adept at prioritizing, tracking, and completing tasks to accomplish project goals. History of mining, warehousing, and analyzing data at the company-wide level. Knowledgeable about the principles and implementation of machine and deep learning. Results-oriented and proactive with top-notch skills in project management and communication.

Summary
10+ years of Professional experience in various phases of projects plus 2 years in Academics with Master of Computer Science.
9+ years of experience in Developing, Enhancing and Maintaining applications in industry verticals like Retail, Banking/Financial and Oil & Gas domain's using cloud and on prem platforms. Informatica Intelligent Data Management Cloud (IDMC) & Informatica Power Center.
9 years of experience in all the phases of Data warehouse life cycle involving requirements gathering/analysis, design, development, validation & testing of Data warehouses using ETL tools and various Databases.
3 years of Team Lead experience managing the work for Onsite/Offshore teams.
Experience in Development & Admin activities with cloud base ETL tool called Informatica Intelligent Data Management Cloud (IICS)), AWS Flue and reporting tool Tableau.
In-depth knowledge of Snowflake Architecture, Datawarehouse, Database, Schema, Table and views. Worked on concepts like Time travel and Snow pipe.
Hands on Experience in writing SQL queries and achieving query performances against Snowflake DB.
Skills
ETL Tools: Informatica Power Center 10.x/9.x/8.x, IDMC, AWS Glue.
Big Data Tool: Hive HUE 2.6.1-2
Databases: MS SQL SERVER, Oracle(11g/10g), IBM-Netezza, Toad, Snowflake, MongoDB.
Source Code Editor: Visual Studio 2022.
Scheduling Tools: Control-M, Autosys and Tidal. CI/CD Tools: Git HUB, Jenkins, Team Foundation Services (TFS), XL Deploy
Programming Languages: Python, SQL, PL/SQL, Shell Scripting
File Transfer Tools: FTP, Putty, WinSCP
Operating Systems: Linux, Windows Family.
Reporting Tool: Tableau.

Experience

Charles Schwab | Remote / Denver | CO.
Data Engineer/ Sr. Informatica Cloud Developer 05/2022 Present.

Project: State Street Modernization. The initiative of project is reinvesting savings from Fund Admin, Accounting and Custody vendor to modernize the acquisition, cleansing and consumption of data into Applications and business reports.
Worked with Architects to on board new domain requirements and mechanism to stub the records, resolve the securities to handle mass ingestions with IICS.
Implemented end to end flow of data from Source files to DW tables, includes creating XML s files, loading data to stagging, dimensions and Fact tables using IICS.
Used VS as source code editor to perform Creating and Modifying the objects. Includes tables, views, SP s, Functions and UDT s.
Developed shell /python scripts to handle incremental loads.
Parsed out the IICS workflows to a grain of 1:1 workflow: session, using python scripts and manipulated the workflow.
Achieved loading data via IICS jobs and Stored Procedures. Extensively created SPs to call the batch load data and process into ODS tables.
Implemented regular expressions in Snowflake for seamless pattern matching and data extraction tasks.
Developed and implemented Snowflake scripting solutions to automate critical data pipelines, ETL processes, and data transformations.
Developed Cloud integration parameterized mapping templates (DB, and table object parametrization) for Stage, Dimension (SCD Type1, SCD Type2, CDC and Incremental Load) and Fact load processes using IICS.
Created IICS connections using various cloud connectors in IICS administrator.
Unit test cases and Techcerts are created to perform unit testing with multiple test conditions to cover all the scenarios.
Production support for business-critical applications, debugging, issue analysis, resolution and reprocessing of failed integrations.
Bit bucket and Bamboo CI/CD tools are used to Deploy code to higher environments.

National Oil Well Varco | Houston | Texas.
ETL Lead/ Sr. Informatica Cloud Developer 09/2019 04/2022

Project: SQL Server Conversion, ORACLE Data Migration.
Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
Leading team of Onsite/ Offshore to help client towards the project work and co-ordinate, track status of the project to management.
Experience in working with Relational DB (RDBMS) like Snowflake, MYSQL, PostgreSQL and No-SQL databases.
Performed different data transformations and created data pipelines using AWS Glue with PySpark.
Migrated the SQL Server and report environments to Amazon Web Services (AWS) Ec2.
Monitored the status of jobs and metrics using AWS CloudWatch.
Created connections in IICS to read the data from flat files, oracle, SQL server, Salesforce, workday, Snowflake loaded CSV, parquet, AVRO, ORC data format to AWS S3 bucket.
Created connections in IICS to read the data from flat files, oracle, SQL server, Salesforce, workday, Snowflake etc.
Created tables on S3 data using Glue interface and accessed S3 data through AWS Athena for analysis.
Create pipelines in IICS using linked services to extract, transform and load data from multiple sources like Snowflake and Azure SQL Data warehouse.
Good exposure in Informatica MDM where data Cleansing, De-duping and Address correction were performed.
Involved in Informatica MDM hub configuration, IDQ cleanse function implementation, Hierarchy Configuration in MDM hub.
Have extensively worked on Oracle and SQL Server databases for data validation and to build Stored Procedures, Views, Functions and Triggers.
Involved in the development of PL/SQL stored procedures, functions and packages to process business data in OLTP system.
Visualized transformed data by using Tableau Desktop dashboards containing histogram, trend lines, pie charts and statistics.
Expose dimension and fact views through JBoss Data Virtualization to downstream Business Intelligence platforms, such as Business Objects and Tableau.

Capital Group | San Antonio | Texas.
Sr. ETL Informatica Developer 01/2019 - 08/2019.

Project: rMail (Regulatory Mailing) & Informatica Upgrades.
Worked on couple of Projects rMail (Regulatory Mailing) & Informatica Upgrade project from versions 9.6 to 10.2.
Developed mappings using Informatica Power Center to Extract, Transform and load data. Performed Type1, Type2 mappings and workflow using Workflow Manager and monitoring the jobs using Workflow Monitor.
Involved in Migrating Objects from Teradata to Snowflake using Informatica PowerCenter.
Development efforts includes Moving data from DST source system (Power Select) to rMail tables in Oracle database.
Involved in writing shell scripts for file transfers & file attachments and several other database scripts to be executed from UNIX.
Involved in System Integration Testing and developed, executed, documented Unit test plans, Integration test plans and validating the results with Business Analyst.
Used Control-M tool to schedule the Informatica workflows using pre, post-conditions and adding timing to it.

Bloomin' Brands | Tampa | Florida.
Sr. ETL Informatica Developer 06/2017 - 12/2018.

Project: Point of Sales, Door Side.
Participated in system analysis and data modeling, which included creating tables, views, indexes, synonyms, triggers, functions, procedures, cursors and packages.
Created mappings, mapping configuration tasks and task flows with Intelligent Informatica Cloud (IICS) and Informatica Power Center (10.1.1 & 9.6.1).
Converted Complex wf s from IPC to IICS by maintaining the standards and without effecting the real time runtime.
Built mappings using Informatica Data Quality and export them as Mapplet to Informatica Power Center to read JSON format files.
Worked in Production Support Environment as well as QA /TEST environments using Quality Center tool for projects, work orders, maintenance requests, bug fixes, enhancements, data changes. Monitoting the production workflows using Workflow Monitor.
Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion.
Involved in Hybrid integrations of Service Now, Salesforce, and Workday with Hadoop Ecosystem with IICS.
Created Warehouse, Database, tables, staging of files etc. in SNOWFLAKE cloud data warehouse.
Ingested CSV, JSON, Parquet format files in Snowflake Datawarehouse.
Created various workflows to read .JSON files and loaded into staging tables and then to respective Dimension and Fact tables.

Catalina Marketing Corporation, St. Petersburg | Florida.
ETL/Informatica Developer 12/2016 - 05/2017.

Project: Legacy to New DW. The main goal of this project is to build the new DW known as CEDW4x Big Data and match the data. VTBD Vault to Big Data is the name of the sub-project which I had worked on.
Used Informatica 9.6.1 to load the data from Legacy tables to Netezza tables.
Built Data Integration Components using Informatica power center following locally DI Framework and recipes (i.e., ETL Cookbook) like Netezza, Oracle, SQL server on Unix/Linux and Windows operating systems.
Proficient in using Informatica Designer, Workflow manager, Workflow Monitor, Repository Manager to create, schedule and control workflows, tasks, and sessions.
Extracted data from our cloud sources (Hive HUE).
Used Hue version- 2.6.1-2 Web interface to query the data.
Used Aginitiy AMP to generate Natural keys by the combination of multiple foreign keys.
Worked on Writing and tuning complex SQL queries and shell scripts.

Pet Smart | Phoenix | Arizona.
ETL/Informatica Developer 04/2016 - 11/2016.

Project: Digital Reporting Current Project is child project of Digital Integration Platform in Pet Smart.
We used ETL/Informatica to load the tables and Micro strategy for reporting.
Informatica 9.5 been used to load the data from Oracle to Netezza tables.
Played a main role in creating all the templates to create different status pages add our tasks and track them in Jira.
Prepared all the designing documents consist of Data flow from Source to Target by mapping all the columns, which are required for reporting purpose.
Handled the complex mappings by modifying some of the core tables which consist of Pet Smart customer data and also the sales tables that are involved in Batch load.
Created DDL and DML scripts that have structure of new tables, and modifications of existing tables.
Built Mappings, work lets and workflows to load the data into staging area, and then into DW tables.
Create complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic programs using Informatica Power center.
Used filter, sorter, expression, aggregator, joiner, router, Normalizer, Union, lookup etc. transformations convert the complex business logics into ETL code.
Design workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.
Used Push Down Optimization to increase the Performance.

Wells Fargo | Charlotte | NC.
ETL/Informatica Developer 12/2014 - 03/2016.

Project: Financial Accounting Control System which requires the interpretation and analysis of the business requirements to assist with the design, development and delivery of a technology solution.
Extracted data high volume of data sets from SalesForce.com (SFDC) using Informatica ETL mappings/SQL PLSQL scripts and loaded to Data Warehouse.
Worked on SQL server, Oracle, Sybase databases.
Designed Workflows that uses multiple sessions and command line objects (which are used to run the UNIX scripts).
Created UNIX scripts for FTP and SFTP files to different servers.
Used Informatica file watch events to pole the FTP sites for the external mainframe files.
Documented and reported defects within established process and tracking systems using ALM and shared the resultant values with Business users in different phases of the project.




Education
Texas A&M University - Kingsville | Kingsville, TX
Master of Science in Computer Science - 2014.
Certifications
Informatica Cloud Certified Professional, Informatica - 2023.
Oracle Certified SQL Expert.
Keywords: continuous integration continuous deployment quality analyst sthree database information technology microsoft procedural language Colorado Delaware North Carolina Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];2440
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: