Home

Srinivasa Reddy - Sr. Informatica Developer/ ETL Developer
[email protected]
Location: Petersburg, Pennsylvania, USA
Relocation: YES
Visa: H1B
Srinivas Palle
ETL Developer/Lead

Summary:

12+ years of IT experience in Data Analysis and Software Development for various Business applications in Business Intelligence and Data warehousing solutions.
Strong knowledge of Software Development Life Cycle and highly proficient in methodologies for Waterfall, Agile methods.
Experience in Data Analysis, Data Cleansing, Data Validation and Verification, Data Conversion, Data Migrations.
Strong Knowledge and Extensive experience in data transformations, data loading, and performance tuning.
Expertise in using various transformations of INFORMATICA to develop a feasible solution to the problem and have good amount of knowledge in performance tuning.
Experienced in using the Informatica Power center and PDO features for loading the data from Flat Files to Data Base tables and Database tables to Extracts by using different loading technologies.
Understanding Abinitio graphs/designs.
Expert level skills in writing SQL queries against database Teradata.
Have solid experience on database query tools such as Teradata SQL Assistant for faster design and developments.
Strong Teradata SQL coding skills. Knowledge and experience in Teradata Database like Bteq, views, Tables, Joins.
Experiences in creating the table, views, indexes, Collect Stats and Bteq scripts in TERADATA.
Expert in writing complex SQL queries for performance tuning and application development.
Experience on UNIX commands and basic idea on Shell Scripting.
Experiences in Using UNIX Shell Scripts and used BTEQ, FASTLOAD, MULTILOAD and FAST Export utilities extensively to load to target database.
Proficiency in using Control-M/Autosys/CA7 Scheduler.
Good exposure in creation of complex jobs, scheduling, alerts using Control-M 6.1/8 and notifications using Control-M.
Have created several solutions for Complex Problems, Performance tuning and Created Complex Scheduling using Control-M Scheduler.
Strong experience in preparing documentation, preparing test environments, executing and analysing the results.
Expertise in Writing, Executing and Maintaining Design Plans, Execution Methodologies.
Involved in preparation of Daily and Weekly Status Reports.
Informatica Cloud Data Integration, parallel project for current working application using IICS
Cloud Data integration by connecting to multiple Databases
Experienced in interacting with Clients, Business Analysts, UAT Users and Testers.
Excellent interpersonal, communication, documentation, and presentation skills.



Technical Skills


ETL Tools: Informatica Power Center 10.x, Abinitio and Informatica Cloud(DI)
DBMS Tools: Teradata SQL assistant
Operating Systems: Microsoft Windows
Databases: Teradata, Oracle, AWS Redshift
Languages: SQL, PL/SQL, UNIX Shell Scripts
Version Control Tools: SVN, SharePoint, GIT Hub, UDeploy
Other Tools: Putty, WinScp, HP ALM
Schedulers: Contro-M 6.1/8.0, Autosys and CA7

Professional Experience:

Client: FNB, USA
Project Name: Insured Deposit Engine (IDE)
Role: Senior ETL Developer/Lead June 20- Till Date

Description:
The project is to generate and submit regulatory reports to Federal Government with FDIC rule. Data extracts happen from EDW for multiple source systems and transformed as per business logic and reports will be generated based on requirement.
QFC data and QLD Legal Agreements reports will be generated and provided to FDIC as per request. In case of failure in system, writeback process enabled to handle the situation and make sure to put hold on uninsured amounts.


Responsibilities:
Perform the requirements analysis and document the system requirements
Conduct data analysis of source systems and create the data mapping spread sheets or business rules spread sheets document
Liaison with FNB Enterprise Architecture team for the dimensional data model
Prepare the ETL design documents describing the details of data movement components
Develop the ETL Components Informatica workflows, ETL mappings, UNIX shell scripts, CA7 jobs
Conduct unit testing and prepare unit testing documents
Provide support for acceptance testing by FNB business teams
Provide the production deployment support

Environment: Informatica, Teradata, Oracle, PL/SQL, Axiom SL, CA7 and IICS (Data Integration)

Client: FNB, USA
Project Name: Single Counterparty Credit Limit (SCCL)
Role: Senior ETL Developer/Lead Apr 19- May 20

Description:
The project is to generate and submit regulatory reports to Federal Government with SCCL rule: a national bank s total outstanding loans and extensions of credit to one borrower may not exceed 15 percent of the bank s capital stock and surplus.
And also, Net credit exposure to any counterparty must be monitored daily and cannot exceed 25% of FNB s Tier-1 Capital.

Responsibilities:
Perform the requirements analysis and document the system requirements
Conduct data analysis of source systems and create the data mapping spread sheets or business rules spread sheets document
Liaison with FNB Enterprise Architecture team for the dimensional data model
Prepare the ETL design documents describing the details of data movement components
Develop the ETL Components Informatica workflows, ETL mappings, UNIX shell scripts, CA7 jobs
Conduct unit testing and prepare unit testing documents
Provide support for acceptance testing by FNB business teams
Provide the production deployment support

Environment: Informatica, Teradata, Oracle, PL/SQL, Axiom SL, CA7




Client: Federal Home Loan Mortgage Corporation, USA
Project Name: DM Rationalization platform extensions (SPP, CBP DMs)
Role: Senior ETL Developer Mar 17 Mar 19

Description:
The project is to update the score card metrics, to ensure metrics are pertinent and effective for measuring the servicers performance and serve as an effective tool to identify emerging issues and serve as a common medium between GSEs to gauge servicer performance. The scorecard will also address existing technical and operational deficiencies, improve data governance and management for better metric calculations and common data use within single family.

Responsibilities:
Perform the requirements analysis and document the system requirements
Conduct data analysis of source systems and create the data mapping spread sheets or business rules spread sheets document
Liaison with Freddie Mac Enterprise Architecture team for the dimensional data model
Prepare the ETL design documents describing the details of data movement components
Develop the ETL Components Informatica workflows, ETL mappings, UNIX shell scripts, Autosys jobs and Data movement controls (DMCs) framework is being used to perform data quality checks for both history and incremental loads
Conduct unit testing and prepare unit testing documents
Provide support for acceptance testing by Freddie Mac business teams
Provide the production deployment support

Environment: Informatica, Vertica, Oracle, DB2, Paraccel, MS SQL Server, Autosys



Client: JP MORGAN CHASE BANK, USA
Project Name: MB BCBS
Role: ETL Developer/Lead Nov 14 Feb 17

Description:

The project is a migration project which involves migration of an existing Data warehouse maintained by Ab-Initio as ETL and DB2 as the primary database to Informatica as ETL and Teradata as the
Primary database. The migration is carried out with the intention of providing better analytical capabilities and deeper drill down of information for the end user. And also involved in CDS process using HDFS. Converted the Informatica ETL mappings to Abinitio Graphs using CDS process and storing/maintaining data in HDFS.

Responsibilities:
ETL Design and Implementation of migration from Ab-Initio/DB2/INFO PROD to Informatica-Teradata.
Work with Business analyst to analyze the functional requirements and the data models to convert the functional requirements to business cases.
Create Technical design documents and build the code as per the requirements.
Provide estimates of modules to fit into the Scrum method for agile execution.
Also responsible to work actively in same Project which is migration from ICDW to Hadoop Environment to convert the ICDW system to Eco-System.

Environment: Informatica, Abinitio, Hadoop, Teradata, UNIX, Putty, Winscp, Control-M 6.1/8







Client: JP MORGAN CHASE BANK, Hyderabad, India
Project Name: Basel Committee on Banking Supervision
Role: ETL Developer/Lead May 14 Oct 14

Description:
The Basel Committee on Banking Supervision s (BCBS) Principles for Effective Risk Data Aggregation and Risk Reporting document outlines what large financial institutions need to do in order to meet the risk data aggregation (BCBS) objectives at a bank-wide level to help the financial entity avoid overexposure risks that could destabilize the bank and, ultimately, the global financial system

Responsibilities:
Worked with Business Analysts to analyze the data models that originate from production environment associated with BCBS standards.
Analyzed the Requirements, scripted complex SQL queries for data quality checks based on functional requirements, system specifications and technical requirements.
Understand the user requirements and build the reports based on the business requirement to meet the Basel standards.
Developed schedulers using Control-M 8, analyzed the existing Control-M jobs and their dependency and found and developed a way for their feasibility.
Led Daily Status Reporting Meetings with Business and Development team extensively.
Generated Unit Test case metrics and project steering reports for the Business team and higher management
Reviewed the test cases written based on the Change Request document and Testing has been done based on Change Requests and Defect Requests.
Developed UNIX Shell Scripts to analyze the type of data.
Interacted with BA team to decide on the various dimensions and facts to test the application.

Environment: Informatica, Teradata, Abinitio, UNIX, Putty, Winscp, Control-M 6.1/8


Client: JP MORGAN CHASE BANK, Hyderabad, India
Project Name: Wave-2 Enterprise Releases
Role: Developer Mar 13 Apr 14

Description:

JPMC is building an integrated customer data warehouse with a high-performance, scalable, extensible, and providing a 360 view of the customer, which is critical to support future growth. Currently JPMC is in a process of Sun setting EDW and Migrating the Data Marts to ICDW Integration Layer which is build on a third normal form on IBM s Reference Data Model BDW.

There are 27 data marts which currently exist & the users also report from landing pad attributes, Integration Layer would contain a consolidated usage attributes from Data Mart & Landing part, Scope comprises capturing data from source to the ICDW Integration area. Attributes from landing pad & Data Mart will be identified based on the usage pattern and discussion with the users to model ICDW for 3 data marts based on the BDW Reference model.

IVR Interactive Voice Response Data Mart
DDM Deposit Data Mart
TDM Transaction Data Mart







Responsibilities:

The Primary role of Enterprise release is to support the change requests from the enterprise level changes and support the existing ICDW system.
Detailed Analysis of the present functionalities of integrated customer Data Warehouse and Appending the New Features to the system.
Functional Documents provide the clear description on changes happen in the Enterprise level and mentioned the changes need to be carryout in ICDW region to synch up the Functionalities of the both the Environments.
Enterprise release involves the Code Fixes in the ETL and structural changes in the Teradata Designed Tables with Addition or Deletion of Attributes.
Understand the problem, Code Fix, Proper Testing, Migration of code to Production Area with Service Management Approval to meet the SLA.
LM is the replica of EDW Staging Layer, building the similar Layer in ICDW Environment for downstream which involves Understanding the Functionalities of EDW Staging Layer, Modeling based on that design, implementation of ICDW LM staging Tables on top of that Testing, Code Migration.
Environment: Informatica, Teradata, Abinitio, UNIX, Putty, Winscp, Control-M 6.1/8

Client: JP MORGAN CHASE BANK, Hyderabad, India
Project Name: Wave2-EDW to ICDW Migration
Role: ETL Developer Sep 11 Feb 13

Description:

This project is a migration project which involves migration of an existing Data warehouse maintained by Ab-Initio as ETL and DB2 as the primary database to Informatica as ETL and Teradata as the primary database. The migration is carried out with the intention of providing better analytical capabilities and deeper drill down of information for the end user by creating different layers which includes stating, integration and semantic for end user query.

Responsibilities:
Worked with Business Analysts to analyze the data models that originate from production environment associated with Customer banking policies.
Involved in Software Development Life Cycle (SDLC) from analysis, design, development, testing and implementation in diverse range of software applications.
Analyzed the Requirements, scripted complex SQL queries for data quality checks based on functional requirements, system specifications and technical requirements.
Facilitated and managed meeting sessions with committee of SMEs from various business areas including Customer Banking.
Worked with several disparate data sources (Teradata, DB2, flat files and vendor specific (EBCDIC) files) and their complex format conversions.
Developed mappings with various transformations of Informatica for data and control flow.
Wrote complex SQL queries for extracting data from multiple tables and multiple databases.
Handled various types of sources like flat files, XML and EBCDIC files.
Analyzed the graphs which are developed in Abinitio and developed the same logic using Informatica and Teradata.
Validated complex Ab initio graphs using SCD type I and II involving transformations such as Expression, joiner, aggregator, lookup, update strategy, and filter.
Created database objects like tables, views, Indexes, Collect Stats, volatile tables using Teradata tool.
Involved in extensive data validation using SQL queries and back-end testing.
Extensive querying using Teradata to run SQL queries and monitor quality & integrity of data.
Extensively developed Informatica ETL workflows using PDO according to the data mapping requirements.
Worked with data validation, constraints, record counts, and source to target, row counts, random sampling and error processing.
Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
Created the indexes to improve the performances and also rebuilt indexes on tables.
Worked on Data Verification and Data Quality checks to support real time scenarios.
Involved in validating the amounts of claims based on Pre conditions limiting set by Business
Extensively wrote manual SQL scripts in accordance to the Data Mapping document to perform Source to Target & Target to Source Transformations and conducted Unit testing
Developed several Informatica ETL mappings and ran on UNIX for loading purpose and checking the log files
Queried different databases using SQL for data verification and data quality checks
Prepared Test Cases and Test Plans for the mappings developed through the ETL Informatica tool from the requirements
Performed Defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process
Led Daily status Report Meetings involving Business and Development team extensively
Generated Test case metrics and project steering reports for the Business team and higher management
Extensively used HP ALM as a defect management and tracking tool
Reviewed the test cases written based on the Change Request document and Testing has been done based on Change Requests and Defect Requests.
Provided the management with weekly QA documents like test metrics, reports, and schedules.
Extensively used Teradata database to test the Data Validity and Integrity for Data Updates, Deletes & Inserts
Interacted with design team to decide on the various dimensions and facts to test the application
Planned ahead of time to develop the mapping parameters and variables by discussing with BA s.

Environment: Informatica, Teradata, Abinitio, UNIX, Putty, Winscp, Control-M 6.1/8
Keywords: quality analyst business analyst information technology hewlett packard microsoft procedural language

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];1790
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: