Home

Sai Krishna - ETL Informatica
[email protected]
Location: Phoenix, Arizona, USA
Relocation: Yes
Visa: H1B
Sai Krishna
ETL Informatica
469.299.4848
[email protected]
Phoenix, AZ
Yes
H1B


Summary:
9 years of experience in IT industry with in-depth understanding of SDLC with Agile methodologies
Data Warehousing: strong Data warehousing experience using Informatica Power Center 9.5/9.0.1/8.6 (Source Analyzer, Repository Manager, Mapping Designer, Mapplets, Transformations, Workflow Manager, Task Developer), Power Connect for ETL, OLTP, OLAP, Data Mining, scheduling tool. Worked with heterogeneous source systems like flat files, RDBMS, DWH, parameterizing variables.
Data Modeling: Data modeling knowledge in Dimensional Data modeling, Star Schema, Snowflake Schema, FACT and Dimensions tables, Physical and logical data modeling and Denormalization techniques.
Have good knowledge in migrating data from oracle warehouse builder to Informatica power center.
Database: Using Oracle 11g/10g, SQL, PL/SQL, SQL*Plus, SQL Server 2008/2005. Proficient in working with PL/SQL to write stored procedures and triggers. Expertise in WinSQL, TOAD (Tool for Oracle Application Developers).
Hands on experience using query tools like SQL Developer, TOAD.
Have well versed knowledge in data integration to ETL tools with various subsystems.
Experience in working with high volume of data in ELT environments.
Experience with Financial and Retail sectors.
Very strong in the Analysis, Design, Development, testing and Implementation of Data Warehouse Applications. Well versed with Star-Schema & Snowflake schemas for designing the Data Marts.
Extensive experience in implementation of transformations, Stored Procedures and execution of test plans for loading the data successfully into the targets.
Experience in creating various transformations using Aggregator, Look Up, Update Strategy, Joiner, Filter, Sequence Generator, Normalizer, Sorter, Router, and Stored Procedure in Informatica Power Center Designer.
Experience with high volume datasets from various sources like Oracle, Flat files, SQL Server and XML.
Excellence in the implementation of Data Warehouse and Business Intelligence using ETL tools like Informatica Power center/PowerMart (Designer, Workflow Manager, Workflow Monitor, Repository Manager)
SCD Management including Type 1, 2, De-normalization, Cleansing, Conversion, Aggregation, and Performance Optimization.
Hands on experience in writing, testing and implementation of stored procedures, Functions using PL/SQL.
Experience handling large sets of data using Informatica.
Strong experience in Performance Tuning of sources, targets, transformations and sessions.
Quick learner and adaptive to new and challenging technological environments.
There is never an exact right tool/technology to use and I am always open to learning new tool/technology.

TECHNICAL SKILLS:
Data Warehousing/ETL Informatica Power Center 10.X , Informatica Data Quality, SSIS
Databases Oracle 11g/10g/9i., SQL Server, Netezza
Programming SQL, PL/SQL , MS SQL, UNIX Shell Scripting,Python
Environment Win 2000/2003, Win XP, UNIX, LINUX
Other tools & Utilities SQL*Plus, Toad for Oracle, SQL tools, SQL Server Management Studio, Tidal, MS-Office, XML Spy, Einstein analytics by salesforce

EDUCATION:
Master s Degree from Texas A&M University- May 2014 Major Electrical Engineering
Bachelor s Degree from JNTU- Hyderabad April 2011 Major Electrical and Electronics Engineering


Client: Bank of the west/BMO, Tempe, AZ August 2020 - Present
Role: FTE Sr Software Engineer I

Responsibilities:
Implementing complex ETL/Data warehouse projects.
Acting as primary liaison between IT and business groups during Analysis, requirements definition and design activities for enhancements and production tickets.
Working closely with the business on the front door process and creates JIRA tickets for production issues in an agile framework.
Data migration to AWS cloud Snow ball.
Working closely with product owners and platform managers on prioritizing the production issues.
Support other system analysts, ETL developers and testers providing technical assistance, troubleshooting and alternative delivery solutions.
Performance tuning and end to end life cycle experience with enterprise data warehouse.
Knowledge and experience with agile techniques: user stories, backlog grooming, continuous integration, continuous testing, pairing, automated testing and agile games.
Solving business related problems using data driven techniques, looking for order and patterns in data, as well as spotting trends that provide strategic advice for business processes and IT solutions.
Data migration to AWS Snow Ball using Amazon SCT Tool.

Environment: Informatica Power center 10.x, Soap UI, AWS SCT Tool, Informatica data integration Hub, Oracle 12c, Tidal, UNIX, and HP ALM Quality center, SQL Server,Amazon SCT Tool, Python, WinSCP.


Client: Merck, Branchburg, NJ March 2020 August 2020
Employer: Horizon Advanced systems Inc.
Role: Software Engineer
In this project I have working mainly on Employee payroll data for different countries. Using XML files, Flat files as source and loading them into tables. Using CURL script to post XML data.

Responsibilities:
Worked extensively on XML files loading into multiple tables.
Monitoring jobs using Informatica data integration Hub.
Migrating ETL jobs between different repositories.
Prepare TDD document according to the High-level requirements.
Leading off shore team and splitting work between onsite and offshore.
Using Soap UI for working on XML files.

Environment: Informatica Power center 10.x, Soap UI, Informatica data integration Hub, Oracle 11i, Tidal, UNIX, and HP ALM Quality center,Python, Winscp

Client: Bank of the west, Tempe, AZ August 2018 February 2020
Employer: Horizon Advanced systems Inc.
Role: Software Engineer
Focused mainly on implementing new approach for Regulatory reporting using AXIOM tool at the front end. Axiom has been implemented to support IHC Regulatory Reporting and is deployed for CIB which is an internal team in Bank of the west. The final report has a set of MDRM numbers and child item numbers each of the item number has a specific logic which we use to calculate the amounts and then populate against each of item number. The final report is generated in AXIOM tool.

Responsibilities:
Prepare TDD document according to the High-level requirements.
Leading off shore team and splitting work between onsite and offshore.
Participate in requirement gathering session.
Use existing PL-SQL procedures to aggregate the GL amounts.
Extract the final data to a .csv file and then place them in the landing zone using shell script.
Assisting business users if they see a mismatch in the GL numbers.
Developing Health Checks between source and target tables for the amount comparison between the elements.
Initiating CFT process between two servers to transfer the files securely.
Participate in QA triage calls and discuss on the open defects.
Performance tuning of SQL queries using oracle hints, creating indexes on the tables etc.

Environment: Informatica Power center 10, Informatica Data quality, Oracle 11i, Tidal, UNIX, and HP ALM Quality center, Winscp.

Client: Salesforce, SanFrancisco, CA April 2018 August 2018
Employer: Horizon Advanced systems Inc.
Role: Software Engineer
Worked mainly in a production support environment and partly as a developer working on Einstein analytics tool which is used as a front end reporting tool which is developed by Salesforce Company

Responsibilities:
Develop reports on Einstein analytics tool according to the business user requirement.
Working on the production failure tickets, finding the root cause and then assigning back to the development team for the fixture.
Scheduling Einstein analytics reports.
Checking for the failure batches using TIDAL and metadata scripts which runs on the back end of tidal to see for the latest failures.
Performance tuning of ETL jobs.

Environment: Informatica Power center 9.x, Oracle 11i, Tidal, UNIX, Einstein Analytics, and Winscp.

Client: Bank of the west, San Ramon, CA August 2015 April 2018
Employer: Horizon Advanced systems Inc.
Role: Software Engineer
Banking companies participating in CCAR (Comprehensive Capital Analysis and Review) are required to submit their stress testing and capital plans.XML files are generated for various portfolios such as Corporate loan, info lease, US AUTO, US Other consumer, First lien and Home equity etc. and then submitted to FRB for review. Bank of west has successfully passed the CCAR stress test in 2016 and received no objection for 2017 company capital plan. This stress test was conducted both of Bank of west and First Hawaiian bank data which was then a subsidiary of the Bank.

Responsibilities:
Design ETL flows according to the business requirements.
Generate XML files using the XSD definition and then copy the files into local UNIX server.
Writing PL-SQL procedures and SQL queries and run them using the ETL batch framework used extensively in this project.
Creating Health checks between source and target tables to match the record counts.
Tuning long running SQL queries by creating profiles with the help of DBA, use of WITH clause and HINTS in the sql overrides and scripts.
Use of TIDAL tool for scheduling the ETL batches.
Worked closely with the Business Analysts for requirements gathering, Data analysis etc.
Implemented SCD type 2 dimensions to represent historical data.
Partitioning, creating unique index on tables to increase the performance.
ETL mappings to source FLAT FILES and load them into relational tables.
Creating Impact analysis, Technical specification and Design Documentations.
Worked on ALM quality center tool to update/ track defects raised by QA/UAT resources.
Review the data model, DDL changes performed by data modelers and propose any additional required changes.
Guiding offshore team from requirements gathering till the QA phase.
Creating Informatica deployment groups to migrate the code to higher environments.

Environment: Informatica Power center 9.x, Oracle 11i, Tidal, Informatica Data Quality, UNIX, and HP ALM Quality center, Winscp.

Employer: New Era Consulting Services April 2014 December 2014
Role: ETL/Informatica Developer
For access to the most current apartment research covering rents, concessions and occupancy with property, submarket and market trends. Axiometrics provide exclusive pipeline tracking, which identifies new-development supply coming into the market. Its conventional apartment research is the timeliest in the industry, with all properties surveyed monthly, and most daily pricing properties surveyed weekly.

Responsibilities:
Set priorities and monitor quality and SLAs.
Created Joins, Macros and Secondary indexes for faster retrieval of data.
Owns and manages the Production support services engagement.
Analyze, design, develop, implement and enhance Data warehousing applications to meet user s changing needs and train users in the application as necessary.
Performance Optimization of Data Integration routines.
Interact with Client process owners/end users to seek clarifications on the open cases/tickets.
Establish, evaluate policies, methodologies and procedures.
Extensively used ETL Processes to load data from various source systems such as DB2, SQL Server, Flat Files and XML Files into target system Teradata by applying business logic on transformation mapping for inserting and updating records when loaded.
Profoundly worked with testing team in identifying the 'code defects' raised from SIT/UAT resources.
Responsible for release planning and coordination.
Worked on REMEDY and HP service managers for updating and maintain the daily defects and the track of status of the defects.
Review work done by peers before delivering to Client.
Ensure System / Integration tests of various objects before delivery Responsible for identifying and presenting opportunities for service improvement.
Closely worked with the database developer to ensure the database changes are implemented as per the requirement.

Environment: Informatica Power Center 9.1.1, IDQ 9.1, Oracle 11i, UNIX, Flat Files, Autosys, TOAD
Keywords: quality analyst user interface access management information technology hewlett packard microsoft procedural language Arizona California Delaware New Jersey

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];1823
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: