Home

Venkateswara Reddy - SENIOR ETL INFORMATICA/TERADATA DEVELOPER
[email protected]
Location: Dallas, Texas, USA
Relocation: No
Visa: H1B
PROFESSIONAL SUMMARY:
Over 9 years of experience in Information Technology with building & supporting Data Warehouse/DataMart using Informatica PowerCenter 10.5/10.4.0/10.2/10.1.1/10.0.1/9.6.1/9.5.1/9.1.1/8.6.1
Strong work experience in Data Warehouse lifecycle process.
Involved in understanding of Business Processes, grain identification, identification of dimensions and measures (Facts).
Extensive knowledge on Understanding Data Modeling (ER & Dimensional Modeling), doing Data Integration and Data Migration.
Extensive experience in working with different RDBMS Oracle, Teradata, My Sql, SQL Server, Azure SQL Server, DB2, Sybase and with File base system Flat Files & XML Files.
Extensive experience in designing and developing complex mappings using transformations lookups (Connected & Un Connected), Normalizer, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, java and Update Strategy.
Expert on implementing Slowly Changing Dimensions Type 1, Type 2 and Type 3 for inserting and updating Target tables for maintaining the history.
Expert on Implementing Change Data Capture (CDC) for handling Incremental loads.
Experience on Mapping Debugger to validate the mappings and gain troubleshooting information about data and error conditions.
Experience in using Automation Scheduling tools like Autosys, Control-M, Tivoli, Maestro scripts.
Experience on post-session and pre-session shell scripts for tasks like merging flat files after Creating, deleting temporary files, changing the file name to reflect the file generated date etc.
Extensively used Informatica Mapping Parameters and variables.
Extensively worked with Informatica performance tuning involving Identifying and eliminating bottlenecks.
Experience in Python openstack API s.
Experience with Integrating Informatica with Teradata and using Teradata features.
Extensive experience with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load&Tpump also with TPT.
Proficient in Teradata EXPLAIN plans, Collect Stats option, Primary Indexes (PI, NUPI), Secondary Indexes (USI, NUSI), Partition Primary Index (PPI), Join Indexes (JI), Volatile, global temporary, derived tables etc.
Expertise in performance tuning and query optimization of the Teradata SQLs.
Experience with Unit Testing, working with QA team on System testing also involved in UAT.
Experience with ETL Migrations & Code deployments also involved with Post production validations.
Solid experience in writing SQL queries, Stored Procedures.
9+ years of experience in Information Technology with building & supporting Data Warehouse/Data Mart using Informatica Power Center 10.2/9.6.1/9.5.1/9.1.1/8.6.1 and IICS (Cloud Data Integration).
Experienced in all stages of the software lifecycle Architecture (Waterfall model, Agile Model) for building a Data warehouse.
Well acquired on Informatica Power Center 10.x/9.x/8.x/7.x Designer Tools (Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformation Developer), Workflow Manager Tools (Task Developer, Worklet and Workflow Designer) and Repository Manager & Admin console.
Experience in working with UNIX Shell Scripts for automatically running sessions, aborting sessions and creating adhoc parameter files. Written number of shell scripts to run various batch jobs.
Implemented Data Warehouse projects both in Agile and Water Fall Methodologies and good understanding on Scrum process.
Excellent interpersonal and communication skills, capable of driving the DWH projects independently.





TECHNICAL SKILLS:
Operating Systems: Windows, UNIX, LINUX, MS-DOS
Modeling: Dimensional data modeling, star schema modeling, snow flake
Schema modeling, E-R modeling, visio
RDBMS: Oracle 12c/11g/10g/9i, Teradata 15/14/13 DB2, SQL Server
2008/2012/2014/2016, DB2, MySQL, Sybase, Azure SQL Server
ETL Tools: Informatica PowerCenter 10.x,9.x/8.6.1, IICS, Informatica power exchange.
Reporting Tools: Cognos, Business Objects, Tableau, Denodo
Scheduling Tools: Autosys, Control-M
Languages: XML, UNIX Shell Scripting, SQL, PL/SQL, Python, Powershell
Miscellaneous: GIThub, SVN.
EXPERIENCE:
JAN 2022 - PRESENT
SENIOR ETL/ TERADATA DEVELOPER, WELLS FARGO BANK, SAN LEANDRO, CA
Developed internal and external Interfaces to send the data in regular intervals to Data warehouse systems.
Extensively used Power Center to design multiple Mappings with embedded business logic.
Involved in discussion of user and business requirements with business team.
Performed Data Migration in different sites on regular basis.
Created complex mappings using Unconnected Lookup, Sorter, and Aggregator and Router transformations for populating target tables in efficient manner.
Attended the meetings with business integrators to discuss in-depth analysis of design level issues.
Involved in data design and modeling by specifying the physical infrastructure, system study, design, and development.
Using SSIS tools like import and export wizard, package installation, and SSIS package designer.
Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and also by using Parameter files.
Developed complex SQL queries to develop the Interfaces to extract the data in regular intervals to meet the business requirements and extensively used Teradata Utilities like M - load, F- Load, TPT, BTEQ and Fast Export.
Analyzed Session Log files in session failures to resolve errors in mapping or session configuration.
Carry out Informatica power exchange data maps deployments from pre prod to production manually.
Knowledge on power exchange connection creation and all power center connections.
Written various UNIX shell Scripts for scheduling various data cleansing scripts, loading process and automating the execution of maps.
Using SSIS tools like import and export wizard, package installation, and SSIS package designer.
Created transformations like Expression, Lookup, Joiner, Rank, Update Strategy and Source Qualifier Transformation using the Informatica designer.
Created Mapplets and used them in different mappings.
Worked on Flat Files and XML, DB2, Oracle as sources.
Written PL/SQL Procedures and functions and involved in Change Data Capture (CDC) ETL process.
Implemented Slowly Changing Dimension Type II for different Dimensions.
Involved in Informatica, Teradata and oracle upgrade process and testing the environment while up gradation.
Created Integrated test environments for ETL applications developed in GO-lang using the Dockers and the Python API s.
Worked on optimizing and memory management of the ETL applications developed in GO-lang and Python and also reusing the existing code block for better performance.
Worked with Informatica version Control excessively.
Experience in using SVN as version control for migration.
Written Unit test scripts to test the developed interfaces.
Experience in Autosys scheduling tool to automate the jobs.
Managed enhancements and coordinated with every release with in Informatica objects.
Provided support for the production department in handling the data warehouse.
Worked under Agile methodology and used Rally tool one to track the tasks.
Written thorough design docs, unit test documentation, Installation and configuration guide documents.
Responsibility included the full SDLC management for designing, analyzing, developing, testing, Implementation and application support.
Performed bulk data imports and created Stored procedures, Functions, Views and Queries.

JUNE 2021 DEC 2021
SENIOR ETL INFORMATICA / TERADATA DEVELOPER, CHARLES SCHWAB, WESTLAKE, TX

Worked with Business Analyst & Data Modelers in understanding the BRD - Business Requirement Document, Mapping Document and Data Model.
Extracted data from different source systems - Oracle, DB2, My SQL, Flat Files and XML Files.
Developed ETL programs using Informatica Power center 10.4.0 to implement the business requirements.
Used SSIS to develop jobs for extracting, cleaning, transforming and loading data into data warehouse and prepared the complete data mapping for all the migrated jobs using SSIS while extensively using transformations such as lookup, derived column, data conversion, aggregate, conditional split, SQL task, script task and send mail task etc. and used packages to transfer data from flat files to SQL server using business intelligence development studio.
Involved in enhancements and maintenance activities of the Enterprise Data Warehouse.
Communicated with business customers to discuss the issues and requirements.
Used most of the Transformations available in Informatica - Source Qualifier, Filter, Router, Lookup (Connected & Un Connected), Expression, Update Strategy, Transaction Control and Sequence Generator.
Implemented Slowly Changing Dimensions - Type 1 & Type 2 to maintain the history in Dimension tables.
Worked on Informatica power exchange for oracle CDC and other Informatica tools like Test Data Management, IDQ.
Worked with Huge data sets to load Fact Tables.
Implemented Change Data Capture (CDC) for handling delta loads.
Involved in Informatica upgrade process and testing the whole existing Informatica flow in new upgrade environment.
Developed, deployed and monitored SSIS Packages for new ETL Processes and upgraded the existing DTS packages to SSIS for the on-going ETL Processes.
Experienced in doing performance Tuning of Informatica objects- Finding the bottle necks at source, Target and mapping level and eliminating the with tuning methods.
Used Informatica file watch events to pole the FTP sites for the external files.
Involved in Enhancing existing Production informatica objects for change or additional requirements and pushing it back to production after successful QA testing.
Expertise in using Teradata Utilities BTEQ, M-Load, F-Load, TPT and F-Export in combination with Informatica for better Load in to Teradata Ware House.
Built several BTEQ to load data from Stage to Base after considering several performance techniques in Teradata sql.
Worked under Agile methodology and used Rally tool one to track the tasks.
Involved in Teradata upgrade process from TD 12/TD 14.
Production Support has been done to resolve the ongoing issues and troubleshoot the problems.
Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
Responsibility included the full SDLC management for designing, analyzing, developing, testing, Implementation and application support.
Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
Using SSIS downloaded the data from Google BigQuery.
Pre and post session assignment variables were used to pass the variable values from one session to other.
Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks,
Used Control-M to schedule jobs.
Performed unit testing at various levels of the ETL and actively involved in team code reviews.
Implemented & followed organization level Informatica best Standards & Procedures.
Involved in Solving Trouble tickets raised by Business users as part of application support team.

JULY 2019 - JUNE 2021
ETL/ TERADATA DEVELOPER, WELLS FARGO BANK, SAN LEANDRO, CA
Developed internal and external Interfaces to send the data in regular intervals to Data warehouse systems.
Extensively used Power Center to design multiple Mappings with embedded business logic.
Involved in discussion of user and business requirements with business team.
Performed Data Migration in different sites on regular basis.
Created complex mappings using Unconnected Lookup, Sorter, and Aggregator and Router transformations for populating target tables in efficient manner.
Attended the meetings with business integrators to discuss in-depth analysis of design level issues.
Involved in data design and modeling by specifying the physical infrastructure, system study, design, and development.
Used the Google BigQuery Data Flow Components to synchronize with Google BigQuery Tables and Datasets.
Using SSIS and Google BigQuery SSIS components to easily connect and synchronize SQL Server with Google BigQuery data.
Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and also by using Parameter files.
Developed complex SQL queries to develop the Interfaces to extract the data in regular intervals to meet the business requirements and extensively used Teradata Utilities like M - load, F- Load, TPT, BTEQ and Fast Export.
Analyzed Session Log files in session failures to resolve errors in mapping or session configuration.
Created Data maps using power Exchange to extract the data from mainframe using copybooks.
Worked on Power exchange CDC to capture the real time data.
Written various UNIX shell Scripts for scheduling various data cleansing scripts, loading process and automating the execution of maps.
Created transformations like Expression, Lookup, Joiner, Rank, Update Strategy and Source Qualifier Transformation using the Informatica designer.
Created Mapplets and used them in different mappings.
Worked on Flat Files and XML, DB2, Oracle as sources.
Written PL/SQL Procedures and functions and involved in Change Data Capture (CDC) ETL process.
Implemented Slowly Changing Dimension Type II for different Dimensions.
Involved in Informatica, Teradata and oracle upgrade process and testing the environment while up gradation.
Worked with Informatica version Control excessively.
Experience in using SVN as version control for migration.
Written Unit test scripts to test the developed interfaces.
Experience in Autosys scheduling tool to automate the jobs.
Managed enhancements and coordinated with every release with in Informatica objects.
Provided support for the production department in handling the data warehouse.
Worked under Agile methodology and used Rally tool one to track the tasks.
Written thorough design docs, unit test documentation, Installation and configuration guide documents.
Performed bulk data imports and created Stored procedures, Functions, Views and Queries.
JUN 2018 - JULY 2019
ETL DEVELOPER, UHG(OPTUM), EDEN PRAIRIE, MN

Collaborate with Lead Developers, System Analysts, Business Users, Architects, Test Analysts, Project Managers and peer developers to analyze system requirements.
Collaborate with Lead Developers, System Analysts, Business Users, Architects, Test Analysts, Project Managers and peer developers to analyze system requirements.
Worked with SQL, PL/SQL procedures and functions, stored procedures and packages within the mappings.
Using SSIS downloaded the data from Google BigQuery.
Used the Google BigQuery Data Flow Components to synchronize with Google BigQuery Tables and Datasets.
Using SSIS and Google BigQuery SSIS components to easily connect and synchronize SQL server with Google BigQuery data.
Created parameterized reports, Drill down and Drill through reports using SSRS.
Prepared the data mapping for all the migrated jobs using SSIS.
Involved in all activities related to the development, implementation, and support of ETL Process using Informatica Power Center 10.x
Worked with most of the transformations such as the Source Qualifier, Expression, Aggregator and Connected & Unconnected lookups, Filter, Router, Sequence Generator, Sorter, Joiner, SQL and Update Strategy.
Developed complex store procedures using input/output parameters, cursors, views, triggers and complex queries using temp tables and joins.
Develop of scripts for loading the data into the tables using FastLoad, MultiLoad and BTEQ utilities of Teradata.
Used control-M to schedule Jobs.
Used Snowflake schema to be joined with the fact table
Involved in requirement analysis, ETL design and development for extracting data from the source systems like sales force, Mainframe, DB2, sybase, Oracle, flat files.
Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
Extensively involved in the Analysis, design and Modeling. Worked on Snowflake Schema, Data Modeling, Data Elements, Issue/Question Resolution Logs, and Source to Target Mappings, Interface Matrix and Design elements.
Design and develop logical and physical data models that utilize concepts such as Star Schema, Snowflake Schema and Slowly Changing Dimensions
Work on Test Driven Development and conduct unit testing, system testing and user acceptance testing.
Create deployment packages to deploy the developed Informatica mappings, Mapplets, Worklets and workflows into test and then to production.
Trouble shoots any deployment issues and coordinates to deploy the code into production on the target date.
Created dashboards for analyzing POC data and applied Filter Actions between different worksheets and dashboards.
Designed and developed various analytical reports from multiple data sources by blending data on a single worksheet in Tableau Desktop.
Participate in agile daily stand-up, sprint planning, sprint review, retrospective, Backlog refinement and Feature Overview meetings.

JULY 2016- JUN 2018
INFORMATICA/ TERADATA DEVELOPER, FREDDIE MAC, FAIRFAX, VA
Extensively worked with various Active transformations like Filter, Sorter, Aggregator, Router and Joiner transformations.
Extensively worked with various Passive transformations like Expression, Lookup, Sequence Generator, Mapplet Input and Mapplet Output transformations.
Worked with source databases like Oracle, SQL Server and Flat Files.
Worked with extracting data from SFDC.
Extensively worked with Teradata utilities BTEQ, F-Load, M-load & TPT to load data in to Teradata ware house.
Created complex mappings using Unconnected and Connected lookup Transformations.
Responsible for the performance tuning of the ETL process at source level, target level, mapping level and session level.
Tuned performance of Informatica session for large data files by increasing block size, data cache size and target based commit interval.
Responsible for Performance Tuning of Teradata scripts using explain plans, indexing and Statistics.
Implemented slowly changing dimension Type 1 and Type 2.
Worked with various look up cache like Dynamic Cache, Static Cache, Persistent Cache, Re Cache from database and Shared Cache.
Worked extensively with update strategy transformation for implementing inserts and updates.
Worked with various Informatica Power Center objects like Mappings, transformations, Mapplet, Workflows and Session Tasks.
Auditing is captured in the audit table and EOD snapshot of daily entry is sent to the distributed list to analyze if there are any abnormalities.
As per business we implemented Auditing and Balancing on the transactional sources so that every record read is either captured in the maintenance tables or wrote to Target tables.
Extensively used the tasks like email task to deliver the generated reports to the mailboxes and command tasks to write post session and pre session commands.
Extensively used debugger to test the logic implemented in the mappings.
Performed error handing using session logs.
Involved in production support when required.
Monitored workflows and session using Power Center workflows monitor.
Used Informatica Scheduler for scheduling the workflows in dev for testing.
Provided 24*7 support for Production environment jobs.
Monitoring the Extraction and loading processes of data and Involved in writing UNIX shell scripting for automating the jobs.
APR 2015- JULY 2016
INFORMATICA/ TERADATA DEVELOPER, TEXAS HEALTH RESOURCES, DALLAS, TEXAS

Analyzed the Business Requirement Documents (BRD) and laid out the steps for the data extraction, business logic implementation & loading into targets.
Created detailed Technical specifications for Data Warehouse and ETL processes.
Developed Source to Target Mappings using InformaticaPowerCenter Designer from Oracle, Flat files sources to Teradata database, implementing the business rules
Modified BTEQ scripts to load data from Teradata Staging area to Teradata Ware House.
Requirement gathering and discussion with Architect for design plan.
Installing required software and services for operation readiness such Connect direct, TWS, Informatica.
Created series of Macros for various applications in Teradata SQL Assistant
Working closely with Architects and Lead for the applications assessment to all the Data Masking Team on Proxy server and providing support on the databases and applications.
Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirements.
Extracted, transformed data from various sources such as Flat files, Oracle 11g and transferred data to the target data warehouse Teradata.
Responsible for building Teradata temporary tables, indexes, macros and BTEQ Scripts for loading/transforming data.
Developed scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
Tested raw data and executed performance scripts.
Successfully upgraded Informatica 9.1 and to 9.5 and responsible for validating objects in new version of Informatica.
Identification of all the dependencies for creating job streams and batch schedules
using TWS.
Managed postproduction issues and delivered all assignments/projects within specified timelines.
Involved in Initial loads, Incremental loads and Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner.
Supported QA for each region testing using Health Rules and Health Answers. Assisted for QA.
Written Technical design document and application workbook and handover applications to production team.
Worked in Production support team.
Developed automated processes to install Control-M servers, agents, and fix packs.

MAY 2014- FEB 2015
JR.ETL DEVELOPER, ADAPTIVE SOFTWARE SOLUTIONS PVT LTD, HYDERABAD, INDIA

Involved in the requirement gathering and Business Analysis of the specifications provided by the business analysts.
Designed the mappings between sources (external files and databases) to operational staging targets.
Experience with high volume datasets from sources like DB2, Oracle and Flat Files.
Loaded data from various sources using different transformations like Source Qualifier, Joiner,
Aggregators, Connected & Unconnected lookups, Filters, Router, Expression, Rank Union, and Update Strategy & Sequence Generator.
Experience in writing PL/SQL scripts, Stored Procedures and functions and debugging them.
Responsible for Migration of Stored Procedures into Informatica Mappings for improving Performance issue.
Involved in Performance Tuning of application by identifying bottlenecks in SQL, thus providing inputs to the application programmer, thereby correcting and implementing the right components.
Created Session Task, Email and Workflow to execute the mappings. Used Workflow Monitor to monitor the jobs, reviewed error logs that were generated for each session, and rectified any cause of failure.
Experience in ETL testing, Created Unit test plans and Integration test plans to check whether the data has been loaded into the target is accurate, which was extracted from different source systems according to the user requirements.
Setting up of the local Informatica environment on the client machines which included the connectivity and access to the data sources, taking the necessary steps to set up the Relational Connectivity variables in the Workflow manager etc.
Used SQL override to perform certain tasks essential for the business.
Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and modularity.
Defined Target Load Order Plan for loading data into Target Tables.
Involved in the documentation of the ETL process with information about the various mappings, the order of execution for them and the dependencies.

EDUCATION:

B.E-JNTUH/Electronics & communication Engineering/Hyderabad/2014
Master's-Trine University/Indiana/Engineering Management/2015-2016
Keywords: quality analyst rlang information technology golang microsoft procedural language California Minnesota Texas Virginia

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];2827
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: