Home

Jagan Nagireddi - ETL Developer
[email protected]
Location: Mechanicsburg, Pennsylvania, USA
Relocation: No
Visa: H1B
Jagan Nagireddi
[email protected]
+1 (469)-459-6394
_____________________________________________________________________________________

Professional Summary:

Around 12+ Years of IT Experience in Software Design, Development, Testing, Maintenance and Deployment using SDLC life cycle.
Having good experience in designing and implementing the Data Warehouse applications, mainly Transformation processes using ETL tool Informatica Power Centre 10.5/10.1/9.6/9.1/8.6, SSIS, Streamsets
Have involved in end-to-end project activities i.e., from gathering Business needs, converting them into requirements, analysis, design activities, build to testing phases, reviews and in release management.
Extensively used ETL tools for extraction, transformation and load purposes on various sources and targets like Flat files, Oracle 12c/11c/10g, SQL Server 2012, sales force, Google Analytics, Mainframe, Netezza and Atlas Mongo DB.
Extensive hands-on expertise using Informatica Power Center with strong business understanding (domains) of Insurance, Health care, Banking and financial services.
Extensively worked with various components of the Informatica Power Center Power Center Designer (Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer, Transformation developer), Repository Manager, Workflow Manager, and Workflow Monitor to create mappings, tasks and workflows for the extraction of data from various source systems to targets.
Experience in implementing complex business rules by creating re-usable transformations, and robust mappings/mapplets using different transformations like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure and Normalizer etc.
Proficient in full life cycle development of Data Warehousing, ETL strategies, reporting and hands on experience in Performance Tuning of sources, targets, transformations, mappings, sessions and Index Usage, Aggregate Tables, Load strategies and commit intervals.
Extensive experience in integrating data from various Heterogeneous sources like Relational databases (Oracle, SQL Server), Flat Files (Fixed Width and Delimited) into Data Warehouse and Data Mart.
Extensively worked on Development, Enhancement, and migration projects.
Experience in PL/SQL Programming (Stored procedures, Triggers, Functions and Packages) and UNIX shell scripting.
Good experience in documenting the ETL process flow for better maintenance and analyzing the process flow.
Good knowledge in administration tasks including importing/exporting Informatica objects, versioning to maintain objects history.
Developed complex mappings, SCD type-I and Type-II mappings in Informatica to load the data from various sources using different transformations and created mapplets for reusable purposes.
Strong in Data warehousing concepts, dimensional Star Schema, Snowflakes Schema methodologies.






Technical Skills:

ETL Tools: Informatica Power Center 10x, Informatica Power Exchange,
Informatica Data Quality, SSIS, Streamsets
Cloud Tools: Streamsets
BI Tools: Tableau 10.2, Cognos
Data Modeling Tool: Dimensional Data Modeling using Star Schema Modeling, Snowflake Modeling, Fact tables and Dimensions Tables, Physical and Logical Data Modeling, ERWIN 4.0.
Databases: Oracle 12c/9i, DB2 9.5, SQL Server 2012/2008, Netezza and Atlas Mongo DB
Languages: SQL, PL/SQL, UNIX Shell Script, Power shell script, C, C++.
Development Tools: SQL *Plus, TOAD, SQL Developer, Putty, HP Quality Center,
SQL Server management studio and Win SCP.
Scheduling Tools: Control M, Informatica Scheduler, $Universe.
Operating Systems: Windows XP/NT/2008, UNIX


PA Department of Human Services Oct 2016 Till date
Role: Senior ETL Developer


Job Responsibilities:

The Pennsylvania Department of Human Services is a state agency in PA dedicated to providing care and support to vulnerable citizens.
Collaborate closely with program offices (OMAP, OCDEL and Early Learning) and source teams (Sandata, Gainwell and Deloitte) to understand their requirements clearly and ensure that the delivered code meets their expectations.
Analyzed Source Data to determine what meta-data was to be included in the Logical Data Model.
Performed data mapping to determine where identified source data belonged in the Data Base.
Work on coming up with templates for Source to Target Document, ETL Detail Design Document, Unit Test Plan, Code review checklist, Migration Check List and Production Handoff document.
Create mappings using the Transformations like Source qualifier, Aggregator, lookups, joiner, and update strategy for the data integration.
For real-time data processing, Streamset pipelines are built using different stages like HTTP Client, Kafka Topic, Expression, Jython, Destinations and MongoDB.
Involve in peer reviews of the code to ensure all the standards are implemented and tuned mappings in order to bring down the run time by using Informatica partitioning if needed.
Perform thorough Unit testing & Integration testing using sample data from PROD, once the code is review and met the standards, promote to higher environments.
Support in UAT testing and fix the defects and thoroughly tested them back and promote to UAT for further testing.
Responsible for migration and deployment of folder or mappings and sessions from development to QA environment.
Created deployment groups and submitted COs to deploy code from lower to higher environments.
We constantly monitor the cycle run times and investigated the long running jobs and did performance tuning in order to meet the SLAs.
Schedule and ran Extraction-Load process and monitored tasks/workflows using the Workflow Manager and Workflow Monitor.
Troubleshoot the problems by checking sessions/error logs in the Workflow Monitor and used debugger in Mapping Designer to debug complex mappings.
Developed Tableau visualizations and dashboards using Tableau Desktop.
Developed Tableau workbooks from multiple data sources using Data Blending.
Maintain all the standards and implement the best practices in our development process.

Environment: Informatica Power Center 10.5, SSIS, Streamsets, Oracle 12c, Atlas Mongo DB, Tableau Public 10.2, Erwin, Informatica Scheduler, UNIX, WIN SCP, Visual Studio 2019, and Service Now.


Syntel Pvt Ltd July 2011 Sep 2016
Clients: All State, American Express, Humana and Cuna mutual Groups
Role: Informatica Consultant


Responsibilities:

Extensively involved in all the phases of Project life cycle right from Requirements Gathering to Testing etc.
Coordinated with Client SMEs and offshore team to get the requirements and get the code delivered in timely manner and with high quality and performance.
Worked on coming up with templates for Source to Target Document, ETL Detail Design Document, Unit Test Plan, Code review checklist, Migration Check List and Production Handoff document.
Developed Designs both at HLD and LLD for the ETL solutions Proposed.
Constructed ETL Logic for Dimension tables using SCD 1 and SCD 2 from delta loading.
Constructed ETL Logic for Fact tables and Aggregate tables using incremental loading.
Created mappings using the Transformations like Source qualifier, Aggregator, Expression, Lookup, Router, Filter, Rank, Sequence Generator, and Update Strategy for the data integration.
Understand the existing complex unix shell scripts and do the required changes as per the requirements.
Used Power Exchange navigator to create data maps and perform row test.
Used PMCMD, and shell scripts for workflow automation.
Involved in peer reviews of the code to ensure all the standards are implemented and tuned mappings in order to bring down the run time by using Informatica partitioning if needed.
Completed documentation in relation to detailed work-plans, mapping documents and high-level data models.
Performed thorough Unit testing & Integration testing using sample data from PROD, once the code is reviewed and met the standards, promoted to higher environments.
Supported in UAT testing and fixed the defects and thoroughly tested them back and promoted to UAT for further testing.
Responsible for migration and deployment of folder or mappings and sessions from development to QA environment.
We constantly monitored the cycle run times and looked into the long running jobs and did performance tuning in order to meet the SLAs.
Scheduled and ran Extraction-Load process and monitored tasks/workflows using the Workflow Manager and Workflow Monitor.
Troubleshoot problems by checking sessions/error logs in the Workflow Monitor and also used debugger in Mapping Designer to debug complex mappings.
Created RFCs and worked with BIDW admin team on Migration between Development, Test and Production Repositories.
Used the client tools (Ivalidator, dbView and db compare) to validate the informatica objects and compare the source and target data.
Monitored the ETL loads using Control M.

Environment: Informatica Power Center 8.6/9.1/9.6,Abinitio, SQL Server 2005/2012,Sales force, Google Analytics, SAP Business Objects 4.0,Cognos,Main frame, HP Quality Center, Control M, UNIX, Oracle SQL,PL/SQL, TOAD
Keywords: cprogramm cplusplus quality analyst business intelligence database information technology hewlett packard procedural language Pennsylvania

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];3099
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: