Home

Yogananda - ETL Developer
[email protected]
Location: Remote, Remote, USA
Relocation:
Visa: H1B
YOGANANDA SAI TALLAPU REDDY
Sr ETL / ODI Developer
[email protected]
813-725-3398

SUMMARY
9+ Years of experience in using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and developing Master Data using Informatica Power Center/IDQ/MDM, Teradata.
Experience in building and managing various data warehousing/data marts using Informatica products such as Power Centre, Power mart, Power Exchange for data models.
Strong experience in performing ETL operations like Data Extraction, Data Transformation and Data Loading with Informatica Power Centre and Informatica Power Mart (Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor).
Skilled in monitoring and managing AWS resources using CloudWatch, setting up custom metrics, alarms, and dashboards to ensure the health and performance of applications.
Involved in all aspects of the project and technologies (Informatica, Unix Scripts, and Business Objects Reports) and work with a team of Onsite and offshore build and Test resources.
Proficiency in SQL Server monitoring and diagnostic tools such as SQL Server Management Studio (SSMS), SQL Server Profiler, and Dynamic Management Views (DMVs).
Experience of having worked with ELT Tool ODI against different databases Oracle (9i, 10g), DB2, MS SQL Server (2005, 2008) and MS Access.
SQL Server DBA (Database Administrator) in a production environment
Strong troubleshooting skills using CloudWatch Logs and Metrics for real-time analysis
Worked extensively with complex interfaces using ODI.
Expertise in Redshift for data warehousing and analytics, including data modeling and optimization.
Extensive experience with ETL (Extract, Transform, Load) processes for efficient data integration and transformation.
Expertise in implementing complex business rules by creating ODI interfaces.
Strong ETL Skills with Multiple ETL Tools Informatica Power Centre, Cloud, BDE, BDM, Talend BDM, DI, DP, MDM, SAP BODS, Oracle Data Integrator in development in Data Warehouse Migration.
Migrate data to Microsoft Azure Cloud Platform, Azure SQL DB, Hadoop data on the Azure HDInsight Service using Informatica.
Hands-on experience in integrating AWS services with Step Functions and Lambda functions to automate workflows and streamline business processes effectively.
Strong knowledge of Azure Storage Accounts, Containers, Blob Storage, Azure Data Lake, Azure data factory, Azure SQL data warehouse, stretch Database, Machine Learning, Virtual Machines.
Strong knowledge of IDQ Mapping Designer, Mapplet Designer, Transformation developer Designer, Workflow Manager and Repository.
Experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, etc.
Experienced in automation testing methodologies and tools for ensuring the quality and reliability of software products.
Implemented the Change Data Capture (CDC) feature of ODI to minimize the data load times.
Used ODI Designer to develop complex interfaces (mappings) to load the data from the various sources into dimensions and facts.
Hands on experience on dealing with ODI Knowledge Modules like, LKM, IKM and JKM, CKM.
Experience in GCP (Google Cloud Platform) for migrating the data from different on - prem Databases like Oracle and SQL server to the cloud GCP platform using SSIS and Striim.
Knowledge in OLTP/OLAP System Study and E - R modelling, developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional, and multidimensional modelling.
Proficient in SQL for data querying, manipulation, and analysis.
Enhanced performance for Informatica sessions by using physical pipeline partitions, DTM Session performance and Parameter Files.
Used Informatica BDM IDQ 10.1.1 (Big Data Management): To inject the data from AWS S3 raw to S3 refine and from refine to Redshift.
Have experience with IDQ, MDM with knowledge on Big Data Edition Integration with Hadoop and HDFS.
Worked with networking teams in configuring AWS Direct Connect to establish dedicated connection to datacenters and AWS Cloud
Extensive experience in using tools like SQL Plus, TOAD, SQL Developer, and SQL Loader.
Experience in data modelling using designed tool Erwin 4.1/4.0 and worked with Oracle Enterprise Manager and Toad.
Experience in Extracting data from Facets claims application to pull it into the Staging area.
Experience in using Facets claim application to open, add generations, enter, and save information
Knowledge in extracting data from various sources like Oracle, DB2, Flat file, SQL SERVER, XML files, Teradata and loaded into Teradata, Oracle database.
Strong understanding of Performance tuning in Informatica and Databases with Oracle Stored Procedures, Triggers, Index and experienced in loading data into Data Warehouse/Data Marts using Informatica.
Hands-on experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations, Mapplets, and PL/SQL stored procedures.
Experience on data profiling & various data quality rules development using Informatica Data Quality (IDQ).
Experienced in developing applications in Oracle and writing Stored Procedures, Triggers, Functions, Views, and creating Partitions for better performance.
Extensive experience in Oracle SQL. PL/SQL Programming. Experienced in writing and designing the SQL queries.
Experience in ETL testing and cloud data migration to Google Cloud Platform pipeline testing along wif real-time data processing in Striim.
Experience with Facets batches, HIPAA Gateway, and EDI processing.
Worked on IDQ tools for data profiling, data enrichment, and standardization.
Experience in development of mappings in IDQ to load the cleansed data into the target table using various IDQ transformations. Experience in data profiling and analysing the scorecards to design the data model.
SQL*Loader, Developed PL/SQL, SQL*Plus, Tuning using explain plan, etc.
Setting up ODBC connections, Batches, and sessions to schedule the loads at required frequency using Power Centre.
Experience in UNIX Shell Scripting. Autosys, Control-M for scheduling the Workflows.
Familiar with Agile development and waterfall methodologies.
Ability to work in teams as well as individually, quick learner and able to meet deadlines Development experience
TECHNICAL SKILLS
Operating Systems: Windows 7 Professional, Windows NT 4.0, Windows 2000 Server, Windows 2000 Advanced Server, Windows 2003 Server, Windows XP, Windows Vista, 7, UNIX and Mac OS
Software: C, Java, SQL, HTML, XML, Oracle 11g / 10g, MS SQL 2008, Teradata 13, MS Access, MSOffice 2010/2007
RDBMS: Oracle, MS SQL Server 7.0, 2000, 2008, DB2.
ETL Tools: Informatica Power Centre 9.5.1/9.0.1/8.6.1/8.0.1/7.5/7.1 , Informatica Cloud, Control-M, IDQ, MDM, Autosys, SharePoint, Erwin.
SAP Data Services BODS: Designer, Monitor, Admin Console & Repository.
Reporting Tools: SQL Server Reporting Services (SSRS), Tableau, Power View, SharePoint 2007
Data Modelling Tools: Erwin, (Star schema/Snowflake)
Mark-up Languages: XML, HTML, DHTML.
Database Query Tools: SQL Server Execution Plan, MS SQL Server Query Analyzer, SQL Profiler, Red Gate SQL Data Compare, Red Gate SQL Data Generator, Red Gate SQL Search, Azure Sql Server, AWS S3 and AWS Redshift.
Version Control Tools: SVN, Team Foundation Server, VSS
Atlassian Tools: JIRA, Confluence

PROFESSIONAL EXPERIENCE
PWC, Tampa, FL May 2021 to Present
Sr ETL / ODI Developer
Responsibilities:
Requirements Gathering and Business Analysis. Project coordination, End User meetings
Development and design of ODI interface flow for Upload / Download files
Responsible for designing, developing, and testing of the ETL (Extract, Transformation and Load) strategy to populate the data from various source systems (Flat files, Oracle, SQL SERVER) feeds using ODI.
Consolidated data from different systems to load Constellation Planning Data Warehouse using ODI interfaces and procedures
Develop packages for the ODI objects and scheduled them as scenarios
Defined ETL architecture for integrating data at real time and batch processing to populate the Warehouse and implemented it using Oracle Data Integrator Enterprise Edition 11g.
Support in full legacy to ODI data conversion and Integration task
Capable of monitoring and managing AWS resources effectively with CloudWatch
Translated business requirements into technical specifications to build the Enterprise Data Warehouse
Manage and administer SQL Server databases across multiple environments, including development, testing, and production, to ensure optimal performance, scalability, and reliability.
Install, configure, and upgrade SQL Server instances and related components, applying patches and updates as necessary to maintain system integrity and security.
Develop and implement database backup and recovery strategies to protect against data loss and ensure high availability and disaster recovery readiness.
Perform database administration tasks including installation, configuration, backup and recovery, monitoring, and troubleshooting of SQL Server databases.
Hands-on experience in designing and implementing ETL processes using tools such as Apache Airflow or AWS Glue to efficiently extract, transform, and load data across various sources and targets.
Involved in system study analysis for logical/physical data model there by defining strategy for implementing Star Schema with Fact and Dimension tables
Responsible for Data Modeling. Created Logical and Physical models for staging, transition and Production Warehouses.
Installation and Configuration of Oracle Data Integrator (ODI)
Responsible for creating complex mappings according to business requirements, which can are scheduled through ODI Scheduler.
Responsible for configuration of Master and Work Repository on Oracle
Worked on ODI Designer for designing the interfaces, defining the data stores, interfaces and packages, modify the ODI Knowledge Modules (Reverse Engineering, Journalizing, Loading, Check, Integration, Service) to create interfaces to cleanse, Load and transform the Data from Sources to Target databases, created mappings and configured multiple agents as per specific project requirements.
Created ODI Packages, Jobs of various complexities and automated process data flow.
Used ODI Operator for debugging and viewing the execution details of interfaces and packages.
Environment: ODI (Oracle Data Integrator) 11g (11.1.1.7), SQL Developer, Oracle 11g/10g, OBIEE 11g, UNIX scripting, Windows NT

Anthem, Atlanta, GA Feb 2018 to Apr 2021
ETL / Informatica MDM Developer
Responsibilities:
Involved in leading and monitoring the team, assigning the task, reviewing the development activity and status calls.
Produced detailed MDM design specifications consistent with the high-level MDM design specifications.
Coordinated with ETL team for performing the batch process to populate data from an external source system to landing tables in the hub.
Analysed the source systems data for SNO based on profiling results that helped to determine the trust scores and validation rules.
Convert specifications to programs and data mapping in an ETL Informatica Cloud environment
Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.
Configured Landing, staging tables and Base Object tables.
Trust scores and validation rules are configured in the hub.
Worked on integration of the external application with MDM Hub using SIF APIs.
Familiarity with automation testing frameworks such as Selenium or Appium for web and mobile application testing, ensuring the delivery of high-quality software products.
Design, document, and configure the Informatica MDM Hub to support initial data loads and incremental loads, cleansing.
Worked on Address Doctor for cleansing addresses using the Developer tool before feeding into landing tables.
Analysis and implementation of the existing claim adjudication process in FACETS.
Used SOAP UI to perform SIF API calls like clean tables etc.
Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
Defined Match rules in Match and Merge settings of the base tables by creating Match Path Components, Match Columns, and Rulesets.
Used filters, segment/segment all matching and non-equal matching.
Performed match /merge and ran match rules to check the effectiveness of MDM on data and fine-tuned the match rules.
Customized User Exists for deferent scenarios.
Used Hierarchy tool for configuring entity base objects, entity types, relationship base objects, relationship types, profiles.
Data integration with claim processing engine (Facets).
Closely worked with Data Steward Team for designing, documenting and configuring Informatica Data Director.
Used Native BPM for configuring workflows like One-step approval, merge, and unmerge.
Used Repository Manager/Change List for migrating incremental as well as bulk meta-data.
Environment: Multi-Domain MDM 9.7, IDD, Address Doctor, Oracle 11g, Oracle PL/SQL, SIF API, Windows Application Server, Native BPM.

AMEX, Phoenix, AZ Sep 2014 to Jan 2018
ETL / Informatica Developer
Responsibilities:
Worked closely with Development managers to evaluate the overall project timeline.
Interacted with the users and making changes to Informatica mappings according to the Business requirements.
Developed the Informatica Mappings by the usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers and data flow management into multiple targets using Router.
Worked on design, development and Performance Tuning Informatica mappings, Workflow Batch process of STG, DW, DM with Worklets, Pushdown Optimizations, Drop, Create index scripts, parallelism to ensure the batch comprising of 850 tables completed on time on delta runs and tuning complex queries in Oracle.
Worked closely with Informatica Admins, Operation for Code movement to SIT, UAT and finally to Production for smooth code movement of Informatica, DB object to ensure Initial load, Delta Load were conducted promptly and resolved all the issues in Initial Load and Delta smoothly.
Worked on resolution of Production Fixes on Delta run enhancement of the existing Informatica code.
Worked strongly on Documentation for Operation Using Deployment Group, import-export the procedure, run book for Prod Support. Knowledge Transfer to ensure the smooth running of the Batch.
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Lookup, Router, and Aggregator to create robust mappings in the Informatica Power Center Designer.
Responsible for best practices like naming conventions, Performance tuning, and Error Handling
Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level.
Involved in standardization of Data like changing a reference data set to a new standard.
Data, if validated from the third party before providing to the internal transformations, should be checked for its accuracy (DQ).
Used Address validator transformation in IDQ.
Involved in massive data profiling using IDQ (Analyst tool) before data staging.
Created partitioned tables, partitioned indexes for manageability, and scalability of the application. Made use of Post-Session success and Post-Session failure commands in the session task to execute scripts needed for clean-up and update purposes.
Able to automate workflows and streamline processes using AWS services.
Extensively worked in ETL and data integration in developing ETL mappings and scripts using SSIS, Worked on Data transfer from a Text file to SQL Server by using bulk insert task in SSIS.
Extensively used the Business Objects functionality such as Master-Detail, Slice and Dice, Drill Down and Hierarchies for creating reports.
Implemented slowly changing dimensions Type 2 using the ETL Informatica tool.
Designed best practices on Process Sequence, Dictionaries, Data Quality Lifecycles, Naming Convention, and Version Control.
Created Tableau a worksheet which involves Schema Import, Implementing the business logic by customization.
Capable of monitoring and managing AWS resources effectively with CloudWatch
Created Use-Case Documents to explain and outline data behaviour.
Working with Informatica Developer (IDQ) tool to ensure data quality to the consumers.
Used Address validator transformation for validating various customers address from various countries by using the SOAP interface.
Created PL/SQL programs like procedures, function, packages, and cursors to extract data from Target System.
Involved in the deployment of IDQ mappings to application and different environments.
Defects are logged and change requests are submitted using the defects module of Test Director.
Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance.
Environment: Informatica Power Centre 9.5/9.1, IDQ, SAP Data Services, SAS, Business Objects 3.1, Oracle 11g, UNIX, PLSQL, SQL* PLUS, SQL SERVER 2008 R2, TOAD, MS Excel 2007.
Keywords: cprogramm user interface sthree database rlang information technology microsoft procedural language Arizona Florida Georgia

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];2675
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: