Home

Sujana Nalluri - Sr. ETL/Informatica IDQ Developer
[email protected]
Location: Remote, Remote, USA
Relocation: Yes
Visa: OPT-EAD
Sujana Nalluri
Sr. ETL/Informatica IDQ Developer
512-768-9090
[email protected]

Yes
OPT-EAD

PROFESSIONAL SUMMARY:
Around 8+ years of experience in Information Technology including Data Warehouse/DataMart development using ETL/Informatica Power Center and Teradata & Informatica Power exchange and basic knowledge in Informatica Data Quality (IDQ) 10.1/9.6.1 across various industries such as Healthcare, Banking, Insurance, Pharmaceutical, Finance.
Experienced in leading, support and maintenance of ETL applications in production environment.
Extensive Database experience using Oracle, MS SQL Server SQL, PL/SQL, and PostgreSQL.
Having strong Data Warehousing work experience using Informatica Power Center Client tools - Repository Manager, Designer, Workflow Manager, and Monitor.
Conducted quantitative analyses on financial statements, internal risk ratings and briefed senior leadership about the findings along with the forecast information.
Created and delivered presentations to various line of business (LOB) teams regarding the status and key decisions regarding their LOB client performances.
Worked on rescoring financial ratios & derivatives for credit risk model development & redesign.
Strong experience working with Enterprise Data Warehouse (EDW)Frameworks and direct experience with BI or DW projects over the full development lifecycle.
Experience in handling Informatica Power Center for ETL extraction, transformation and loading into target Data warehouse.
Strong in Data warehousing concepts, dimensional Star Schema, Snowflake Schema methodologies, slowly changing Dimensions (SCD Type1/Type2).
Extensive experience in creation of complex ETL mappings, mapplets and workflows using Informatica Power Center to move data from multiple sources into target area.
Used Mapplets and Reusable Transformations to prevent redundancy of transformation usage and maintainability.
Information Technology with Expertise in Data modeling for Data Warehouse/Data Mart development, Data Analysis for Online Transaction Processing (OLTP) and Data Warehousing (OLAP)/Business Intelligence (BI) applications.
Excellent data-analysis skills and ability to translate Transformation logic into Mappings for ETL Process and maintaining the Repositories and Servers.
Involved in performance tuning is to optimize session performance by eliminating performance bottlenecks to get a better acceptable ETL load time.
Have involved to schedule the jobs using Control M and monitor the jobs in workflow monitor.
Knowledge on Teradata Utility scripts like Fast Load, Multiload to load data from various source systems to Teradata.
Experience in Postproduction support, handling new enhancements and customer interaction.
Experience in End-to-End Data warehouse Development, Performance Tuning, Optimization of SQL, and Production Support.
Extracted data from the databases (Oracle, SQL Server, Flat files, and Teradata) using Informatica to load it into a single data warehouse repository.
Extensive experience in web development, ETL and reporting technologies.
Well versed in all phases of software development life cycle and involved in creating Technical Design Documents (TDD), Mapping Sheets and Test Case documents.
Expertise in SQL and Performance tuning on large scale Teradata
Experienced on extracting data from multiple sources like Teradata, Oracle, SAP BW, Mainframes, and Flat files and perform the required transformations on the data using ETL tools -Informatica or Teradata utility.
Communication: Convey problems, solutions, updates and project status to peers, customers, and management. Develop and maintain programs, systems, and user documentation.
Involved to create estimates and delivery plans for projects.
Proficient in different project roles such as designer, developer, and tester.
Have Experience in creating ETL design documentation and maintaining overall design document and deliverables.
Educated team members on Informatica, SQL Server Reporting Services (SSRS) and MS SQL Server.
Have work experience in Agile and Waterfall methodologies.
Expertise in designing and implementing data integration solutions using Informatica IICS.
Informatica Cloud Data Integration mapping and task flows to extract and load data between on-premises, AWS RDS, Amazon S3, Redshift, Azure SQL Data Warehouse and Azure Data Lake Store; created and configured all kinds of cloud connections and runtime environments with Informatica IICS.
Having knowledge of Project Management skills with the ability to manage multiple projects simultaneously and independently.

SKILL SET:
ETL Tools Informatica Power Center 10.x/9.x (Source Analyzer, Mapping Designer, Workflow Monitor, Workflow Manager, Mainframes), Power Exchange, Informatica Data Integrator, SAP BODS
Datawarehouse and Databases PostgreSQL, MS SQL Server, Oracle, and Teradata
Scripting Languages Unix
Web Technologies Web API (Soap and Rest)
Reporting Tool SSRS, Tableau
Ticketing tools SVN and JIRA
Cloud Services Informatica IICS, Amazon AWS services such as S3
3rd party tools WinSCP, Putty
Scheduling tool Control-M
Comparing tools Beyond Compare and Notepad++
MS office Excel, Word and PPT

EDUCATION:
Completed Master s in Computer Science from University of Southern Mississippi, MS.
Completed Bachelor s in Computer Science from Satyabhama University, Chennai.

WORK EXPERIENCE:
Truist Dallas, TX Feb 2023 Present
Sr. ETL/ Informatica Developer
Responsibilities:

Requirements gathering, analyze, design, code, test highly efficient and highly scalable integration solutions using Informatica, Oracle, SQL, Source systems viz.
Involved in the technical analysis of data profiling, mappings, formats, data types, and development of data movement programs using Power Exchange and Informatica.
Developed ETL mapping document which includes implementing the data model, implementing the incremental/full load logic and the ETL methodology.
Involved in understanding the Requirements of the End Users/Business Analysts and Developed Strategies for SQL & reporting processes.
Performed comprehensive analysis on clients/investors statement data and their financial performance to bring more valued information to mitigate risk.
Analyzed overrides on credit risk model ratings to provide insights & trends on upgrades/downgrades on these risk ratings.
Derived financial ratios for all the redesigned models and published them on Teradata views in both UAT and Production env for downstream teams.
Developed mappings for extracting data from different types of source systems (flat files, XML files, relational files, etc.) into our data warehouse using Power Center.
Involved in daily business meeting and created mapping documents based on the business needs to perform ETL operations.
Experience working with IICS transformations like Expression, joiner, union, lookup, sorter, filter, normalizer, and various concepts like macro fields to templatize column logic, smart match fields, renaming bulk fields and more.
Used Excel pivot tables to manipulate large amounts of data to perform data analysis, position involved extensive routine operational reporting, Ad-hoc reporting, and data manipulation to produce routine metrics and dashboards for management.
Expertise in Developing Mappings and Mapplets/Transformations between Source and Target using Informatica Designer Worked in Agile methodology based on onshore-offshore model.
Creating User Stories, Product backlogs and prioritizing them alongside of the Product Owner and Scrum Team through JIRA.
Experienced performing testing on changes in UAT environment before moving them into Production.
Assisted/Kicked of batches for pulling down financial statement and running ETL processes to load data into targets.
Presented detailed reports about the meaning of gathered data to senior management and regulators and help them identify scenarios utilizing modifications in the data.
Expertise in the concepts of building Fact Tables, Dimensional Tables, handling Slowly Changing Dimensions and usage of Surrogate Keys. Extensive experience in backup and recovery process of Informatica Repository.
Understood and documented the data sourcing and data flow using flow charts in Microsoft office utilities like Word and Visio s.
Involved in business analysis and technical design sessions with business and technical staff to develop Entity Relationship/data models, requirements document, and ETL specifications and dig deep into complex T-SQL Query and Stored Procedure to identify items that could be converted to Informatica Cloud ISD.
Develop complex ETL mappings on Informatica 10.x platform as part of the Risk Data integration efforts.
Developing ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translate business rules and functionality requirements into ETL procedures.
Experience in Proc SQL joins and Proc SQL set operators to combine tables horizontally and vertically and proficient in integration of various Data Sources like SAS datasets, Flat and CSV files.
Produced descriptive statistics on wholesale portfolio using procedures like Proc Summary, Proc Freq, Proc Means, Proc Univariate
Developed Dashboards using Tableau Desktop for data visualization and reporting and presented them to both internal and external stakeholders.
Extensive data cleansing and analysis, using pivot tables, formulas (v-lookup and others), data validation, conditional formatting, and graph and chart manipulation.

Environment: Informatica Power Center, Teradata, Tableau Desktop, Tableau Server, Unix, WinSCP, SQL Programming, T-SQL, SAS Enterprise Guide, MS Excel, JIRA.

Centene Tampa, FL Oct 2021 - Feb 2023
Sr. ETL/ Informatica Developer
Responsibilities:

Prepared detailed ETL design documents explaining the mapping logic in technical as well as Business terminology.
Involved in installation, configuration, and upgrade of Informatica 10.4.3 to 10.5.3.
End-to-end ETL development of Data Mart, Data Quality Analysis to determine cleansing requirements. Designed and developed Informatica mappings for data loads.
Responsible for Dimensional Data Modeling, Understand the business needs and implement the same into a functional database design.
Collaborate with the architects, leads for the design and data model framework.
Worked with various levels of software developers to load data into the data warehouse and identify potential problem areas in the source system by data profiling.
Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Mapplets, worklets and Transformation objects.
Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
Created workflows and Worklets with parallel and sequential sessions that extract, transform, and load data to one or more targets.
Have involved to prepare the mapping design documents.
Prepared SQL scripts for assessments and extraction of data.
Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
Extensively used Control-M for scheduling the workflows and monitoring loads.
Responsible to monitor the jobs using Control-M whether those jobs are succeeded or not.
Have involved in Comprehensive monitoring statistics allow you to quickly assess the health of your Informatica assets.
In case of any job failures, we will send an email alert to respective parties such as Structure Changes in Source or target level changes or informatica metadata schemas dB high usage.
Have Written SQL queries with joins, stored procedures, sub queries based on client requirements.
Involved in performance tuning is to optimize session performance by eliminating performance bottlenecks to get a better acceptable ETL load time.
Have involved identifying the bottlenecks in source, target, and mapping and further to session tuning.
Involved in UNIX scripts creation based on the requirements.
Have involved to move data from different various source stages to data warehouse and data mart.
Involved to collect the checksum reports using commands and login the servers using Putty and WinSCP.
Used excel to export data and summarize using pivot tables and graphs for reporting.
Generated Dashboard and helped writing data stories for client calls and executive deck meetings.

Environment: Informatica Power Center, ORACLE, Control M, UNIX, Tableau, WinSCP, Teradata.

WellCare Tampa, IL Sept 2019 - Oct 2021
Sr. ETL/ Informatica Developer
Responsibilities:

Interacted with Data Modelers and Business users to understand the requirements and create impact analysis of the new ETL on INBOUND and OUTBOUND Jobs.
Responsible for converting Existing jobs from SAP BODS to Informatica with same functionality.
Developed mappings that can be enabled and run in PDO.
Created mapping template instead of creating a mapping from scratch.
Parsed high-level design specification to simple ETL coding and mapping standards.
Maintained warehouse metadata, naming standards and warehouse standards for future application development Parsing high-level design spec to simple ETL coding and mapping standards.
Extracted the data from the flat files and other RDBMS databases and loaded it into staging area and populated onto Data warehouse.
Worked on Oracle SQL Developer to develop queries and create procedures and packages in Oracle.
Worked extensively on PL/SQL as part of the process to develop several scripts to handle different scenarios.
Created Landing, Stage tables in development environment and coordinated with DBA to create the same tables in higher environment.
Implemented ETL programs using Informatica Power Center against the Oracle 11g Data Warehouse, applying Oracle database skills including PL/SQL, packages, procedures, indexing and query tuning.
Designing the technical design specifications and program specification documents for ETL jobs.
Created ETL mappings using Informatica Power Center to move Data from multiple sources like Flat files, Oracle into a common target area.
Have used various transformations like PostgreSQL Transformation, Expression Transformation, Lookup Transformation, Joiner Transformation, Router Transformation, Filter Transformation, Normalizer Transformation, Union Transformation, Update Strategy Transformation and Aggregator Transformation.
Strong experience in writing complex SQL queries and PostgreSQL stored procedures.
Proficient with all major PostgreSQL procedural languages (PL/SQL) as well as some Oracle PL/SQL and SQL - Server T-SQL
Implemented Slowly Changing Dimension Type 1 and Type 2 for inserting and updating Target tables for maintaining the history.
Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
Developed Workflows using task developer, Worklet designer and designer in Workflow manager and monitored the results using workflow monitor.
Have involved to check the job failure issues such as network issues as firewall or connectivity breakage, OS patching related issues and Disk Space issues.

Environment: Informatica Power Center, SAP BODS, ORACLE, PostgreSQL.

US Bank Minneapolis, MN May 2018 - Aug 2019
ETL/ Informatica Developer
Responsibilities:

Prepared detailed ETL design documents explaining the mapping logic in technical as well as Business terminology.
Responsible for Dimensional Data Modeling. Understand the business needs and implement the same into a functional database design.
Extensively used ETL and Informatica to load data from Oracle, CSV, flat files into the target database.
Implemented various Transformations like Joiner, Aggregator, Expression, Lookup, Filter, Update Strategy, Stored Procedures, and Router etc.
Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Designer.
Gathering and analyzing the requirements by interacting with Business analysts.
Developed Drill-through, Drill-down, sub-Reports, Charts, Matrix reports, Linked reports using SQL Server Reporting Services (SSRS).
Created T-SQL stored procedures, functions, triggers, cursors, and tables.
Used existing UNIX shell scripts and modified them as needed to process SAS jobs, search strings, execute permissions over directories etc.
Responsible for requirement definition and analysis in support of Data warehousing efforts and worked on ETL Tool Informatica to load data from Flat Files to landing tables in SQL server.
Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
Designed and developed Dashboards using Tableau Service Documents for various products and services.
Worked with SSIS packages involved FTP tasks, Fuzzy Grouping, Merge, and Merge joining, Pivot and Unpivot Control Flow Transformations.
Worked with Metadata Manager which uses SSIS workflows to extract metadata from metadata sources and load it into a centralized metadata warehouse.
Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
Day to Day project sheet updated in time sheet application and to client.
Have involved to move data from different various source stages to data warehouse and data mart.

Environment: Informatica Power Center, SSRS, SQL, PL/SQL, Tableau, UNIX, MS SQL Server, Toad Data Point.
Hyundai Motors Mar 2017 Apr 2018
ETL/ Informatica Developer
Responsibilities:

Worked on multiple projects as an ETL designer and developer. Extensively used Informatica Power Center to develop mappings.
Interacted with the Business users and the requirements team to identify process metrics, various key dimensions, and measures.
Created Data mapping documents. Worked in Agile methodology.
Designed the ETL mappings between sources to operational staging targets, then to the data warehouse using Power center Designer.
Developed various mappings using Mapping Designer and worked with Aggregator, Lookup (connected and unconnected), Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter, and Sequence Generator transformations.
Worked with testing team to define a robust test plan and support them during the functional testing of the application.
Contribute to performance tuning and volume testing of the application.
Detailed study and data profiling of all the underlying information security application systems and understood the information security data models and identified and captured the right metadata from source systems
Designed and developed the ETL solution using Informatica to implement slowly changing dimensions of Type 2 to populate current and historical data into Dimensions, ETL solution validated incoming data and sent notifications when jobs are done.
Designed and developed Dashboards using Tableau Report Service Documents for various products and services in Insurance Service Group.
Generated ETL Scripts leveraging parallel load and unload utilities from Teradata.
Complete Software Development Lifecycle Experience (SDLC) from Business Analysis to Development, Testing, Deployment and Documentation.
Finalize Informatica data integration processes for the client system. With the major responsibilities of Informatica defect fix, Unit testing, Informatica mapping and system performance tuning, Informatica workflow recreate and combine with UNIX shell script to automate ETL systems.
Environment: Informatica Power Center, DB2, IICS, Teradata, MS SQL Server, PL/SQL, UNIX, Tableau.

Accenture May 2015 - Feb 2017
Informatica Developer
Responsibilities:

Have involved in Gather, analyze, and formalize user s business requirements and processes, evaluate on feasibility of implementation, and manage scope of project.
Involved in Migration of Informatica workflows/mappings from development region to Test and Production Environments.
Created parameters and variables for incremental data loading effectively using Informatica workflow manager.
Handle application's change requests by Customizing Informatica workflows/Mappings, Oracle Views/SP/Materialized Views.
Have used different types of transformations in mappings such as Expression, Router, Update Strategy, Lookup, Filter, Sorter and Join etc.
Developed Informatica Cloud Data Integration mapping and task flows to exact and load data between on-premises, AWS RDS, Amazon S3, Redshift, Azure SQL Data Warehouse and Azure Data Lake Store; created and configured all kinds of cloud connections and runtime environments with Informatica IICS.
Created Salesforce connections, Implemented Salesforce business process with Informatica IICS data Integration.
Operated AWS console to configure services and configurations.
Developed Redshift queries and RDS queries to confirm data is loaded correctly.
Scheduled and monitored sessions using Informatica Workflow Manager.
Have Created Reusable and Non-Reusable sessions.
Have Created tables, Views, and Stored Procedures and written SQL queries with joins, sub queries based on client requirements.
Involved in Deploying the Application from one server to another server and Resolved On fixing bugs in UAT and LIVE Environment.
Have Done Unit Testing, Debugging and Resolved bugs.
Day to Day project sheet updated in timesheet application and to client.
Have worked on Fixing bugs reported and working on enhancements and change requests.
Have worked on formatting SSRS reports using the Global variables and expressions.
Have created well-formed and Web-based Reports for Micro-Finance Related Documents using SSRS.
Have worked on Modifying Existing Reports and Procedures as per Client Requirements.
Have worked on formatted reports for various Output formats as requested by the management using SSRS.
Have Created datasets using stored procedures and reports using multi value parameters for SSRS.
Environment: Informatica Power Center 9.6.1, Business Intelligence (SSRS), Oracle, Toad Data point, MS Excel, JIRA, Autosys, Flat files, SQL *Loader, PL/SQL, XML, Unix Shell Scripts.
Keywords: business intelligence sthree database active directory information technology business works microsoft procedural language Florida Illinois Minnesota Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];1025
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: