Home

SASANKA - Informatica Developer / ETL Developer
[email protected]
Location: Frisco, Texas, USA
Relocation: YES
Visa: H1B
PROFESSIONAL SUMMARY:
Over 10+ years of work experience in Information Technology as an Informatica Developer with strong background in ETL Data warehousing experienced using Informatica PowerCenter.
Experience in software development life cycle (SDLC), business requirement analysis, design, programming, database design, data warehousing and business intelligence concepts, Star Schema and Snowflake Schema methodologies.
Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server and Mainframe applications.
Expertise in design and implementation of Slowly Changing Dimensions SCD type1, type2, type3.
Extensive experience in developing complex mappings from varied transformations like Router, Filter, Sorter, Connected and Unconnected lookups, Normalizer, Expression, Aggregator, Joiner, Union, Update Strategy, Stored Procedure and Sequence Generator etc.
Experience in using Informatica Power Center tools Mapping Designer, Workflow Manager, Workflow Monitor and Repository Manager, metadata Manager.
Experience in using Python coding for monthly report creation.
Experience in Dimension Data Modeling concepts like Star Join Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling.
Extensive experience in Extraction, Transformation, and Loading (ETL) data from various data sources into Data Warehouse and Data Marts using Informatica PowerCenter tools (Repository Manager, Designer, Workflow Manager, Workflow Monitor, and Informatica Administration Console).
Strong experience in Extraction, Transformation, Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
Experienced in loading data, troubleshooting, debugging mappings, performance tuning of Informatica Sources, Targets, Mappings and Sessions and fine-tuned transformations to make them more efficient in terms of session performance.
developing ETL solutions on AWS, including using AWS Glue, AWS Data Pipeline, and AWS Step Functions. It also indicates my proficiency in programming languages like Python and SQL, which are essential for developing ETL workflows.
Experience in ETL using Informatica PowerCenter9.6.1 from sources like Oracle, and Mainframes Cobol Flat files
Experience in developing complex SQL queries and Implementing Production Data fixes using SPUFI, QMF tool in IBM DB2
Experience working in Development/Maintenance/Sub-Production Batch support of Mainframes applications
Design, Development, Testing, and Implementation of ETL processes using Informatica Cloud
Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
Strong experience in working with large scale Data Warehouse implementations using Informatica PowerCenter 8.x/7.x/6.x, Oracle, DB2, SQL Server on UNIX and Windows platforms.
Excellent interpersonal and communication skills, and is experienced in working with senior level managers, businesspeople, and developers across multiple disciplines
Developed reports to visually explore data and create an interactive report Tableau and Power BI.
Used UNIX shell scripts for automating tasks for BTEQ and other utilities.


TECHNICAL SKILLS:

Operating Systems Windows, UNIX, MS-DOS
ETL Tools Informatica Power Center 10.x,9.x,8.x (Designer, Workflow Manager, Workflow Monitor, Repository manager, Workflow Designer and Informatica Services), Informatica Power Exchange, informatica MDM 9.5, IDE 9.5.1,IDQ 9.6.1/9.5.1, Informatica Intelligent Cloud Services (IICS), IBM Mainframes
Databases Oracle 11g/10g/9i/8i, MS SQL Server 2008/2005, DB2 v8.1, Teradata, Vertica, Cloud (Palantir)
Big Data Technologies Apache Hadoop, Spark
Data Modeling tools Erwin, MS Visio
BI Reporting Tools Cognos, Tableau, Power BI and Looker
Languages SQL, PL/SQL, UNIX Shell scripts, C++, Python
Cloud AWS
Scheduling Tools Autosys, Control-M, TWS (Tivoli Work Scheduler)

PROFESSIONAL EXPERIENCE:
PROJECT NAME: Symphony SD1
Jan 2021 to Till date
ETL Developer
Location: Dallas, TX
Implementation partner: Amdocs

MUFG Union Bank (US Bank)

Processing of the Commercial Credit Card (CCC) products daily General Ledger and certain Settlement transactions. These Transactions will be delivered by the TSYS/CCX Team to the Accounting Team to be loaded to the existing data staging table. The data will be extracted form the data staging table and transformed to the standard GBnn file format required by OVS. The prepared GBnn file will then be delivered to OVS via SFTP. OVS will process these transactions and return the GL transactions to the Oracle GL via the existing interface. The CCC GL Transactions will be included in the existing GL Transaction files delivered daily to the Oracle GL.

Responsibilities:

Involved in understanding the Requirements of the End Users/Business Analysts and Developed Strategies for ETL processes.
Developed ETL procedure strategies and worked with business and data validation groups to provide assistance and guidance for system analysis, data integrity analysis, and data validation activities.
Developed mappings/Reusable Objects/Transformation/Mapplet by using mapping designer, transformation developer and Mapplet designer in Informatica Power Center.
Designed and developed ETL Mappings to extract data from flat files, MS Excel and Oracle to load the data into the target database.
Involved in Scrum meetings and Scrum calls.
Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
Developed Informatica Mappings for the complex business requirements provided using different transformations like Normalizer, SQL Transformation, Expression, Aggregator, Joiner, Lookup, Sorter, Filter, and Router and so on.
Involved in developing several complex mappings in Informatica a variety of PowerCenter transformations, Mapping Parameters, Mapping Variables, Mapplets Parameter files in Mapping Designer using Informatica Power Center.
Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ. Extensively worked on Informatica IDE/IDQ.
Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
Worked on IDQ tools for data profiling, data enrichment and standardization.
Used IDQ s standardized plans for addresses and names clean ups.
Worked on IDQ file configuration at user s machines and resolved the issues.
Used IDQ to complete initial data profiling and removing duplicate data.
Worked on integrations using Informatica Cloud Data Integration (IICS-CDI Service)
Designed, Developed and Implemented ETL Processes using IICS Data Integration.
Created IICS connections using various cloud connections in IICS administrator.
Installed and configured windows secure agent register with IICS org.
Experience on performance tuning techniques while loading data into cloud platform and on Premises platforms using IICS
Deployed new MDM Hub for portals in conjunction with user interface on IDD application.
Strong background in creating Base objects, Staging tables, foreign key relationships, Mappings, lookups, queries, packages, query groups and custom cleanse functions in Informatica MDM.
Configured match rule set property by enabling search by rules in MDM according to Business Rules.
Worked in developing Mapplets and Re-usable Transformations for reusability and reducing effort.
Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
Involved in migration of large amount of data from OLTP to OLAP by using ETL Packages.
Used ETL to load data using PowerCenter/Power Connect from source systems like Flat Files and Excel Files into staging tables and load the data into the target database.
Developed complex mappings using multiple sources and targets in different databases, flat files and loading them into Teradata.
Used query surge testing automation tools to document unit testing results.
Worked with Python scripts to automate monthly validation reports for business.
Completed POC to move legacy data to AWS redshift & S3 buckets.
Developed complex SQL queries to develop the Interfaces to extract the data in regular intervals to meet the business requirements and extensively used Teradata.
Involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and by using Parameter files.
Used various transformations like Unconnected/Connected Lookup, Aggregator, Expression Joiner, Sequence Generator, Router etc.
Responsible for the development of Informatica mappings and tuning for better performance.
Worked with UNIX scripts for automation of ETL Jobs using Autosys Scheduler and Involved in migration/conversion of ETL processes from development to production environment.
Created transformations like Expression, Lookup, Joiner, Rank, Update Strategy and Source Qualifier transformation using the Informatica designer.
Created mapplet and used them in different mappings.
Written PL/SQL Procedures and functions and involved in change data capture (CDC) ETL process.
Environment: Informatica Power Center, IDQ, Teradata, Oracle, My SQL, Flat Files, SQL Assistant, Autosys, PL/SQL, Erwin, Unix shell scripting, Unix, Agile and Windows.


PROJECT NAME: AdvantEdge Analytics Group

Sep 2017 to Dec 2020
Informatica Developer
Location: Chicago, IL
Implementation partner: AtosSyntel


CUNA Mutual Group

AdvantEdge Analytics is an analytics fintech built and supported by the bright minds and deep resources of CUNA Mutual Group, one of the most experienced and influential partners in the credit union movement. As collaborators passionate about evolving the credit union business, the AdvantEdge Analytics teams co-create alongside credit union customers to accelerate their digital transformation journey. Analytics Consulting Services and cloud-native Analytics Platform solutions are configured to meet credit unions where they are in their journey, enabling them to deepen member relationships and deliver speed-to-value.

Responsibilities:
Studied the software requirements specifications and gathered the business requirements, analyzed them from functional perspective
Actively participated in sessions to review and document business processes and to produce high-quality deliverables
Team oriented role focused on supporting the buy-side capital markets group's clients wherever necessary
Involved in Scrum meetings and Scrum calls.
Converted old data from Flat files to Oracle database making use of SQL*Loader
Speak directly with customers about their trading strategy, potential market impact, live market color, counterparty feedback and execution guidance
Work closely with the broader global capital markets & strategy team to leverage resources
Help manage relationships and maintain protocol with our sell side capital markets sales partners
Used Exception Handling extensively for the ease of debugging and displaying the error messages in the application
Experience in Performance Tuning of Informatica PowerCenter ETL jobs Oracle/Netezza SQL Queries
Developed/modified the PL/SQL Procedures and Functions to enhance the reusability of the code to be used later in various applications
Also used Infoworks (Ingestion and Analytics) tool to ingest the data from various sources like Teradata, Oracle, SQL-Server, DB2, Flat Files and building pipelines and workflows to transform the data and also to schedule it.
Used Informatica PowerCenter tools to extract data from SFDC sources
Worked on Informatica PowerCenter tools-Designer (Source Analyzer, Data warehousing designer, Mapping Designer, Mapplets & Transformations), Repository Manager, Workflow Manager & Workflow Monitor
Build Datasets and tables in Big Query and loading data from cloud storage
Converted and modified Hive questions to use in Big Query and Performed data cleaning on unstructured information using various tools
Created Airflow Scheduling scripts using python
Integrated Apache Airflow and wrote scripts to automate workflows in AWS data pipeline.
Expertise in scheduling Informatica jobs using Informatica, Autosys, Airflow DAGs and DAC schedulers.
Spark jobs DAG creation, determining batch dependencies and job triggers using Airflow as the batch job automation tool.
Written shell scripts to run SQL jobs on background and integrate data from flat files
Provided effective support in delivering process and product change improvement solutions
Ensured acceptable performance of the data warehouse processes by monitoring, researching and identifying the root causes of bottlenecks
Tableau Workbook design with filters, parameters, quick filters, sorting, groups, and hierarchies.
Shared reports/dashboards by creating contents packs and created reports to view in Power BI Mobiles.
Created dashboard, filter actions, URL actions and drill-down
Published common Tableau reports on Tableau server under appropriate projects.
High level testing of Reports and Dashboards.
Environment: Oracle 11g/10g, PL/SQL, SQL*PLUS, Informatica PowerCenter 8.6, EDC, Big Query, Tableau, TOAD 9.5 for oracle, Erwin, UNIX shell scripting

PROJECT NAME: Moody s

Sep 2016 Aug 2017
ETL Informatica Developer
Location: Pune
Implementation partner: AtosSyntel


Moody s Investor Services
Moody s Corporation is the parent company of Moody s Investors Service, a leading provider of credit ratings, research, and analysis covering debt instruments and securities in the global capital markets, and Moody s KMV, a credit risk management technology firm serving the world s largest financial institutions. This project mainly deals on customer credit rating data. The data mainly get fed from KMV application and loaded to the integrated database.
Responsibilities:
Responsible for the MLQM (Member Life Quality Measures) and Quality Score Card Project using ETL process, Design and Development
Data loading from Flat Excel sources to the DA Layer in Oracle Data mart using Informatica 9.6.1 version
Loaded the Lab and Biometric Data for each month release into the relational tables in the DA layer
Developed the Mappings to Load the data from Oracle Data mart to the SBI (Supplemental Business Intelligence) which is in SQL server and Scheduled the sessions to run for each month release
Data staging in the MLQM Stage Database (SQL server) by building the new ETL to load the data into that schema in MLQM. Created the mappings and used the business logic to load the data into the tables according to the Client's requirements
Exporting the calculated quality measures custom data to the flat files (Excel) using the Informatica and reporting to Provider portal using SSRS
Created ETL jobs to run the Workflows to move the data from Oracle to SQL server and SQL server to Flat files
Worked with Reflection FTP Client to transfer the source files into the Client's server and to define the source path at Source File Directory
Worked on Unix Environment to start and run the sessions in Informatica
Created Shell Scripts in Unix to run the ETL jobs
Used various Transformations like Expression, Aggregator, Sequence Generator, Update Strategy, Joiner, SCD type 1 and type 2, Filter, Normalizer and Lookup (Connected & Unconnected) based on the different conditions and Worked on the Transformation Designer to design the new transformations and use the existing transformations for the Development
Generated complex SQL queries and override in the Source Qualifier and Lookup Conditions wherever needed
Used different joins by Joiner transformations to join different target tables and user defined joins in the Source Qualifier to join the different sources like Flat files (.csv, excel, xml) and relational sources
Used look up condition and joined the different tables having Primary key and foreign key relation using both cached and un-cached memory
Analyzed the data flow and worked on the session log files to resolve the issues and errors by modifying the mappings in the mapping designer and Mapplets and looking into the ODBC connection strings to make the workflows successful
Involved in the entire project lifecycle from analysis, planning, configuration, development and Reporting
Responsible for creating Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor
Involved in writing stored procedures in Oracle SQL Developer and Microsoft SQL Server
Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages Oracle
Developing the tabular queries for efficient analysis of report using pivot/unpivot in T-SQL.
Worked with SSRS for reporting the MLQM Custom measures from data to the Portal Provider foreach month release deployment and involved in the production management
Developed SAP Business Objects Web Intelligence reports making use of several Queries (Union, Intersection, Minus)
Designed Business Objects Universe based on the XI Repository and developed Business Objects reports and Crystal Reports
Participating in Daily Status Check-point team meetings, Development meetings with Lead and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings
Involved in regular discussions with the Facets team to enter test data
Weekly DRB meetings with the testing team and fixing identified problems in existing production data and developed one-time scripts to correct them
Excellent interpersonal and communication skills, and is experienced in working with senior level managers, businesspeople and developers across multiple disciplines


Environment: Informatica PowerCenter 9.6.1, Oracle 11g/12c, Microsoft SQL Server Management Studio 2012, T-SQL, SSRS, SSIS, Facets 4.7.1, TOAD 10, Sql plus, Putty, Reflection FTP Client, Mainframe 8210, Business Objects 4.1, HP Client, Microsoft Visio, MS Office, UNIX, Windows 10 Operating System



PROJECT NAME: DTNA

Jun 2012 Aug 2016
ETL Informatica Developer
Location: Pune
Implementation partner: AtosSyntel

Client: DTNA
Daimler Trucks North America is a Leading Automobile manufacturer offers Highly Customized trucks to Clients.
The current project involves in building an advanced analytical platform for DTNA. It will be used as a Data Lake for better understanding of the Business and Facilitates business to perform advanced analytics.

Responsibilities:
Interacting with Business analyst and on-site coordinates for requirement analysis
Worked as ETL Developer involved in the development of ETL process using Informatica to load data from source EBS database into the target Oracle database
Designing and developing mappings to extract data from diverse sources including flat files
Designed and Developed the Sales Order Aggregate to resolve the performance Issues
Successfully solved Tracks assigned Before Deadlines
Analysis, Design, Development, Integration of Mainframes and Informatica PowerCenter Designer
Extract data from the Mainframes source fields, perform required Transformations and load into XML tags for HP-Exstreme using Informatica Powercenter Designer Tool
Transformations like XML generator, Application Source Qualifier, Normalizer, Expression, Lookup (Connected and unconnected), Joiner, Filters, and Sequence Generator are mostly used.
Create data maps in Power Exchange to pull data from Mainframe flat files. Mainframes and ETL Unit Testing, UAT support and resolve defects
Scheduling Mainframes jobs using Control-M scheduling tool
Involved in the testing process throughout the project
Involved in debugging the data mismatches and fixed those and also prepare proper analysis document for the mismatches
Worked on a Production Clone instance for the same project based on the client request and hand over to client in a couple of weeks' time including the data validation
Worked on Migration from DEV environment to QA and Production Environment
Performed unit testing

Environment: Informatica PowerCenter 8.1, DB2, Mainframes, COBOL, Business Objects 6.0, Oracle 10g, SQL Loader, PL/SQL, DB2, UNIX Shell Programming, Linux and Windows NT


PROJECT NAME: Kronos ITO

Nov 2010 May 2012
Location: Pune
BI Developer
Implementation partner: AtosSyntel


KRONOS is into sales domain and business mainly deals in selling S/W and H/W Products like WF Timekeepers, HR, Payroll. In Touch Terminals... etc... It Maintains services Educational and Products Maintenance as well. For all this business process (Quoting, Order, Shipping, Invoice, Depot. etc.) of the products/services will be done by Oracle Apps. KRONOS is having other applications like Amdocs, Open air Share point sites. etc. for Customer support and other services. All these data will be pushed to EDW environment on scheduled jobs and served via Cognos Reports to employees to know complete details of their customers for further business growth and analysis.

Responsibilities:
Developing Reports as per the Mockup provided by the client.
Code review of Juniors
Developing Report as per Business Requirements.
Analysis of Reports and Data validation.
Solve the issue of the Report.
Providing resolution to incurred issues
Testing of report functionality and preparing Unit test case/Results.
Migration document preparation.

EDUCATION:
Master of Computer Applications from Acharya Nagarjuna University in 2006.
Bachelor of commerce from Acharya Nagarjuna University in 2002


________________________________________________________________
Thanks and regards,
Sai Kiran
Sr Bench Sales Recruiter, Cloud Info Inc.
Direct : 302-206-7225 EXT:107 | Email : [email protected]
LinkedIn : https://www.linkedin.com/in/kiran-rasuri-8009ba190/
Keywords: cplusplus quality analyst business intelligence sthree information technology hewlett packard microsoft procedural language Colorado Illinois Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];739
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: