Home

Rajashekar - ETL Developer
[email protected]
Location: Dallas, Texas, USA
Relocation: Yes
Visa: H1B
Rajasekhara Reddy Lakku
Sr. ETL/ Talend / Power Bi Developer

PROFESSIONAL SUMMARY

10+ Years of Professional IT with Data warehousing Business Intelligence background in Designing, Developing, Analysis, Implementation, and post implementation support for DWBI applications.
Extensive experience in Development, Implementation, Testing Support of Data Warehousing and Data Integration Solutions using IICS and Informatica PowerCenter.
Extensively worked on the ETL mappings, analysis, and documentation of OLAP reports requirements.
Experience in migrating to other databases to Snowflake.
Solid experience and understanding of Implementing large scale Data warehousing Programs and E2E Data Integration solution on Snowflake Cloud, AWS -S3, Informatica Intelligent cloud services (IICS-CDI) &InformaticaPowerCenter integrated with multiple Relational databases (Oracle, Microsoft SQL Server, PostgreSQL)
Knowledge on Python Programming for data processing and to handle data integration between On-Pem and Cloud DB or Datawarehouse
Experience of handling Slowly Changing Dimension (SCD) technique in insurance application.
Experience in OLTP/OLAP System Study, Analysis, developing Dimensional Models using Star schema and Snowflake schema techniques.
Having good experience in cloud concepts like Azure, GCP and AWS
Great team player and experienced in team leading, with excellent communication skills.
Having very good in Experience writing complex queries using SQL.
Involved in end-to-end development activities in terms of technical designing, coding, review, deployment, QA/UAT support, release documentation.
Created, edited and maintained SQL functionality in SQL Server and Azure SQL Databases.
Exposure to complete Software Development Life Cycle (SDLC).
Tuned Informatica mappings/sessions for better ETL performance by eliminating bottlenecks.
Analyzed workflow, session, event, and error logs for trouble shooting Informatica ETL process.
Designed and developed the UNIX shell scripts for the automation of ETL jobs.
Worked on complete Agile methodology and involved in the status calls and scrum calls.
Experiencein Microsoft Power BI reports, dashboards and distributing to the end clients for leader level Business Decision.
Pro-Active to Production Issues, punctuality in meeting deadlines and always follow First Time Right (FTR) and On Time Delivery (OTD) approach.

TECHNICAL SKILLS

Methodologies: Agile (Scrum, Kanban)
Databases: Microsoft SQL server (2018 ,2005), Oracle (11g, 12c, 19c,21c), PostgreSQL (6.0)
Cloud: AWS(S3), Snowflake, SnowSQL, SnowPipeAWS
ETL Tools: IICS, Informatica PowerCenter 10.4/10.2/9.6.1
Reporting Tools: Tableau, PowerBI
Applications:Microsoft Office, 2016(Excel, Outlook, PowerPoint, Visio, Word, SharePoint)
Scheduling: TWS, Autosys
Scripts: Advanced SQL, Python, PL/SQL, Unix Shell scripting
Code Management Process/Tools: GitLab, GitHub, Bit Bucket
Project Management Process: JIRA, Confluence
Design: Microsoft Visio
Operating Systems: Windows, Linux

Education: Bachelor s Computer Science ,JNT University,2012


PROJECT EXPERIENCE

Client: Mphasis , Dallas, TX (Nov 1st,2022 Present)
Role: Sr. ETL Developer

Responsibilities:
As an ETL Developer, liaised with SMEs & business users to understand the business requirements to clarify an open gap and gathered the necessary information about Retirement Readiness Mart (RRM) applications.
Proactive involvement in daily Scrum calls & Status meeting with RRM Project team and Client Managers.
Engaged in Plan, analyze, design, coding, and unit testing for ETL activities from different source feeds into target systems.
Responsible for designing and developing of mappings, mapplets, sessions and workflows for load the datafrom source to staging and Staging to Target.
Extracted data from Oracle, Excel, json files using Informatica mappings/SQL/PLSQL scripts.
Developed UNIX scripts for event waits, wait for files and pre-session tasks.
Loading data from disparate non-Hadoop sources which can be both structured and unstructured.
Experience with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source systems which include loading nested JSON formatted data into snowflake table.
Designed, Developed, and Implemented ETL processes using IICS Data integration.
Created IICS connections using various cloud connectors in IICS administrator.
Installed and configured Windows Secure Agent register with IICS org.
Experience in working with ETL/ELT tool like Informatica and Matillion for Data loads
Extensively used performance tuning techniques while loading data into snowflake using IICS.
Developed complex Informatica Cloud Task flows (parallel) with multiple mapping tasks and tasks flows.
Developed MASS Ingestion tasks to ingest large datasets from on-prem to snowflake - File ingestion.
Stage the API or Kafka Data (in JSON file format) into Snowflake DB by Flattening the same for different functional services.
Configured EMR clusters on AWS and used them for running spark jobs.
Implemented data pipelines to process batch data by integrating into AWS S3, Hive and AWS Redshift
Experience in building snow pipe, Snowflake Clone and Time-Travel.
Prepared Unit test cases and technical specifications.
Creation of new Database Connections and improved load and Database performances. Created and/or
updated existing Parameter Files on Linux to Informatica Workflows.
CreatedCode Inventory sheets which brings the code in the readable format for all parties (Business and L3 Support).
Designing complex scheduling frame works and processes for larger applications using Tivoli scheduler.
On call support for production job failures and lead the effort on working with various teams to resolve the issues

Environment: Informatica10.4/10.2, IICS, AWS(S3), Snowflake, Linux Ubuntu, EDW, Oracle 21C, Extract, Transform, Load (ETL), SQL, Tivoli Scheduler (TWS), Jira, ServiceNow, Jenkins, GitHub

Client: Accenture, India Mar 31st, 2021 Oct 29th, 2022
Role: ETL /Talend Developer

Key Responsibilities:
Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
Created mapping documents to outline data flow from sources to targets.
Analyzed the business requirements, designed, and developed the ETL mappings involving complex business rules.
Developed Dimensional data modeling of the data warehouse projects by using star schema also developed Data mapping and filtering procedures.
Developed mappings/Transformation/mapplets by using mapping designer, transformation developer and mapplets designer using Informatica PowerCenter.
Gather Requirements from the source system and creating Process using Informatica.
Building Design Spec for the ETL Process flow based on the requirements using Informatica.
Design the ETL process and schedule the stage and mart loads for the data mart.
Designed, Developed and Deployed UNIX shell scripts.
Configured and managed Informatica servers and Implemented data quality management solutions that manage millions of customer transactions.
Performed systems, data quality assurance, system testing, ensuring that software and systems perform to specification.
Troubleshooting the production support issues and keeping the triage of support tickets
Extracted large volumes of data from Mainframes systems to target Oracle database.

Environment: Informatica 9.6.1, Unix, EDW, Oracle (10.7/11/11.5.7/11.5.10.2) Database, Microsoft SQL server2018, Extract, Transform, Load (ETL), SQL, Autosys, Jira, Jenkins


Client: CGI Information Systems and Management Consultants PVT Ltd Feb 3rd, 2014 Mar 30th,2021
Role: ETL Developer
Key Responsibilities:

Analyzed the Requirement for ETL Design and Development for extracting the data, which is stored in different sources like Oracle, SQL Server and DB2.
Participated in designing and developing an environment that would facilitate the use of Informatica to transform data coming from oltp into data marts.
Used Agile Methodology for SDLC and utilized scrum meetings for creative and productive work.
Developed complex Informatica Mappings with transformations like lookup, router, aggregator, expression, update strategy, joiner etc.
Worked with Informatica9.6.1 to create Source /Target connections, monitor, and synchronize the data in the data warehouse.
Wrote PL/SQL procedures which are called from Stored Procedure transformation to perform database actions such as truncate the target before load, delete records based on a condition and rename the tables.
Extensively involved working on the transformations like Source Qualifier, Filter, Joiner, Aggregator, Expression and Lookup.
Created session logs, workflow logs and debugger to debug the session and analyze the problem associated with the mappings and generic scripts.
Design and developed complex Informatica mappings including SCD Type 2 (Slow Changing Dimension Type 2).
Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion.

Environment: Informatica 9.6.1, Linux Ubuntu, EDW, Oracle 19c Database, Microsoft SQL server 2008R2,
Extract, Transform, Load (ETL), SQL, Autosys, Jira, Jenkins

Client: CIBC IONS BANK Jan 2018-Mar 29th, 2021
Role: SSIS/SSRS Developer
Roles and Responsibilities:

Designed Metadata using Pentaho Metadata editor, which acts as semantic layer between report and database.
Used custom SQL to enhance the performance of the reports and gained knowledge in Reporting.
Communication with Onsite and senior management on the project status.
Extensively worked on designing and debugging the advance SQL queries.
Review and Ensure quality of deliverables.
Worked on improving performance of the product to suit the client environment.
Designed various reports using SSRS report Designer SSIS packages based on the Requirements.
Involved in Creation, Development and Deployment of SSIS packages in SQL Server 2012
Created packages in SSIS with error handling as well as created complex SSIS packages using various data transformations like conditional split, Cache, for each loop, multi cast, Derived column...
Experience in Dealing with Slowly Changing Dimensions SCD in SSIS
Preparing the Unit Test Documents for developed Reports and Packages
Extracted data from internal data warehouse system to SSRS.
Designed and developed stored procedures, queries, and views necessary to support SSRS reports.
Worked on Agile methodology.
Daily Scrum call with Clients and updating the Status on current PI Items

Environment: MS SQL Server 2014/2016, MS SQL Server Integration Services (SSIS), MS SQL Server Reporting Services (SSRS), MS SQL Server Analysis Services (SSAS), DAX, POWER BI, Visual Studio 2014, Agile, T SQL, SQL Profiler, XML, Team Foundation Server (TFS), MS Excel, Toad, TWS, Excess, Windows 8

Client CGI, Hyderabad, India Nov 2015 Dec 2017
Role: SSRS&SSIS Developer

CGI Advantage 360 offers an alternative out of the box approach to Government ERP.Multi-tenant solution with defined feature set and prescribed upgrade scheduleA cloud-based functionally rich ERP solution for mid-tier local government that is delivered faster and cheaper.


Roles and Responsibilities:
Designed Metadata using Pentaho Metadata editor, which acts as semantic layer between report and database.
Used custom SQL to enhance the performance of the reports and gained knowledge in Reporting.
Communication with Onsite and senior management on the project status.
Extensively worked on designing and debugging the advance SQL queries.
Review and Ensure quality of deliverables.
Worked on improving performance of the product to suit the client environment.
Designed various reports using SSRS report Designer SSIS packages based on the Requirements.
Involved in Creation, Development and Deployment of SSIS packages in SQL Server 2012
Created packages in SSIS with error handling as well as created complex SSIS packages using various data transformations like conditional split, Cache, for each loop, multi cast, Derived column...
Experience in Dealing with Slowly Changing Dimensions SCD in SSIS

Environment: MS SQL Server 2014/2016, MS SQL Server Integration Services (SSIS), Pentaho,MS SQL Server Reporting Services (SSRS), MS SQL Server Analysis Services (SSAS), DAX, POWER BI, Visual Studio 2014, Agile, T SQL, SQL Profiler, XML, Team Foundation Server (TFS), MS Excel, Toad, TWS, Excess, Windows 8



Client: IFS& BKS, PRIMEBASE ,Hyderabad,India (Aug 2012 to Oct 2015)
Role: SQL BI (ETL) Developer

Designed and developed and deployed SQL Server Integration Services (SSIS) packages to validate, extract, transform and load data from multiple homogeneous and heterogeneous information sources (CSV, DAT, Excel, Oracle dB).
Filtered data from Stage to EDW by using Complex SQL statements in Execute SQL Task and implemented various Constraints for data consistency and to preserve data integrity.
Worked with various control flow items like For-Each Loop Container, Script task, Execute SQL task, Send Mail Task, SFTP Task, Execute Package Task as per business requirement.
Extensively used transformations such as Derived Columns, Condition Split, Aggregate, Script component Transformation, Merge Join and Union all.
Implemented error handling using Event Handlers (On Error) and Configure Error Output in ETL load.
Created Package Configuration, Variables and used expressions in order to make the Package dynamic.
Handled slowly changing dimensions (SCD) type 2in order to maintain the history of the data.
Performed tuning and optimization of slow running packages by reducing the data flow usage.
Modified and recreated various packages for business needs and also added more alerts in case of package failure.
Created Index to optimize performance of T-SQL queries and created Views to reduce complexity of end user.
Created and modified various Stored Procedures using T-SQL.
Scheduled packages to run daily and monthly using SQL Server Agent Jobs.

Environment:MS SQL Server 2012,2014, SSIS, TFS, Microsoft Visual Studio.
Keywords: quality analyst business intelligence sthree database information technology microsoft procedural language Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];1158
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: