Home

Raja Kondaveeti - ETL -Data Stage Developer/ product support owner
[email protected]
Location: Chester, Pennsylvania, USA
Relocation: yes
Visa: h1b
Raja Kondaveeti
404-508-5797 [email protected]
Summary:
10+ years of professional experience in developing & testing, test lead maintaining enterprise appli-cations.
Experience in Data warehousing application in testing by using Informatica Power center versions 8x, 10 and Hadoop hdfs systems.
Experience in testing Strategies for Extraction, Transformation and Loading (ETL) mechanism-using Informatica Power center.
Experience working in Agile methodology and related activities.
Experience in Functional testing & Regression Testing.
Experience with reverse analysis in Informatica mappings, Sessions and workflows.
Expertise in writing SQL, PLSQL and ANSI SQL Queries including Joins, Views and stored procedures, functions using Oracle and Teradata.
Co-ordinate with all stake holders to get the test data.
Having good knowledge in how data is flowing from source to target data mapping via intermediate tables.
Has experience in data models and transformation logics.
Knowledge in Dimensional Modelling (Star schema, Snow-flakes schema).
Having good knowledge in PowerBI, Snowflake, Control-M, Service Now.
Experience on UNIX/Linux commands.
Positive team member, capable of taking up the initiatives & Capable of working under stringent timelines.
Proven ability in quick understanding and learning of new technologies and their application in busi-ness solutions.
Technical Skills:
Languages SQL, PLSQL, HIVE, Python
Methodology Agile Methodology
Operating Systems Windows, Linux
Tools Informatica 10, SQL developer, JIRA, Control M, Data Analytics Studio, Con-fluence, Putty,
Prototyping Tools MS Visio, Figma
Databases and Tools MS SQL, MySQL, Cassendradb, Mongo dB, Mariadb, Teradata
Frame works Hadoop, Spart, Microsoft Azure
Version Control Github, BitBucket



Project Details:
Cloud Data Networks Feb 24 to till date
Project Title: Edukare
Role: ETL -Data Stage Developer
Responsibilities:
Creating the staging tables and give the necessary permissions to the table.
Get the data to the staging tables from different sources like Excel, different databases.
Configured clustered and distributed scalable parallel environments and update the data within the repositories, data mats, and data warehouse.
Ensure that data is flowing from source tables to target tables.
Work closely with platform development team to co-ordinate on all activities.
Providing supporting to the customers about the issues relating to the storage, handling and access of data.
Prepare valid test scenario's and gather the valid stabs and Preparing Unit test cases to prove the business logic is working as expected.
Participate in the design and development of the data warehouse, as well as user creation and SOP documentation.
working on the requirements, design, and integration of software components with a variety of stakeholders, including functional specifications and thorough use cases for all development initia-tives.
Create the test planning and ensure that all the test scenarios have tested and executed in 360-degree ways for a quality product outcome.
Engage in active participation in daily meetings and report any errors that may have happened dur-ing development activity.
Create and schedule the jobs flows via Control-M.
Create a job and update, maintain the scheduling through the calendars and troubleshoot of Con-trol-M jobs and batch flows.
Ensured the proper handling job flows for automation of manual jobs, schedulers, monitor the job flows and job failures
Worked very closely with all business units and assisted with script debugging, to ensured problems were resolved with minimal impacts.
Provide the potential support to the team by analyzing the functional and development teams about the failures to resolve the issues.
Testing the jobs in the lower test environments and deploy them to production environment.

Environments: MongoDB, ANSI SQL, NOSQL, AWS, Azure, Front end (HTML, CSS and Java), Back end (Node Js and spring), Linux, OAuth payment gateways, data stage-Control-M 9.0, Windows.
Bank of America Oct 21 to Feb 24
Project Title: Promotional Ops
Role: ETL and Hadoop Senior L2 & L3 production support analyst
Responsibilities:
Ensure the timely logging off the issue in SOR engaging in the resolution.
Assess the potential impact to business and SLA. Also, consider downstream impacts and financial and reputation loss.
Escalate early to Off-shore leads and On-shore teams if there are any potential impacts.
Keep communicate to Off-shore and On-shore teams through mail communication for aware and share updates through the triage and resolution process.
Initiate timely command center communication if required and trouble shoot the data stage job failures and provide the root cause analysis.
Collaborate and communicate with developers and SME s if any impacts occurred.
Consider worst scenario while accessing the issue and impact.
Supported the production data stage jobs if needed and develop the data stage tables to load the data into the data warehouse.
Maintain a document for all developed jobs and circulate to the all-stake holders and update in the confluence pages.
Keep a log of critical events in the chain for a post problem review.
Take ownership and ensure periodic communication to all stakeholders like upstream, downstream, applications, business owners.
Need to take care end to end process by communicating through mail with resolution steps If any known issue occurred.
Monitor all the job flows , schedulers in Control-M tool and provide the support if any ongoing issues raises.
Track all the issue incidents in the BMC remedy tool.
Provide the walk through like demos after completion of test results.
Capture all the unresolved issues in Jira tool and assign back to the developer s team for the resolu-tion.
Taking care of CRQ s and signing off along with the developers.
Provide the sign off that ensuring all the test scenarios has covered.

Environment: Hive, SQL, RDBMS / PLSQL, Teradata, Informatica 10, Jira, BMC remedy, Data Stage-Control M 9.0, Data Analytics Studio, DeProdX, AutoSys, Apache ATLAS, Confluence, Putty, Hadoop Big Data, Spark, Hive, Microsoft Azure DevOps, Windows.

Westpac Jan 20 to Sept 21
Project Title: Program Shield
Role: DevOps Developer
Responsibilities:
Actively participate in requirements gathering meetings to create Python Spark frame work to bring data from Teradata to Hive.
Design and develop data frames by using Python on Spark.
Identify, design, and implement internal process improvements.
Build the infrastructure required for optimal extraction, transformation, and loading (ETL) of data from a wide variety of data sources like Teradata, SQL Server, Oracle & Spark, Python, Hive and other Bigdata technologies.
Create meta data sheets preparation to process the data by using Azure data frame work.
Having the continuous integration experience to deploy the code by using git hub.
Create pull requests to cover the entire flow that what changes in the code and ask for inputs or make suggestion if any changes would require and merge the code and give sign off in one place by git hub.
Deploying the code into Azure deployment process.
Taking care about code migration to the different environments.
Co-ordinating with test team if any errors faced during UAT, Functional & Production.
Create and run Control-M job schedulers by adding multiple servers, predecessor and successor.
Provide Quality control for job creation and review in changes.
Build and validate job flows are running fine, calendar requirements, and scheduling criteria by using Control-M scheduler.
Environment: Hive, Teradata, Informatica 10, Jira, Ambari, Control M, Data Analytics Studio, Ambari, Apache ATLAS, Confluence, Putty, Big Data, Hadoop, Spark, Hive, Microsoft Azure, Service Now, Windows.


Aviva UK Ltd Jan 18 to Dec 19
Project Title: Cyclops
Role: ETL Developer & Test Analyst
Responsibilities:

involved in ETL developing and analysis, database design, code development implementation and testing the data end to end.
As a tester needs to take care about the data is flowing to target tables as per business logic imple-mentation.
Fixing run issues and maintenance in live system support.
Ensuring that NULL values are not coming from the indexes
No duplicate data is loading in the end-to-end system.
Prepare the test scenarios and queries with execution with on time delivery.
Ensuring that reconciliation data is matching with source system.
Attended daily/weekly meeting with developers, service owners and stake holders to discuss up-coming releases and sprint planning, if any blockers.
Developed defect summary report and test summary report once the testing is done.
Provided the test scenario evidence walk through to business & tech teams as part of sign off crite-ria and business scenarios along with client.
Participate in Sprint planning, Lookahead, Retrospective.
Environment: Teradata, Informatic 8, UNIX, SQL, PLSQL, Control-M 7.0


Aviva UK Ltd Jun 16 to Dec 17
Project Title: EDW
Role: ETL Developer
Responsibilities
Mainly involved in ETL developing and analysis, database design, code development implementation and testing the data end to end.
Need to load the data on daily, weekly, monthly basis jobs as a load runner.
Worked on Informatica Source Analyzer, Target Designer, Mapping Designer, Mapplet and Trans-formation Developer.
Fixing the run issues while the jobs in the system level support.
To check the data is loaded as expected from source to target tables via intermediate tables.
Send the EOD snapshot to the stakeholders.
Ensuring that no duplicate data is loaded as part of end-to-end system.
Prepare the documents to give the knowledge base transfer to whoever onboarding to the team.

Environment: Teradata, Oracle 10g, Informatic 8, SQL, PLSQL, Windows.

SEI Investments Dec 15 to May 16
Project Title: SEI Global Wealth Platform
Role: ETL Developer
Responsibilities:
Involved in ETL workflow development
Worked on Informatica Source Analyzer, Target Designer, Mapping Designer, Mapplet and Trans-formation Developer.
Involved in the development of mappings from Source to Staging and then to Warehouse Database.
Developed data Mappings between source systems and warehouse components using Mapping De-signer.
Used most of the Transformations such as Source Qualifier, Aggregator, Expression, Lookup, Union, Sequence Generator, Update Strategy etc.
Used Informatica Workflow Manager to create Workflows, Worklets, Sessions to run with the logic embedded in the mappings.
Extensively used ETL to load data from flat file, which involved both fixed width as well as delimited files and also from the relation database.
Environment: Informatic 8, Oracle 10g, SQL, PLSQL, LINUX, Windows, Control-M7.0

Nike Inc, Jul 13 to Nov 15
Project Title: Retail Shoe mart
Role: ETL Tester
Responsibilities:
L1 support to the team using the specifications gathered from Business Analysis teams.
Validate the target data for accurate load into target databases.
Tested the scripts and Queries according to the requirements.
Prepare Test scenarios and test scripts for functional testing.
Co-ordinating with Solution designer to understand the process flow.
Preparation of Test plan.
Raising and updating defects in JIRA tool.
Participate in daily calls and do the follow ups if needed.
Environment: Oracle 10g, Informatica 8, OBIEE, SQL, PLSQL, Windows.

Professional Experience:
Working as IT Analyst with Cloud Data Networks LLC from Feb 2024 to till date.
Worked as IT Analyst with TATA Consultancy Services, Bangalore from Dec 2015 to Feb 2024.
Worked as Software Engineer at Delta Outsourcing Solutions (p) Ltd, Bangalore from July 2013 to Nov 2015.
Worked as Assistant manager at ROC UK Ltd, UK from Jan 2010 to March 2013.
Education:
Masters in BUSINESS MANAGEMENT from Bedfordshire University, UNITEDKINGDOM, 2010.
Master of Computer Applications from J.N.T University, Hyderabad, India, 2008.
Keywords: javascript database information technology microsoft Colorado Delaware

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];3850
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: