Durgesh is Looking for the role of Test Data Management, Service Virtualization and ETL/QA Testing at Remote, Remote, USA |
Email: [email protected] |
Hi, Hope you are doing well. Kindly find the attached resume for your reference. Please let me know. If this profile works for you. Visa: H1B let me know if you find any suitable opening available. You can reach me on (860)498-2498 or at [email protected] Durgesh K [email protected] (860)498-2498 15+ years of IT industry experience with expertise in Continuous Testing concepts of Test Data Management, Service Virtualization and ETL/QA Testing. A highly motivated result oriented professional experienced in achieving high quality products that meet customer needs. A delivery focused person with strong attention to detail. Key Skills: Good Understanding and well-versed hands-on experience in Test Data Management concepts and in-depth knowledge of TDM implementations. Domain expert with sound knowledge in Life Insurance-New Business & Disability Income, Healthcare-Member & Provider and Banking-Mortgage servicing. Well versed in understanding TDM assessment, TDM strategy and TDM planning. Hands on Experience in Continuous testing tools of Broadcom BROADCOM TDM and Delphix. Partnered with Delphix vendor to coordinate and install delphix on client environment. Developed a detail road map and strategy to meet the objective of client s TDM goal w.r.t delphix. Conduct scrum meeting with all stake holders to drive and accomplish task related to delphix tool implementation. like Synthetic Data generation, Data Profiling, Data Subset, Data Masking, In-place and In-flight masking, Automation of TDM activities using Javelin automation tool, TDoD , Self-service form creation with integration of ARD, BROADCOM TDM portal using Broadcom BROADCOM TDM. Well versed in handling Broadcom BROADCOM TDM tool with good hands-on experience in tools like GT Datamaker, Fast Data Masker, GT Subset, BROADCOM TDM Portal and Javelin automation tool and ARD. Good hands on experience in Data Profiling, Data sample, creating Transformation Maps, creating synthetic data generation for both relational database and non-relational database(CSV,XML,JSON) using BROADCOM TDM portal and GT Datamaker tool. Good hands-on experience in creating Subset scripts of data using GT Subset and generate export/delete/import batch files along with In-flight masking technique. Good hands-on experience in data masking in relational and flat files like CSV, JSON, XML using both Fastdatamasker tool and BROADCOM TDM Portal. Good hands-on experience on working with MS SQL SERVER, ORACLE DB and DB2. Good knowledge on developing TDM automation visual workflows using Javelin TDM automation tool and integrating it with BROADCOM TDM Portal. Good Knowledge and hands on experience on TDM features like synthetic data generation rules, Subset Scripts, Masking Scripts using FDM, Transformation maps using GT Datamaker and integration of TDM activities by Javelin. Excellent proficiency in handling tools like Broadcom BROADCOM TDM JIRA, Confluence and following agile methodology which helps enabling improvement in software quality. Good Knowledge in creating integration of GT Datamaker, GT Subset, FDM, BROADCOM TDM portal and Javelin automation tool. Served the roles of Informatica TDM Architect and Data Analyst, ETL Test Analyst, Quality Assurance analyst and Production support. Lead POC with Informatica Test Data Management for delivering Data Masking and Data Generation solutions. Extensively worked on HL7 and EDI files (837, 834, 835, 270, 820, 271, 275) ETL and data integration experience in developing ETL mappings and scripts using Informatica Test Data Management(TDM), Informatica PowerCenter 10.x/9.x/8.x/7.x/6.x and IDQ Proficiency in ETL Testing, Functional Testing, Automation Testing, Browser Compatibility Testing, SOAP/UI SOA Testing, Database Testing, volume Testing, Stress Testing, Load testing, Regression Testing, Integration Testing, User Acceptance Testing and Defect Management. Extensive Experience in writing SQL queries, stored procedures, functions, packages, triggers, exception handlers, Cursors, using MS SQL Server Management studio, PLSQL Developer, SQL Plus and TOAD. Experience in Autosys, Control M and CA7 scheduling tools. Domain expert with sound knowledge in Life Insurance-New Business & Disability Income, Healthcare-Member & Provider and Banking-Mortgage servicing. Proficient Test Manager with extensive experience in defining Test Strategy for large-scale Integration/E2E test efforts, leading testing efforts with all STLC phases including Test Planning, Test Design, Test Development, Test Execution, Test scheduling, Test coordination, Test Data Management, Defect triage and Metrics Reporting. Experience in defining Automation Strategy, Automation Framework, Automation Test optimization, Automation Test Estimations and roadmap including Automation script design, development, maintenance, and execution. Experience in Automation Scope Identification, Assessment and Cost Benefits Calculation Projection Good knowledge in DevOps, Big Data Ecosystems: Hadoop, MapReduce, HDFS, Hive and Pig. Experience in Develop and maintain Automated Regression Suite based on discussion with BA and SME. Experience in Maintaining the Scripts in version control systems like Bit Bucket, GitHub and Source Tree. Experience in Configuring and integrating Automation scripts on Continuous Integration tools (CI/CD) like Jenkins for Continuous Integration and Testing. Served the roles of Data Analyst, ETL Test Analyst, Quality Assurance analyst, Developer and Production support. As a Subject Matter Expert (SME), ability to review requirement, design, test strategy, test plan, test scenarios, test data and scripts. Expertise in coordinating & delivering large testing projects with involving multiple scrum teams spread across geographies working in onsite/offshore delivery model. Proficient in maintaining QA team's capacity planning, utilization, and assignment allocations. Adept at identifying overall QA milestones, dependencies, issues, risks, and assumptions. Excellent documentation skills with good experience in preparing Master Test plan, Detailed test execution plan, Test scripts, RTM, Impact assessment, defects tracking report and End of test report using Microsoft office. Experience in QA stakeholder management by co-coordinating various parties like business analyst, architect, IT & Business project manager, development team and UAT team in various projects. Experience in providing Best in Class, optimized Test solution to meet the aggressive project timelines. Technical Skills Database Oracle 11/10g, IBM DB2, MS SQL Server, MongoDB, Big Data Operating Systems Unix, Windows NT/2000/XP/7, OS/400, Mainframe ETL/TDM Tool Broadcom TDM, Delphix , IBM OPtim, Informatica TDM, Informatica PowerCenter 10.X; Data Stage 8.5, 9.1, 11.5 & 11.7; HDFS , Mapreduce, Hive, pig. Automation tool Selenium, Python, Java, Cucumber, Junit, TestNG, Maven, Bit Bucket, Jenkins, Languages/Technologies Python, Shell scripting, SQL, PL/SQL, Unix, RPG400, COBOL,CL400 Test Management Tool HP Application Lifecycle Management, Jira Web service Testing Tool SOAP-UI Data Warehouses Snowflake, Teradata, Oracle Professional experience Pricewaterhousecoopers(PwC) Apr 2024 till date TDM Lead/Architect Key Accomplishments: Requirement gathering of test data needs and analysis of technical requirements Project assessment, planning and scheduling. Test data facilitation using various tools like CA TDM, FDM, Agile designer, Javelin and power shell/batch scripting. Creation of generators, rule setups, configuration and pre and post actions for synthetic data generation needs. Data scanning, profiling and masking of DB/Data for sensitive fields and masking PII fields based on project requirement using seed lists. This also involves identification of sensitive fields, preview masking. Creation and modification of custom classifiers and Seedlists. Creation of Self-service UI catalog using ARD (Agile designer) for application users to trigger masking/data generation jobs. This also enables passing of parameters from the UI. Creation of Javelin workflow to incorporate various actions required pre/post triggering of masking/data generation jobs. Adding batch process to trigger the masking/data generation jobs into devops CI/CD pipeline through creation of Batch/Powershell scripts. Synthetic data generation in databases, csv or flat files along with JSON file formats to upload in MongoDB. Managing risk tracking, estimations and monitoring of data requirements. Coordinating within the team and with multiple application teams and stakeholders to provide best quality test data to the teams. Following agile methodology and continuously updating of Azure devOps portal to be used in scrum. Management activities: SOP creations and documentations of Requirement Definition Monitoring and debugging of issues and logs Coordination with client and their stakeholders Overall risk tracking and monitoring of projects and releases Coordinating with other departments in PwC UNIX, DBA, Networking teams Key Bank - Cleveland, OH Jan 2022 March-2024 TDM Lead/Architect Key Accomplishments: KeyBank, the primary subsidiary of KeyCorp, is an American regional bank headquartered in Cleveland, Ohio, and is the only major bank based in Cleveland. KeyBank is one of the largest banks in the United States. Key's customer base spans retail, small business, corporate, commercial, and investment clients. Working on creating Self-service tiles for data mining activities, creating data mining by registering necessary database tables, creating required variables, creating data mining rules in generators, Creating necessary actions (SQL, workflows) and configurations. Creating Self-service forms with the help of ARD by integrating the generators developed and publishing the Self-service forms to CA TDM portal. Well versed with the concepts of Synthetic data generation using, Data Masking, Data Subsetting and creating Visual workflows using Javelin. Well versed with Synthetic data generation rules, good understanding of the tool and end to end integration of the tool. I have got good proficiency in CA Broadcom TDM tool and have in-depth knowledge of how to use the tool to create, provision and manage test data. Worked on creating Synthetic data for functional test teams and development teams by understanding the functionality of the requirement and delivering data accordingly. I have good problem-solving skills always prepared to troubleshoot and resolve issues related to test data provisioning and management. I have provided training and mentoring to team members, clients to enhance their TDM skills on Broadcom CA TDM. I always have good interactions with the vendor for support, updates, and issue resolution. A good team player who understands the requirements by having active conversations with the clients and delivering the best possible TDM methods helping the team be GDPR compliant. I have updated myself with the industry standards tools related to TDM and have been continuously seeking ways to improve TDM processes and efficiency. Develop and detailed road map and strategy to meet the client TDM goal w.r.t delphix. Partnered with Delphix vendor to coordinate and install delphix on client enviornment. Virtualized data from PROD into vDBs and created DSource for POC. Created multiple rulesets for masking at DB level for Delphix masking POC. Truist Bank, VA, USA Aug 2020- Dec 2021 TDM Lead Key Accomplishments: Be as single point of contact for all Test Data needs/requirement for large program. Design Framework for Data Masking Proof of concept for several application Teams using Broadcom TDM and Delphix. Transformed existing configurations from Delphix to CA TDM. Leveraged dataset vDBs and created shared branches for the data created Applied both IN PLACE and ON the FLY masking methods using DELPHIX Implemented replication against PROD vDBs and have masked data into NON PROD environments Data sub setting from production and pre-production environment. Discover and identify Data relation Applied both IN PLACE and ON FLY masking during capture of data from VDB Setup of Gold copy in VDB as TDM repository to minimize the impact on PROD Leveraged Big ID and integrated with Delphix for Data Governance Integrated Delphix tool with CI/CD pipeline Creating Rich Versatile Synthetic data using CA TDM for various Database like Oracle, SQL Server Management System. Creating Synthetic Data and Publishing them to SQL, XML,JSON, CSV, EXCEL and numerous formats using CA TDM. Masking PII production data Using FastDataMasker. MassMutual, MA, USA July2016- July2020 TDM Lead / Test manager Key Accomplishments: Be as single point of contact for all Test Data needs/requirement for large program. Design Framework for Data Masking Proof of concept for several application Teams using Broadcom TDM.. Working on creating Self-service tiles for data mining activities, creating data mining by registering necessary database tables, creating required variables, creating data mining rules in generators, Creating necessary actions (SQL, workflows) and configurations. Creating Self-service forms with the help of ARD by integrating the generators developed and publishing the Self-service forms to CA TDM portal. Well versed with the concepts of Synthetic data generation using, Data Masking, Data Sub setting and creating Visual workflows using Javelin. Well versed with Synthetic data generation rules, good understanding of the tool and end to end integration of the tool. Good proficiency in CA Broadcom TDM tool and have in-depth knowledge of how to use the tool to create, provision and manage test data. Worked on creating Synthetic data for functional test teams and development teams by understanding the functionality of the requirement and delivering data accordingly. Provided training and mentoring to team members, clients to enhance their TDM skills on Broadcom CA TDM. Aetna, CT, USA May 2014- June2016 TDM Analyst Key Accomplishments: Managing new engagements for TDM through RFQ s and estimates Accountable for preparation of Master Test Data Strategy, Master Test Data Delivery Plan for applicable large-scale projects. Creating Synthetic data for flat files like XML.JSON and EDI files used for API testing. Reviewing of Test Data request developed by Team members. Generate Test Data summary report at the end of testing phase and circulate to stake holders. Weekly & Daily status Report Reported defects on Jira and followed till they are closed. Performed in data validation, data profiling and data frequency analysis. Performed Data mining, Data masking by using Informatica TDM tool based on requirement. Discussion with all relevant teams and the Key POCs to understand and agree the scope and milestones. Used Informatica PowerCenter & IDQ tools. Mainly worked on mapplets & masking objects. Implemented masking rules as per the client requirements. Troubleshooting database performance issues and implementing necessary database changes Coordination between offshore and onsite Test Data Obfuscation, Test Data Delivery Management Care First, MD, USA Jan 2013- Apr 2014 ETL Lead/ Data Analyst Key Accomplishments: Analyze the data flows and data models pertaining to Personal Health Information (PHI) within PH non-production environments. Gather inputs from stakeholders specific to in scope applications/systems and review the data compliance/IT security, PHI & PII Data elements. Prepare the customized data masking solution strategy by using of Informtaica TDM tool. Prepare specific masking algorithms and seed lists preparation. Worked on TDM tool installation support and configuration including data subset, persistent masking, discovery option and data validation option components. Experience in DVO tool for testing the masked and non-masked data. Masking scripts preparation for tables with PHI data for FACETS and Non-FACETS databases Developing and implementing the Test Plan and Test Strategy for data masking. Reporting quality metrics to the project team and stakeholders. Prepared masking process documentation for FACETS and Non-Facets DB schemas including details designs, run book, user training materials and deployment plan. TIAA-CREF LI, NC, USA Jan 2011 - Dec 2012 ETL Lead/Data Analyst Key Accomplishments: Validated ETL DW load statistics and Data mart Loads on an Ongoing Basis. Involved in validating various ETL, ODS, Data stage Job, Report batch jobs scheduled using autosys. Provide Critical Batch report to customer. Validated whether Source to Target ETL mappings is according to transformation rules and Business Rules. Validated Data Stage jobs by running them in data stage director. Validated Data Stage ETL jobs for the data quality and record count. Involved in validating data moving from multiple source systems to ODS and EDW. Prepared complex SQL scripts for functional and data validation of ETL applications. Validated the record count between source and target for passive and active ETL transformations. Staples Inc., USA Apr 2008 Jan 2011 Software Developer Key Accomplishments: Developed TSD (Technical Specification Document) from Functional Requirements and Business Requirement Document. Responsible for development and maintenance of critical applications. Created the DFD and ER diagram Validated both the databases and the front-end applications. Implementation & Release plan Checked the data flow through the front end to backend, by using SQL queries to extract the data from the database. Performed database backend testing by field level validation in SQL/400. Auto Loan Origination, HSBC U.S. A Nov 2005 - Mar 2008 Software Developer Key Accomplishments: Developed TSD (Technical Specification Document) from Functional Requirements and Business Requirement Document. Engagement in Problem analysis and resolution along with Coding, Review and Testing. Involvement in Unit Testing and Integration Testing Responsible for Final Delivery to the Customer. Coaching and mentoring the team members. Involvement in resource utilization and resource rotation and people management. Involved in preparation of implementation plan. Reported and prioritized defects using Quality Center and presented reports in weekly team meetings. Closely worked with Technical operations and Release management teams to resolve the production issues. Responsible for development and maintenance of critical applications. Environment: AS400, DB2, Quality Center, SQL/400, PL/SQL, TOAD, Autosys, UNIX, Oracle 9.0, Java, RPGILE, COBOL, SYNON This email is generated using CONREP software. A50055 Keywords: continuous integration continuous deployment quality analyst business analyst user interface database rlang information technology hewlett packard microsoft procedural language California Colorado Connecticut Idaho Maryland Massachusetts North Carolina Ohio Virginia Durgesh is Looking for the role of Test Data Management, Service Virtualization and ETL/QA Testing [email protected] hotlist |
[email protected] View all |
Fri Nov 15 00:08:00 UTC 2024 |