Home

Excellent Consultant On My Bench - Datastage Developer Resume
[email protected]
Location: Remote, Remote, USA
Relocation: Any
Visa: H1B
Hello

This Is Lisa,

Please Find The Below Attached Resume Of My Consultant


MUKESH
Phone: 216-785-5232
[email protected]
http://linkedin.com/in/mukesh-rathore-2505mukesh



Professional Summary
Around 14+ years of IT experience in ETL using IBM Data Stage, Data Analysis, Dataware house Implementation, Data migration, Data Conversion, SQL.
Expertise in various DataStage tool versions IBM DataStage 11.7/9.1/8.7/8.5/8.0.1/7.5.x using Components like DataStage Designer, DataStage Manager, DataStage Director, Administrator, Console. Data profiling using IBM Information Analyzer, data cleansing using Quality Stage.
Executed software projects for Insurance, Bank ,Retail and Healthcare industry.
Extensive experience in Extraction of data from various sources, Transformation of data as required and Loading into Data Warehouse ,Data Mart ,Databases using DataStage.
Developed DataStage parallel Extender jobs using different stages like Transformer, Aggregator, Change Capture, Filter, Join, Merge, Lookup, Sort, Pivot Enterprise, Column/Row Generator, Surrogate Key Generator, Remove Duplicate, Funnel, Database Connectors, Hierarchical Data, MQ Connector, Sequential File, Dataset, Unstructured Data ,Peek, Java Stage ,XML stages Salesforce Connector, Kafka Connectors.
Strong Experience in building Generic and Reusable Code components like Shared Containers.
Experience working with Large scale application with very large volume of data.
Worked on Migration of DataStage Jobs from 7.5.2 , 8.0.1 to 9.1 to 11.3 version.
Worked on Migration of Batch Processing -Mainframe jobs into DataStage jobs.
Worked on Migration of data Salesforce to Salesforce.
Worked on Migration of Oracle Database to Snowflake Db and Conversation of the ETL jobs to use snowflake connector instead of oracle connector stage.
Experience in using DBT , Airflow.
Troubleshoot and help improve existing ETL Jobs following best practices and standards.
Knowledge of Data modeling Techniques like Star Schema and Snowflake Schema.
Design and implement ETL Processes for data transaction related to Enterprise Data Warehouse, Operational Data Store (ODS) and Data Marts to support our Business Intelligence operations.
Strong knowledge on Databases like Oracle(12c),DB2,Greenplum and Teradata.
Strong knowledge on Cloud based Data storage like Snowflake
Hands on experience with Complex SQL,PL/SQL, Partition,Indexes, SQL Performance Tuning ,Query Optimization, Explain Plans,Stored procedures, database Utilities like Fast Load, Multi Load, Bulk Load.
Experience with Unix Commands , scripting, Data Ingestion ,Big Data solutions.
Provided guidance in some DataStage administration task such as scheduling jobs, troubleshooting job errors, identifying issues in unusually long running jobs.
Good Knowledge on scheduling tools such as Autosys, Control-M, ESP CA WorkStation.
Used Hyperion and OBIEE Tools to create reports.
Expert in Unit testing, System testing, Integration testing, UAT and Production Check outs.
Solid design, coding, Ttesting and Ddebugging skills.
Familiar with Version management tool like SVN, GitLab, GitHub.
Developed ETL jobs using RESTful APIs (PUT, GET) Web Services.
Knowledge on Rest, SOAP Service calls, XML, XSD, Json, HDFS Hadoop, DevOps, and Spectrum.
Knowledge on Micro Services, Open Shift, Initiate MDM SE, Kafka Router-Streamer, JavaScript,HTML5.
Managed the Splunk Env including monitoring and alerting on system performance and availability.
Working Experience with CI/CD Pipelines Jenkins (CI) and Udeploy (IBM Urban Code Deploy CD).
Designed and implemented database-migration flow for DDL/DML promotions using flyway, CICD concepts.
Strong Interpersonal skills in areas such as teamwork, facilitation, communication and negotiation.
Exceptional verbal and written communication skills, excellent team player and leadership skills.
Ability to handle and organize multiple projects and deadlines.
Handled Customer Communication and Management Reporting.
Experience in Onshore-Offshore delivery module Agile and DevOps Experience.
Translated requirements and data mapping documents to a technical design.
Hands-on with database client tools like Oracle ,IBM DB2,SQL developer ,TOAD ,pgAdmin and Hive.
Received appreciation at organisation level while dealing with most demanding customers.
Hands on with Rally, Jira Agile Tools and Azure DevOps(ADO) .

Skills Summary
Operating Systems UNIX ,Linux and Windows.
Languages/Scripting SQL, XML, XSD, JSON, Initiate API, Rest Web Services, Micro-services
Databases Oracle 12c, MS SQL Server,IBM DB2,Teradata.
Scheduling ESP, CA7 Workstation, Control- M
Office Tools MS-Office, MS-Access, MS-PowerPoint, Visio, SharePoint, Confluence.
Apps Microsoft Teams, WebEx, Skype, Slack.
Tools Salesforce, pgAdmin, Hive, Putty, Winscp, Toad ,SQL Developer, Teradata Studio,
Agile Rally, Jira, Azure DevOps
Version management Tools SVN, GIT Lab, GIT Hub.
Cloud Technologies Snowflake, Hive.
ETL Tool IBM InfoSphere DataStage and Quality Stage Designer.
IBM InfoSphere DataStage and Quality Stage Director.
IBM InfoSphere Information Server Manager.
IBM InfoSphere Information Server Administrator
Code Build/Deployment Jenkins(Build tool)- to create and upload Packages or Bundles
UDeploy (Deployment tool) - to deploy the Packages or Bundles.

Educational Qualification
Master of Science GPA: 3.52 UMKC (2010-2012).
IBM DataStage certified.

Training and workshops attended
Attended Agile Workshop (Agile 101 and Agile story writing).

Professional Experience

Client Name : Express Scripts-EVERNORTH, CT (Jun 2022 to Current)
Role : Senior DataStage Development Contract Engineer.
Project's Description:
This Project loads data from 24 Sources.
When ETL processes incoming delta files, the resulting information is usually bifurcated and delivered to both the I reads the source data and converts the incoming changes (delta load) into both XML message to be consumed by Kafka Queue then into HDFS (Hadoop), as well as writes coverage information directly to the ODS (Oracle DB).
Roles and Responsibilities:
As a role of FSE in complete Agile Team Framework, Responsible for gathering requirements from PO, Development, Testing in all environments, Deployments/Checkouts until Production.
Extensive experience in all phase of SLDC ie Agile Methodology and Waterfall Model.
Used ETL Job control, Jenkins, Udeploy.
Generated Reports from DB2 Databases.
Developed the logical and physical ETL design documents from source to target mapping document.
Developed ETL Job using RESTful APIs to make GET and PUT calls using Java integration stage.
Designed and developed jobs using IBM InfoSphere DataStage 11.x, 9.x.
Developed DataStage Jobs with required Transformations like DB2 Connector Stage, MQ Stages, XML Stages, Aggregator, Filter, Funnel, Join, Lookup, Merge, Remove Duplicates, Sort, Change Capture, Hierarchal, file stage, Connector stage and Transformer.
Developed processes to extract the source data from different sources, integrated and populated the extracts after cleansing, transforms and integrating.
Developed DataStage jobs using Change Capture Stage to compare the data changes between source tables and DataMart tables.
Tuned the performance of the parallel jobs, worked closely with DBA and Data Modeler s.
Used Complex Queries for data extraction from Source and Staging.
Responsible for Unit testing, System testing, Integration testing, UAT and Performance Testing.
Used Shared Containers for code reuse and implementing complex business logic which increased the productivity of work.
Extracted the data from DB2 Database and loaded into Oracle DB.
Provided assistance for optimization and performance tuning of DB2 SQL & Stored Procedures.
Proficient in using JCL,CA-7 for invoking DataStage jobs
Worked on Job sequences to Control the Execution of the job flow using Activities & Triggers like Job Activity, Wait for file, Email Notification, Sequencer, Exception handler and Execute Command.
Involved in fixes for bugs identified during production.
Creating UNIX scripts, PL/SQL, Stored Procedures in support of ETLs.
Created Json Files using Data stage Hierarchal stage and load data into HDFS Hadoop using Kafka MQ Kafka Topic.
Migration of old v9 IBM InfoSphere DataStage jobs to v11.7x, provided post-deployment support.
Developed ETL Jobs using Bulk load, Teradata Fast load.
Followed set standards and best practices in ETL development, refactor existing jobs.
Used Hyperion and OBIEE Tools to create reports off of the ODS and SML Databases.
Performing analytics on ODS, data clean-up activities to determine fallout of a particular SR ticket.
Monitored Server Capacity during PVS Testing using Team Quest.
Involved in preparation of ESP/CA7 scripts to schedule the jobs and ETL Job Controls.
Played significant Tech lead role in various phases of Project life cycle, such as requirements definition, functional, technical design, testing, production support and implementation.
Attended daily Scrum meetings, Demo s, Sprint Planning Meetings.

Environment: IBM InfoSphere DataStage (version 11.3x, 9.x), Initiate MDM SE(11.6), IBM DB2 13 Z/OS,MQ Series 7.0.1, Oracle 12c, Teradata, Kafka, Salesforce, Hadoop, AWS ,RESTful Web services ,OBIEE Reports , Pitney Bowes Spectrum ,UNIX/Linux.

Client Name : EVERNORTH, Windsor, Connecticut (Dec2020 to Jun 2022)
Role : Sr. DataStage Developer.
Project's Description:
Evernorth Behavioral Health, Inc. is a subsidiary of Cigna that provides behavioral health services for individuals, couples, and families. DataStage ETL is used to perform a multitude of tasks that pertain to loading of data, such as Reading source data from a file and loading into data mart.
Roles and Responsibilities:
As a role of FSE in complete Agile Team Framework, Responsible for gathering requirements from PO, Development, Testing in all environments, Deployments/Checkouts until Production.
Used ETL Job control, Jenkins, Udeploy.
Developed DataStage Jobs with required Transformations like MQ Stages, XML Stages, Aggregator, Filter, Funnel, Join, Lookup, Merge, Remove Duplicates, Sort, Change Capture, Hierarchal, file stage, Connector stage and Transformer.
Worked on Migration of Oracle Database to Snowflake Db and Conversation of the ETL jobs to use snowflake connector instead of oracle connector stage.
Worked on Migration of Salesforce data from one system to another System.
Tuned the performance of the parallel jobs, worked closely with DBA and Data Modeler s.
Used Complex Queries for data extraction from Source and Staging.
Used Airflow for job runs and schedules using dags.
Experience working using DBT (Snowflake).
Developed ETL Jobs using Bulk load, Teradata Fast load.
Attended daily Scrum meetings, Demo s, Sprint Planning Meetings.
Involved in fixes for bugs identified during production.
Creating UNIX scripts, PL/SQL, Stored Procedures in support of ETLs.
Attended daily Scrum meetings, Demo s, Sprint Planning Meetings.

Environment: IBM InfoSphere DataStage (version 11.3x), MQ Series 7.0.1, Oracle 12c, Salesforce, Hadoop,,RESTful Web services, Control M, Snowflake, Pitney Bowes Spectrum ,UNIX/Linux.

Client Name : CIGNA, Windsor, Connecticut (Oct 2015 to Dec 2020)
Role : Sr. DataStage Developer.
Project's Description:
Cigna is a global health service company that offers health, life and accident, dental insurance. DataStage ETL is used to perform a multitude of tasks that pertain to loading of data, such as Reading source data from a file delivered via C/D (Connect Direct), parsing the file and delivering an XML message to an Kafka MQ Queue.
Roles and Responsibilities:
Developed the logical and physical ETL design documents from source to target mapping document.
Developed ETL Job using RESTful APIs to make GET and PUT calls using Java integration stage.
Designed and developed jobs using IBM InfoSphere DataStage 11.x, 9.x.
Developed processes to extract the source data from different sources, integrated and populated the extracts after cleansing, transforms and integrating.
Developed DataStage jobs using Change Capture Stage to compare the data changes between source tables and DataMart tables.
Responsible for Unit testing, System testing, Integration testing, UAT and Performance Testing.
Used Shared Containers for code reuse and implementing complex business logic which increased the productivity of work.
Worked on Job sequences to Control the Execution of the job flow using Activities & Triggers like Job Activity, Wait for file, Email Notification, Sequencer, Exception handler and Execute Command.
Involved in fixes for bugs identified during production.
Creating UNIX scripts in support of ETLs.
Created Json Files using Data stage Hierarchal stage and load data into HDFS Hadoop using Kafka MQ Kafka Topic.
Migration of old v9 IBM InfoSphere DataStage jobs to v11.7x, provided post-deployment support.
Followed set standards and best practices in ETL development, refactor existing jobs.
Used Hyperion and OBIEE Tools to create reports off of the ODS and SML Databases.
Performing analytics on ODS, data clean-up activities to determine fallout of a particular SR ticket.
Monitored Server Capacity during PVS Testing using Team Quest.
Played significant Tech lead role in various phases of Project life cycle, such as requirements definition, functional, technical design, testing, production support and implementation.
Attended daily Scrum meetings, Demo s, Sprint Planning Meetings.

Environment: IBM InfoSphere DataStage (version 11.3x, 9.x), Initiate MDM SE(11.6), MQ Series 7.0.1, Oracle 12c, Teradata, Kafka, Salesforce, Hadoop, AWS ,RESTful Web services ,OBIEE Reports , Control M, Pitney Bowes Spectrum ,UNIX/Linux.

Client Name : UNFI, Providence, Rhode Island (Apr 2015 to Oct 2015)
Role : Sr. ETL/DataStage Supervisor.
Project's Description: UNFI is the leading independent national distributor of natural, organic and specialty foods in the United States. The Object of the EIW Inventory Optimization Forecast project is to bring the inventory data from IO Forecast and business system to the Enterprise Information Warehouse (EIW) and make the data accessible for reporting via One Source. Extracts via FTP, to IBM DataStage, Loaded into Teradata for reporting.
Roles and Responsibilities:
Involved in analysing business requirements and design documents.
Attend SCRUM meetings to discuss what was done since the previous meeting, what will be done today and until the next meeting and if there are any blockers.
Performed Column Analysis, Primary Key Analysis, Foreign Key Analysis and Cross Domain Analysis using IBM Information Analyzer.
Developed DataStage jobs using Change Capture Stage to compare the data changes between source tables and DataMart tables.
Involved in development of DataStage Jobs with required Transformations like Aggregator, Filter, Funnel, Join, Lookup, Remove Duplicates, Surrogate Key Generator, Sort and Transformer etc.
Developed data cleansing procedures using Quality Stage to standardize names, addresses, area.
Created pattern and data overrides to override the data values that are not handled by the rule sets.
Created PL/SQL procedures/Packages to implement complex business logic.
Involved in fixes for bugs identified during production runs.
Worked on conversion of all old jobs from SAS to 11.3 version DataStage.
Environment: IBM InfoSphere DataStage and Quality Stage (version 11.3), IBM Information Analyzer (v 11.3), Sql server, Teradata, UNIX/Linux, Flat Files.

Client Name : CIGNA, Bloomfield, Connecticut (Apr 2013 to Apr 2015)
Role : ETL/DataStage.
Project's Description: Cigna is a global health service company that offers health, life and accident, dental insurance. This project was responsible for defining and implementing a new data movement strategy. This will de-normalize the One View (Medical, Pharmacy, and Dental) schema into a more business friendly reporting schema.
Roles and Responsibilities:
Involved in analysing business requirements and design documents.
Designed and developed jobs using DataStage for loading the data into Dimension and Fact Tables.
Worked in an onsite-offshore model, explained the developers on the mapping and business rules.
Involved in development of DataStage Jobs with required Transformations like Aggregator, Filter, Funnel, Join, Lookup, Merge, Remove Duplicates, Sort and Transformer etc.
Developed processes to extract the source data from different sources, integrated and populated the extracts after cleansing, transforms and integrating.
Used Complex Queries for data extraction from Source and Staging.
Involved in fixes for bugs identified during production within the existing functional requirements.
Worked on migrating all the old jobs from 7.5.2 to 9.1.2 version plus new development on v9.
Involved in preparation of ESP/CA7 scripts to schedule the jobs and ETL Job Controls.
Environment: IBM DataStage (9.1x), (7.5x), Oracle 10G, SQL Server, DB2, Teradata, UNIX/Linux.

Client Name : First Object, Inc. Irving, Texas (Jun 2012 to Apr2013)
Role : Software Engineer.
Project Description
Assigned as ETL (DataStage) consultant to design and develop data model and ETL to support Business Intelligence applications. Developed ETL process to extract data from Sources systems, Transform as per business requirements and load the data into dimensional data models to support BI reports.
Roles and Responsibilities
Understanding the existing business model and Translating Business Requirements into ETL jobs.
Implemented Extract, Transformation, and Load functionality for Enterprise Data Warehouse and Data Marts using DataStage designer.
Extensively used the Sequential File stage, Dataset, Filter, Funnel, JOIN, Lookup, Copy, Aggregator, during ETL development.
Worked with DataStage Director to Run jobs, Monitor and check log files.
Environment: DataStage 7.0, Oracle, UNIX/Linux and MS Windows.

Client Name : DRTC Ltd, AP (May 2009 to Aug 2010)
Role : Jr Programmer /Intern.
Roles and Responsibilities
Understanding project requirement and translating it into specifications and programming deliverables.
Testing and debugging the product in controlled, real situations.
Maintaining the systems and updating as per requirements.
Data Extraction, Transformation and Loading from different Source system.

******** PROFESSIONAL REFERENCES CAN BE PROVIDED UPON REQUEST ******






Thanks & Regards,
Lisa,
Bench Sales Recruiter
Stier Solutions Inc


T. (610) 735 8665 | E. [email protected]
Keywords: cprogramm continuous integration continuous deployment message queue business intelligence database zos information technology purchase order microsoft procedural language California Connecticut Delaware

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];3867
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: