Home

Suman Meriga - ETL Developer/Data Engineer
[email protected]
Location: Malvern, Pennsylvania, USA
Relocation: Yes
Visa: H1B
Suman M
Senior Data Engineer/ ETL Devloper
Email: [email protected]
Phone: +1-(312)-488-7328
Linkedin: www.linkedin.com/in/suman-meriga-474a48190

Professional Summary
Having 11+ years of focused experience in Information Technology on Development with a strong background in System Analysis, design, development and Data Warehousing, Business Intelligence ETL projects covering all the areas of Software Requirement gathering, Analysis, Design, Data Modeling, Data Architecture, Development, Integration, Implementation, Maintenance, Testing, and Production Support of Applications
Excellent Experience in Designing, Developing, Documenting, Testing of ETL jobs and mappings in Parallel jobs using DataStage, Snowflake,DBT & Informatica Intelligent Cloud Services -IICS to populate tables in Data Warehouse, Data marts & Cloud S3 bucket.
Strong skills in IBM DataStage 11.7/11.5/ 9.1v, Snowflake,DBT, Informatica Intelligent Cloud Services -IICS ,SQL Programming, IBM DB2, Netezza performance tuning and Shell Scripting.
7 + Year Experience in Control-M for ETL scheduling process. Creating Control-M Process and Job monitoring for up and down streem processes.
Created Clone concepts to maintain zero copies in Snowflake.
Experience in building Snowpipe.
Experience in using Snowflake Clone and Time Travel.
ETL pipelines in and out of data warehouse using combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake.
Experience with Snowflake Multi - Cluster Warehouses
Worked on TASK creation for scheduling/Automate Snowflake jobs.
Hands-on experience in bulk loading and unloading data into Snowflake tables using COPY command.
Proficient in understanding business processes/requirements and translating into technical requirements.
Good exposure in Snowflake Cloud Architecture and SnowSQL and SNOWPIPE for continuous data loading.
Experience designing and developing jobs/ scripts to load and extract data from/ into Hive DB
Knowledge and experience in design, development, and deployments of Big Data projects using Hadoop, coded complex SQLs (ELT scripts) to load data into Data Warehouse Foundation/ Aggregate tables
Experience in creating detailed design technical artifacts with: Data Flow Diagrams (DFDs), Program architecture Designs and Data Models Design
Used various partitions Techniques like hash, Same, Entire, Modulus etc. to improve the job performance And Created DataStage job sequences.
Used DataStage Director to monitoring jobs and used control-M tool to schedule and run the jobs.
Good Knowledge in the Data Warehouse concepts like Dimensions, Facts, Star Schema and Snow Flake Schema.
Good Knowledge in IICS Data Integration and Monitor to implement mapping, mct and Task flows.
Functioned as a part of CRB (Code Review Board), reviewed code to ensure Target standards are met and deliverables are of high quality
Provided post-production implementation support and worked with business for program enhancements

Technical Skils
ETL TOOL: IBM InfoSphere DataStage 11.7/ 9.1v, Informatica Intelligent Cloud Services -IICS
Big Data: HDFS, HIVE
Databases: Netezza, Oracle, MS SQL Server, Teradata, DB2,Redshift,Snowflake
Scheduling Tools : Control-M,StoneBranch
Operating Systems: Windows NT/2002, Windows Server 2003/2008/2012, Windows 98/Vista/XP/2003/7/10.
Cloud technologies: Snowflake, DBT, IICS, AWS S3

Education & Certifications
Completed Master of Computers Applications from Sri Venkateswara University in March 2011.

Certification
IBM DataStage 8.7v
Snowpro Core Certification

Work Experience:
Client : NYL (New York Life) April 2022 April 2024
Roll : Data Engineer
Project: EDM
New York Life Insurance Company (NYL) is the third-largest life insurance company in the United States, the largest mutual life insurance company in the United States and Fortune 500 list of the largest United States corporations This Project is mainly Aimed to prepare Enterprise Data Modeling to generate reports for the higher management

Analyze and understand the business requirements to convert technical design
Involve in the all phases of SDLC, created detailed analysis design documents with source to target mappings
Gathered requirements from business teams
Perform root cause analysis to analyze issues and report results back to the infrastructure lead
Perform SQL analysis on source or target tables relating to ETL mapping issues as required
Strong background in database design and ETL tools
Worked on SnowSQL and Snowpipe
Created Snowpipe for continuous data load.
Used COPY to bulk load the data.
Created data sharing between two snowflake accounts.
Created internal and external stage and transformed data Using DBT.
Created jobs to implement SCD type1 and type2 Loads Using DBT.
Worked on Time Travel Mechanism and Cloning the tables.
Knowledge in working with Azure and Google data services
Consulting on Snowflake Data Platform Solution Architecture, Design, Development, and deployment focused to bring the data driven culture across the enterprises
Implemented Change Data Capture technology in Snowflake using stream Mechanism to load deltas to a Data Warehouse.
Unload the Data from Snowflake tables to the client required format files (CSV, dat and txt extensions files)
Engaged in data analysis and profiling to answer business/functional questions
Developed UNIX Shell Scripts, FTP and DataStage jobs to process and load data files
Collaborate with the Subject matter experts, analyze user needs to determine digital therapy requirements.
Develop, update and maintain Software solutions for re-usability to ease the work of other engineers, save the development cost/efforts
Created a detail implementation plan, coordinated with different teams on implementation of common jobs and deployment.
Develop and support the extract, Transformation, and load process (ETL) for data migration
Implementation of data purging and retention logic as per the compliance requirements
Created the test environment for staging area, Loading the staging area with data from multiple sources and migrate the objects in different environments (DEV,UAT and Prod)
Control-M for ETL process scheduling and Creating Control-M Process and Job monitoring for up and down streem processes.


Environments: IBM DataStage 11.7v, Informatica Intelligent Cloud Services -IICS, Windows XP, UNIX, Putty, WinSCP, Jira, Hadoop, HDFS,HIVE,AWS S3, Snowflake, Redshift,GitHub,DBT

Client : CBA (Common wealth Bank of Australia) Feb 2021 April 2022
Roll : ETL Developer
Project: CVM
Common wealth Bank is an Australia multinational banking and financial services company headquartered in Sydney, Australia. This Project is mainly Aimed get the CVM Candidates details which are in CBA Customers.

Worked on DataStage Designer, Director. Involved in the development of DataStage jobs.
Involved in Staging data through ETL tool DataStage.
Used ETL tool DataStage to extract data from different sources like Oracle, flat files, transforming according to requirement and then load
Implemented performance tuning options like partition technique and sorting data before join stage, removing unnecessary columns and handling the nulls.
Error tracking and Error Monitoring using DS director.
Taking regular backups of the jobs developed using DataStage Designer Export/Import utility.

Environments: DataStage 9.1, Oracle9i, Windows XP, UNIX, Putty, WinSCP, Jira

Client : USAA Mar 2017 Jan 2021
Roll : ETL Developer
Project: IT Data and Analytics
IT Data and analytics system deals with financial analysis, Delivery performance, Resource type mix and optimization, infrastructure capacity and operational performance analysis and Availability/Incident Management to provide best-in-class analytics which are trusted and actionable to optimize the business of IT and business stakeholder to effectively manage the diverse IT business portfolio. And consider a minimum set of reporting across all programs leverage best practices across IT and Improve ability to forecast funding & Scope delivery.
The purpose of Data Warehouse is to design it for Data Analysis. The source data from Netezza, Oracle/flat file, Salesforce and SNOW are extracted and cleansed using ETL Operations. The primary focus of the project is to pull the initial and incremental data from USAA various source systems and to load into the data warehouse, The scope of this project is to capture the details of USAA products and to construct a Data warehouse to maintain their historical and current data, to enable them to take decision to enhance business process

Understanding the client requirements by reviewing by the functional documents.
Primary on-site technical lead during the analysis, planning, design, development, and implementation stages of data quality projects using Integrity
Gathering requirement from business users and develop code as per business logic.
Implemented Delta load jobs using change capture stage.
Developed Unix file dependent scripts as per business requirement.
Used ETL tool DataStage to extract data from different sources Salesforce, flat files Transforming according to requirement and then load into Target.
Used the various Stages like Transformer, Copy, Lookup, Sort, Funnel, Join, Dataset, Merge, Aggregator, CDC, Salesforce connector, Remove duplicate and Sequential file stage.
Preparing project related document, which will help to understand code what we have done.
Prepare Unit Test results.
Involved in Performance Tuning of ETL Jobs.
Control-M process scheduling and Creating Control-M Process and Job monitoring for up and down streem processes
Environments: DataStage 11.7, Web Services, Oracle9i, Windows, UNIX, Putty, WinSCP, RTC, Jira
Client : USAA Jun 16 Feb 17
Roll : ETL Developer
Project: LDRPS
United Services Automobile Association (USAA) is a Texas- based, diversified financial services group of companies, including a Texas Department of Insurance regulated reciprocal inter-insurance exchange and subsidiaries offering banking, investing, and insurance to people and families that serve, or served, in the United States military.
Living disaster Recovery planning System is used for following business purpose
business continuation, No loss to USAA customer data
No loss to business data, recover all the project from disaster, activate system all the time


Client : USAA Mar 15 May 16
Roll : ETL Developer
Project: FAH

Business Problem Current financial data infrastructure for the enterprise requires too much manual manipulation and has a lack of systems integration.
Business Benefit Summary Benefits are currently listed as No Financial Benefits (NFB). However, it is most certain that there are benefits around full time employee (FTE) reductions and efficiencies and a reduction in compliance costs. While change communications are being coordinated, the team to establish and certify these benefits as soon as possible.
Project Concept Definition Implement Financial Accounting Hub solutions to field a single source of accounting truth and improve the financial insight of the product line leaders, financial analysts, and senior financial officers of USAA


Client : USAA July 13 Mar 15
Roll : ETL Developer
Project: Mortgage
Business Problem:
As mortgage migrates be loan origination system there will be a need to provide the business teams with Operational & Enterprise Reporting capabilities to continue supporting the mortgage product and members.
Business benefit summary:
Establish mortgage Operational & Enterprise reporting capabilities in support of new loan origination system.
Project Concept definition:
Establish mortgage Operational & Enterprise reporting capabilities in support of new loan origination system.

Understanding the client requirements by reviewing by the functional documents.
Worked on DataStage Designer and DataStage Director. Involved in the development of DataStage jobs.
Used ETL tool DataStage to extract data from different sources like Oracle, flat files, transforming according to requirement and then load into Oracle, flat files.
Used the various Stages like copy, Modify, Lookup, Transformer, Sort, Funnel, Row/Column Generator, Join, Dataset, Merge, Aggregator, Sequential file, Remove duplicates and Surrogate Key Generator.
Handling SIT issues.
Involved in developing complex Jobs using DataStage Designer and Job parameters
Taking regular backups of the jobs developed using DataStage Designer Export/Import utility.
Error tracking and Error Monitoring using DS director.
Handling of job recoveries during load aborts.
Involved in Performance Tuning of ETL Jobs.
Control-M monitoring for DataStage Jobs

Environments: DataStage 11.7/9.1, Web Services, Oracle9i, Windows, UNIX, Putty, WinSCP, RTC, Jira
Keywords: sthree database information technology microsoft

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];3246
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: