Home

SRI CHEE - ETL DATA ENGINEER
[email protected]
Location: Frisco, Texas, USA
Relocation: TX
Visa: H1B
16 years of experience in diverse technology and business environments with comprehensive hands-on expertise in delivering BI solutions, Data Engineering & Data Management, Cloud Data Architecture, Business Intelligence Strategy, Advance Analytics and Business Intelligence applications.
Experience in identifying and implementing statistical methods and Machine Learning Models to achieve research objectives.
Experience in creation of ad-hoc reports and dash boards for validating program integrity.
Experience in acting as a conduit between customer needs, defining high-value solutions to meet those needs and assuring the project teams successfully develop those solutions.
Experience leading and engaging with multiple stakeholder groups to comprehend the business needs, prototype new ideas and new technologies, to drive data innovation.
Developing meaningful insights from data by creating actionable narratives that simplify complex ideas for varying audiences.
Assisted in developing BI road map evaluating the current state and planning for the future state laying out the vision/goal.
Experience in providing technical oversight to the project teams in the unit to ensure development standards are being met.
Experience in providing estimates for various projects and developed different estimation templates which were used across the department.
Experience in evaluating methods to upgrade software like Informatica and Oracle.
Experience in creating audit jobs to validate data between source system and data warehouse.
Works proactively to identify and resolve compliance issues ahead of audit.
Ensures compliance to data governance requirements aligned with data owner and business objectives.
Extensively used ETL methodology for supporting data extraction, transformation and loading process (Building ETL pipelines), in a corporate-wide ETL Solution using Informatica Power Center 10.x, Informatica Developer 10.x (DQ, MDM and BDM), Informatica DIH, Informatica Intelligent cloud services (IICS) and AWS Glue tools.
Data Integration with Amazon Redshift, S3 and Data Lake from On-prem Oracle DB using Informatica Cloud.
Very good experience in gathering business requirements and converting them into functional, technical specifications developing mapping documents to map source systems data to the destination tables in Data Warehouse model.
Experience in data warehouse techniques and practices, experience including ETL processes, dimensional data modeling (Star Schema, Snowflake Schema, FACT & Dimension Tables), OLTP and OLAP.
Successfully implemented migration projects like Source systems migrations and technology migrations Analyzing the entire existing system functionality and preparing the plan for migrating to the new system/technology by keeping the business functionality intact.
Working experience on AWS services like IAM, EC2, S3, RDS, Redshift, Glue, EMR and VPC.
Extensive experience in converting on premise power center jobs to IICS.
Proficient in API programming using Python.
Architecting and implementing end-to-end automation solutions to speed up the execution process and avoiding manual intervention. This is applicable to the existing processes and new process where it has the scope to build end-to-end automation solution.
Experience in Developing solution patterns, Data Flow Diagrams, Data Models, ER Models for optimal data storage in databases.
Experience in integration of various data sources like Salesforce, Amazon Redshift, RDBMS, flat files, XML data into RDBMS tables, Amazon S3 and flat files using Informatica, Glue and Python scripting.
Experience in analysis of stored procedures, packages, database triggers and functions in PL/SQL and converting them into ETLs to fasten the data load process and to ease the trouble-shooting issues.
Good experience in writing Oracle SQL queries ranging from simple to complex with analytical functions Tuning queries by looking at explain plan, use of right oracle hints at right place.
Experience in working with global teams located in multiple geo locations.
Good Hands-on experience in Python 2.x and 3.x installation in Linux servers and implementing automated solutions.
Strong UNIX shell scripting experience to automate file handling processes, enabling error checks, file movement between various servers (setup password less user access) and to send e-mail notifications.
Created dashboards using Tableau.
Support QA, UAT and performance testing phases of development cycle.
Experience in acting as a conduit between customer needs, defining high-value solutions to meet those needs and assuring the project teams successfully develop those solutions.
Experience leading and engaging with multiple stakeholder groups to comprehend the business needs, prototype new ideas and new technologies, to drive data innovation.
Developing meaningful insights from data by creating actionable narratives that simplify complex ideas for varying audiences.

Cloud Evaluations and Cloud Architectures
More than 4 years of experience with providing Cloud Data Architectures, Cloud migration strategies, roadmaps, architectures for Enterprise Data Warehouse platforms and Data Lake in Cloud.
Onsite/Offshore Model - Expertise with building and working with teams located in multiple geo locations.

Technology Expertise
Collaborated with various Business executives and IS Management (CIO, Bureau section Chiefs and AVP) on various initiatives and projects and presented Information strategies, roadmaps, technology trends and recommendations. Strong expertise with delivering presentations to variety of audience including internal IS senior management, Business Steering committees and external vendors.
Extensive knowledge and experience of modern data architecture and modern technologies such as data appliances, cloud computing (AWS, Microsoft Azure), ETL, Reporting, Big Data Technologies like Hadoop, data visualization, data streaming, analytics.
Proficient in data modelling and dimensional modelling (conceptual, logical and Physical models).
Expertise in statistical methods.
Experience in building Machine Learning models.
Experience in Data profiling, Data quality and Data governance.


PROFFESIONAL CERTIFICATIONS

AWS Certified Solution Architect Associate
AWS Certified Cloud Practitioner.
PCAP- Certified Associate in Python Programming.
Oracle- SQL Developer.

EDUCATION


Bachelor s in computer science engineering from Jawaharlal Nehru Technological University (Accredited by NAAC with grade A ), India.


TECHNICAL SKILLS

ETL TOOLSET: Informatica Power Center (8.x/9.5/10.X), Informatica MDM 10.X,
Informatica Cloud (IICS), Informatica Power exchange for Mainframes, Microsoft SQL Server Package(SSIS,SSAS).

REPORTING TOOLSET: SAP Business Objects.

VISUALIZATIONTOOLSET: Tableau, DOMO.

DIMENSION MODELLING: Ralph Kimball, Bill Inmoun.

DATABASES: DB2, MS SQL Server, Oracle 8i/9i/10g/11g/12g/Exadata, PostgreSQL, Teradata, Redshift, Azure DB, Snowflake, DynamoDB.

SCRIPTING LANGUAGES: Python, R, SAS, XML, SQL, PL/SQL, UNIX Shell Scripting.

CLOUD: AWS (Amazon Web Services) EC2, AWS Glue, S3, EMR.

BIG DATA: Apache Spark, Hive, Pig, HDFS.

SCHEDULERS: UC4, Autosys, Control-m, Airflow.

MISCELLANEOUS TOOLS: Putty, Erwin, MS VS code, JIRA, Toad, Oracle SQL Developer, SQL Navigator, IBM Data studio, ERWIN, MS Visio.




DOMAIN KNOWLEDGE

P&C Insurance, Entertainment, Retail, Pharma, Health Care, Telecom and Public Administration.

WORK EXPERIENCE

CLIENTS:

Fidelity Jan 23 Present

Role: Principal Data Engineer
PROJECT INFORMATION

BUSINESS METRICS DATA LAKE: BMDL has been set up to house multiple source systems (HR, WI, FI, PI) data into Snowflake data warehouse. Once the source data is loaded into Snowflake, datasets, and dashboards are created in DOMO for reporting purposes.
Responsible for design and building data pipelines for BMDL.
Created data pipelines in IICS Data Integration tool for data movement from/to cloud.
Created batch scripts move files from on prem location to AWS S3.
Extensively worked on API programming using python to download datasets from DOMO.
Migrated on prem Bitbucket stash repository to Github repository for BMDL stack.
Converted on premise Jenks CI/CD pipeline to cloud Jenkins pipeline.
Conversion of legacy etl pipelines into Snowflake SQLs.
Working with global team located in multiple geo locations.
Created Dynamic Data Ingestion process to create views for source system objects in data lake for DOMO reporting.
Created Datasets and Dashboards on DOMO.
Developed API programs using python to load applications and source database.
Created schemas in Snowflake for database reorganization.
Created snowsqls for data ingestion.
Converted existing ETL pipelines in Informatica Power Center to Informatica Intelligence cloud services.
Collaborated with Data Analyst from different teams creating views of source objects in Snowflake and onboarding them to DOMO.
Environment: Informatica Intelligent Cloud Services (IICS), Informatica Data Management Cloud, Python, Informartica PowerCenter, Snowflake, AWS, Control M, Jenkins, Bitbucket, Oracle, Sql Server, DOMO.

DCF, State of WI, Madison, WI Ju1 15 Dec 22

Department of Children and Families (DCF) objective is to promote the economic health and social well-being of Wisconsin's children and families. Worked on multiple projects including federal projects simultaneously as a BI solution lead, designing right sized solutions and technical oversight.

ROLE Business Intelligence Lead

Involved in every phase of project life cycle from project scoping through project implementation.
Managed relationships with multiple business stakeholders and recommended products and services to satisfy client business needs.
Responsible for managing various processes of SDLC from requirements gathering thru
Implementation and support.
Researched and recommended new technologies, architectures, tools and techniques to keep the organization on par with industry standards for data and information management.
Assisted the customers in the identification of data, information and analytical needs. Suggested relevant, effective and efficient solutions to best meet customer needs and budgets.
Experience in end-to-end Data quality testing and support in enterprise warehouse environment.
Experience in maintaining Data Quality, Data consistency and Data accuracy for Data Quality projects.
Provided production support to schedule and execute production batch jobs and analyzed log files in Informatica 8.6/& 9.1 Integration servers.
Experience in Data profiling and Scorecard preparation by using Informatica Analyst.
Strong knowledge in Informatica IDQ 9.6.1 transformations and power center tool.
Strong exposure in source to Confidential data flows and Data models for various Data quality projects.
Parameterizing the Informatica ETLs to reduce the build time thus effectively managing ETLs in Production.
Guide/mentor other developers in the team to resolve their technical issues and make them understand the requirements very well, peer reviews - keeping an eye on all project activities/deliverables - providing instant assistance if something goes wrong in production daily batch cycle or any other production loads.
Involved in daily status call with onsite Project Managers, DQ developers to update the test status and defects.
Creation of report detailed document for the reports build which has information about the report requirements (how objects are created, measure objects), intended audience and security policy.
Provided training to the report users on Business Object Universe and Webi tool.
Worked with the data governance team and research team while building the technical solution.
Worked as an interim project manager for a release when that position became vacant in middle of release.
Creation of ad-hoc reports by complex SQL queries on DB2, Oracle databases for budget team.
Creation of Dashboards using Tableau for Bureau of Program Integrity.
Creation of data extracts using MS Excel using pivot tables and sas plugins for Policy team.
Extensively used sas to validate federal reports with source application data for validation.
Environment: Informatica 9X/10X, Informatica IDQ 10X, IICS, Oracle 12g/Exadata, IBM DB2, SAP BO 4, MS Sql Server 2017, Tableau, Snowflake, SQL Developer, Postgres SQL, Python3, R, SAS.







Infosys Limited, TX Nov 11 Jun 15

Infosys Limited: With over three decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.

CLIENTS:

American Family Insurance, Madison, WI Nov 11 Jun 15

ROLE TECHNOLOGY LEAD

RESPONSIBILITES
Interacting with business representatives and higher-level management for requirement analysis and to define business and functional specifications.
Have designed and implemented Business Intelligence solutions to provide financial, operational and analytical reporting for various subject areas such as Policy, Claims, Billing, Financial, Health and Education for both Legacy DB2 EDW and Advance BIC.
Have architected migration strategy for moving BIC Analytics from Oracle to Greenplum.
Have architected solution that combines data from American Family and Homesite to provide a centralized reporting platform for Mindi program.
Involved in numerous Cloud POC s to build architecture for future BI Cloud Architecture.
Involved in Performance tuning for the Data Warehouse Environment.
Designed ETL processes to load all the history from legacy sources to the new BI environment.
Designed ETL processes to load new products data from Guidewire Policy Center to the new BI environment.
Maintain technical documentation.
Create test strategy and integration test plan.
Assign technical resources to project tasks.
Coordinate the user acceptance testing.
Assist the project team with the data modeling efforts and review of the final model.
Worked with other data stewards to ensure consistency, avoid redundancy, and provide integrated data.
Provide guidance to technical staff for consistent methods of extracting from the source systems.
Provide guidance to technical staff team for consistent methods of creating fact and dimensions.
Facilitate and communicate standards and best practices to technical staff.
Ensure the architecture standards and guidelines for the BI program are being met.
Maintain the documentation on data structures already developed or being developed to reduce redundancy.
Extensively worked in setting the standards for offshore team.

Environment: Informatica PC 9X, Autosys, Oracle 12g, IBM DB2, SAP BO 3, SQL Developer, Postgres SQL, Hadoop, Pig, Hive.




Tata Consultancy Services, India Apr 08 Oct 11

Tata Consultancy Services: TCS organization structure is domain led and empowered to help provide Customers single window to industry specific solutions. Agile industry units have embedded capabilities to enable rapid responses that provide a competitive edge to our customers. This is coupled with a unique Global Network Delivery Model (GNDM ), spanning 40 global locations, that is today recognized as the benchmark of excellence in technology deployment.

Senior Software Engineer
Worked as a developer on multiple projects with diverse domain(Health Care, and Telecom).

ROLE LEAD BI DEVELOPER

RESPONSIBILITES
Built a brand-new DataMart to support their regulatory and analytical needs.
Participated in all phases such as Business Requirements gathering, technical design, high level design, detail design, development, review, testing and support.
Involved in business analysis and prepared the Detail Design Document (DDD), data mappings document, developed the technical specifications.
Utilized Informatica PowerCenter 9.1.0/8.6.1 to extract data from various sources like Flat files, Mainframe files, SQL Server and CSV Files into staging tables.
Designed techniques for the Change Data Capture (CDC) and for the Full refresh of the Datamart.
Implemented performance tuning techniques so as to increase the performance of the loading process.
Construct and perform System and Integration Test cases and testing and presenting the test results with Business Analysts.
Creation and Maintenance of shell scripts to automate the process and/or scheduling the jobs and validate the parameters/variables to pass through the workflows.
Implemented data masking methods to protect PI and PHI information in line with HIPAA.
Have taken periodic test on Health Insurance Portability and Accountability Act (HIPAA).
Trained new team members on Health Insurance Portability and Accountability Act (HIPAA).
Developed phased projects to develop Enterprise Data Warehouse and Data Marts to support Business Intelligence.
Creation of reports using Business Objects.
Creation of Ad-hoc and Scheduled reports using Business Objects.
Preparation and execution of Unit and System test cases.
Worked extensively on SQL queries on oracle database for data integrity and data quality measures.

Environment: Informatica 8.6, Oracle 9i/10g, SAP BOXI, MS SQL Server 2008, MS SQL Server package (SSIS, SSRS and SSAS), HP QC, SQL Developer, Teradata, Unix.
Keywords: cprogramm continuous integration continuous deployment quality analyst artificial intelligence business intelligence sthree database active directory rlang information technology hewlett packard microsoft procedural language Texas Wisconsin

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];4220
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: