Home

Babitha - data engineer
[email protected]
Location: Atlanta, Georgia, USA
Relocation:
Visa: H1b
Babitha

SUMMARY
Over 10+ years of IT experience in using SQL, PL/SQL and ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation, and loading using Informatica Power Centre, MSBI suite, and Manual testing for Client/Server and Web based applications.
Experience in developing data pipelines with AWS EMR, S3, Glue, Glue Catalogue and Athena along with Spark and PySpark.
Good understanding and hands on experience of AWS cloud architecture, Azure, Databricks and Snowflake along with DevOps technologies like Git, Jenkins, Docker, DevSecOps.
Experience in Extract, Transform, Load (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Centre (Designer, Workflow Manager, Workflow Monitor, Metadata Manger) and SSIS.
Exposure and knowledge of GenAIbank and LLM s: AWS Bedrock, AWS Code Whisperer
Scrum Master experience.
Engaged in the bidding process to secure new projects and extend contracts.
TECHNICAL SKILLS
Cloud Technologies Amazon Web Services for cloud, Azure, Databricks
DevOps Git, Jenkins, Docker, Kubernetes, Ansible, Nagios, Dev SecOps
ETL & Analytical Tools Informatica Power Center 9.5, 9, 8.x, 7.x, SSIS, SSAS, Glue
Reporting Tools Power BI, SSRS, Tableau
Operating Systems UNIX, Windows 98/ NT/2000/XP/10/11
Database/Data warehouse Oracle 11g/10g/9i, MS SQL Server, MySql, Redshift, Snowflake
Programming Languages SQL, JavaScript, C#.Net, C++, C, JAVA, R, Python, PySpark

EDUCATION
Master of Science in Computer Science, North Dakota State University, USA. (Dec, 2015)
Master of Technology, Computer Science Engineering, Osmania University, India. (Oct, 2010)

CERTIFICATION
AWS Certified Data Analytics Speciality Feb - 2023
Azure Data Fundamentals Certification - Oct - 2022
AWS Certified Solutions Architect Associate Jun - 2020
Oracle PL/SQL Developer Certified Associate Oct - 2016

PROFESSIONAL EXPERIENCE
Capgemini: Senior Software Engineer Feb 2022 till date

Client: McDonald s
Role: Lead Data Engineer
Responsibilities
Build, scale and maintain data pipelines to process millions of daily transactions of customers from Redshift databases to AWS S3, Glue, accessible through Athena.
Build datasets for capturing available customer journeys, third party deliveries, tokenisation, basket analysis and occasion analysis for both UK and Ireland markets.
Provide support for deployed data applications and analytical models by being a trusted advisor to Data Scientists and other data consumers by identifying data problems and guiding issue resolution with partner Data Engineers and source data providers.
Exploring new data sources to provide additional data to power data analytics team for Tableau visualisations.
Reviewing code before deployment in Code Commit.
Implement and support a platform that can provide ad-hoc access to large datasets.
Data quality tests with Great Expectations framework.
Coordinating closely with the clients and internal teams.
Internal and external stakeholder management.
Mentor and assist other engineers in or out of my areas of ownership and expertise.
Assumed the role of a Scrum master, taking on the responsibilities associated with the position.
Workflow management using JIRA.
Desing and Documentation captured in the confluence.

Currys Plc (previously Dixons Carphone) Multiplay Sep 2017 Feb 2022
Dixons Carphone is Europe s number one Electrical and Telecommunications Retailer and Services Company, which operate across nine countries. Multiplay is a part of the business where the broadband, home phone and television services from top suppliers in the market are promoted to customers across the UK.

Role: Data Manager
Responsibilities:
Managing pricing and promotional data for the Multiplay broadband products for various sales channels.
Creating/adding new products/propositions in the pricing database
Creating/updating services, equipment, packages (products) and offers (promotions) on the base Multiplay broadband packages/products.
Coordinating with the trading team to decide the upcoming propositions.
Launching products on various sales channels (Instore Retail, Indirect/Call centres, Ecommerce) on time and when it is needed.
Creating workflows in CRM for gift cards, bill credits, cash back vouchers.
An expert in using the tool, the DMT (Data Management Tool), for managing the pricing database.
Creating and managing campaigns and partners in CRM.
Weekly reports on the sales and submission rates for individual broadband, phone and tv services providers on various sales channels: Ecommerce, Retail and Partnerships.
Analysing the YoY and MoM sales data to identify new opportunities and the ways to reduce the customer orders pre submission and post submission dropout rate.
Responsible for creating a monthly sales report showing the orders placed, cancelled, installed, and dropout rates from the sales cube.
Identifying new opportunities and ways of working with the current system to improve the performance, efficiency, scalability, and usability of the existing pricing system.
Active member in innovating new features and delegating the ideas to the developers and the stakeholders of the applications.
Mentoring new team members in process, data, and applications.
Documenting, updating, and managing the concepts and terminology used within the current pricing system.

Microsoft, M&O CMO (Fargo, USA.) Jan 2016 Jun 2016

Microsoft is an American multinational company, which develops, manufactures, licenses, supports and sells computer software, personal computers and services. The project goal is to transform and consolidate marketing data from multiple source systems to one centralized place, which involved lot of business and technical analysis, analysing and reporting on sales data across product categories and channels to support company sales and marketing efforts.

Role: Data Analyst
Responsibilities:
Understanding, analysing and transforming the data (marketing leads) from various marketing operations.
Developed a wide range of Business Intelligence solutions used to manage output of large data volumes for marketing managers. SSIS packages developed and deployed using Team Foundation Server.
Use ETL processes to achieve effective business use of high volumes of data.
Build and maintain SQL scripts and complex queries for data analysis and extraction.
Assisting with the transition to a new marketing automation system, analysing existing data practices and making recommendations for improvements.
Generating reports using Power BI for various marketing teams.
Ensure best practices are applied and integrity of data is maintained through security, documentation, and change management.
Mentor and coach new team members on business process and technical aspects.

Open Access Technology International (OATI) Minneapolis, USA Jan 2015 Aug 2015

Open Access Technology International, Inc. is the industry leader in the North American Energy Industry, providing advanced application solutions as web-based services to over 700 customers. The project that I worked on was aimed at Design, Development and Migration of OATI products from heterogeneous data sources, upgradation of Informatica and Oracle Database on Linux. The project involves Informatica data services to profile and document the structure and quality of all data.

Role: Junior Software Developer
Responsibilities:
Gathering business requirements for the data-warehouse as well as business-intelligence reports to be used by the management.
Creating, monitoring, modifying, & communicating the project plan with other team members.
Worked on power centre tools including Designer/Repository, Workflow Manager/Monitor.
Extensively used Informatica Transformations like Source Qualifier, Rank, Expression, Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Sorter etc. and all transformation properties using Informatica Power centre 9.5.
Translate Business processes into Informatica mappings for building Data marts by using Informatica.
Conversion of WML (web plus) scripts into C# .net programs using .net framework.
Responsible for writing test cases to cover overall quality assurance using HP ALM.
Responsible for Integration, Functional and End to End testing.
Involvement in Test Execution, Results Analysing and Defect Reporting.
Attend daily status call with internal team and weekly calls with client and updated the status report.

Sree Nipuna Technologies, India June 2009 May 2013

Role: Informatica Developer
Responsibilities:
Involved in complete life cycle of developing enterprise data warehouse application and developed ETL architecture using Informatica.
Designed data warehouse target tables by using dimensional modelling techniques: Star and Snowflake schemas.
Fact and dimension tables creation.
Extract data from various data sources, transformed according to business requirements and loading into the targets.
Created complex mappings using various transformations: Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup etc.,
Unit testing and user acceptance testing to check whether the data loads into target are accurate and satisfying the user requirements.
Documenting in the confluence pages, mentoring and knowledge transfer
loping Enterprise Data Warehouse Application and
developing ETL Architecture using Informatica.
Designed Data warehouse target tables by using Dimensional Modelling Techniques Star
and Snowflake Schemas.
Created Dimensions and Fact tables using Informatica.
Extracted data from various data sources such as Oracle, Flat file, XML and transformed and
loaded into targets using Informatica.
Created various complex mappings using different transformations such as Source Qualifier,
Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence
Generator etc.
Involved in Unit testing, User Acceptance testing to check whether the data loads into target
are accurate, which was extracted from different source systems according to the user
requirements.
Prepared the Standard Operating Procedure (Knowledge Transfer) document, which provides
necessary information, required for the Maintenance and Operation of the application.
Involved in complete Life Cycle of developing Enterprise Data Warehouse Application and
developing ETL Architecture using Informatica.
Designed Data warehouse target tables by using Dimensional Modelling Techniques Star
and Snowflake Schemas.
Created Dimensions and Fact tables using Informatica.
Extracted data from various data sources such as Oracle, Flat file, XML and transformed and
loaded into targets using Informatica.
Created various complex mappings using different transformations such as Source Qualifier,
Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence
Generator etc.
Involved in Unit testing, User Acceptance testing to check whether the data loads into target
are accurate, which was extracted from different source systems according to the user
requirements.
Prepared the Standard Operating Procedure (Knowledge Transfer) document, which provides
necessary information, required for the Maintenance and Operation of the application.
Involved in complete Life Cycle of developing Enterprise Data Warehouse Application and
developing ETL Architecture using Informatica.
Designed Data warehouse target tables by using Dimensional Modelling Techniques Star
and Snowflake Schemas.
Created Dimensions and Fact tables using Informatica.
Extracted data from various data sources such as Oracle, Flat file, XML and transformed and
loaded into targets using Informatica.
Created various complex mappings using different transformations such as Source Qualifier,
Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence
Generator etc.
Involved in Unit testing, User Acceptance testing to check whether the data loads into target
are accurate, which was extracted from different source systems according to the user
requirements.
Prepared the Standard Operating Procedure (Knowledge Transfer) document, which provides
necessary information, required for the Maintenance and Operation of the application.
Keywords: cprogramm cplusplus csharp business intelligence sthree active directory rlang information technology hewlett packard microsoft procedural language

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];2580
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: