Home

Bhagath - ETL developer / Data engineer
[email protected]
Location: Abingdon, Illinois, USA
Relocation:
Visa: H1B
P BHAGATH
Sr. ETL Informatica Developer
Contact: +1 (804) 666-9819
Email: [email protected]
LinkedIn: lhttps://www.linkedin.com/in/bhagath5680456205/

Professional Summary:

IT professional with over 10 years of experience in data warehousing and ETL development. Expert in designing and implementing EDW, ODS, and Data Marts using Informatica, SSIS, and Talend. Proficient in SQL, PL/SQL, and Big Data technologies like Hive and Sqoop. Skilled in data quality, governance, and performance optimization. Proven ability to lead cross-functional teams and deliver robust data solutions.


Specialized in designing and constructing Enterprise Data Warehouses (EDW), Operational Data Stores (ODS), Data Marts, and Decision Support Systems (DSS) using Multidimensional and Ralph Kimball Dimensional modeling (Star and Snowflake schema) concepts.
Proficient in Change Data Capture (CDC) and daily load strategies for Data Warehouses and Data Marts, including Slowly Changing Dimensions (Type 1, Type 2, and Type 3), Surrogate Keys, and Data Warehouse principles.
Extensive experience in ETL methodologies, encompassing Data Profiling, Data Migration, Extraction, Transformation, and the design of data conversions from diverse source systems, including Oracle, DB2, SQL Server, Teradata, Hive, as well as non-relational sources like flat files, XML, and Mainframe files.
Highly skilled in developing Informatica Mappings, Mapplets, Sessions, Worklets, and Workflows for data integration and loading.
Proficient in the installation and management of Informatica PowerCenter, Metadata Manager, Data Explorer, and Data Quality.
Experienced in Big Data technologies such as Hive and Sqoop.
Developed Python User-Defined Functions (UDFs) for Pig and Hive to preprocess and filter data sets in distributed environments.
Strong command of SQL, PL/SQL, including packages, functions, stored procedures, triggers, and materialized views for implementing business logic in Oracle databases.
Extensive experience in ETL testing, using tools like Informatica (PowerCenter/Power Mart), Teradata, and Business Objects, alongside expertise in working with relational databases, including Oracle, SQL Server, DB2, UDB, MS Access, and Teradata.
Proficient in ETL mappings, data analysis, and documentation of OLAP report requirements, with a solid understanding of OLAP concepts and the complexities of managing large data sets.
Possessed expert knowledge of data governance, ensuring that data management policies and standards were consistently followed across the organization.
Extensive background in Dimensional Modeling, Data Migration, Data Cleansing, and Data Staging for operational sources using ETL and data mining features for data warehousing.
Experienced in enhancing and deploying SSIS Packages from development to production servers.
Proficient in ETL implementation using SQL Server Integration Services (SSIS) and Reporting Services (SSRS).
Triage data quality issues by analyzing quality scorecards and dashboards until they were resolved, ensuring data quality met required standards.
Expert-level skills in testing Enterprise Data Warehouses using various ETL tools, including Informatica PowerCenter, DataStage, Ab Initio, and SSIS.
Extensive use of DataStage Change Data Capture for DB2 and Oracle files, employing change capture stages in parallel jobs.
Comprehensive knowledge and experience in process improvement, normalization/de-normalization, data extraction, data cleansing, and data manipulation.
Took ownership of creating and maintaining data quality processes and procedures to ensure ongoing compliance with established standards.
Proficient in evaluating data profiling, cleansing, integration, and extraction tools, including Informatica, Kalido, and Composite Software.
Proficient in creating comprehensive specification documents delineating source-to-target systems.
Possess a deep understanding of the Data Warehouse project development life cycle.




TECHNICAL SKILLS:


Operating Systems Windows, Mac, Linux (Red Hat), UNIX (Solaris, AIXv5.2, SunOS 5.10)
Languages SQL, PL/SQL, Python, T-SQL, UNIX Shell Scripts, Perl Scripting, Java, XML, Sqoop, Hive
ETL Tools Informatica PowerCenter, Informatica Intelligent Cloud Services (IICS), Informatica Developer (IDQ), Data Management, Big Data, Datastage, DTS, SSIS
Databases Oracle, Siebel, MS SQL Server R2, DB2, MySQL, Greenplum, Redshift, Netezza, Teradata
Methodologies Agile, Waterfall, Snowflake Schema
Scheduling Tools Autosys, Control-M, Informatica Scheduler
Reporting Tool Business Objects, Tableau, OBIEE


PROFESSIONAL EXPERIENCE:

Client: BCBSA Chicago, IL. March 2023 Till Date Sr. ETL Informatica Developer
Responsibilities:
Collaborated within a cross-functional team of Architects and Developer Consultants to conceptualize, create, and enhance software solutions.
Acted as the primary process expert on data quality, demonstrating a deep understanding of data quality principles, methodologies, and best practices.
Spearheaded the development of comprehensive project documentation, encompassing Functional, Technical, and ETL Specification documents.
Designed and executed ETL mappings and processes in strict adherence to company standards, employing Informatica PowerCenter as the primary tool.
Pioneered the setup of Hadoop configurations, including the establishment of Hadoop clusters and Hive connections, utilizing Informatica BDM (Big Data Management).
Reviewed data to identify patterns, trends, errors, or inconsistencies that affected data quality.
Articulated data quality issues effectively to the technical team, facilitating understanding and prompt action.
Crafted intricate stored procedures (both procedures and functions) through PL/SQL and orchestrated scripts using Unix Shell Scripts.
Managed complex ETL mappings, with a focus on accommodating slowly changing dimensions efficiently.
Optimized source queries to control temporary storage space utilization and introduced delay intervals as per business requirements to enhance performance.
Consistently applied existing ETL standards during the development of mappings.
Extensively utilized Informatica tools such as Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformation Developer.
Drove end-to-end data processing, encompassing source analysis, data transformation, mapping, and target data loading using Informatica Power Center Designer.
Leveraged Informatica Workflow Manager to craft workflows, establish database connections, define sessions, and orchestrate batches for executing mappings.
Contributed significantly to Data Mart model design using Erwin, following the Star schema methodology.
Championed performance optimization efforts and streamlined the extraction of header and footer information within a single file.
Developed measurement plans to assess the quality of data from different sources, ensuring alignment with client expectations.
Generated reports on data quality issues and provided recommendations for resolving problems, highlighting the impact on business processes.
Employed SSIS to craft ETL packages, overseeing data validation, extraction, transformation, loading into data warehouse databases, and the processing of SSAS cubes.
Masterminded intricate mappings in Informatica, incorporating various PowerCenter transformations and mappings.
Developed a robust Change Data Capture (CDC) mechanism using Informatica PowerExchange for select interfaces, considering project requirements and constraints.
Translated business requirements from Functional specifications into technical specifications to architect ETL methodologies.
Assumed responsibility for the development, support, and maintenance of ETL processes using Informatica Power Center.
Managed metadata associated with ETL processes, facilitating data warehouse population.
Led the inception-to-production setup of Informatica BDM and Hadoop cluster environments.
Executed logging and deployment of diverse packages within SSIS.
Created DataStage jobs to import data from heterogeneous sources like Oracle 9i, text files, and SQL Server.
Fostered close collaboration with Business users, Informatica Product support group, and Hortonworks teams.
Orchestrated various tasks within workflows, including sessions, events, decisions, emails, commands, worklets, assignments, timers, and workflow scheduling.
Engineered Informatica mappings to extract data from a range of sources, including Oracle, files, and Salesforce, implementing multiple transformations for Salesforce data integration.
Devised a hierarchical structure for Contact accounts in Salesforce using ETL Informatica logic and mappings.
Extracted data from multiple databases, including Oracle, SQL Server, DB2, and flat files, using Informatica for centralized storage within a data warehouse repository.
Managed metadata within DataStage Manager, handling metadata importation from the repository, creating new job categories, and defining new data elements.

Environment: Informatica, Informatica BDM, Informatica IICS, Informatica IDQ, Oracle, Toad, Autosys, Data Integration, Linux, Unix, SQL, Oracle, XML, ETL, SSIS, Power BI, ServiceNow, DataStage, GitHub, Putty, JIRA, SQL Server Reporting Services, AWS, AWS Redshift, SQL Server, Unix, Teradata, Snowflake
Client: AT & T - Dallas, TX. Mar 2021 Feb 2023
ETL Informatica Developer

Responsibilities:
Orchestrated the development of extraction and loading processes, harnessing the power of Informatica, to retrieve data from upstream sources.
Acted as a crucial support resource, swiftly identifying and resolving production issues and job failures resulting from unforeseen circumstances.
Crafted Linux scripts, enabling the smooth execution of application maintenance tasks.
Expertly manipulated data using a wide array of Informatica transformations, including Joiner, Expression, Lookup, Aggregate, Filter, and Update Strategy.
Developed and established standard processes for executing data quality, including analysis, issue identification, and root cause analysis.
Created and enforced data quality standards, ensuring that data received from various sources met the client's quality standards.
Strategically established staging tables in alignment with the data warehouse's implementation design.
Formulated procedures for efficiently transferring data from diverse systems to the Data Warehousing system.
Skillfully authored SQL queries and PL/SQL programs, creating new packages and procedures, and finetuning existing ones.
Demonstrated proficiency in utilizing Informatica modules, including Repository Manager, Designer, Workflow Manager, and Workflow Monitor, to manage end-to-end ETL processes.
Executed ETL operations encompassing data extraction, transformation, and loading, moving data from sources like Excel, flat files, Oracle, to MS SQL Server, employing tools such as BCP utility, DTS, and SSIS services.
Conducted data profiling on sources to analyze data content, quality, and structure during the mapping development phase.
Implemented automated and scheduled cloud jobs, coupled with email notifications for failure alerts, ensuring seamless daily operations.
Analyzed requirements and devised business logic for the ETL process, actively participating in ETL design and documentation.
Designed, developed, and rigorously tested applications using Informatica PowerCenter, Informatica Data Quality, and Informatica Master Data Management, aligning them with functional specifications.
Engineered ETL jobs and custom transfer components to facilitate the movement of data from Oracle Source Systems to SQL Server, leveraging SSIS.
Created data mappings and workflows using Informatica PowerCenter to extract, transform, and load data into the target reporting environment.
Pioneered the containerization and deployment of ETL and REST services on AWS ECS via the CI/CD Jenkins pipeline.
Exhibited familiarity with AWS cloud services such as EC2, Elastic Container Service (ECS), Simple Storage Service (S3), and Elastic MapReduce (EMR).
Leveraged Talend Data Quality for in-depth analysis of source data quality, supporting informed decision-making.
Demonstrated extensive design, development, and testing proficiency with Talend Integration Suite, coupled with expertise in performance tuning of mappings.
Developed a Python Flask-based web service on the Postgres database, serving as the backend for a real-time dashboard.
Designed and implemented CRUD scripts to efficiently load transactional data into Hive and HBase, employing Thrift and Python scripting.
Successfully loaded diverse data types (Structured, JSON, XML, flat files, etc.) into the Snowflake schema and Star schema data warehouse.

Environment: Informatica BDM, Informatica Power Center, Talend Big Data Studio, Informatica IDQ, Python, Oracle, SQL, PL/SQL, TOAD, MySQL, Unix, SQL Server, OBIEE, Oracle EBS, Hadoop, Hive, Datastage, Teradata, Snowflake, Shell Scripts, Autosys Scheduler, AWS Redshift, ETL

Client: Honeywell Dallas, TX. Apr 2019 Dec 2020
ETL Informatica Developer

Responsibilities:
Collaborated closely with business users and business analysts to comprehend intricate business requirements.
Developed plans to enhance data quality by identifying the causes of errors or discrepancies and implementing effective solutions.
Provided technical direction and mentored other engineers and data analysts in data quality best practices and techniques.
Crafted efficient Informatica mappings, facilitating the seamless loading of data from diverse sources such as Oracle, CSI, and SQL Server into Data Warehousing and Data Mart systems.
Leveraged XML transformations to extract data from XML files and populate staging databases.
Architected and executed ETL mappings that aligned with project objectives, extracting and transforming data from various sources to meet specific requirements.
Designed and implemented robust Informatica ETL mappings, effectively extracting both master and transactional data from heterogeneous data feeds.
Conducted detailed analysis of business process workflows and contributed to the development of ETL procedures for data movement from source to target systems.
Automated sessions within Informatica ETL using UNIX shell scripts, streamlining data processing tasks.
Engineered dynamic parameter file generation within mappings, enhancing flexibility and reusability.
Played a pivotal role in fine-tuning mappings and sessions to optimize performance and throughput.
Implemented Change Data Capture (CDC) processes using Informatica PowerExchange, ensuring data accuracy and timeliness.
Collaborated on LAI application to validate and monitor data flow from various third-party applications.
Exhibited expertise in SIR and EDI applications for sourcing data and loading it into target environments.
Developed automated processes in Informatica for data extraction, transformation, and consolidation from multiple sources into a single application for storage.
Proficiently utilized various lookup cache types, including Static Cache, Persistent Cache, Re Cache from database, and Shared Cache.
Orchestrated the creation of parameter files and runtime parameter manipulation for sessions, mappings, and variables.
Developed PL/SQL scripts, stored procedures, indexes, constraints, partitions, and triggers in Oracle, enhancing data management capabilities.
Created PL/SQL procedures in Greenplum, aligning with specific project requirements.
Contributed to the creation of Greenplum functions and views, catering to customer use cases.
Engineered complex Talend ETL jobs for migrating data from flat files to databases, implementing robust error handling and comprehensive logging methods.
Designed and coded ETL/Talend jobs to efficiently process and load data into target databases.
Collaborated on the verification and testing of data, ensuring alignment with application requirements.
Utilized TDM (Test Data Management) tools to create graphs and generate workflows, streamlining data testing and management processes.

Environment: Informatica Power Center, Erwin, MS Visio, Python, Talend, Oracle, SQL, PL/SQL, TOAD, MySQL, SQL Server, Mainframe, XML, Autosys, UNIX Shell Scripting, MKS Integrity, WinSCP, Putty, JIRA
Client: Tvisha Technologies Incorporation Hyderabad, India. Nov 2016 Nov 2018
ETL Technical Developer

Responsibilities:
Facilitated communication with business customers to address issues and gather requirements.
Optimized Informatica workflows with shell scripts to enhance ETL flow.
Implemented FTP site monitoring via Informatica file watch events for external mainframe files.
Provided production support, resolving issues, and troubleshooting problems.
Developed ETL programs using Informatica to meet business requirements.
Created data validity, accuracy, integrity, and other quality tests across various components of the Data Platforms, utilizing Informatica and related tools.
Designed and built data quality frameworks and tools to automate data quality services and applications, improving efficiency and accuracy.
Conducted performance tuning at both functional and mapping levels, optimizing data transfer over the network with relational SQL.
Orchestrated ETL workflows using Unix shell scripts.
Designed and implemented complex aggregate, joiner, and lookup transformations to enforce business rules in ETL mappings for target Facts and Dimensions.
Utilized Informatica parameter files for mapping and workflow variables, FTP connections, and relational connections.
Led data migration from PeopleSoft Financials using Informatica PowerCenter and scheduled data via Data Warehouse Administration Console (DAC) for metadata management.
Improved ETL performance through indexing and caching strategies.
Developed, maintained, and administered complex ETL processes using IBM Data Stage.
Created new ETL processes to make data sources readily available to the business using IBM Data Stage.
Collaborated with fellow ETL developers to resolve complex scenarios and monitored daily ETL progress with source system owners.
Engaged in data warehouse enhancements and maintenance, including stored procedure modification for code enhancements.
Worked efficiently in an Informatica version-based environment, utilizing deployment groups for object migration.
Reviewed and analyzed functional requirements, mapping documents, and performed problem-solving and troubleshooting.
Conducted unit testing at various ETL levels and actively participated in team code reviews.
Applied a variety of transformations, including Stored Procedure, Connected and Unconnected Lookups, Update Strategy, Filter, and Joiner transformations, to implement complex business logic.

Environment: Informatica, Oracle, XML, SQL Server, Web Services, DB2 Mainframe, DAC, Cognos, JIRA, HP QC

Client: Byteridge Hyderabad, India. Jul 2013 Oct 2016
ETL Developer

Responsibilities:
Utilized Informatica PowerCenter Designer to analyze and Extract & Transform data from diverse source systems (Oracle, DB2, SQL Server, and flat files) while applying business rules.
Conducted analysis of business requirements, technical specifications, source repositories, and physical data models for ETL mapping and process flow.
Implemented Slowly Changing Dimensions (SCD) for select tables as per user requirements.
Optimized Informatica mappings for efficient data loading performance.
Configured and managed workflows and sessions to transport data to target Oracle warehouse tables using Informatica Workflow Manager.
Documented technical specifications, business requirements, and functional specifications for the development of Informatica Extraction, Transformation, and Loading (ETL) mappings.
Designed and executed ETL mappings to extract and transform data from diverse sources to meet project requirements.
Developed Informatica ETL mappings for extracting both master and transactional data from heterogeneous data feeds.
Analyzed business process workflows and contributed to the development of ETL procedures for data movement.
Proficiently utilized Informatica PowerCenter client tools, including Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformations Developer.
Translated high-level design specifications into simple ETL coding, adhering to mapping standards and employing expression transformations.
Participated in data warehouse enhancements and maintenance activities, focusing on performance tuning and code enhancements.

Environment: Informatica, Oracle, SQL Server, SQL, T-SQL, ETL, OBIEE, Toad, Erwin, Unix, Tortoise SVN, Flat Files

Education:

Dallas Baptist University, Jan 2019 - Dec 2020 Dallas, TX
Master of Science MS, Computer Science

Jawaharlal Nehru Technological University, Aug 2009 May 2013 Hyderabad, INDIA
Bachelor of Technology BTech, Computer Science and Engineering
Keywords: continuous integration continuous deployment business intelligence sthree information technology hewlett packard microsoft procedural language Delaware Illinois Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];3528
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: