Home

Madhu - Pl Sql developer
[email protected]
Location: Dallas, Texas, USA
Relocation:
Visa:
Ramak
[email protected]

PROFESSIONAL SUMMARY:
Overall 8+ years of IT experience as an Oracle PL/SQL Developer and Data Engineer, with extensive expertise in the analysis, design, and implementation of business applications using Oracle RDBMS, database development, ETL processes, data modeling, report development, and Big Data technologies.
Actively engaged in all phases of the SDLC, from analysis to maintenance, with a focus on meeting tight deadlines and delivering high-quality solutions.
Extensive experience with data modeling, data governance, ETL processes, and cloud platforms including Snowflake Cloud, AWS Redshift, and Informatica (IICS-CDI, PowerCenter), integrated with relational databases (MySQL, Teradata, Oracle, Sybase, SQL Server, DB2).
Proficient in data modeling techniques (normalization, denormalization, ER modeling) for both transactional and analytical databases, while implementing data governance policies, role-based access control (RBAC), and data masking to meet regulatory and compliance standards.
Proficient in configuring and managing Oracle RMAN for automated backup, recovery, and disaster recovery strategies, including full, incremental, and archival backups, ensuring data availability.
Proficient in handling and manipulating JSON data within Oracle databases using SQL/JSON functions like JSON_VALUE, JSON_QUERY, JSON_EXISTS, and JSON_TABLE.
Experienced in integrating JSON data storage and querying within Oracle Database versions 12c and above, leveraging native JSON support for efficient data retrieval.
Skilled in performance optimization techniques such as table functions, indexes, table partitioning, materialized views, and query rewrites, along with tuning using tools like Explain Plan, SQL Trace, STATSPACK, and AWR reports.
Expertise in Oracle client-server application development (11g/10g/9i/8i) with a strong command of PL/SQL, SQLPlus, TOAD, SQLLoader, and Oracle Forms for customization and extension.
Developed and automated ETL workflows using UNIX shell scripts, Python, and scheduling tools like Apache Airflow, Control-M, Autosys, and crontab, with experience in data migration from Teradata to AWS Snowflake.
Leveraged Python libraries (NumPy, Pandas, PySpark) within Databricks for advanced data analysis and integrated Tableau with ETL processes for seamless data visualization.
Hands-on experience with relational and NoSQL databases (MySQL, MongoDB, Cassandra, PostgreSQL) and proficiency in the Cloudera ecosystem (HDFS, Hive, Sqoop, HBase, Kafka, Spark SQL) for big data processing.
Strong technical and analytical skills with a deep understanding of ER modeling for OLTP and dimensional modeling for OLAP, as well as managing warehouse operations and administering WMS.
Experienced in developing automated scripts for data processing and migration using Python, SQL, and Unix shell scripting, and providing technical support for database connectivity, data integrity, and performance issues.
Experienced in Agile environments using JIRA for task management and defect tracking, with strong proficiency in version control systems like Git and Azure DevOps for collaborative development.
Skilled in leveraging VS Code for efficient coding, debugging, and integration with CI/CD pipelines.

VISA STATUS:
H1B

PROGRAMMING SKILLS:
Programming Languages: Python, C++, Unix Shell Scripting
Web Technologies: HTML, CSS, Bootstrap, JavaScript
Data Analytics/Visualization Tools: Power BI, Tableau.
ETL Tools: Informatica PowerCenter
Database Technologies: Oracle SQL/PLSQL, MySQL, Snowflake, ControlM, ServiceNow
Cloud Technologies: Azure Data Factory, Azure Databricks
Operating Systems: Windows, Linux, UNIX
Version Control/Repository Tools: Git, Azure DevOps
Database Management Tools: SQLPLUS, SQL Developer, Toad, SecureCRT

CERTIFICATIONS:
IBM Data Analyst:
https://www.coursera.org/account/accomplishments/specialization/certificate/VM9XEWHLSLCE
Oracle Cloud Infrastructure:
https://catalog-education.oracle.com/pls/certview/sharebadge id=D40557F1B1EF441AC93617B5869BF5E4F62074BF59748066DC9E5CD0AD54D3B0
Python for Data Science:
https://www.credly.com/badges/75522bca-60cd-462b-8e51-1bae74d4278c/linked_in_profile

Work EXPERIENCE:
CMIS
Nationwide Insurance, Dallas, TX
Oracle PLSQL Developer Oct 2023 Present

Involved in full Software Development Life Cycle (SDLC) - Business Requirements Analysis, preparation of Technical Design documents, Data Analysis, Logical and Physical database design, Coding, Testing, Implementing, and deploying to business users.
Developed complex mappings using Informatica Power Center Designer to transform and load the data from various source systems like Oracle and Sybase into the final target database.
Analyzed source data coming different sources like Flat files then transformed according to business rules and loaded the data in to target tables.
Developed scripts to automate the tasks using CRON scripts in the UNIX operating system.
Supported various business teams with Data Mining and Reporting by writing complex SQLs using Basic and Advanced SQL including OLAP functions like Ranking, partitioning and windowing functions, Etc.
Tuning SQL queries using Explain Plan for analyzing the data distribution among index usage, collect statistics, definition of indexes, revision of correlated sub queries.
Developed shell scripts for job automation, which will generate the log file for every job.
Written complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
Worked on development of data warehouse, Data Lake and ETL systems using relational and non-relational tools like SQL, No SQL.
Experience in developing Spark applications using Spark-SQL in Databricks for data extraction, transformation, and aggregation from multiple file formats.
Worked extensively on SQL, PL/SQL, and UNIX shell scripting.
Expertise in creating PL/SQL Procedures, Functions, Triggers and cursors.
Developing under scrum methodology and in a CI/CD environment using Jenkin.
Developed UNIX shell scripts to run batch jobs in Autosys and loads into production.
Do participate in architecture council for database architecture recommendation Utilized Unix Shell Scripts for adding the header to the flat file targets.
Environment: Oracle DB, CRON, Unix Shel Scripting, SQL, Databricks, GitHub.


Virtusa (CITIBANK), Tampa, FL
Database Developer April 2023 Sep 2023

Collaborated closely with the front-end design team, facilitating seamless integration by providing essential stored procedures and packages, along with insightful data analysis.
Implemented data loading tasks using SQLLoader to efficiently process flat files received from diverse facilities on a daily/weekly basis.
Developed and adapted numerous UNIX shell scripts to accommodate evolving project needs and meet client requirements effectively.
Employed advanced PL/SQL techniques, including packages, procedures, triggers, functions, indexes, and collections, to execute complex business logic with precision and efficiency.
Managed version control proficiently through Bitbucket, ensuring the integrity of code repositories and promoting collaborative efforts within a dynamic team environment.
Conducted thorough debugging and testing of SQL and PL/SQL code, consistently identifying, and resolving issues to uphold the reliability and robustness of critical database applications.
Implemented performance tuning techniques for SQL and PL/SQL, utilizing tools such as EXPLAIN PLAN, SQLTRACE, TKPROF, AWR, ADDM and AUTOTRACE to optimize application performance.
Monitored and tuned database performance as per application and business requirements, including index rebuilding and statistics updating, to enhance overall system efficiency.
Leveraged advanced PL/SQL features such as Records, Tables, Object types, and Dynamic SQL to enhance application functionality and facilitate adaptable data management.
Implemented comprehensive performance tuning strategies, resulting in improved database efficiency and enhanced application performance.
Employed extensive Exception Handling mechanisms for efficient error management, ensuring streamlined debugging processes and effective error messaging within applications.
Ensured secure remote server access and data transfers by operating within the secure environment of Tectia SSH Terminal, safeguarding data integrity and confidentiality.
Contributed actively to the enhancement of the codebase by introducing innovative modifications and enhancements, aligning applications with evolving business needs, and fostering continuous improvement.
Proficiently utilized a variety of tools including uDeploy, Visual Studio, Eclipse, Git BASH, and Bitbucket to streamline development processes and promote efficient collaboration.
Environment: Oracle RDBMS 19c, Oracle SQL Developer, SQL Loader, PL/SQL, Shell Scripts, Tectia SSH Terminal,
Bitbucket, Git BASH, uDeploy, TeamCity, Visual Studio, Eclipse.

Escape Velocity (Zurich Insurance Group), Hyderabad, India
PLSQL/ SQL Developer July 2017 - July 2021

Developed SQL queries, encompassing intricate joins, subqueries, and aggregations, utilizing SQL Navigator to streamline development.
Generated server-side PL/SQL scripts for data manipulation, validation, and the creation of materialized views across remote instances.
Developed PL/SQL triggers and master tables to automate the creation of primary keys, enhancing database efficiency.
Engineered PL/SQL stored procedures, functions, and packages to orchestrate seamless data movement from staging areas to data marts.
Implemented scripts to instantiate new tables, views, and queries, accommodating enhancements within the application architecture.
Created diverse database objects including tables, views, materialized views, procedures, and packages using Oracle tools such as Toad, PL/SQL Developer, and SQL*Plus.
Employed bulk collection extensively within PL/SQL objects to optimize performance and enhance system responsiveness.
Utilized advanced PL/SQL features like Records, Tables, Collections (nested tables and arrays), and Dynamic SQL to streamline code compatibility across Oracle and DB2 databases.
Conducted comprehensive performance tuning and optimization for SQL queries and data retrieval, employing indexing strategies and other techniques to boost system responsiveness across Snowflake and traditional databases.
Developed and implemented ETL (Extract, Transform, and Load) components based on filter rules, leveraging Informatica PowerCenter, UNIX, and PL/SQL to extract data from diverse source systems and calculate essential metrics.
Conducted rigorous testing and validation of Informatica mappings, stored procedures, functions, and integration processes to ensure data accuracy and consistency.
Scheduled and managed batch jobs for daily data loads, facilitating seamless data integration between Oracle and DB2 systems.
Applied optimization techniques derived from database performance tuning to enhance the efficiency and responsiveness of Terraform-managed infrastructure.
Collaborated closely with infrastructure teams to deploy, maintain, and optimize databases, facilitating a smooth transition to collaborative infrastructure deployment using Terraform.
Environment: Oracle RDBMS 12c, Snowflake, Terraform, Informatica Power Center 8.1 (Designer, Workflow Manager, Workflow Monitor), SQL Navigator, UNIX, SQL*Loader, SQL Developer.

Sizmek, Hyderabad, India
Python Developer Sep 2015 - June 2017

Conducted proactive client interactions to meticulously gather and analyze business requirements, ensuring a comprehensive understanding of their needs for Snowflake DB integration.
Implemented REST APIs with Python and Django, facilitating smooth integration with Snowflake DB to enhance data processing capabilities and enable efficient data retrieval.
Spearheaded the design and implementation of a robust Data Quality Framework for Snowflake on Spark (Pyspark), enabling comprehensive schema validation and data profiling to maintain data integrity and reliability.
Utilized Snowflake for Python scripting and leveraged the Unit Test Python library for rigorous program testing and efficient data loading into the database.
Harnessed the power of Spark (Pyspark) to effectively manipulate unstructured data, applying text mining techniques on Snowflake-stored user utilization data to extract valuable insights.
Adhered to the AGILE development methodology for Snowflake integration, fostering flexibility and responsiveness to evolving project requirements and client needs.
Conducted comprehensive analysis of API flows, identifying optimization opportunities to enhance the customer experience, with a particular emphasis on streamlining Snowflake-related processes.
Implemented a Test-Driven Development (TDD) approach to the development of essential services for the application, effectively managing version control using Git.
Demonstrated notable success in minimizing fraudulent activity and maximizing API revenue generation through the implementation of sophisticated Snowflake DB-related data processing and analysis techniques.
Environment: Snowflake, Python, Django, Spark (Pyspark), AGILE Development, Test-Driven Development (TDD), Git, Unit Test Python Library.

Academic PROJECTS:

Maze Runner C++ programming language: Designed and built a maze runner in visual studio to reach the destination in the shortest path using C++ programming. Users can modify the grid before starting the maze and can see how the maze is moving in each step. The stack used by the computer will help us remember which moves we have made.
Agent moving in real-world environment Python: This project includes concepts of performance measurement, environment interaction, and the use of actuators (PEAS) to assist the agent s navigation in a stochastic setting that mimics a real-world environment. Designed and built to understand the working of a simple, table-driven agent using Python
Tic-Tac-Toe Python: Designed and built a project to demonstrate the understanding of the alpha-beta search algorithm we implemented the following game Tic-Tac-Toe using Python
Chat Bot Python, ML: We ve created a chatbot using the same with a twitch as an online platform that provides a chatbot platform to the online clients using Python.

EDUCATION:

University of North Texas, Denton, TX
Master of Science in Computer Science Aug. 2021 Dec. 2022

Jawaharlal Nehru Technological University, Hyderabad, India
Bachelor of Engineering in Computer Science Aug. 2011 June. 2015
Keywords: cplusplus continuous integration continuous deployment machine learning business intelligence database information technology procedural language Florida Idaho Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];4021
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: