Home

[email protected] - Sr. Database Developer
[email protected]
Location: Dallas, Texas, USA
Relocation: Ready to relocate
Visa: H4EAD
FATHIMA J
Email: [email protected]
PH No: (469) 806-3533
Sr. Database Developer

Summary
Deeply passionate about PL/SQL development, 7+ years of hands-on experience and a strong command of Oracle databases. My expertise lies in crafting intricate back-end modules for multi-tier applications, coupled with a fervent dedication to troubleshooting challenges head-on.
Proficient in developing robust data solutions using PL/SQL and Oracle databases, spanning versions 9i through 21c. Skilled in architecting and implementing ETL processes to seamlessly integrate data from diverse sources.
Proficient in crafting database solutions that adhere to industry-standard design principles, ensuring scalability, efficiency, and data integrity.
Proficient in utilizing GIT for version control, enabling collaborative development efforts, and ensuring code integrity across projects.
Proficient in integrating Oracle databases with C# applications, facilitating seamless communication between backend data layers and frontend user interfaces.
Demonstrated ability to troubleshoot and optimize PL/SQL code for performance and reliability. Experienced in diagnosing and resolving complex database issues to maintain seamless operations.
Utilized tools like Explain Plan and SQL Trace to identify optimization opportunities, resulting in a streamlined database environment and improved application performance.
Designed and developed intuitive user interfaces and comprehensive reports using Oracle Forms and Reports, ensuring seamless interaction and presentation of data for end-users.
Hands-on experience in Oracle Database Administration, including installation, configuration, and maintenance of Oracle database systems, ensuring optimal performance, security, and reliability.
Demonstrated expertise in database design and data modeling, utilizing tools like TOAD and SQL Developer to architect efficient data structures and schemas, facilitating seamless data management and retrieval.
Successfully migrated legacy systems to Oracle databases, leveraging SQL*Loader for efficient data migration and transformation, ensuring data integrity and consistency across platforms.
Possess intermediate-level proficiency in TSQL development, enabling effective data manipulation and management in SQL Server environments. Capable of contributing to projects requiring TSQL expertise.
Designed and implemented complex TSQL stored procedures, functions, and triggers to support backend business logic, ensuring efficient data processing and manipulation.
Proven track record in ETL (Extract, Transform, Load) processes, proficiently handling data integration tasks across various platforms. Skilled in designing and implementing efficient ETL workflows to support business objectives.
Developing intricate data systems that transform, cleanse, and harmonize data from diverse source systems, channeling it to the Curate layer for consumer access.
Creating robust and dependable systems for data processing, ensuring high test coverage, and constructing data quality (DQ) tools to compare data between source and target systems.
Proficient in executing data export, import, and diverse operations using TOAD, SQL*Loader, and SQL DEVELOPER.
Skilled in application development leveraging advanced Oracle features such as Bulk Collections, Table Functions, Autonomous Transactions, Dynamic SQL, Object types, and Records.
Provided support for advanced PL/SQL, utilizing Cursors, REF Cursors, and Native Dynamic SQL.
Proficient in handling DDL, DML, TCL, DCL, DRL, T-SQL and database objects.
Developed complex triggers using PL/SQL to ensure data integrity and enforce business rules.
Utilized Cursor Variables to pass query result sets between PL/SQL programs and client applications.
Extensively worked on the Extraction, Transformation, and Load (ETL) process using PL/SQL for populating database tables.
Extensive experience in data migration techniques, employing Oracle External Tables, SQL* LOADER, UTIL file loader, and batch processing.
Improved the performance of slow SQL queries by implementing indexes and using FORALL and BULK COLLECT.
Proficient in crafting logical and physical database designs, employing data modeling techniques with Erwin and Informatica.
Highly motivated self-learner, capable of swiftly acquiring proficiency and adapting to new technologies and methods.
Enthusiastic advocate for Agile methodologies, with hands-on experience in Agile/Scrum environments. Leveraged Agile principles to drive collaboration, efficiency, and continuous improvement across project teams, resulting in streamlined processes and enhanced project outcomes.

Technical Skills:
Database Oracle 10g,11g,12c,19c,21c PL/SQL, SQL Server 2000/2008/2012, MS Access 2016, MySQL

Oracle Tools TOAD, SQL * Loader, SQL Developer, SQL Tool, SQL Navigator, REPORTS 12c/11g, FORMS 12c/11g, SQL*Plus, PostgreSQL
ETL Tools Informatica Power Center 10.5.3, Data stage, ODI Studio, SSIS
Data Analysis Requirement Analysis, Business Analysis, Detail Design, Data Flow Diagrams, Data Definition Table, Business Rules, Data Modeling, Data Warehousing, System Integration, Tableau, Power BI
Programming Languages C#, SQL, PLSQL, T-SQL, UNIX Shell Scripting, Perl Programming, XML

Professional Experience:

Charles Schwab - Chicago, IL Aug 2022 to Present
Sr. PL/SQL Developer
Responsibilities:
Coordinated with the front-end design team to provide them with the necessary stored procedures and packages and the necessary insight into the data.
Optimized database schemas for performance and usability, utilizing normalization techniques and appropriate indexing strategies.
Worked on SQL*Loader to load data from flat files obtained from various facilities every day.
Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.
Wrote Unix Shell Scripts to process the files on daily basis like renaming the file, extracting date from the file, unzipping the file and remove the junk characters from the file before loading them into the base tables.
Involved in the continuous enhancements and fixing of production problems.
Generated server-side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.
Proactively identified and addressed technical debt within the codebase, refactoring and optimizing existing PL/SQL routines to improve maintainability and scalability, fostering a culture of continuous improvement and code craftsmanship
Developed PL/SQL triggers and master tables for automatic creation of primary keys.
Created PL/SQL stored procedures, functions, and packages for moving the data from staging area to data mart.
Created scripts to create new tables, views, queries for new enhancement in the application using TOAD.
Created indexes on the tables for faster retrieval of the data to enhance database performance.
Involved in data loading using PL/SQL and SQL* Loader calling UNIX scripts to download and manipulate files.
Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF and AUTOTRACE.
Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.
Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SOL and PL/SQL engines.
Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL FILE package.
Partitioned the fact tables and materialized views to enhance the performance.
Extensively used bulk collection in PL/SQL objects for improving performance.
Created records, tables, collections (nested tables and arrays) for improving Query performance by reducing context switching.
Used Pragma Autonomous Transaction to avoid mutating problem in database trigger.
Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.
Collaborating with business analysts, database administrators, and other developers to understand requirements and translate them into PL/SQL solutions.
Implementing data security measures, such as role-based access control, data encryption, and auditing, using PL/SQL constructs.
Testing and debugging PL/SQL code to ensure it meets functional and non-functional requirements, including performance, scalability, and reliability.
Proven ability to collaborate effectively with cross-functional teams, leveraging strong communication skills to foster a cohesive working environment and drive project success.
Collaborated with database administrators and software engineers to design and implement database schemas that align with backend application requirements and performance goals.
Contributed to enhancing system stability by 25% through the identification and resolution of critical performance bottlenecks, ensuring seamless operation of mission-critical applications.
Achieved a significant 35% improvement in data processing speeds by fine-tuning PL/SQL procedures, enhancing the responsiveness of applications and enabling faster decision-making based on real-time insights.
Effectively made use of Table Functions, Indexes, Table Partitioning, Collections, Analytical functions, Materialized Views, Query Re-Writing.
Used JIRA for issue tracking and project management.
Involved in development activities of Overnight Batch Processes.
Configured Jenkins to automate the build process for the FGV application, defining build jobs, configuring build steps, setting up build environments, and scheduling and monitoring builds to streamline development and deployment workflows.
Migrated PVCS source control to TFS.
Created automation scripts for monitoring Oracle WebLogic queue manager to track incoming XML loads.
Created JSON output using Oracle JSON array and JSON object function for front end Application.
Developed complex ETL process to populate tables for SCS Data Visualization.
Contributed to the development of the ETL architecture and established Source to Target mappings for loading data into the Data warehouse.
Worked on production support, and receive production calls, and working on trouble reports which are received from clients.
Involved in Analysis, Design and Development, test, and implementation of Informatica transformations and workflows for extracting the data from the multiple systems.
Worked cooperatively with the team members to identify and resolve various issues relating to Informatica.
Developed mappings in multiple schema databases to load the incremental data load into dimensions.
Design jobs in Control-M for scheduling and monitoring the jobs.
Involved in the documentation of the few utilities using Oracle Designer Tool.
Created UNIX Shell Script for sending email to alert users for erroneous records.

Environment: Oracle 11g, SQL * Plus, TOAD, SQL*Loader, SQL Developer, Shell Scripts, UNIX, Windows XP.

Costco Dallas, TX Nov 2020 to July 2022
Sr. PL/SQL Developer
Responsibilities:
Installed and configured Oracle database systems, ensuring seamless setup and alignment with organizational requirements and best practices.
Maintained Oracle database systems, including routine tasks such as patching, upgrading, and monitoring, ensuring continuous operation and stability.
Optimized Oracle database performance through effective configuration tuning, index optimization, and query optimization techniques, enhancing system responsiveness and efficiency.
Designed and developed PL/SQL packages, procedures, functions, triggers, and other database objects to support complex business requirements and applications.
Writing efficient and scalable PL/SQL code adhering to coding standards and best practices.
Optimizing PL/SQL code for performance, ensuring efficient execution and minimizing resource utilization.
Creating and maintaining database stored procedures, functions, and triggers to enforce business rules, data integrity, and data manipulation operations.
Conceptualized and implemented data models that accurately represent business requirements, facilitating smooth application development and maintenance.
Branched, merged, and resolved conflicts within version control systems, maintaining a well-organized and coherent codebase.
Leveraged Git features such as tagging and branching strategies to streamline development workflows and facilitate release management.
Configured CI/CD workflows to automate build, test, and deployment processes, reducing manual overhead, and accelerating time-to-market for software releases.
Designed and implemented efficient data access layers in C# applications, leveraging frameworks like Entity Framework to interact with Oracle databases securely and optimally.
Developed RESTful APIs and web services using C# to facilitate data exchange and interaction with Oracle databases, ensuring robustness, scalability, and adherence to industry standards.
Actively contributed to the enhancement of database security measures, leveraging PL/SQL constructs to implement role-based access control and data encryption techniques, ensuring the confidentiality and integrity of sensitive information in compliance with industry standards.
Established proactive monitoring and alerting systems for database performance, leveraging tools like SQL Trace and JIRA, to identify and resolve issues before they impact operations, ensuring high availability and reliability.
Collaborating with database administrators, application developers, and business analysts to understand requirements and translate them into PL/SQL solutions.
Creating and maintaining database objects such as tables, views, triggers, and stored procedures to support APEX applications.
Integrating APEX applications with external data sources, including RESTful services, web services, and third-party systems.
Re-designing of scripts with new Oracle features like MERGE, IMPDP, HINTS, etc.
Participated in tuning/optimization of queries, modifying scripts to accommodate required changes in production for speed.
Preparing PL/SQL scripts to validate/reconcile Source and target.
Testing and debugging PL/SQL code, ensuring it meets functional and non-functional requirements.
Spearheaded the optimization of PL/SQL codebase, resulting in a notable 30% reduction in query execution time, enhancing overall database efficiency and user experience.
Implemented innovative solutions using PL/SQL features like Bulk Collections and Autonomous Transactions, resulting in a 25% increase in system performance and reliability.
Led successful project implementations by leveraging Oracle databases and related tools such as TOAD and SQL Developer, ensuring seamless integration and delivery within tight deadlines.
Implementing data security measures, such as role-based access control and data encryption, using PL/SQL constructs.
Providing support and troubleshooting for existing PL/SQL applications and database objects.
Spearheaded database development initiatives, optimizing performance and efficiency through meticulous PL/SQL coding and continuous performance tuning efforts.
Engaged in cross-functional collaboration to address performance bottlenecks and technical challenges, leveraging a combination of technical expertise and effective communication skills to drive solutions.
Developed complex triggers and analytical tools to extract actionable insights from vast datasets, empowering stakeholders with valuable business intelligence for informed decision-making.
Tuned Large Complex Queries and Improved Performance of PL/SQL procedures.
Proactively identified opportunities for process enhancement and innovation, consistently seeking ways to optimize database performance, streamline operations, and enhance the overall efficiency of data management systems.
Adopted Agile methodologies in PL/SQL development processes, leading sprint planning sessions and daily stand-ups to promote transparency and collaboration, resulting in accelerated project delivery and improved team morale.
Design and develop logical and physical data models that utilize concepts such as Star Schema, Snowflake Schema and Slowly Changing Dimensions.
Created Schema Objects like Tables, Views, Materialized views, Sequences, Constraints, indexes.
Involved in fixing code issues raised during SIT (System Integration Testing) and UAT (User Acceptance Testing) and worked with oracle support team through Service Requests (SR's) on seeded performance Issues.
Extensively worked on preparing test scenarios, test scripts on different applications to help the users better understand the application.
Actively Involved in Pre- go-live activities and Post go-live activities, including deployment

Environment: Oracle Database, Oracle APEX, C#, SQL, PL/SQL, SQL*PLUS, UNIX SHELL scripting, SQL Loader, Oracle Data Modeling Tools, SIT, UAT.

Johnson and Johnson New Brunswick, NJ May 2019 to Oct 2020
SQL Database Developer
Responsibilities:
Processed the Web server logs by developing Multi-hop flume agents by using Avro Sink and loaded into MongoDB for further analysis, also extracted files from MongoDB through Flume and processed.
Expert knowledge on MongoDB, NoSQL data modeling, tuning, disaster recovery backup used it for distributed storage and processing using CRUD.
Extracted and restructured the data into MongoDB using import and export command line utility tool.
Experience in setting up Fan-out workflow in flume to design v shaped architecture to take data from many sources and ingest into single sink.
Experience in creating tables, dropping, and altered at run time without blocking updates and queries using HBase and Hive.
Experience in working with different join patterns and implemented both Map and Reduce Side Joins.
Wrote Flume configuration files for importing streaming log data into HBase with Flume.
Imported several transactional logs from web servers with Flume to ingest the data into HDFS.
Using Flume and Spool directory for loading the data from local system (LFS) to HDFS.
Installed and configured pig, written Pig Latin scripts to convert the data from Text file to Avro format.
Created Partitioned Hive tables and worked on them using HiveQL.
Loading Data into HBase using Bulk Load and Non-bulk load.
Worked on continuous Integration tools Jenkins and automated jar files at end of day.
Worked with Tableau and Integrated Hive, Tableau Desktop reports and published to Tableau Server.
Developed MapReduce programs in Java for parsing the raw data and populating staging Tables.
Experience in setting up the whole app stack, setup, and debug log stash to send Apache logs to AWS Elastic search.
Developed Spark code using Scala and Spark-SQL/Streaming for faster testing and processing of data.
Analyzed the SQL scripts and designed the solution to implement using Scala.
Used Spark-SQL to Load JSON data and create Schema R DD and loaded it into Hive Tables and handled structured data using Spark SQL.
Implemented Spark Scripts using Scala, Spark SQL to access hive tables into Spark for faster processing of data.
Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL and U-SQL Azure Data Lake Analytics. Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.
Tested Apache Tez for building high performance batch and interactive data processing applications on Pig and Hive jobs.
Exploring with Spark to improve the performance and optimization of the existing algorithms in Hadoop using Spark context, Spark-SQL, postgreSQL, Scala, Data Frame, Impala, OpenShift, Talend, pair RDD's.
Setup data pipeline using in TDCH, Talend, Sqoop and PySpark on the basis on size of data loads
Implemented Real time analytics on Cassandra data using thrift API.
Designed Columnar families in Cassandra and Ingested data from RDBMS, performed transformations and exported the data to Cassandra.
Leading the testing efforts in support of projects/programs across a large landscape of technologies (Unix, Angular JS, AWS, Sause LABS, Cucumber JVM, Mongo DB, GITHub, BitBucket, SQL, NoSQL database, API, Java, Jenkins
Environment: MS SQL Server 2012/2008R2, Visual Studio, MS SQL Server Integration Services (SSIS), Power BI, Informatica Power Center , Informatica Intelligent Cloud Services (IICS) , MS SQL Server Analysis Services (SSAS), DAX, Agile, T SQL, SQL Profiler, XML, Team Foundation Server (TFS), MS Excel, Excess, Windows 8, PySpark , AWS EMR , S3.

Vanguard - Philadelphia, PA Jan 2018 to Apr 2019
Data Analyst
Responsibilities:
Responsible for all data related aspects of the project. Created report on Cloud based environment using Amazon Redshift and published on Tableau
Developed of Python APIs to dump the array structures in the Processor at the failure point for debugging. Worked extensively on ER/ Studio in several projects in both OLAP and OLTP applications.
Created SQL tables with referential integrity and developed queries using SQL and PL/SQL. Performed Data Analysis and data profiling using complex SQL on various sources systems including Oracle.
Developed the required data warehouse model using Star schema for the generalized model. Implemented Visualized BI Reports with Tableau.
Worked on stored procedures for processing business logic in the database. Extensively worked on Viewpoint for Teradata to look at performance Monitoring and performance tuning.
Performed Extract, Transform and Load (ETL) solutions to move legacy and ERP data into Oracle data warehouse. Developed and maintained data dictionary to create metadata reports for technical and business purpose.
Worked Normalization and De-normalization concepts and design methodologies. Worked on the reporting requirements for the data warehouse.
Worked on SQL Server concepts SSIS (SQL Server Integration Services), SSAS (Analysis Services) and SSRS (Reporting Services).
Developed complex T-Sql code such as Stored Procedures, functions, triggers, Indexes, and views for the business application.
Wrote a complex SQL, PL/SQL, Procedures, Functions, and Packages to validate data and testing process. Configure/script Business rules and transformation rules in Informatica.
Analyzing and translating business needs to create solution data models. Implementing data strategies and creating physical data models.
Environment: ER/ Studio, SQL, Python, APIs, OLAP, OLTP, PL/SQL, Oracle, Teradata, BI, Tableau, ETL, SSIS, SSAS, SSRS, T-SQL, Redshift, IICS, Data modelling.

Appshark Software - Hyderabad, India Sep 2016 to Nov 2017
Data Analyst
Responsibilities:
Designed & built reports, processes, and analyses with a variety of business intelligence tools & technologies Transformed data into meaningful insights from various data sources to support the development of global strategy and initiatives.
Involved in requirements gathering, source data analysis, identified business rules for data migration, and for developing data warehouse/data mart.
Collected data using SQL Script, created reports using SSRS, and used Tableau for data visualization and custom reports analysis.
Performed Exploratory Data analysis (EDA) to find and understand interactions between different fields in the dataset, handling missing values, detecting outliers, data distribution, and extracting important variables graphically.
Worked on python library - NumPy, Pandas, SciPy for data wrangling and analysis, while visualization libraries of Python using Matplotlib for graphs plotting.
Performing data collection, cleaning, wrangling, analysis, and building machine learning models on the data sets in both R and Python.
Used Agile methodologies to emphasize face-to-face communication and that iteration are passing through full SDLC.


Infinite Computer Solutions - Bangalore, Ind Jul 2015 to Aug 2016
SQL Developer
Responsibilities:
Worked with Digital and Analytics teams, focused on the Health Insurance processes for delivering solutions. Created Joins, User Defined Functions (UDFs), Complex Stored Procedures, Indexes, Tables, and other T-SQL scripts.
To facilitate easy user interface implementation created Views and executed triggers on them to facilitate consistent data entry into the database.
Developed physical and logical database architecture for warehouses and servers and normalized all tables in the database to 3NF form.
Migrated data from text files and Excel spreadsheets to SQL Server databases using SSIS. Extensive experience in Data Definition, Data Manipulation, Data Query and Transaction Control Language
Understanding the requirements by interacting with business users and mapping them to design and implement it following the AGILE Development methodology.
Experience in Installing, Upgrading and Configuring Microsoft SQL Server and Migrating data from SQL Server 2008 to SQL Server 2012
Experience in designing and creating Tables, Views, User Created Data Types, Indexes, Stored Procedures, Cursors, Triggers and Transactions
Automated scripts in T-SQL, BCP, and DTS to import data from various ODBC sources into production SQL Server databases.
Keywords: csharp continuous integration continuous deployment javascript business intelligence sthree database rlang information technology golang microsoft procedural language Delaware Illinois New Jersey Pennsylvania Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];2894
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: