Home

Pradeep Kumar - Snowflake/Teradata Developer
[email protected]
Location: Tampa, Florida, USA
Relocation: No only remote
Visa: GC EAD
PRADEEP KUMAR BONAMSHETTY PHONE#: 508-644-6732
Sr. Snowflake Architect Email-ID: [email protected]

Certifications
SnowPro Core Certified
AWS Certified Cloud Practitioner
Teradata Certified Professional
PROFESSIONAL SUMMARY

Over 12+ years of IT experience in Database Designing, Administration and Architecting Enterprise Data Warehouse in Teradata/Snowflake.
Architect the data warehouse and provide guidance to the team in implementation using Snowflake cloud-based features.
Hands-on experience with Snowflake features - Time Travel, Fail-Safe, Data Streams, Tasks, Zero Copy Cloning, Data masking, Data sharing, Clustering Keys, Resource Monitors, RBAC controls etc.)
Possesses strong leadership skills with a willingness to lead, create Ideas, and be assertive.
Experience in performance tuning of the snow pipelines and virtual warehouses in snowflake.
Good knowledge and exposure to data analysis/pipelining development, automation & orchestration platforms, open source tools like Data Build Tool (DBT), Airflow, Spark, Kafka.
Experience as Teradata Administrator for managing regular DBA activities like Create Database, Users, Tables, Views, macros, Procedures, checking the tables skew, profiles, roles, grant/revoke permissions, space management (spool, temporary, permanent space), compression, troubleshooting the failure jobs.
Proficient in Teradata Application Support, Performance Tuning, Optimization, User & Security Administration, Data Administration and setting up the environments.
Proficient in Database Design (Logical and Physical). Deep understanding of relational as well as NoSQL (DynamoDB, MangoDB) data stores, methods and approaches (dimensional modelling, star and snowflake).
Monitoring Teradata CPU/Load utilization, Space Management and System Health Trend periodically for the future enhancements and pipeline projects.
Worked on Teradata View point, Teradata PDCR to monitor the systems and also to generate various performance and system related reports.
Experience as On-Call DBA Support (24X7) as part of a scheduled rotation with team members.
Good understanding of AWS, Azure and GCP cloud platforms and experience in some of its services like S3, EC2, IAM, cloud storage, blob storage etc.
Familiarity with a variety of programming/scripting languages like Python, Java, C/C++.
Involved in Data Migrations and Disaster Recovery projects.
Programming experience in Teradata SQL, Stored Procedures, Macros and Triggers.
Responsible for the design and implementation of DW Dimensional Model Architecture using Join Indexes and Partitioned Primary Indexes to achieve and exceed the desired performance.
Used TARAGUI for the BAR operations of the Teradata Databases and also monitored the jobs. Used most efficient way of data and DDL Backups and Environment refreshes.
Define and maintain TASM Workloads, TASM exceptions, filters and throttles as needed.

TECHNICAL SKILLS

Database Teradata, Snowflake, Oracle, Microsoft SQL Server, Greenplum,
Dynamo DB, MangoDB, Redshift, RDS, Big Query
Tools & Frameworks Teradata SQL Assistant, BTEQ, Teradata Viewpoint, Teradata Administrator, Teradata Dynamic Workload Manager, Unity Director, Data Mover, TSET, Priority Scheduler, MultiLoad, FastLoad, Fast Export, Tpump, SQL*Plus, Peregrine Service Center, EURC (End User Request Center, ITSM (Information Technology Service Management), JIRA, Service Now.

Informatica Power Centre, Informatica Metadata Manager, Manta, DataBricks, Tableau, Power BI, Data Build Tool (DBT)

Microsoft Management Studio

Airflow, Active Batch, AutoSys, Control_M
Languages C, C++, Java, Java Script, Python, SQL, PL/SQL
Backup Server DSA, TARAGUI Backup/Restore, Teradata ARC, Symantec NetBackup.
Cloud Services AWS, Azure, GCP
Operating Tools Microsoft Windows, UNIX, Linux


EDUCATION
Bachelors in Computer Science, JNTU Hyderabad, India
Masters in Computer Science, University of Central Missouri Missouri, USA


Professional Experience:

AT&T - (Tampa, Florida) Nov 2021 July 2023
Snowflake Architect
Responsabilit s :

Experience in designing and implementing a fully operational solution on Snowflake Cloud Data Warehouse.
Hands-on experience with Snowflake features - Time Travel, Fail-Safe, Data Streams, Tasks, Zero Copy Cloning, Data masking, Data sharing, Clustering Keys etc.)
Define roles, privileges required to access different database objects.
Hands-on experience of Snow ake internal and external Stages. Bulk loading from the external stage (AWS S3) to the internal stage (snow ake) using the COPY command.
Good exposure to di erent types of snow ake tables and views, Handling large and complex data sets with CSV le formats, JSON les.
Define virtual warehouse sizing for Snowflake for different type of workloads.
Created Snowpipe for continuous data load, created data sharing between two snowflake accounts.
Redesigned the views in snowflake to increase the performance.
Provide production support for Data Warehouse issues such as data load problems, transformation translation issues etc.
Administration of Teradata database production and non-production activities like User Management, Object and Storage Management, code review and migrations as per Teradata Best practices.
Experienced in performance optimization techniques and structured query language (SQL) tuning.
Implement Teradata Security on Database access and System-enforced security features based on user groups. Grant and Revoke object permissions on databases/users/roles as per the access layer architecture.
Real time monitoring of the Teradata databases and ensuring their highest performance and availability using viewpoint.
Monitor Querygrid portlets for Teradata to Hadoop QG jobs and troubleshoot the QG issues for the failed GQ jobs and communicate with developers.
Assist developers, DBAs in designing, architecture, development and tuning queries of the project. This included modification of queries, Index selection, and refresh statistic collection.
Performing backup and restore process using Taragui.
Support 24/7 On-Call/backup responsibilities, troubleshoot and resolve end user issues to ensure data warehouse meets service level agreements.
Generate CPU/Load utilization, Space usage growth reports and various performance, system related reports and work with capacity planning and performance improvements.
Collaborate with other DBA members and Architects in planning, testing, configuration and execution of patches, installations, upgrades and migrations.
Use Tay s (Teradata at your service) to work with Teradata customer support for system down/restarts/any other misc. issues.
Maintains and contributes to policies, standards and procedures, and documentation of best practices/process improvements.

Verizon - (Tampa, Florida) April 2019 Sep 2021
Snowflake DBA
Responsabilit s :

Support multiple Teradata Systems for Verizon Wireless & Wireline.
Manage Teradata Database using Teradata Administrator, Teradata SQL Assistant and BTEQ. Create /Modify/Drop Teradata objects like Tables, Views, Join Indexes, Triggers, Macros, Procedures, Databases, Users, Profiles and Roles.
Proactively monitoring database space, Identifying tables with high skew, working with data modeling team to change the Primary Index on tables with High skew.
Error handling and support of existing applications for failed reloads using Teradata MLOAD, BTEQ and Fast Load.
Updated numerous BTEQ/SQL scripts, making appropriate DDL changes. Optimization and tuning of SQL s for fixing Teradata Spool issues.
Monitoring bad queries, aborting bad queries using viewpoint, looking for blocked sessions and working with development teams to resolve blocked sessions.
Performing backup and restore process using Taragui.
Support 24/7 On-Call/backup responsibilities, troubleshoot and resolve end user issues to ensure data warehouse meets service level agreements.
Worked on capacity planning, reported disk and CPU Usage growth reports using Teradata DBQL, and Resusage.
Worked on Space considerations and managed Perm, Spool and Temp Spaces.
Review DDL code requests following some of the DDL standards and perform the deployments.
Use Tay s (Teradata at your service) to work with Teradata customer support for system down/restarts/any other misc. issues.
Proactively sharing best practices and recommending improvements to the process, in order to increase the quality of the data.

Wells Fargo - (Charlotte, North Carolina) Sep 2018 April 2019
Sr. Teradata DBA
Responsabilit s :

Proactively monitoring database space, Identifying tables with high skew, working with data modeling team to change the Primary Index on tables with High skew.
Managing database space, allocating new space to database, moving space between databases as needed.
Working closely with end users to understand and resolve issues.
Performance Tuning, SQL query optimization and code analysis.
Live server monitoring and tract killer queries for blocked sessions and high CPU consuming sessions/users & worked with developers and end users to resolve them.
Built monthly, weekly CPU/Space Usage Growth Reports and provided recommendations.
Effectively used PDCR-DBQL for different object management and performance reports/issues.
Monitor and Abort Sessions for blocked, high CPU consuming, highly skewed and long running queries.
Monitored Database / Table Space Usage & Skewness at Database / Table / Vproc level.
Work on production and uat deployments and access requests from end users.
Reviewed the SQL for missing joins & join constraints, Stats, miss-matched aliases, casting errors.

Expedia Inc. (Seattle, Washington) Sep 2017 Sep 2018
Sr Teradata DBA
Responsibilities :

Teradata Database Administration and resolving tickets/issues through committed SLA s.
Performance Tuning, SQL query optimization and code analysis.
Proactively monitoring database space, Identifying tables with high skew, working with data modeling team to change the Primary Index on tables with High skew.
Managed database space, allocating new space to database, moving space between databases as needed.
Built monthly, weekly CPU/Space Usage Growth Reports and provided recommendations.
Live server monitoring and tract killer queries for blocked sessions and high CPU consuming sessions/users & worked with developers and end users to resolve them.
Performed Maintenance DSA weekly Full backup jobs and Monitored in Viewpoint.
Effectively used PDCR-DBQL for different object management and performance reports/issues.
Participated in On-call and weekend support responsibilities.
Monitor and Abort Sessions for blocked, high CPU consuming, highly skewed and long running queries.
Enabled Access logging and DBQL logging in DEV, UAT following Teradata best practices.
Monitored Database / Table Space Usage & Skewness at Database / Table / Vproc level.
Review DDL code requests following some of the DDL standards and perform the deployments.
Reviewed the SQL for missing joins & join constraints, Stats, miss-matched aliases, casting errors.
Worked closely with end users to understand and resolve issues.

Cigna - Health Spring (Nashville, Tennessee) October 2016 Sep 2017
Sr Teradata DBA
Responsibilities:

Teradata Database Administration and resolving tickets/issues through committed SLA s.
Implemented Teradata Security on Database access, Control, Monitor and System-enforced security features based on user groups.
Performance Tuning, SQL query optimization and code analysis.
Prepared environment container scripts for specific application projects & deploying them.
Managed database space, allocating new space to database, moving space between databases as needed.
Built monthly, weekly CPU/Space Usage Growth Reports and provided recommendations.
Granted and Revoked object rights on databases/users/roles per the access layer architecture.
Live server monitoring and tract killer queries for blocked sessions and high CPU consuming sessions/users & worked with developers and end users to resolve them.
Participated in Data Migrations between Production & DR systems.
Participated in On-call and weekend support responsibilities.
Worked with Informatica team and Business Objects team in testing their connection to Teradata.
Created new ETL, Analyst, Reporting, DBA, Architect roles and profiles.
Created access matrix for management purpose, to track rights for different profiles through different roles.
Application databases created in Dev and assigned appropriate privileges on the objects to different roles.
Enabled Access logging and DBQL logging in DEV, UAT following Teradata best practices.
Monitored Database / Table Space Usage & Skewness at Database / Table / Vproc level.
Assist developers to write DDL code following best practices.
Review DDL code requests following some of the DDL standards and perform the deployments.
Reviewed the SQL for missing joins & join constraints, Stats, miss-matched aliases, casting errors.
Worked closely with end users to understand and resolve issues.

Comcast (Philadelphia, PA) April 2016 Sep 2016
Sr Teradata Analyst
Responsibilities:

Analysis of complex SQL scripting and lineage within Informatica Metadata Manager.
Review BTEQ scripts being used to load data tables and confirm the lineage is matching how the BTEQ scripts are developed.
Report discrepancies discovered through analysis and the suggested resolution.
Documenting all the validated issues including missing lineages, bteq s and clearly explaining to developers.
Performing the validations in UAT, Prod & making sure both the systems are in sync.
Confirm the objects are one-time load tables or Incremental tables based on the analysis for missing lineages.
Installed all Teradata DEV, UAT and PROD Catalogs in Informatica Metadata Manager to perform data lineage analysis.
Configure the properties and the necessary files into the Manta directories.
Establish and maintain end-to-end data lineage accurately reflecting UAT, PRODUCTION processes and data flows in Manta.
Drive data analysis meetings for data migration from external data sources, and documenting data conversion requirements.
Retrieve all the involved BTEQ s of a target lineage object for the migration purpose from DEV to UAT and UAT to PROD.
Write SQL code on the audit columns to confirm if it s an initial load or incremental load.
Validating the no. of Bteq's under each database in IMM lineage between Dev, UAT and Prod.
Work with Manta technical support team and raise tickets for any issues/blockers.
Worked on the validation status documents & reporting to manager on regular basis.
Followed the agile methodology & attending daily 15 mins scrum calls to synchronize activities and create a plan for the next 24 hrs.

JPMC (Newark, Delaware) August 12 Feb 16
Teradata DBA

Responsibilities:

Monitoring the EURC (End User Request Center) Queue and solving the requests before the SLA.
Participated in the Data Migration (DDL Reviews and Implementations) activities between Development and Production environments thorough standard ITSM Process.
Support 24/7 On-Call/backup responsibilities as part of a scheduled rotation with other team members and solving p1/p2/p3/p4 tickets before they get breached.
Monitor the Shared Mailbox regarding the end user questions/issues and working with them.
Manage Teradata Database using Teradata Administrator, Teradata SQL Assistant and BTEQ.
Create / Modify / Drop Teradata objects like Tables, Views, Join Indexes, Triggers, Macros, Procedures, Databases, Users, Profiles and Roles.
Monitor Database / Table Space Usage & Skewness at Database / Table / Vproc level.
Monitor and Abort Sessions for blocked, high CPU consuming, highly skewed and long running queries.
Using the DBQL and Access Log to monitor the database log.
Managing database space, allocating new space to database, moving space between databases as needed basis.
Monitor backup results and resolve/troubleshoot failed database backups.
Use Teradata Explain to analyze and improve query performance.
Execute Diagnostic Help stats for session for queries to find missing stats on tables.
Granting & Revoking of Object Rights, System Rights on Databases / Users / Roles.
Performed the validations of the deployments.
Collect Statistics on NUPIs/USIs/NUSIs/non-indexed join columns/small tables for better performance of queries.
Supported in the DR releases to sync the objects between Prod and DR and generating a report.
Supported in several Teradata Expansion releases performing Validations, making necessary profile changes and resolving password issues.
Supported DR releases to sync objects between Prod and DR and generating a report.
Submitted, Enabled and started DM Job.
Monitored logs of failed DM jobs & troubleshoot DM jobs using DM repository framework.
Generated DM reports with sync issues between PROD & DR.

State Farm (Bloomington, IL) February 11 - July 12
Teradata DBA
Responsibilities:

Understanding the specification and analyzed data according to client requirement.
Creating roles and profiles as needed basis. Granting privileges to roles, adding users to roles based on requirements.
Managing database space, allocating new space to database, moving space between databases as needed basis.
Proactively monitoring bad queries, aborting bad queries using PMON, looking for blocked sessions and working with development teams to resolve blocked sessions.
Worked on moving tables from test to production using fast export and fast load.
Extensively worked with DBQL data to identify high usage tables and columns.
Implemented secondary indexes on highly used columns to improve performance
Worked on exporting data to flat files using Teradata FEXPORT.
Worked exclusively with the Teradata SQL Assistant to interface with the Teradata.
Written several Teradata BTEQ scripts to implement the business logic.
Populated data into Teradata tables by using Fast Load utility.
Created Teradata complex macros and Views and stored procedures to be used in the reports.
Did error handling and performance tuning in Teradata queries and utilities.
Creating error log tables for bulk loading.
Actively involved in the TASM workload management setup across the organization. To define TASM Workloads, developed TASM exceptions, implemented filters and throttles as needed basis.
Used Teradata Manager collecting facility to setup AMP usage collection, canary query response, spool usage response etc.
Developed complex mappings using multiple sources and targets in different databases, flat files.
Developed Teradata BTEQ scripts. Automated Workflows and BTEQ scripts
Worked on exporting data to flat files using Teradata FEXPORT.
Query optimization (explain plans, collect statistics, Primary and Secondary indexes).

OTC (Oklahoma, OK) January 10 January 11
Teradata DBA
Responsibilities:

Performed Data analysis and prepared the Physical database based on the requirements.
Used Teradata Utilities to ensure High System performance as well as High availability.
Implementation of TASM for performance Tuning and Workload Management.
Usage of analyst tools like Tset, Index Wizard, and Stats Wizard to improve performance.
Responsible for populating warehouse-staging tables.
Responsible for capacity planning and performance tuning.
Prepared Performance Matrices.
Worked on and developed scripts for CronTAB to automate the monitoring tasks.
Created Teradata objects like Databases, Users, Profiles, Roles, Tables, Views and Macros.
Developed complex mappings using multiple sources and targets in different databases, flat files.
Developed BTEQ scripts for Teradata.
Automated Workflows and BTEQ scripts
Responsible for tuning the performances of Informatica mappings and Teradata BTEQ scripts.
Worked with DBAs to tune the performance of the applications and Backups.
Worked on exporting data to flat files using Teradata FEXPORT.
Query optimization (explain plans, collect statistics, Primary and Secondary indexes).
Build tables, views, UPI, NUPI, USI and NUSI.
Written several Teradata BTEQ scripts to implement the business logic.
Worked exclusively with the Teradata SQL Assistant to interface with the Teradata.
Written various Macros and automated Batch Processes.
Writing UNIX Shell Scripts for processing/cleansing incoming text files.
Used CVS as a versioning tool.
Coordinating tasks and issues with Project Manager and Client on daily basis.
Keywords: cprogramm cplusplus business intelligence sthree database information technology logistics execution procedural language Idaho Illinois Pennsylvania

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];1433
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: