Home

Kranti Kumar - Sr Snowflake Developer/Admin
[email protected]
Location: Ellicott City, Maryland, USA
Relocation: No
Visa: GC
KRANTI KUMAR
PROFESSIONAL SUMMARY
Around 17 years of total IT experience in the Analysis, Design, Development, Testing, Implementation, Administration and Support of Database Systems in Linux and Windows environments,12 + years of experience in Teradata Enterprise Data Warehouse (EDW) and 3 years of experience in CLOUD Data warehouse (AWS, Azure & Snowflake).
Experience working with clients of all sizes in the healthcare, financial, Telecom industries.
Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, zero copy clone, time- travel.
Served as the Snowflake Database Administrator responsible for leading the data model design and database migration deployment production releases to endure our database objects and corresponding metadata were successfully implemented to the production.
Installed all the client Teradata tools and have extensive experience with TTU 13.00/13.10/14.0/14.10/15.10/16.20 client tools including Teradata Manager, SQL Assistant (Quarryman), Teradata Administrator
Expert in planning and execution of hardware/software upgrades and expansion system tuning and troubleshooting on UNIX, knowledge of database recovery and restart procedures, database monitoring and skilled with periodic performance tuning.
Implemented various Teradata recommended best practices while defining Profiles, Roles, Alerts, Multi-Value Compression, Data Mover, Backups and Restore. Involved in Migrating Objects from Teradata to Snowflake in AWS. Experienced in UNIX Shell scripting as part of file manipulation and text processing.
Experience in Performance Tuning and Query optimization
Experience with AWS account setup configures S3, EC2 and SQS instances.
Implemented solutions using snowflake s data sharing, cloning and time travel.
Implemented role-based access control in snowflakes.
Have good experience in architecting data pipeline solutions using AWS, Snowflake and Python
Implemented Data ingestion strategies in snowflake using Snow pipe and external tables
Implemented automated python frameworks for ETL process in snowflake
Created and implemented BAR strategy for the TD application and system databases
Implemented ETL level collection statistics for the Indexes and for some of the columns going to be involved in the constraints or join conditions
Extensive experience with Release Management activities using MKS Tool & NetBackup
Implemented solutions using snowflake s data sharing, cloning and time travel
Implemented role-based access control in snowflake
Implemented automated python frameworks for ETL process in snowflake
Hands on experience on AWS S3, EC2, Lambda, DMS, Cloud watch, RDS, Redshift services
Involved in Migrating Objects from Teradata to Snowflake in AWS
Experienced in UNIX Shell scripting as part of file manipulation and text processing
Experience in Performance Tuning and Query optimization
In-Depth understanding of Snowflake Multi-cluster Size and Credit Usage
SKILLS:
Databases: Teradata, SQL Server, Oracle, Snowflake
Teradata DBA tools: Teradata Manager, Viewpoint Server, Query Manager, Bteq, Fast Export, Fast load, Multi load, TPump and TARA GUI
Scripting Languages: Shell scripting and Python
CI/CD tools: Bit Bucket, Jenkins,GIT
Cloud: AWS, AZURE, GCP
ETL Tools: Talend, Ab initio, Informatica
BI/Reporting Tools: Crystal Reports, Web focus and Tableau
EDUCATION:
Master s in computer applications, Osmania University, 2003
Bachelor's degree, Computer Science, Osmania University, 2000
CERTIFICATE:
AWS Certified Solutions
Architect Associate
Snow Pro Core Certified
PROFESSIONAL EXPERIENCE
Client: DELTA DENTAL, CA MAR 2023 to Till Date
Role: Snowflake Developer/Admin.
Responsibilities:
Implemented snow pipe for real-time data ingestion
Automated the purging process, data reconciliation between source and target using python scripting
Shared sample data using grant access to customer for UAT
Created Snow pipe for continuous data load
Implemented Python scripts for Migrating Teradata Objects to Snowflake Objects
Worked on tuning the snowflake queries
Implement/support automation for production deployment in Snowflake
Diagnosed and troubleshooting Snowflake database errors
Implemented Snowflake user /query log analysis and alerts
Implemented data encryption/decryption/masking policies to protect PII/PCI data
Demonstrated proficiency of index design, query plan optimization and analysis required
Created Snowflake utilization and capacity plan
On-call responsibilities on rotational basis for Snowflake database administration
Developed secure access to objects in Snowflake with Role-Based Access Control (RBAC)
SSO is configured using Azure AD.
Integrated Tableau & PowerBI with snowflake.
Created warehouses, databases, and all database objects (schemas, tables, etc.) and access management policies
Analyzed production workloads and develop strategies to run Snowflake with scale and efficiency
Experienced in Snowflake performance tuning, capacity planning, handling cloud spend and utilization
Defined and automate DBA functions
Developed operational best practices, monitoring, defining SLAs, and measuring metrics
Defined and document best practices and support procedures and help on call
Worked on data platform cost, credit consumptions, and optimizing the cloud spend.
Worked on Cache usage, Streams, NoSQL, snow pipe, Secure Data sharing and
Automated using Advanced SQL and Python
Created data sharing between two snowflake accounts
Grants to roles to users as required
Implemented SSO setup using Azure AD integration.
Created new accounts using Orgadmin Role.
Inegrated snowflake with Power BI, Tableu and DBweaver.
Created and Monitored credit usage of all account using ORG admin account.
Data Replication implemented between East & West regions
Created internal and external stage and transformed data during load
Added cluster keys to tables as needed
Implemented Python scripts for Migrating Teradata Objects to Snowflake Objects
Created internal and external stage and transformed data during load
Migrated Teradata data into S3 bucket and unloaded to snowflake tables
Experience using git for version control and collaboration
Experience working with CI / CD to test and deploy code
Experience using Airflow to orchestrate data pipelines
Optimize and fine tune queries and cloned Production data for code modifications and testing
Designed logical and conceptual data models (ERWIN), Roles (RBAC), security policies (Network), Warehouse policies (Multi cluster, Sizing, Scaling), & Data Sharing
Implemented cost Optimizations on Snowflake DB by analyzing and re-engineering the consumptions by various services (Warehouse, Snow pipe, clustering). Costs optimized by 25%
Implemented AWS configurations S3, EC2, and SQS
Implemented Python scripts for Migrating Teradata Objects to Snowflake Objects
Monitoring performance/Bad queries and user logins
Environment: Snowflake (Business Critical), AWS, SCRUM, Tableau, AZURE,GITLAB,PowerBI, AZURE,AIRFLOW, Python, ADF, Informatica, JIRA, Red hat Linux, MATILION,Service Now


Client: AIR, Columbia, MD Aug 2019 to FEB2023
Role: Snowflake Developer/Admin.
Responsibilities:
Created and managed users and roles Using user administrator
Time traveled to 56 days to recover missed data
Implemented snow pipe for real-time data ingestion
Automated the purging process, data reconciliation between source and target using python scripting
Shared sample data using grant access to customer for UAT
Created Snow pipe for continuous data load
Implemented Python scripts for Migrating Teradata Objects to Snowflake Objects
Migrated all the TPT & FAST LOAD scripts in Teradata to snowflake copy statements
Worked on tuning the snowflake queries
Implement/support automation for production deployment in Snowflake
Diagnosed and troubleshooting Snowflake database errors
Implemented Snowflake user /query log analysis and alerts
Implemented data encryption/decryption/masking policies to protect PII/PCI data
Demonstrated proficiency of index design, query plan optimization and analysis required
Created Snowflake utilization and capacity plan
On-call responsibilities on rotational basis for Snowflake database administration
Developed secure access to objects in Snowflake with Role-Based Access Control (RBAC)
Created Snow pipe for continuous data load
SSO is configured using Azure AD.
Integrated Tableau & PowerBI with snowflake.
Created warehouses, databases, and all database objects (schemas, tables, etc.) and access management policies
Analyzed production workloads and develop strategies to run Snowflake with scale and efficiency
Experienced in Snowflake performance tuning, capacity planning, handling cloud spend and utilization
Defined and automate DBA functions
Developed operational best practices, monitoring, defining SLAs, and measuring metrics
Defined and document best practices and support procedures and help on call
Worked on data platform cost, credit consumptions, and optimizing the cloud spend.
Worked on Cache usage, Streams, NoSQL, snow pipe, Secure Data sharing and
Automated using Advanced SQL and Python
Created data sharing between two snowflake accounts
Grants to roles to users as required
Implemented SSO setup using Azure AD integration.
Created new accounts using Orgadmin Role.
Inegrated snowflake with Power BI, Tableu and DBweaver.
Created and Monitored credit usage of all account using ORG admin account.
Data Replication implemented between East & West regions
Created internal and external stage and transformed data during load
Added cluster keys to tables as needed
Implemented Python scripts for Migrating Teradata Objects to Snowflake Objects
Created internal and external stage and transformed data during load
Migrated Teradata data into S3 bucket and unloaded to snowflake tables
Experience using git for version control and collaboration
Experience working with CI / CD to test and deploy code
Experience using Airflow to orchestrate data pipelines
Optimize and fine tune queries and cloned Production data for code modifications and testing
Designed logical and conceptual data models (ERWIN), Roles (RBAC), security policies (Network), Warehouse policies (Multi cluster, Sizing, Scaling), & Data Sharing
Implemented cost Optimizations on Snowflake DB by analyzing and re-engineering the consumptions by various services (Warehouse, Snow pipe, clustering). Costs optimized by 25%
Implemented AWS configurations S3, EC2, and SQS
Implemented Python scripts for Migrating Teradata Objects to Snowflake Objects
Migrated all the TPT scripts in Teradata to snowflake copy statements
Monitoring performance/Bad queries and user logins
Created internal and external stage and transformed data during load
Migrated Teradata data into S3 bucket and unloaded to snowflake tables
Environment: Snowflake (Business Critical), AWS, SRUM, Tableau, PowerBI, Git, , AZURE,AIRFLOW,Teradata 16.x, Python, ADF, Informatica, PTC, JIRA, Red hat Linux, Service Now,SVN.
Client: GENERAL DYNAMICS INFORMATION TECHNOLOGY (CMS-IDR), Baltimore, MD Jan 2015 to Jul 2019
Role: Teradata Data Engineer
Responsibilities:
Scheduled diffirent types of Backup jobs such as Weekly/Monthly/Yearly, and Full backups using Net backup console
Migrated all the bteq scripts and stored procedures to snowflake
Expertise in DSA backup & restore
Expertise in Solaris/Linux Administration Tasks, such as setting ACLs on Directories and unlocking user accounts, creating directories and sub-directories etc
Performed system administrator functions in SUSE Linux Enterprise Server (SLES) 10 & 11
Expertise in writing Arc Scripting
Worked on Teradata Multi-Load, Teradata Fast-Load utility to load data from Oracle and SQL Server to Teradata
Tuning SQL Queries to overcome spool space errors and improve performance
Worked with BTEQ in UNIX environment and execute the TPT script from UNIX platform
Monitoring Teradata Backups/Media servers
Promoting the code from DEV-TST, TST-INT and INT-PRD
Created Teradata Restore Scripts
Converted all DEV, TST, INT & PRD Tara scripts to DSA backup scripts
Monitoring Teradata Backups/Media servers
PROD Database data copied into DEV and VAL servers as part of REFRESH in DEV and VAL Servers
Created Multiple Teradata Backup Scripts Using TARA
Expertise with MKS Tool, to track the code promotions
Created Teradata Restore Scripts
Implementation of Project Migration Forms in different Environments
Created multiple application ids with directory structures in UNIX Environment
Created different NetBackup policies with various Retention Periods
Performed Unlock mechanism on application ids
Resolved critical backup failure issues in netback up console
Involved in various Database upgrades and Database Refresh
Environment: Teradata 16.20/15.10/14.10, TARA GUI 14.10, TERADATA VIEWPOINT, SUSE LINUX 11.X, SOLARIS 10.0/11.0, TDT, Net Backup 8.x/7.x, BTEQ, PTC, Remedy, DD890, DD7200, DSA 15.10, DSA 16.20, Red hat Linux, Service Now, SVN, AWS, Matillion 1.51.
Client: LOCKHEED MARTIN, MD Dec 2010 to Dec 2014
Role: Teradata DBA (CMS-IDR)
Responsibilities:
Installed all the client Teradata tools and have extensive experience with TTU 13.x/14.x client tools in windows and UNIX environment
Performed system administrator functions in SUSE Linux Enterprise Server (SLES) 10 & 11
Expertise in writing Arc Scripting
Monitoring Teradata Backups/Media servers
Created Multiple Teradata Backup Scripts Using TARA
Expertise with MKS Tool, to track the code promotions
Promoting the code from DEV-TST, TST-INT and INT-PRD
Created Teradata Restore Scripts
Worked on Teradata Multi-Load, Teradata Fast-Load utility to load data from Oracle and SQL Server to Teradata
Tuning SQL Queries to overcome spool space errors and improve performance
Worked extensively in Tuning queries by elimination cursor-based store procedure and using the set-based UPDATE statements
Worked with BTEQ in UNIX environment and execute the TPT script from UNIX platform
Also involved in creating Views, Conformed views and mapping them from the table
Worked to create TPT script template for all medical, eligibility and pharmacy flat file coming
All Submission files were designed according to HIPAA standards
Conducted design review meetings with the senior architect for any new requirement to get the approval and designed the mappings according to the company standards
Used SQL Assistant to querying Teradata tables
Designed complex mappings Involving target load order and constraint-based loading
Created net backup policies and scheduled jobs using net backup console
Running SRF s in different Environments
Implementation of Project Migration Forms in different Environments
Created multiple application ids with directory structures in UNIX Environment
Performed Unlock mechanism on application ids
Implemented 2 factor Authentication on all Teradata nodes and Media Servers
Copied data from one environment to another environment using TDT
Resolved critical backup failure issues in netback up console
Teradata 13.10/14.0 Viewpoint is used to monitor the system performance/Health when it is in load
Involved in various Database upgrades and Database Refresh
Co-ordinating with INFORMATICA Admin s implementation of informatica code promotions
Involved in data domain setup for Teradata Backups
Environment: Teradata 12.0/13.0/13.10/14.0, TARA GUI 13.0/13.10/14.0, TERADATA VIEWPOINT, SUSE LINUX 9.0/10/11.0, SOLARIS 9.0/10.0/11.0, TDT,TASM, PDCR, Net Backup 6.5/7.5, BTEQ, PTC, Remedy, DD890.
Client: STATE OF MICHIGAN (DEPARTMENT OF COMMUNITY HEALTH) Feb 2007 to Aug 2010
Role: Teradata Developer
Responsibilities:
Developed Logical and Physical model using Erwin tool by getting the required information from the Business Users
Creating databases, users in dev, test, and production environments
Expertise in writing Arc Scripting
Designed, developed, optimized, and maintained database objects like tables, views, indexes, soft RI, common procedure macro etc. Testing and implementation of systems
Creating roles and profiles as needed basis. Granting privileges to roles, adding users to roles based on requirements
Scheduled diff. types of Backup jobs such as Weekly/Monthly/Yearly, and Full backups
Managing database space, allocating new space to database, moving space between databases as needed basis
Assist developers, DBAs in designing, architecture, development, and tuning queries of the project. This included modification of queries, Index selection, and refresh statistic collection
Proactively monitoring bad queries, aborting bad queries using PMON, looking for blocked sessions and working with development teams to resolve blocked sessions
Performed Administrative tasks in SOLARIS/LINUX environment
Proactively monitoring database space, identifying tables with high skew, working with data modeling team to change the Primary Index on tables with High skew
Worked on moving tables from test to production using fast export and fast load
Extensively worked with DBQL data to identify high usage tables and columns
Implemented secondary indexes on highly used columns to improve performance
Developed various DBQL reports such as top 10 queries with high CPU, Top 10 queries with high
Implemented various Teradata alerts using Alert facility in Teradata Manager. Involved in setting up alters to page DBA for events such as node down, AMP down, too many blocked sessions, high data skew etc
Resolved critical backup failure issues
Installed Teradata Tools in Windows/Unix Environment
Used Teradata Manager collecting facility to setup AMP usage collection, canary query response, spool usage response etc
Worked on capacity planning, reported disk and CPU usage growth reports using Teradata Manager, DBQL, Reusage
Good understanding on Teradata Workload management concepts, extensively worked with TASM and TDWM. Worked with various user groups and developers to define TASM Workloads, developed TASM exceptions, implemented filters and throttles as needed basis
Worked on exporting data to flat files using Teradata FEXPORT
Written several Teradata BTEQ scripts to implement the business logic
Populated data into Teradata tables by using Fast Load utility
Created Teradata complex macros and Views, functions, and stored procedures to be used in the reports
Environment: Teradata V2R6, SUSE LINUX, SOLARIS, NetBackup 5.x/6.x, Crystal Reports, Business Objects, Informatica.
Client: VERIZON, TX Aug 2005 to Jan 2007
Role: Teradata Developer
Responsibilities:
Gathered data requirements from Users and transformed into Logical and Physical Data Models
Designed the mapping document for the multiple billing systems to the EDGE Data warehouse model
Developed Normalized Data Warehouse and Dimensional Modeling using Star Schema and Snowflake Schema
Implemented two level Roles and rights for user s security management
Designed, developed, optimized, and maintained database objects like tables, views, indexes, common procedure macro etc. Testing and implementation of systems
Develop a data archival strategy, end user roles strategy, legacy database interfaces, and disaster recovery procedures for corporate data store
Worked on loading of data from several mainframes flat files sources to Staging using Teradata MLOAD, FLOAD and BTEQ
Expertise in writing Arc Scripting
Reviewed ETL jobs, mapping documents and design documents
Develop ETL Scripts for Mload, Fast load and Tpump to load data into data warehouse
Designed, developed, optimized, and maintained database objects like tables, views, indexes, common procedure macro etc. Testing and implementation of systems
Monitored system performance using utilities such as PMON as well as DBC Reusage tables
Develop a data archival strategy, end user roles strategy, legacy database interfaces, and disaster recovery procedures for corporate data store
Written several Teradata BTEQ scripts to implement the business logic
Worked exclusively with the Teradata SQL Assistant to interface with the Teradata
Environment: NCR Teradata V2R5/R6, TERADATA MANAGER, UNIX/LINUX, Informatica, Business Objects, Crystal Reports, BTEQ, ERWIN.
Keywords: continuous integration continuous deployment business intelligence sthree database active directory information technology California Colorado Maryland Rhode Island Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];2258
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: