Home

Naveen Reddy - Snowflake Developer
[email protected]
Location: Saint Louis, Missouri, USA
Relocation:
Visa: H1B
Naveen

Email: [email protected] | Desk: 314 635 5133

________________________________________
PROFESSIONAL SUMMARY

Around 14 years IT experience in analysis, designing, development, testing and implementation of Enterprise application Integration and Client server application development.
In-depth knowledge in Snowflake database, Data warehouse Management and Middle ware technologies.
Having solid experience and understanding of architecting, designing and Operationalization of large- scale data and Analytics solutions on Snowflake cloud data and Data modeling, Data Analysis, Data Integration Data Architecture, Data warehousing, Cloud based technologies ,DBT,Query Performance tuning and Expierence on Power BI , Tableau tools.
Hands on Experience in SQL Server migration to Snowflake and Data Modeling. Strong knowledge on SQL with data diagrams, data dictionaries and relational databases.
Loading data from files that have been staged in an external (Amazon S3) stage then COPY Data into the target table.Devloping data pipe lines on spark.
Worked on the SQL queries that generated Snowflake compatible DDLs with 100% accuracy by using Teradata DBC tables. Working knowledge on Air flow.
Experience with Snowflake data warehouse, deep understanding of snowflake architecture and Performance tuning with Snowflake data warehouse.
Worked on Amazon s3 components. Build logical and physical data model for Snowflake as per changes required. Design and code required database structures and Components.
Have extensive knowledge in making connections from Python client API to Snowflake account and Developed Scripts using Python and Experienced in Python development.
Involved in creating stored procedures and User defined functions/User defined tables and creating complex SQL scripts in efficient manner.
Experienced in Snowflake Multi cluster size and Credit usage and with Snowflake Multi cluster and Virtual Warehouses.
Worked on Snowflake cloud technology and data and Snowflake data sharing hands on in writing SQL queries against snowflakes. Ability to develop Shell scripts (UNIX) to do Extract, Load and Transform data.
Involved in production support for data ware issues such data load problems, transformation/translation etc.
Experience Azure Data Engineer with experience in Azure Development, Azure Data Factory, Azure Data Lake, and SSIS.
Good technical experience on various components of Any Point Platform, RAML, API in Mulesoft.
Hands-on experience with Snowflake utilities like Snowsql, Snow pipe tasks and Streams. Worked on Amazon S3 Products. Knowledge in ETL/ELT activities like data processing from multiple source systems.
Worked on Metadata Manager for snowflake, Tasks and Streams.
Worked on developing batch integrations to transfer data bulk in between enterprise applications using Mulesoft Any Point Platform, Enterprise service bus.
Used several connectors like HTTP, Database, Salesforce, Workday, Azure, Queues, Rabbit-Mq, File and SFTP.
Have used SQL and Database technologies extensively on several projects and Also Worked extensively on writing efficient SQL to read and write data. Well experienced in data programming like store procedures, Functions Triggers in MS SQL SERVER 2000 and Oracle databases.
Good exposure to continuous deployment/ Integration and Delivery tools. Supported the deployments in release of Devops CI/CD tools like Jenkins, Github.
Involved in lead development and execution of data solutions involved for different projects. Also involved in technical documentation and updated project documents in Confluence page.
Extensively used various kinds of adapters (File, Database, MQ-series, and Email) as a source destination in data transformation (XML to EDI, EDI to Flat File, Database to EDI database to database and XML, JSON Cobol copy book.
Experience in EDI ANSI X12, HL7, and EDIFACT standards in Health (HIPPA guidelines). Logic based on EDI transactions.
Responsible for designing, configuring, deploying, Troubleshooting and Maintaining applications moving from DEV to TEST to PRD.
Strong debugging, troubleshooting and Problem-solving skills with excellent understanding of system development methodologies techniques and tools and managing team in supporting technical activities for Projects.Experince in Agile Methodologies. Working with stakeholders.


TECHNICAL SKILLS

Languages: Java, Microsoft Visual Basic 6.0. SQL, PL/SQL server, Python JSP, Jdbc, JMS and EJB,Apache Kafaka,Spark,Airflow, Java Script.
Operating Systems: MS Windows 95/NT/2000, MS-DOS, UNIX (Solaris), AIX
Cloud Data warehouse: Snowflake SnowSQL, Snow Sight, Snow Pipe AWS, Snowflake Cloud Data Warehouse (4.14), Teradata, AZURE.
Industry Standards: HIPAA, EDIFACT, X12, HL7 & SWIFT, EDI, HTML,XML, XSLT, Web Services,JSON, WSDL and SOAP
Others: SAP,Siebel&AirFlow,PeopleSoft,POWERBI,ITX,WMB,IIB,WTX,Mulesoft,shell scripting,OBIEE,SnapLogic.
Databases: Snowflake, Teradata, ETL, DBT, SQL, RDBMS,MangoDB, Informatica Cloud,SQL Server 2000, DB2, Oracle 9i/10g,Workday, MS Access 2000. Snowflake,DataBricks ,Cloud Data Warehouse (4.14)


EDUCATION

Bachelor of Technology

Certification: AWS developer Associate.

EXPERIENCE SUMMARY

AT&T, Dallas, TX Jan 22 Tilldate
Sr. Snowflake Developer

Responsibilities:
Created various stages like internal/external stages to load data from various sources like on premises system. Define virtual warehouse sizing for Snowflake for different type of workloads. Integrate Snowflake with external services using Snap logic.
Responsible for all activities related data to Migration, development, implementation, administration, and support of ETL/ELT processes for large scale Snowflake cloud data warehouse.
Managed more junior numbers in development team. As a Lead Key position responsible ensuring data is properly stored, maintained and used to meet organization goals.
Acted as Technical owner of multiple core modules of the platform, be accountable for technical quality and delivery timelines. Coaching and mentoring junior team members.
Hands on Streaming structure semi structure data processing and Provide data pipeline solution options in Snow flake. Worked on Streams, optimizer, data sharing, Tasks in Snow Pipe. Using Virtual warehouses, query performance using micro partitions. Leading team and also training them in snowflake devlopments.
Worked on Informatica cloud supports Extract, Load, Transform (ELT) and Extract, Transform, Load (ETL) patterns for data integrations with Snowflake.
Implemented role based access control (RBAC) in Snowflake, ensuring secure data access and Managing user privileges effectively.Created and Managed snowflake roles, granting appropriate privileges to users based on their responsibilities and data access requirements. Involved in data governance data Security and data Compliance in Snowflake.
Hands on experience in development and administration in snowflake. Worked with cross functional team for integration testing and Working with stakeholders and Participated in management calls like daily stand up, scrum and requirement gathering calls. Worked on Glue connectors and Airflow.
Worked on Azure design, implementation and data analytics solutions on Snowflake cloud Data Warehouse.
Hands on experiencing copy and Transform data with Snowflake in Azure. Configuring an Azure container for loading data.Worked on large Datasets like JSON, Parquet.
Worked mostly data protection with Time travel scenarios in snow flake. Loading data into Snowflake tables from internal stage and on local machine.Hands on Metadata and System objects like query history, grant access to users to roles and Table Clustering and Materialized views benefits.
Used COPY, LIST, PUT and GET commands for validating internal and external stage files and used import and export from internal stage (Snowflake) VS external stage (S3Bucket).
Writing Python Numpys, Pandas and Using Python for writing SQL queries in Snowflake. Good knowledge of Data warehouse Creating table structures and Star Schemas within Snowflake.
Modeling techniques such as normalization and Worked on facts, Worked in both transactions OLTP, OLAP.
Having experience in Snowflake cloud data warehousing shared technology environment for providing stable infrastructure, architecture, robust design architecture, best practices and automated SCBD (Secure database connections, Code review, Deployment Process) Utilities.
Good exposure on Snow pipe configuration for continuous data ingestion.Hands on experience in Virtual ware houses in multi cluster ware house and Auto scaling.
Hands on Time travel, Fail Safe, Data streams, Tasks, Cloning, Data Masking, Data Sharing, Query pruning concepts in snow flake. Worked on user stage, table stage, Named stage, External stage in Snowflake. Worked on SCD techniques like slowly changing dimensions (type1, type2, type3).
Worked on the DBT to connect snowflake DB and create the models to transform the data and Used data vault for initial and incremental loads.
Worked in Query Performance tuning of sessions, mappings, writing complex SQL procedures, functions to extract, transform and Load data. Performed troubleshooting analysis and resolution of critical issues.Experince in developing data pipelines in spark.
SnowSQL enables users with appropriate privileges to manage roles within Snowflake. Using SnowSQL you can assign roles, grant roles to users and manage the role hirerachy.Allows administrators to control access effectively.
Designing and implementing data ingestion pipelines from multiple sources using Azure Databricks.
Used the Power BI canvas to create visualizations, such as charts, tables, maps and graphs, based on the imported data in data Visulizations.Data refresh setting automatically update data from snowflake on regular basis.
Worked on Github to automate the deployment of Snowflake changes to our production environment and Include promote changes from development or staging environments.

Environment & Tools: Snow Pipe, Snow SQL, Snowflake, AWS, AZURE, Python, SQL Server, Oracle 10g, TABLEAU,POWERBI,Red, Hat Linux, EDI, Kafka, Airflow, Databricks, Ultraedit-32, Windows 2000

Blue Shield of California, Oakland, CA Jul 21 Dec 21
Sr. Snowflake Developer

Responsibilities:
Understand business functional requirements, document them & perform feasibility analysis to understand impact on existing interfaces & end users.Knowledge on cache Architecture.
Created snow pipe for continuous data load. Used COPY to bulk load the data.
Worked on authenticated ,SnowSQL provides the ability to switch role within a Session
Worked with both Maximized and Auto-scale functionality. Used temporary and transient tables on different datasets.Knowldge on Matillion ETL for Snowflake.
Have extensive knowledge in creating stored procedures and User Defined Functions/ User Defined Table Functions.Worked on health care transactions for HL7 V2 for CCD, FHIR in snow flake.
Experienced in migration of on-premises of Databases Microsoft Azure environment of (Blobs, Azure Data Warehouse, and SSIS Azure Components.
Develop code using UI or SnowSQL, configure S3 buckets, integration objects, file format & stage data in the Snowflake system. Transform data in staging layer & use Snowflake tasks, Snow pipes to schedule data loads.
Use Snowflake features like Zero copy cloning, Time travel, Data sharing as part of development & testing. Write test cases, use cases to validate loaded data and raise defect radars to source team.
Develop Teradata stored procedures to report business metrics through Teradata tables.
Environment &Tools: Snowflake Clod Data warehouse, Workday, Snow pipe, HL7, AWS, AZURE, PYTHON SQL Server, Linux, EDI, Mulesoft and Windows 2000.

Hamburg Commercial Bank Jan 21 Jun 21 Snowflake Developer

Responsibilities:
Develop Teradata stored procedures to report business metrics through Teradata tables.
Extracting 200+ tables from Teradata and loading the same in Snowflake as part of history load.
Build complex ETL Jobs that transform data visually with data flows or by using of compute services Azure Data bricks and Azure SQL Database.
Desgined and Implemented data pipelines using Apache NIFI and Airflow, processing over 2TB of data daily.
Create firewatcher and file transfer job with correct archival methodology.
Reduced Snowflake space used by adding transient tables where appropriate and ensuring optimum clustering column definitions.
Documenting all the deliverables for smooth handover to production team.
Environment &Tools: Snowflake, Snow pipe, Teradata, SQL server, PYTHON, Airflow, AZURE, Windows 2000, UNIX.
Wal-Mart, Bentonville, AR Nov 14 Jan 21
Sr. Snowflake Developer

Responsibilities:
Strong experience in migrating other databases to Snowflake and with Snowflake multi-cluster Warehouses.
Experience in using Snowflake Clone and Time Travel. In -depth knowledge of data sharing in snow flake.
Created and used Snowflake Database, Schema and Table structures and Store Procedures for users.
Writing advanced SQL Scripts as per the business plan to transform the data.
Worked with cross functional teams on adhoc tasks like optimizing database space usage, database/server migration.
Maintain strong hands-on skills with Mulesoft Any point Platform and stay current on Mulesoft products, strategies, and best practices for on RAML understanding and implementation.
Analyze the input and output specifications and develop data conversion maps using WTX 8.2 development of new maps for interfaces like 810, 820, 850, 855, which converts EDI XML Copybook, IDOC-flat file-XML and XML-EDI.
Successfully performed snowflake integration with Azure AD(Single Sign-On)
Involved in pre-production support and early life defect resolution, monitor project execution, report on project performance.
Environment &Tools: ITX 9.0, WTX 8.4, IBM Web sphere MQ 7.0, Soap UI,PYTHON,AZURE, EDI, Snowflake, Mulesoft, AWS.
Suncor, Alberta, Canada. Jan 11 Oct 14
Websphere Developer

Responsibilities:
Designed all the X12 transactions and integrated the EDI and XML system.
Participated in change management process and production deployment.
Used WMB flows to route data to XML and EDI systems. Involved in EDI Customizations.
Developed message flows with HTTP Nodes (HTTP input Node, HTTP Request Node, HTTP Reply)
Involved in trouble shooting using logs, traces and by designing generic error handling sub flows to handle exceptions.
Environment &Tools: WTX 8.4, ITX 9.0, Websphere MQ, Message broker, IIB 9, Windows 2000
AmerisourceBergen, Conshohocken, PA Jan 10 Dec 10
Software Developer

Responsibilities:
Production support for Cyclone Commerce message monitoring.
Involved in Pedigree process for 856 transactions. Production support for major EDI Transaction like 850, 855, 856.
Reprocessing error files in EDI mainly transactions of 850/855.
Environment &Tools: Data stage TX 6.7/8.0, Commerce manager, Cyclone 5.3, EPOS 7.4, Doc man, CSOS
First Choice Health Plan Oct 09 Dec 09
Software Developer

Responsibilities:
Developed functional acknowledgement and trading partner validation by using conventional type of Type trees and maps.
Interfaces like 834, 837, 270, 271,997, HL7 from Hippa Transactions.
Environment &Tools: Mercator 6.5/6.7,HL7, Type Designer, Map Designer, EDI ANSI X12, Oracle 8i.
Keywords: continuous integration continuous deployment user interface message queue business intelligence sthree database active directory information technology microsoft procedural language Arkansas California Pennsylvania Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];1552
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: