Home

RAMESH - SNOWFLAKES ARCHITECT
[email protected]
Location: Edison, New Jersey, USA
Relocation:
Visa: H1B
Employer : [email protected]; [email protected]; (609) 778-4215 ext 1000

Ramesh

Senior Snowflake Data Engineer/ Data Arch/ Lead ETL Developer


Sr. Snowflake Developer | Databricks| Informatica power center 8.x, 9.x, 10.x, Informatica Admin and IICS| Azure, AWS| Data Warehousing |Business Intelligence |Oracle, DB2, Teradata, SQL Server| SQL, PL/SQL, AGILE methodology, Control M, Atomic user interface, Git/GitHub, UNIX, and Windows.
Profile Summary
Over 16 years of experience in Design, Development, and managing of various enterprise level data solutions (Enterprise Data Warehouse, Cloud Data Warehouse, Enterprise Data Lake, Data ingestion, Data migration, data pipelines, Business Intelligence and Data Migration) projects.
Experience with different domains like Banking, Finance, Health Care, Insurance, Retail etc..
4+ year experience in Migration, design, implementation of large-scale data from on prime to Snowflake Data Warehouse.
Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Snowpark, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures
Hands-on experience with Informatica Intelligent Cloud Services (IICS)
Extensively used IICS components like Mapping task, session task, workflow task, power center task, Data integration and connectors & secure agent groups
Experience with Development, Integration, Migration and Maintenance/Support applications.
Very good exposure on Azure Databrickes to migrate the data from on-prime to cloud by creating ETL pipeline.
Extensive experience on data quality in terms of data cleansing, transforming, profiling & enrichment and validation.
Very good experience on creating DBT models, seeds, snapshots, and tests.
Very good experience in ETL tool Informatica power center, Informatica Admin, B2B data transformation, IICS, Snowflake and Databrickes.
Very good exposure on handling interpersonal conflicts, leading employee training, managing deadlines, Building company culture.
Experienced in Design, Development and maintenance of Business Applications including implementation of data warehousing solutions, full lifecycle (design, build, and test), and development of Data Warehouses/Data Marts Databases.
Very good exposure on Snowflake core and certified profession.
Skilled in working with Hive, Spark SQL, Kafka, and Spark Streaming for ETL tasks and real-time data processing.
Hands on experience on integrated data model framework for enterprise data warehousing.
Excellent exposure in implementing, building and testing in client databases by defined framework process flows using Informatica power center 8.x, 9.x,10.x, DSF framework, B2B Data Transformation, Oracle, DB2, and SQL Server.
Hands on experience with Informatica administration, installation, upgrade, and maintenance.
Hands on experience in AGILE methodology.
Managed all aspects of the SDLC process - requirement analysis, time estimates, functional specification, design, development, testing, packaging, and support/maintenance.
Excellent experience in providing Business Intelligence solutions for Data warehousing.




ETL Tools ETL- Snowflake, IICS, Informatica Power Center 7.x, 8.x, 9.x and 10.x, Informatica admin, DBT (Data Build Tool) and Azure Databrickes
Big Data Technologies MapReduce, Hive, Teg, Python, PySpark, Scala, Kafka, Spark streaming, Oozie, Sqoop, Zookeeper, HDFS, SQL, YARN, PIG Latin, Spark, Yarn, Strom, Oozie, Storm, Flume, T-SQL
Cloud Technologies Azure Data Factory, Azure Databricks, Logic Apps, Functional App, Snowflake, Azure DevOps, Azure Data Lake, Azure SQL Database, Azure Synapse Analytics, Active Directory, Application Insights, Azure Monitoring, Azure Search, Key Vault.
Operating System: UNIX, LINUX & Windows.
Databases & Languages Oracle, SQL Server 2000, DB2, Teradata, SQL, PL/SQL, Java, Unix scripting.
Scheduling Tools Tivoli, Atomic User interface, DAC and Control M, Autosys
Reporting Business Objects & MicroStrategy
Management & Leadership Skills Project Manager, Business Analysis and Development.
Version Control Git/Github, Jenkins
Methodology AGILE, Scrum
Operational Activities Demands fulfill.
Team management
Project driving
Domain Knowledge Insurance, Banking, Retail, Chemical and HealthCare (HIPAA, EDI, HITECH, & SOC Audited)

Work experience:
Currently working as Lead consultant-US UBS since July 2023 to till date.
Worked as Lead consultant-US in Infosys Limited since Mar 2021 to July 2023
Worked as DWH architect in Zentek Infosoft India Pvt Ltd from Aug 2020 to March 2021
Worked as a principal consultant in CiberSites India Pvt Ltd from Sep 2011 to Jan 2017
Worked as senior consultant in IBM India Pvt Ltd from Aug 2009 to Sep 2011
Worked as Software Engineer in Hewlett-Packard from Jul 2006 to Aug 2009
Worked as Software Engineer in Wipro from Mar 2006 to Jul 2006
Worked associate consultant in Optimum Info System from Apr 2005 to Mar 2006


1. Project Name WMPC-SaMaRA & ARISK analytics
Client UBS- New Jersey-USA
Role Data Engineer / ETL developer
Duration Aug 20 To till date
Environment Snowflake, IICS, DBT, Informatica power center 10.5.2, Data bricks, Phyton, Kafka, Oracle, IBM DB2, SQL, PL/SQL, AGILE framework, AIX, JIRA, GitLab, Unix, Windows.



Project Description: UBS Group AG/UBS Group SA/UBS Group Inc. ("UBS") is a public company incorporated under the laws of Switzerland and with registered and principal offices in Zurich. UBS is listed on the Swiss Stock Exchange (SIX) and the New York Stock Exchange (NYSE).
UBS has several applications to support customers and maintain one central repository of all sales/marketing data used throughout all eCenter applications. This has many modules like Load Concurrency Process, Standard Extract, ODS, DWH and Data marts.

Roles and Responsibilities:

Helping to formulate estimates and timelines for project activities and setting related goals and acting as a Snowflake architect and mentoring the developers.
Design scalable, high-performance data architectures using Snowflake, including data warehousing, data lakes, and real-time data pipelines.
Created Integration objects, External Table, file formats, stages, and used COPY INTO/SnowPipe
to ingest CSV/PARQUET/JSON data continuously from Azure containers.
Very good experience on creating IICS Mappings, Tasks, Task flows, Power center Task, Synchronization task and Replication task, etc..
Experience on advanced performance properties like Push Down Optimization (PDO) and Partitioning.
Played an important role in migration from on premise to Snowflake by analyzing exiting systems (Analysis & Planning, Code & Application migration, Quality & Performance, Cutover & Users training)
Provide direction to the developers when needed and participating in the weekly Technical Integration Working Group meetings.
Experienced in writing queries by using SnowSQL, SnowPipe, Streams, Tasks, UDF & reverse engineering activities.
Extensively used Data Build Tool (DBT) for data transformations (models, seeds, snapshots and tests) & code deployment.
Optimizing Snowflake configurations, query performance, and data pipelines to ensure efficient use of
resources and meet performance requirements Design and implement data security and access control.
Developing ETL pipelines in and out of data warehouse using combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake
Implemented solutions using snowflake s data sharing, cloning and time travel.
Experience in building near-real-time ETL pipelines using Snowflake, Kafka and PySpark.
Proficient in scripting languages such as Python, PySpark and Scala, enabling seamless integration of custom
functionalities into data pipelines.
Experienced in Continuous Data Protection Lifecycle: Time Travel, Fail-Safe zone.
Integrated Kafka with Azure Databricks, to enable seamless data integration and synchronization.
Cloned Historical Objects: Duplicated the objects at its current state or at a specific point in the history.
Responsible to migrate the Informatica power center workflows to IICS.
Extensively used IICS s Mappings, Tasks, Taskflows, Power center task, Data integration and connectors & secure agent groups and components, etc..
Experienced in partitioning strategies and multi-cluster warehouses in Snowflake to ensure optimal query performance and scalability.
Responsible to crate migration framework to move ETL process from on-prime Informatica power center to IICS.
Hands-On working experience with a diverse range of file formats, including CSV, JSON, Parquet, and Avro, to
efficiently store, process, and exchange data within data engineering pipelines and analytics workflows.
Executed Hive scripts through Hive on Spark and Spark-SQL to address diverse data processing needs.
Utilized Kafka, Spark Streaming, and Hive to process streaming data, developing a robust data pipeline for
ingestion, transformation, and analysis.
Demonstrated ability to design and implement data integration strategies between Snowflake and external systems,
leveraging technologies such as Apache Kafka, Airflow or custom-built orchestration frameworks to ensure seamless data movement and synchronization.
Develop proofs-of-concept (POCs) in Spark using Python to compare performance with Hive and SQL/Oracle, ensuring that POCs adhere to security and governance standards.
Responsible for key architecture decisions, design policy creation and enforcement middleware to support dynamic, context aware policies for access control and resource allocation.
Work with a team of architects and engineers to develop proof-of-concept components.
Created Standalone Tasks, Tree of Tasks, Called Stored Procedure using Tasks.
Experience on Streams; Change Data Capture from existing data sources to Snowflake Perform enhancements required to implement necessary Optimization.
Provided extended support to help and fix the performance issues in production instance.
Develop and monitor solutions focused on presentation, logic and data tiers that allow for more junior team members to understand and address service impacting issues in order to proactively remediate issues prior to escalation.
Develop and maintain ETL processes to move data from source systems to Snowflake stages.


2. Project Name Medica Applications
Client MEDICA
Role Principle Consultant
Duration Feb, 2015 To Jan,2017
Environment Informatica power center 9.x,10.x, Informatica PIM 360, oracle, SQL Server, Teradata, AGILE framework, OBIEE, DAC, Atomic, Unix, Windows 2000.

Medica was founded by physicians in 1975 as Physicians Health Plan. It was the first open-access health plan in the states. In 1991, PHP merged with Share to become Medica. And in 1994, Medica merged with ealth Span to form Allina Health System, an integrated organization offering both health care coverage and medical services. Medica became an independent health plan in 2001.
Medica has several applications to support customers and mainly (CCMS, COPIS, MDW, DAC and Red bricks) one central repository of all claim data used throughout all eCenter applications. This has many modules like Load Concurrency Process, Standard Extract, ODS, CMS, EBM, Global Symentry.

Roles and Responsibilities:

Helping to formulate estimates and timelines for project activities and setting related goals and acting as a ETL architect and mentoring the developers.
Provide direction to the developers when needed and participating in the weekly Technical Integration Working Group meetings.
Participating ETL design phase and creating ETL Design Specs for mappings.
Using Informatica power Center 9.x loading Historical and Incremental Data into Dimension / Fact tables
Responsible to work with business owners and loading data from varies source systems like Flat files, Xml Files, unstructured files and other feeds.
Understanding existing business model and customer requirements and adding best practice techniques.
Work with a team of architects and engineers to develop proof-of-concept components.
Write and maintain architecture and detailed design specifications.
Manage a team of senior engineers and contractors to implement the key components of the product according to specifications.
Work with QA manager to develop test specifications and collaborating with application developers and architects to define application middleware requirements.
Creating and implementing procedures to bring new services onto the middleware platform and identifying ways to automate repeatable work.
Develop and monitor solutions focused on presentation, logic and data tiers that allow for more junior team members to understand and address service impacting issues in order to proactively remediate issues prior to escalation.
Maintaining and debugging existing applications that interfaces with a database back end.
Preventing system resources and application consumption from negatively affect service availability.
Writing UNIX scripting to formulate business rules and directions on exiting framework.
Designed and Implemented the Export profiles and Export format templates to extract the data from PIM to Publish
Defined file provisioning process (DW Preprocessing Steps). 100% automation of file provisioning process using UNIX, Informatica mappings and oracle utilities.

3. Project Name Consolidated Claim Data Base (CCDB)
Client Health Management Systems (HMS)
Role Team Lead
Duration Sep, 2011 To Jan, 2015.
Environment Informatica power center 9.x/B2B Data Transformation, Oracle, DB2, Teradata,Unix, and Windows 2000.
Health Management Systems is pioneer in cost containment, coordination of benefits, and program integrity services for government health care programs. Using information technology and data mining techniques, HMS identifies other insurance coverage, coordinates benefits, and recovers overpayments. HMS enables Medicare and Medicaid programs throughout the United States to recover their claims from the carriers and thereby financially help the Government through cost containment services.
The Consolidated Claim Data Base (CCDB) is the central repository of all claim data used throughout all eCenter applications. This has many modules like Load Concurrency Process, Standard Extract, Phenox cycle and Calc totals, CCDB Reformats for different clients, BLOB Reformats Process.
Roles and Responsibilities:

Assigning tasks, prioritizing tasks and Enforcing project processes and policies.
Helping to formulate estimates and timelines for project activities and setting related goals and acting as a technical mentor for developers.
Assisting developers with design work, analysis and requirements gathering.
Provide direction to the developers when needed and participating in the weekly Technical Integration Working Group meetings.
Completing a weekly status report for the project manager and assisting production support issues.
Participate in peer code review to make sure that coding standard follow.
Responsible for key architecture decisions, design policy creation and enforcement middleware to support dynamic, context aware policies for access control and resource allocation.
Work with a team of architects and engineers to develop proof-of-concept components.
Write and maintain architecture and detailed design specifications.
Developed complex Pre-Stage mappings to reformat from Property Source (XML, flat files) System to CCDB Standards flat files, flat files to staging area and staging area to Production DB.
Used B2B Data Transformation to extract data any format to any other format(Cobol to Xml/PDF to Xml/Flatfile to Xml and Xml to xml)

4. Project Name Achmea Solvency II
Client Achmea-Netherlands
Role Sr. Consultant
Duration Aug 2009 To Sep, 2011.
Environment Informatica power center 8.1.1, DB2, Cogon s, Unix, Windows 2000.

Project Description:
Achmea is an integrated financial services group with a clear and demonstrable focus on value creation. Their core business is insurance life, non-life and health and services relating to pensions and health.
The goal of the Solvency II is to provide a foundation platform for corporate Business Intelligence based on an Enterprise Data Model, which will support consolidation of the company s information resources and facilitate the implementation of a flexible Data Centric View of the Achmea business.
The Solvency II solution will be delivered in a phased manner, via a series of Business Releases, each of which will target different source systems and business subject areas.
Roles and Responsibilities:
Participating ETL design phase and creating ETL Design Specs for mappings.
Understanding existing business model and customer requirements.
Providing optimal Data warehousing solutions to the Business.
Using Informatica power Center 8.1.1 loading Historical and Incremental Data into Dimension / Fact tables.
Extracting data from multiple source systems- Oracle RDBMS and Flat Files.
Migrating data from various source system s to DB2 database.
Using Transformations like Aggregator, Router, Joiner, Expression, Update Strategy, Lookup and Stored Procedures.
5. Project Name Woolworths EDW
Client Woolworths-Australia
Role Senior ETL Developer
Duration Jan 2009 To Aug 2009.
Environment Informatica power center 8.1.1, Oracle9i, Neoview, Business Objects, Unix, Windows 2000.

Project Description:
Woolworths Limited is committed to supporting the 'National Privacy Principles for the Fair Handling of Personal Information' which set clear standards for the collection, access, storage and use of personal information which we obtain as part of the business operations. Woolworths remains a down-to-earth, honest Australian Company, dedicated to providing customers with a convenient and it s committed to providing customers an enjoyable shopping experience each and every time.
The goal of the EDW is to provide a foundation platform for corporate Business Intelligence based on an Enterprise Data Model, which will support consolidation of the company s information resources and facilitate the implementation of a flexible Data Centric View of the Woolworths business.
The EDW solution will be delivered in a phased manner, via a series of Business Releases, each of which will target different source systems and business subject areas.
Roles and Responsibilities:

Participating ETL design phase and creating ETL Design Specs for mappings.
Understanding existing business model and customer requirements.
Providing optimal Data warehousing solutions to the Business.
Using Informatica power Center 8.1.1 loading Historical and Incremental Data into Dimension / Fact tables.
Extracting data from multiple source systems.
Designing ETL Data flows from Source systems to Target systems.
Preparation of the unit test plan and test results along with the test data.
Migrating data from various source system s to Neoview database.
Using Transformations like Aggregator, Router, Joiner, Expression, Update Strategy, Lookup and Stored Procedures.


6. Project Name Master well data management (MWDM).
Client Marathon Oil Corporation-Houston(USA)
Role Senior ETL Developer
Duration Feb 2008 To Dec2008.
Environment Informatica power center 8.1.1/8.6, Oracle9i, SQL Server, Unix, Windows 2000.
Project Description:
Marathon Oil Corporation (NYSE: MRO) is an integrated international energy company engaged in exploration and production; oil sands mining; integrated gas; and refining, marketing and transportation. Operating across the globe, Marathon is among the world s leading integrated energy companies applying innovative technologies to discover and develop valuable energy resources, providing high-quality products to the marketplace and delivering value to all of the Company s stakeholders.
Roles and Responsibilities:
Participating ETL design phase and creating ETL Design Specs for mappings.
Understanding existing business model and customer requirements.
Appling the all Detective and Corrective rules and load the data to MWDM Tables.
Appling all QA-Controls and loading data to Error log table.

7. Project Name Data information factory and Research Information Factory.
Client Pfizer USA
Role Senior ETL Developer
Duration April 2007 To Feb 2008.
Environment Informatica 8.1, Oracle9i, Unix, Windows 2000, Business Objects 6.5
Project Description: Pfizer pharmaceuticals help over 150 million people throughout the world live longer, healthier lives. With medicines across 11 therapeutic areas, we help to treat and prevent many of the most common, and most challenging, conditions of our time. And also build partnerships to help ensure access to there medicines and educate and empower consumers. And, with a broad range of animal vaccines and medicines, we help to protect the health of both pets and farm animals.
Roles and Responsibilities:
Understanding existing business model and customer requirements.
Providing optimal Data warehousing solutions to the Business.
Using Informatica power Center 8.1.1 loading the Historical and Incremental Data into Dimension / Fact tables.
Designing ETL Data flows from Source systems to Target systems.

8. Project Name GSCB( Global SolutionCenter in Bangalore)
Client HP
Role ETL Developer
Duration July 2006 To March 2007
Environment Informatica 8.1, Oracle9i, Unix, Windows 2000, Business Objects 6.5
Project Description: Global Solution Center in Bangalore (GSCB) provides tailored services to meet specific business requirements along with the use of cutting-edge technologies leading to high levels of accuracy and speed. GSCB provides End user/ Technical user support on:
Roles and Responsibilities:
Understanding existing business model and customer requirements.
Providing optimal Data warehousing solutions to the Business.
Using Informatica power Center 8.1.1 loading the Historical and Incremental Data into Dimension / Fact tables.
Extracting data from multiple source systems.
Designing ETL Data flows from Source systems to Target systems.
9. Project Name GHM for D&B(Global Hygiene Match)
Client DNB
Role ETL Developer
Duration March 2006 To July 2006.
Environment Informatica 7.1.1, Oracle8i, DB2, Windows 2000, UNIX,Micro strategy 7.3
Project Description: D&B is the world s leading source of business information and insight, enabling companies to Decide with Confidence for 165 years. D&B s global commercial database contains more than 100 million business records. The database is enhanced by our proprietary DUNS Right Quality Process, which transforms the enormous amount of data we collect daily into decision-ready insight.
Roles and Responsibilities:
Using Informatica power Center 7.1.1 loading the Historical and Incremental Data into Dimension / Fact tables.
Designing ETL Data flows from Source systems to Target systems.
Preparation of the unit test plan, system test cases and test results with the test data.
Migrating data from OLTP system to OLAP system.
Using Transformations like Aggregator, Source Qualifier, Joiner, Expression Update Strategy, Lookup and Sequence generator.


10 . Project Name Enterprise Data Warehouse (EDW)
Client Bellsouth (BIST)
Role ETL Developer
Duration April 2005 To March2006
Environment Informatica 7.1.1, Oracle8i, Sybase, Windows 2000, UNIX and Micro strategy 7.3
Project Description:
Bellsouth Corporation is a Fortune 100 Telecommunications Service Company headquartered in Atlanta, USA. Serving more than 44 million customers in the United States and 14 other countries.
Roles and Responsibilities:

Providing optimal Data warehousing solutions to the Business.
Creating wrapper script and workflows for TNG Error scenarios.
Worked on Data Marts, Data Cleansing, Defects, End User Tickets and TNG.



Master of Computer Applications, Tagore Engineering College, Madras University, 2004.
Keywords: quality analyst database information technology hewlett packard procedural language

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];4352
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: