Home

Krishna - Data Architect/Modeler/Engineer-AWS
[email protected]
Location: Princeton, New Jersey, USA
Relocation: Yes
Visa: H1B
Professional Summary

Highly creative and skilled BI and Data Architect with extensive experience in designing, implementing, and optimizing data warehousing solutions on the Cloud extensively on Snowflake platform and Tableau. Committed to delivering efficient and scalable data architecture to enable data-driven decision-making for businesses. Seeking opportunities to contribute my expertise in Snowflake and data engineering to drive innovation and success. Dedicated BI and Data Consultant with more than 15 years of experience leading and collaborating with Data Warehousing, Data Modelling, Data Quality, ETL and Business Intelligence. Presents in-depth technological knowledge to satisfy up-to-date guidelines for project road mapping, launch and provisioning. Organized approach to balancing Technical and Functional tasks.


Experienced Data Architect/Data Modeler and Data Analyst with high proficiency in requirement gathering and data modeling.
Good understanding and knowledge with Agile and Waterfall environments.
Good knowledge in Identifying and Resolving snowflake DB/data issues & AWS Redshift and Kafka.
Excellent knowledge in Migrating servers, databases, and applications from on premise to AWS and Snowflake.
Designing the architecture of the data warehouse after understanding and analyzing the legacy databases with Hadoop, Big Data.
Experience in designing and Modeling a Data Warehouse by using tools like Erwin, Power Designer and E-R Studio.
Expert in generating on-demand and scheduled reports for business analysis or management decision using Various BI Tools like Tableau, OBIEE & Oracle BI Applications , QlikView, AWS Quick sight.
Designing the Data Marts in dimensional data modeling using star and snowflake schemas.
Experience in Logical Data Model (LDM) and Physical Data Models (PDM) using Erwin data modeling tool.
Extensive experience in development of PL/SQL, Stored Procedures, Triggers, Functions, Packages, performance tuning and optimization for business logic implementation.
Extensive knowledge in architecture design of Extract, Transform, Load environment using Informatica Power Center
Strong expertise on Amazon AWS Redshift, S3, AWS Glue, Terraform and other services
Experience in Python programming and UNIX shell scripting for processing large volumes of data from varied sources and loading into distributed MPP databases like Vertica Teradata, Snowflake and Redshift. Also have vast experience in Oracle, SQL server, MySQL.
Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.
Excellent Knowledge of Ralph Kimball and Blinman s approaches to Data Warehousing.
Excellent knowledge in Data Validation, Data Cleansing, Data Verification and identifying data mismatch
Experiencein Data Scrubbing/Cleansing, Data Quality, Data Mapping, Data Profiling, Data Validation in Informatica IDQ.
Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export
Strong analytical and problem-solving skills, excellent communication and presentation skill, and a good team player.

Education and Certifications
MCA Master of Computer Applications - JNT University Hyderabad 06/2007
BCA Bachelor of Computer Applications Andhra University Hyderabad 05/2004
Certified - Tableau Desktop Specialist - June 2020
Certified - OBIEE Foundation Specialist - August 2015
Completed Various badges in Snowflake University
Trained in Snowflake Datawarehouse Core
Trained in Azure Cloud Fundamentals
https://www.credly.com/users/krishna-swayampakula/badges

Technical Skills
BI TOOLS: Tableau 9x,10x,2020x, Tableau Prep, now Sight, OBIEE 10g, OBIA 7.9.6x, QlikView 12, OAS, OBI Publisher, AWS Quick Sight, Power BI.
ETL TOOLS: Informatica Power Centre, IDQ and Informatica Cloud, DAC 10g, 11g, App Worx, Airflow, GIT HUB, GIT CMD, Python, Fivetran HVR, AWS Glue
OS: MS Dos, Win 95/98, 2000/2003, Win XP
DATABASES: AWS Redshift, Netezza, Snowflake, Oracle 11i/10g/ (SQL), DB2, SQL SERVER, Teradata, No SQL.

Professional Experience
Pamten Inc 12/2021 Till date

Client: Blue Owl - USA 06/2023 Till Date
Role: BI and Data Architect

Responsibilities:
Working as a lead Data Modeler Responsible for Designing and building Enterprise Data Warehouse for the Capital Markets domain with Front Office, Middle Office and Back Office Credit Business.
Build Architecture in Snowflake with SQL Server and Azure Data Storage as Sources for the Equity and Credit Business models.
Responsible for Data Governance with Snowflake being the Centralized Data Hub
Reverse Engineer Source system tables and design ER Diagrams for various Transactional tables
Responsible for designing reports and migrate existing reports from Excel to Power BI and Tableau.
Updating code deployment using GIT HUB and Jira.
Tools Used: Snowflake, Azure, Temporal.IO, Power BI, Tableau 2020, Bit Bucket, Jira, MS SQL Server, Erwin, MDM, SQL Server data modeler, Capital Markets, Funds Processing, Equity and Credit Business.


Client: Quiet Platforms - USA 12/2022 06/2023 (Remote)
Role: Data Architect DBE Team

Responsibilities:
Responsible for Designing and building Data Warehouse model for Quiet Platforms Supply Chain Management data in Snowflake.
Build Architecture in Snowflake for the entire data warehouse and document the same.
Create and Document Data Lineage and Data dictionaries for all the source and Warehouse tables.
Reverse Engineer Source system tables and design ER Diagrams for various Transactional tables
Create ETL Pipelines using GCP Buckets, Temporal.io.
Administrate various Snowflake Compute and Storage scenarios, implementing cost effective methodologies.
Updating code deployment using GIT HUB and Jira.
Developed and implemented MDM strategies to ensure data accuracy, consistency, and integrity across the organization.
Implemented MDM platform on Snowflake to centralize master data management, leading to improved data accessibility and usability.
Extensively used Fivetran HVR for the data migration and replication from SQL server to Snowflake.
Tools Used: Snowflake, GCP Buckets, Temporal.IO, Tableau 2020, Bit Bucket, Jira, Data Dog, Cassandra Db, Mongo Db, DBT,MS SQL Server, Erwin, MDM, SQL Server data modeler, Fivetran HVR.

Client: State Street Financials- USA 12/2021 11/2022
Role: Data Architect DBE Team Bridgewater, Nj
Responsibilities:
Developing ETL Pipelines using AWS Redshift for internal data mart. Responsible for the migration of the data mart from IBM Netezza to AWS Redshift.
Create deployment artifacts and deploy the same into AWS cloud platform.
Updating the PL/SQL procedures to AWS standards. Creating Multiple Snow Flake Streams to capture Delta records coming from AWS S3.
Used python to extract the application data which is in XML/JSON format.
Used Python to create loops in data transformation and masking.
Updating code deployment using GIT HUB and Jira on AWS Redshift. Designed management and ETL status dashboards in Tableau and maintaining them.
Using Python programming and UNIX shell scripting for processing large volumes of data from source and loading into distributed, Snowflake and Redshift.
Developed and executed comprehensive MDM strategies using AWS and Snowflake, ensuring seamless data integration, standardization, and governance across the organization for the data coming from Big Data Hadoop.
Designed and implemented scalable and secure cloud-based MDM architecture leveraging AWS services such as Amazon S3, Amazon RDS.

Tools Used: AWS RedShift, AWS Glue, GIT CMD, IAM, Tableau 2020, Netezza, MDM, SQL, Airflow Dag, FiveTran HVR, Kafka, Hadoop, Domain knowledge of Financial Services like account and digital processing.

DXC Technology - USA 06/2019 12/2021
Princeton, NJ

Client: OneAmerica USA 05/2021 11/2021
Role: Data Architect Bridgewater, Nj
Responsibilities:
Designed and Developed Virtual Warehouses and Created Data Pipelines Snow Pipe using Snowflake Enterprise Data Warehouse.
Scheduled Snow Flake Tasks to load insurance claims data coming from AWS S3 and loaded to transient tables that support Tableau Reporting.
Designed Tableau Reports using the data extracts coming from Snow Flake Virtual Warehouse. Created Multiple Snow Flake Streams to capture Delta records coming from AWS S3.
Used Python to create loops in data transformation and masking.
Created and Configured Data retention time lines for various source tables in Snow Flake. Managed different types of source data like CSV file, Parquet data and Json Formats. Created Stages and configured AWS on Snowflake to create smooth integration.
Created data pipelines using Python for Json Formats and Parquet formats.
Employed DBT and Snowflake's ETL capabilities to efficiently integrate data from various sources, transforming raw data into consistent, high-quality master data.
Implemented data quality controls and validation rules using Snowflake and AWS services to improve data accuracy and reliability in MDM processes.
Tools Used: Snowflake 5.43, Aws S3, AWS Glue, Terraform IAM, Tableau 2020, DBT, Snow SQL, FiveTran.

Client: Corteva Agriculture USA 11/2020 5/2021
Role: Data Architect Princeton, NJ
Responsibilities:
Modelling of data, read the data warehouse in DB2 and create data pipelines using Snow Pipe and Snow Streams to migrate to SQL and then transform the same to cater the needs of replicating legacy Financial Application screens in to reports in Power BI.
By migrating to Snowflake, we have also strengthened our data governance and compliance efforts. The platform's advanced metadata management capabilities and built-in data governance frameworks have enabled us to establish data standards, define data lineage, and enforce data privacy and security policies. This ensures that our pharmaceutical data remains protected and compliant with industry regulations, such as HIPAA and GDPR.
In conclusion, the migration to the Snowflake platform has revolutionized our data management practices in the pharmaceutical industry. We now have a comprehensive solution that empowers us to integrate, cleanse, and govern our data effectively, driving innovation, accelerating research, and ensuring regulatory compliance. With Snowflake's robust features and continuous advancements in data integration technology, we are well-positioned to harness the power of our pharmaceutical data and propel our organization towards success in a data-driven future.
Tools Used: Erwin data modeler, Tableau, Power BI, DB2, Snowflake, DBT, SQL Server, Azure Blob storage.

Client: Blue-Cross Blue Shield California 05/2020 10/2020
Role: Data Architect
Responsibilities:
Designed and implemented Snowflake data warehouse architecture, enabling seamless integration of structured and semi-structured data from various sources.
Developed and optimized ETL/ELT workflows to load, transform, and process petabytes of data in real-time.
Collaborated with data engineering and business intelligence teams to ensure data consistency and accuracy in reporting.
Implemented data security measures, including role-based access control and encryption, to maintain data privacy and compliance.
Conducted performance tuning and query optimization to enhance system efficiency and reduce processing time by 40%.
Led the migration of on-premises data infrastructure to Snowflake on AWS, resulting in significant cost savings and improved scalability.
Utilized Informatica s real-time data replication features to ensure master data consistency across applications and systems in real-time.
Implemented robust data governance policies and access controls on AWS and Snowflake platforms to protect sensitive master data and comply with data privacy regulations.
Leveraged Snowflake's metadata management capabilities and AWS services to maintain comprehensive metadata repositories for MDM entities, enabling data lineage and governance.
Worked with Big Data Hadoop to leverage the Legacy data integrations
Tools Used: Snowflake, AWS S3, AWS Glue, DBT, Informatica Power Center 10.2 HF2, Tableau, SQL, AutoSys, Big Data Hadoop, Spark.

Client: Pfizer Inc 06/2019 5/2020
Role: Data Architect (Pharma) Peapack -gladstone, NJ
Responsibilities:
Designed, developed various Tableau reports and the associated Interfaces by extracting the data using Informatica Power Center that feeds these reports.
Created various data pipelines using Informatica and PL/SQL interfaces at the ETL level. Responsible for the Data Mapping, Integration and Change management documents and versions.
During the migration process, we carefully evaluated our existing data infrastructure and identified key pain points and inefficiencies. Working closely with Informatica consultants, we designed a comprehensive migration plan tailored to our pharmaceutical data requirements. This involved mapping and transforming data from various sources, including clinical trial data, patient records, regulatory information, and manufacturing data.
With Informatica, we have gained a centralized and unified view of our data, enabling us to make informed decisions and drive innovation across our organization. The platform's powerful data integration capabilities have simplified the process of extracting, transforming, and loading data from disparate sources, ensuring data accuracy and consistency. Additionally, Informatica's data quality features have allowed us to cleanse and enrich our data, improving its reliability and usability for critical pharmaceutical research, compliance, and reporting purposes.

Tools Used: Informatica Power Center 10.2 HF2, SQL, Control M and Tableau, Pharma Data Domain.


IBM - USA 07/2013 - 05/2019

Client: Portland General Electric- USA 06/2018-05/2019 Portland, OR
Role: BI Architect and Onsite Coordinator
Responsibilities:
Designed and developed various Interfaces using Informatica Power Center as a part of the Transmission, Distribution and generation Metrics. Performed performance tuning on complex application interfaces running on daily batch mode.
Converted Metric requirements to Informatica Interfaces, Developing, Unit Testing and maintaining the track for any bugs/issues created.
Created various dashboards for various metrics in Tableau and Created data Lineage documents as a value add. Gained Knowledge on Oracle Utility Analytics and Industry.
Tools Used: Informatica Power Center 10.2, SQL, Appworx and Tableau 10, Oracle utility Applications.

Client: Met Life USA 01/2018 06/2018 Dublin, OH
Role: Data Analyst and BI Developer
Responsibilities:
Worked as an Architect - BI and Data Analyst for various Interfaces as a part of the integration data mapping.
Responsible for two major integration data mapping from XML source to Hadoop.
Creates various Transformation rules to meet the target file structures and created several interfaces using Informatica Power center.
Developed various rules to process data quality using Informatica IDQ.
Responsible for the Data Mapping, Integration and Change management documents and versions.
Responsible for the entire BI modelling and maintenance with OBIEE to Tableau Migration.
I assisted the leaders by completing executive research, preparing mate-rials, worked with Testing team to help them review the test cases in con-junction with Functional Requirements.
Converted Data requirements to Stories in Jira and maintaining the track for any bugs/issues created. Gained knowledge on Insurance Claims, Customer Structure and Amendments.
Tools Used: Informatica Power Center 10.2, Informatica IDQ, SQL, Jira, Autosys and OBIEE, XML, Hadoop.

Client: Pearson Education USA 07/2017 12/2018 Hoboken,NJ
Role: Consultant BI
Responsibilities:
Designed and Implemented Tableau 10.2.3 for Rights & Permissions and Oracle BI 11.1.1.9.1 for Royalties Module of Reporting solution for the Pearson Royalty department.
Designed 6 Projects and 150+ workbooks from 25+ Legacy Data sources with Oracle and Flat files.
Defined and documented requirements for 17 reporting categories and 110+ canned reports and Developed the Reports in OBIEE under various dashboards for Royalties Module in Requirement Gathering Phase.
Conducted workshops to gather requirements. Involved in data validation of the results in Tableau by validating the numbers against the data in the database tables by querying on the database.
Worked on various reports using best practices and different visualizations like Donut charts, Bars, Lines and Pies, Maps, Scatter plots, Bubbles, Bullets, Heat maps and Highlight tables in Tableau.
Tools Used: OBIEE, BI Apps, Tableau, Oracle Discoverer, SQL and Erwin data Modeler, Informatica Power
Center and DAC.

Client: TIAA-Cref USA 06/2016 06/2017 Iselin,NJ
Role: Consultant BI
Responsibilities:
Developed custom reports/Ad-hoc queries using Tableau and assigned them to application specific dashboards.
Developed different kinds of Reports (pivots, charts, tabular) using global and local Filters in and Tableau 9.
Involved in administration tasks such as Setting permissions, managing ownerships and providing access to the users and adding them to the specific group for Rights & Permissions Module.
Developed complex calculated fields for the business logic, field actions, set parameters to include various filtering capabilities for the dashboards, and to provide drill down features for the detailed reports.
Provided Production support to Tableau users and Wrote Custom SQL to support business requirements. Integrated Tableau with OBIEE dashboards using SOAP protocol and WSDL. Design and developed the most suitable User Interface such as Dashboards, multiple charts types, Trends, custom requests for Excel Export and objects for Management Dashboard reporting using Power Pivot.
Developed complex calculated fields for the business logic, field actions, sets and parameters to include various filtering capabilities for the dashboards, and to pro-vide drill down features for the detailed reports. Involved in DEV, Stage and Production server installation and Configuration of OBIEE, Informatica 8.6.1.
Tools Used: Informatica Power Center 8.6.1, Tableau 9 and OBIEE, IBM Information Analyzer.

Client: Macys USA 07/2013 05/2016 HYD, India
Role: Consultant BI
Responsibilities:
As a Team Lead responsible for Analysis, design, development, testing and implementation of customized Tableau for new implementations and upgrades for our IBM US customer Macys.
Lead the Technical team for HFA project included Financial, Procurement & Spend, Employee Expenses and Enterprise Asset Management analytics Implementations with IBM Maximo.
Also, with Custom implementations like Green Grade2, Fedfar Enhancement and Production Load Performance Improvement which involves both Tableau and OBIEE Reports.
Created interactive dashboards for forecasting, reference lines, combination charts etc. using Quick filters, Parameters. Scheduled data refresh on OBIEE for weekly and monthly increments based on business change to ensure that the views and dashboards were displaying the changed data accurately.
Design and create metadata and ETL solutions to aid the reporting process across the enterprise. Deal with on-shore and off-shore clients and ensure project delivery on time.
Maintain and monitor the data documentation for all data warehouse and reporting formats. Responsible for effective communication between the project team and the customer.
Provide day to day direction to the project team and regular project status to the customer.
Utilize in-depth knowledge of functional and technical experience in Oracle BI Analytics and other leading-edge products and technology in conjunction with industry and business skills to deliver solutions to customer.
Tools Used: Informatica Power Center 8.6.1, OBIEE, ODI, PL/SQL, Unix, Oracle Exadata, IBM Maximo Asset management tool, Information Analyzer.

Infosys Ltd. - India 04/2012 - 06/2013

Client: University of Melbourne - Australia
Role: Consultant BI
Responsibilities:
The University is required to upgrade from Oracle 11i to Release 12 to ensure business continuity as Oracle support for 11i will cease in November 2013. Similarly, current reporting landscape is also spread across two different architectures to address the reporting needs of Single domain and Cross domain users.
Involved in DEV, Stage and Production server installation and Configuration of OBIEE 10.1.3.4.1, OBIA 7.9.6.2, Informatica 8.6.1 and DAC Worked as a BI Admin that Involved in the migration of RPD and Web catalog from DEV to UAT to Prod.
Involved in Data Reconciliation of selected reports out of the 255 predefined re- ports under Financial BI Apps Enabled EBS Single-Sign on security model which matches the Predefined security model given with Financial BI Apps.
Tools Used: Informatica Power Center 8.6.1, OBIA 7962, Teradata, SQL, DAC, OBIEE.



CSC - Computer Sciences 02/2011 04/2012

Client: Beckman & Coulter INC - USA
Role: Consultant BI
Responsibilities:
Monitored Daily Production Load and Performance of the Load. Involved in DEV and Production server Configuration of OBIEE 10.1.3.4.1 OBIA 7.9.6.2, Informatica 8.6.1 and DAC. Data Validation and Fixing the issues.
Developing code changes for the issues and fixing them. Resolving Remedy Tickets Raised by the user and resolving the issues and submitting them along with Change management, Migration and Unit Test Plan (UTP) Docs.
Tools Used: Informatica Power Center, OBIEE, DAC, SQL, Oracle EBS.

Mahindra Satyam - India 07/2007 - 01/2011

Client: OLD Mutual Cape town, South Africa 12/2009 01/2011
Role: Senior Software Developer - BI & Onsite Coordinator
Responsibilities:
Involved in DEV, Stage and Production server installation and Configuration of OBIEE 10.1.3.4.1, OBIA 7.9.6.2, Informatica 8.6.1 and DAC. Administered the complete migration of RPD and Web catalog from DEV to UAT to Prod.
Involved in Data Reconciliation of selected reports out of the 255 predefined re-ports under Financial BI Apps.
Enabled EBS Single-Sign on security model which matches the Predefined security model given with Financial BI Apps. Configured Action Links which provide drill back to source EBS system.
Configuration of Mail box SMTP and managed client centric BI Delivers. Configuration of BI Publisher, Briefing Book Reader and MS Office integration. Managed security privileges for each subject area and dashboards according to user requirements.
Tools Used: Informatica Power Center, OBIEE, OBIA Financials, BI Publisher, DAC, SQL, Oracle EBS.

Client: Cisco USA 07/2007 11/2009
Role: Senior Software Developer - BI
Responsibilities:
Worked extensively with Physical Layer, Business Layer and Presentation Layer.
Developed different kinds of executive level reports. Dashboards were generated to provide insight into the Sales reporting. Generated Customized Reports using Drill-downs, Aggregation tables to meet the Business Requirements. Involved with requirement gathering and interacting with Business Analyst and Business users.
Extensive experience working with views including charts, pivot tables, and narratives. And also, created selectors to drive interactivity in Business Intelligence re-quests.
Tools Used: OBIEE, BI Publisher, SQL, Oracle EBS.
Keywords: business intelligence sthree database active directory rlang microsoft procedural language New Jersey Ohio

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];1888
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: