Home

Rambabu - Azure Cloud Engineer / reach me on this 404-382-9211
[email protected]
Location: Dallas, Texas, USA
Relocation: yes
Visa: H1B
Rambabu



Professional Summary:

Highly experienced and motivated cloud computing professional with 20 years of experience in designing, deploying, and managing Azure/AWS infrastructure. Skilled in developing and implementing best practices for cloud operations, backup& Recovery, disaster recovery, security, and performance optimization. Proven track record of leading teams and delivering high-quality results.
Experience Azure Architect by providing cloud custom solutions to help best practices & security governance.
Excellent problem-solving skills and ability to troubleshoot complex technical issues in Azure environments.
Migrate Infrastructure to Azure IaaS cloud: Identify the applications that require more privacy and leave them on Premises and move the non-critical/non-privacy infrastructure components to IaaS Azure cloud.
Create functional design specifications, Cost management reports (Apptio )based on LOB budget, Azure/AWS reference architectures, and assist with other project deliverables as needed.
Experience in APIM core components - Developer portal, Gateway, Azure portal.
Implemented Azure API Management (APIM), Security, config firewall (inbound, outbound rules), and Cloud-to-Cloud Integration (Public, Private).
Good knowledge/experience in DevOps scripting and tools: Ansible, Bash, GIT, Artifactory, Terraform.
Knowledge of best practices and IT operations in an always-up, always-available service including Infrastructure Design Patterns.
Experience in handling PII information and defining strategies that follow Data Privacy guidelines by using TDM tools.
Experience in comprehensive support to L3 project teams, ensuring timely and efficient project execution.
Provide recommendations to clients' needs on cloud migrations and cloud infra best practices then prepare technical implementation roadmaps for Azure adoption.
Experience in Data Modeling using data modelers, Star Schema/Snowflake schema, FACT & Dimensions tables, and Physical & logical data modeling.
Excellent SQL and strong hands-on scripting programming languages like Python, SQL, and Cloud technologies
Implemented Azure PIM authentication for admin operation to improve security score.
Design state-of-the-art technical solutions on Azure that address customers requirements for scalability, reliability, high availability, security, business continuity, performance and leverage existing investments in Azure/MS platforms.
Migrate customers On-premises workloads into Azure and increase their consumption of the platform by providing architecture and deployment guidance, supporting the development of the customers cloud adoption model, and providing appropriate recommendations to overcome blockers.
Deployed Azure IaaS virtual machines (VMs) and Cloud services (PaaS role instances) into secure VNets and subnets by using Terraform scripts.
Define cloud architecture, design and implementation plans for hosting complex application workloads on MS Azure
Architect solutions using MS Azure PaaS services such as Azure Synapse (workspace, studio), ML, Postgres, SQL Server, HDInsight, service bus, etc.
Played ETL Data Architect role and provided cloud solutions for Snowflake database.
Responsible for collaborating on and setting cloud vision; providing thought leadership in cloud infrastructure and cloud services architecture to meet operational objectives for cloud solutions.
Educate customers of all sizes on the value proposition of managed services on Azure and participate in architectural discussions to ensure solutions are designed and provided for successful deployment in the cloud.
implemented best practices to save $ by identifying the unused/right capacity Azure resources and decommissioning.

Education:

Bachelor s degree in engineering in Electrical and Electronics (EEE) from JNTU Hyderabad, India

CERTIFICATIONS:
MS Azure Fundamental AZ-900.
NCFM Certification from NSE (National Stock Exchange, India).
Core Java Certification from Sun Microsystems.

Professional Experience:
Kyndryl Inc. New York, NY (Remote) Aug 2022 Till Date
Azure Infrastructure & Operations
Projects: Cloud Infrastructure

Responsibilities:

Excellent problem-solving skills and ability to troubleshoot complex technical issues in Azure environments.
Proficient in Azure cloud services, including Azure Virtual Machines, Azure SQL Database, Dedicated SQL Pool, Azure Storage, Azure Networking, Azure Data Lake, Azure Synapse, and Azure Active Directory.
Strong understanding of data governance principles, including data classification, data lineage, data masking, and data retention policies.
Experience with Azure security and compliance features, such as Azure Security Center, Azure Policy, and Azure Key Vault.
Experience with Administered Azure subscriptions, resource groups, and role-based access control (RBAC) to ensure proper governance and security.
We configured Azure PIM authentication (just in time) for admin operation to improve security for all customers.
Experience with well-structured Access on Gen2 Data Lake by using ACL control.
Monitored Azure services and resources using Azure Monitor and implemented automated alerts and notifications for proactive issue resolution.
Deployed and managed Azure virtual machines, App Service, Azure SQL Database, Open AI and PaaS offerings.
Worked closely with development teams to deploy applications to Azure and optimize their performance and scalability.
Conducted regular backups and disaster recovery tests to ensure business continuity and data protection.
Develop the many scripting like PowerShell, and Terraform for automation and infrastructure as code (IaC) using ARM templates.
We developed an Azure CLI script for creating AD groups and adding members and access to users.
Experience with relationships between clients, stakeholders, and project teams, facilitating effective communication and ensuring alignment with project objectives.
Provide comprehensive support to L3 project teams, ensuring timely and efficient project execution.
Collaborate with cross-functional teams to gather project requirements and develop detailed project plans.
Monitor project timelines and deliverables, identifying potential risks and implementing mitigation strategies.
Implemented Data Migration solutions for on-prem to Azure by using Backup, SQL Server Migration Assistant, and Azure Migration tools.
Implemented with Azure Synapse (dedicated, serverless SQL and notebook) for data injection & Analysis by All Pipeline.
We built many pipelines in Synapse and used a Serverless SQL pool/data lake Gen2 for data process and analysis.
Azure Synapse Apache Spark pools were used for the data process and moved data to the data lake for BI reporting.
Apply experience of Azure DevOps/Automation best practices to achieve highly available and reliable systems.
Participate in the processes of strategic project-planning meetings. In addition to providing guidance and expertise on system options, risk, impact, and costs vs. benefits, you will create and share operational requirements and development forecasts to allow for timely and accurate planning of projects/tasks.
As part of the Get-Well Plan implemented various solutions for Security hardening.

Skills:
Cloud Services: Virtual Machines, Storage Solutions, Network Security, Database and Synapse Services
Cloud Operations: Resource Access, Deployment, Management, Monitoring, Performance Optimization
Disaster Recovery Planning: Business Continuity Planning, Backup and Recovery Solutions
Security: Network Security, Identity and Access Management, Compliance and Auditing
Team Management: Performance Management, Resource Allocation, Mentoring and Coaching/Runbooks
Development: Build all kinds of resources by automation via scripts.

Standard Insurance, Portland, OR Dec 2017 Aug 2022
Azure Architect & Operations
Projects: Azure Operation/Data Governance

Responsibilities:

Administer and Operation support on Azure subscriptions, resource groups, and role-based access control (RBAC) to ensure proper governance and security.
Help develop a high-performing cloud operation team for junior cloud engineers and help with customer needs.
Worked with development teams to identify opportunities to optimize cloud infrastructure and reduce costs by help Apptio tool experience.
Provided technical support to customers and resolved cloud-related incidents.
Experience with relationships between clients, stakeholders, and project teams, facilitating effective communication and ensuring alignment with project objectives.
Provide comprehensive support to L3 project teams, ensuring timely and efficient project execution.
Collaborate with cross-functional teams to gather project requirements and develop detailed project plans.
Monitor project timelines and deliverables, identifying potential risks and implementing mitigation strategies.
Implemented Data Migration solutions for on-prem to Azure by using Backup, SQL Server Migration Assistant, and Azure Migration tools.
Implemented with Synapse (dedicated, serverless SQL), Azure Data Factory for on-prem data (EMR, Claims and enrollments) processing.
We build a pipeline in Azure Synapse for data copy process and analysis and generate reports by using Power BI.
Developed sync batch and streaming workflows with an in-built Stone branch scheduler and bash scripts to automate the Data Lake systems.
Azure Synapse Apache Spark pools were used for my analysis data process and moved data to the data lake for BI reporting.
Work closely with stakeholders to understand and incorporate non-functional requirements into solutions.
Create proofs-of-concept to demonstrate ideas and concepts that may be foreign to customers or prospects.
Overseeing the evolution of infrastructure & communicating with leadership on the state of Azure infrastructure.
Meeting current and future demands for high availability, integrity, and confidentiality of the data
Apply experience of Azure DevOps/Automation best practices to achieve highly available and reliable systems.
Participate in the processes of strategic project-planning meetings. In addition to providing guidance and expertise on system options, risk, impact, and costs vs. benefits, you will create and share operational requirements and development forecasts to allow for timely and accurate planning of projects/tasks.
Improve and maintain current Cl/CD pipelines, processes, and automation enabling development teams to deploy regularly while maintaining high performance levels.
Work with your fellow engineers to define deployment, configuration, and monitoring requirements.
Plan, build, configure, and optimize private and public cloud infrastructure for high-performance, scalable, and reliable consumer websites.
Providing access to create AD groups and adding members using Azure AD Services by using Python and Terraform scripts.
Acts as a technical SME for serious incidents and complex issues related to Azure/AWS.
Automated multiple components including subscription access and resource group access, AD group, and adding members by using Python and Terraform scripts
As part of the Get-Well Plan implemented various solutions for Security hardening.
Proactively performed the Orphaned Resource (Disks/Snapshot/NICs/NSGs) clean up after securing backups with automated PowerShell scripts
Implemented Azure best practices to save $ by identifying the orphan resources by decommission.


Ahold Delhaize, Greenville, South Carolina (SC) Jan 2017 Dec 2017
Cloud System Engineer
Projects: Delhaize Migration

Responsibilities:

Creating VM, Storage Account, and Virtual Network with an application from Azure Market Place.
Creating Labs, Virtual Machines for different environments along with setting up policies.
Building and Installing servers through Azure Resource Manager Templates or Azure Portal.
Azure Databricks cluster used for my analysis data process and load warehousing database then BI reporting used
Migrating an On-premises virtual machine to Azure Resource Manager Subscription with Azure Site Recovery.
Virtual Machine Backup and Recovery from a Recovery Services Vault using Azure PowerShell and Portal.
Worked on Microsoft Azure Storage - Storage accounts, blob storage, managed and unmanaged storage.
Creating and Configuring Azure Active Directory services for authenticating applications in Azure Cloud service,
Hands on experience on Site-to-site VPNs, Virtual Networks, Network Security Groups, Load balancers, Storage
Accounts.
Worked ETL mappings and workflows to load Snowflake cloud data warehouse for Annual Reporting.
Azure Resource group has given Role Based Access Control to the Development and QA environment
Worked on the Configuration of load-balanced sets and Azure Traffic manager.
Involved in requirements gathering, designing and development phases.
Involved in PI Planning, Agile scrum meetings, daily stand-up meetings and agile sprint planning/review meetings with product owners.

TD Bank, Mount Laurel, NJ Jan 2015 Dec 2016
ETL & Data Engineer
Projects: Collection & Recovery

Responsibilities:

Involved in requirements gathering, designing, and development phases.
Expertise on IaaS, PaaS, SaaS & AAD
Migrate Infrastructure to IaaS cloud: Identify the applications that require more privacy and leave them on Premises and move the non-critical/non-privacy infrastructure components to IaaS (Infrastructure-as-a-Service) Azure cloud.
Experience with Apache Spark, Hadoop, Databricks, Azure Data Factory
Developed Batch and streaming workflows with an in-built Stone branch scheduler and bash scripts to automate the Data Lake systems.
Worked ETL mappings and workflows to load Snowflake cloud data warehouse for Annual Reporting.
Create data pipelines between multiple data sources, Spark-based analytics platform (Databricks) by using Python.
Upgrade all Windows 2008 Servers to Windows 2014 servers: Target Sunset of Windows Server 2008 servers and deliver Windows 2014 server infrastructure to LOB.
Building Infrastructure and application in the migration process.
Developing scripts for build, deployment, maintenance, and related tasks using Jenkins, Docker, Maven, Python, Bash & Kubernetes.
Developed ETL mappings and workflows to build Datamart based on LOB to load Snowflake cloud data warehouse for BI Reporting.
Provided Operation support for ETL jobs and monitoring.

Eldorado Inc (an Mphasis Ltd) USA Nov 2013 Jan 2015
Senior ETL/Database Lead
Project: Javelina Support (Healthcare Claims Process)

Responsibilities:

Worked as a Data migration consultant by converting various complex objects like EMR, Customer Master Data, Vendor Master Data Joint Operations Agreements, etc.
Convert the SSIS ETL project to Informatica as the client requested.
Played a lead role in designing customer experience profile backend processes
Created the Conceptual Model for the Data Warehouse using Erwin Data Modeling tool
Knowledge EDI claims in HIPAA transactions and ANSI X12 Code Set 837 (I/P/D), 835, 270 and 271 and HL7
Knowledge on Coding and billing tools for ICD-10-CM/PCS, CPT, HCPCS
Work on Protected health information (PHI) under US law is any information about health status or provision of health care.
Designed & developed the reports using Cognos 8 Report Studio, Query Studio, and Analysis Studio reports.
Customized Cognos Connection with appropriate reports and security measures.
Developed ETL mappings and workflows to load Input feeds using Informatica Power Center
Utilized/development of Autosys to schedule jobs by using JIL code
Developed complex mappings in Informatica to load the data from various sources
Good explore in Informatica DVO for testing the data validation by writing SQL query sources and
Implemented data cleansing, Profile for Dimension tables by using Informatica Data Quality (IDQ)
Experience in defining and configuring landing tables, staging tables, base objects, lookups, query groups, queries/custom queries, packages, hierarchies, and foreign key relationships.
Created procedures to truncate data in the target before the session as per requirement.
Written documentation to describe program development, logic, coding, testing, changes and corrections
We are using Python script for migration data from the legacy system to the new system.
Implemented SOAP, Rest Web Services for Real-Time process.

Environment: ETL Informatica, TDM-data mask, BI reporting, J2EE, Java Script, Springs 3.0, Tomcat Server 7.0, XML, SVN, JUnit, Oracle 11g and Jenkins.

Mphasis Ltd, Bangalore, India Sept 2007 Nov 2013
Role: ETL/BI Developer
Project: Javeline
Product: Javelina phase I - Javelina is a Java web-based product and this product is used for the claims processing system of health care insurance clients. For data warehousing & conversion of legacy system data, Informatica power center tools are used. We designed different kinds of reports by using SSRS, BIRT Tableau, MS SQL Server, and Oracle.

Responsibilities:
Responsible for gathering requirements of the project by directly interacting with the client and analysis accordingly.
We provide data conversion projects for existing clients by using Pentaho 5.2 ETL tools like kettle, and spoon. The kettle is a powerful Extraction, Transformation, and Loading (ETL).
Experience in HIPAA transactions and ANSI X12 Code Set 837 (I/P/D), 835, 270 and 271
Created the Conceptual Model for the Data Warehouse using Erwin Data Modeling tool analyzing
Extracting, Scrubbing, and Transforming data from Flat Files, Oracle, SQL Server, and Netezza and then loading it into the Teradata database using Informatica tools.
Worked on optimizing the ETL procedures in Informatica 8.6 version.
Implementing logical and physical data modeling with STAR and SNOWFLAKE techniques using Erwin in Data warehouse as well as in the Data Mart.
Worked on Database level tuning, and SQL Query tuning for the Data warehouse and OLTP Databases.
Used Informatica repository manager to create folders and add users for the new developers.
Developed UNIX Shell scripts for calling the Informatica mappings and running the tasks on a daily basis.

Environment: Informatica Power Center 8.6, Pentaho, PL/SQL, DB2, Oracle 10g, Teradata, TOAD, Erwin, SSRS, Unix, SQL Server 2008, Netezza, Query Surge, Windows XP, Visio 2003

Syntel (India) Ltd, India Jul 2006 Aug 2007.
Role: Analyst Programmer
Project: UK Collective Fund Services (CFS), DICE Project

This project helps State Street to report Breaches in the financial functioning of Clients to Risk Management and Compliance (RMC). CFS (Collective Fund Services) has been appointed by a number of clients to perform the pricing administration function for regulated and non-regulated funds.

Responsibilities:
Involved in the requirements gathering for the warehouse. Presented the requirements and a design document to the client.
Created ETL jobs to load data from the staging area into the data warehouse.
Designed and developed complex aggregate, join, and lookup transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica Power Center
Designed and developed mappings using Source qualifier, Aggregator, Joiner, Lookup, Sequence generator, stored procedure, Expression, Filter, and Rank transformations
Extensively worked with Netezza database to implement data cleanup, and performance tuning techniques.
Kettle used for development and carte used for execution of scripts
Developed and maintained optimized SQL queries in the Data Warehouse.

Environment: Windows XP/NT, Informatica Power Center 8.1, JIRA, UNIX, Netezza, SQL, Erwin, Actuate, SVN, Web logic server

Nelito System Ltd, India May 2004 July 2006
Role: ETL/BI Developer
Project: Merlin and Endeavour Project for Merrill lynch, U.K

Responsibilities:
Extensively worked with the data modelers to implement logical and physical data modeling to create an enterprise level data warehousing.
Created and Modified T-SQL stored procedures for data retrieval from MS SQL Server database.
Automated mappings to run using UNIX shell scripts, which included Pre and Post-session jobs and extracted data from Transaction System into Staging Area.
Extensively used Informatica Power Center to extract data from various sources and load in to staging database.
Extensively worked with Informatica Tools - Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, Repository server and Informatica server to load data from flat files, legacy data.

Environment: Informatica, JIRA, SVN, Actuate Reports, Web logic, PL/SQL, MS Access, SQL Server, Windows 2000, UNIX

Indus Business Systems Ltd, India Sept 2002 May 2004
Role: Software Engineer
Project: Med-Plexus - Key solutions Inc. (www.medplexus.com)

Responsibilities:
Description: MedPlexus is a healthcare provider system for the U.S. healthcare industry in general and for the physician's offices in specific. This information system will help the physicians to automate their information/process claims, billing, Payments, and appointment processing requirements.

Responsibilities:
Understanding the Client requirement for moving the data from source to target.
Translating the Requirements into Design like HLD, LLD
Involved in designing the dimensional model like
Prepared the DDL Scripts for creating the tables and supporting table records
Extensively used Pentaho tools like Spoon, carte, kitchen, and Transformations.
Cleanse the source data, Standardize the Vendor's address, Extract and Transform data with business rules, and build data module using Spoon Designer.
Developed schedules to automate the update processes and sessions and batches.
Analyze, design, construct and implement the ETL jobs using kitchen.

Environment: Pentaho 3.8, iReports, Actuate Reports, JBoss, Windows NT, PL/SQL, Excel, Oracle

Worked as Electrical Engineer (Non-IT,) --- Jun 2000 Aug 2002
Keywords: continuous deployment quality analyst artificial intelligence machine learning business intelligence active directory information technology microsoft procedural language Arizona New Jersey New York South Carolina

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];2666
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: