Home

Karthik - Devops engineer/AWS/azure/cloud
[email protected]
Location: Dallas, Texas, USA
Relocation: Any wherein USA
Visa: H!B
Karthik Nagavelly
Senior Cloud / DevOps Engineer
Email: [email protected]
Mobile: +1 7187137088

PROFESSIONAL SUMMARY:
I am a results-driven DevOps Engineer with over 9+ years of experience building and automating robust cloud infrastructures. I Possesses deep expertise in leading cloud platforms like Azure, GCP, and AWS, leveraging tools like Terraform, Ansible, Chef, Puppet, Jenkins, GitLab CI/CD, Kubernetes, and Docker to design, develop, and I deploy scalable, secure, and performant cloud solutions. Proven track record of optimizing infrastructure for cost-efficiency, enhancing agility through automation, and fostering collaboration across development and operations teams.
Proficient in implementing Cloud Solutions using various AWS Services including EC2, VPC, S3, Glacier, EFS, AWS Kinesis, IAM, Lambda, Directory Services, Security-Groups, CloudFront, Ops Work, and CloudFormation, Elastic Beanstalk, RDS, SNS, SQS, SES, DynamoDB, Redshift, EMR, ELB, and Route 53.
Experienced in creating and maintaining user and admin-level documents for various DevOps tools.
Skilled in administering OS and DevOps tools Linux, Bitbucket, Jenkins, CircleCI, GitHub, GitLab, JIRA, Nexus, Confluence, and VersionOne, SVN.
Developed the company's DevOps strategy in a mixed environment of Linux (RHEL, CentOS, Ubuntu) servers and implemented a cloud strategy based on Amazon Web Services.
Expertise in building CI/CD pipelines as an iterative process using Jenkins and experience in using the Configuration Management tool Ansible.
Experience in Implementing automated testing scripts using tools like KafkaUnit or Confluent's Kafka Testing framework to validate Kafka cluster behavior.
Experience with OpenShift system integration, Infrastructure as a service, and Cloud knowledge.
Managed Private Cloud and Hybrid cloud configurations, patterns, and practices in Windows Azure and SQL Azures, and Azure Web and Database deployments.
Strong understanding of Citrix XenApp architecture, including versions 6.5 and 7.6.
Experience in designing Cloud architectures for customers looking to migrate or develop new PaaS, IaaS, or hybrid solutions utilizing Microsoft Azure or AWS.
Proficient in creating and managing cloud Infrastructure on AWS and GCP using Terraform and Ansible.
Experience in DevOps Engineering in automating, building, and deploying code within different environments.
Experience with container-based deployments using Docker, working with Docker images, Docker Hub Docker registries, and Kubernetes.
Experience in developing and implementing infrastructure as code, automated provisioning, and configuring using Chef, Puppet, and Ansible.

TECHNICAL SKILLS:
Category Tools Used
Cloud Environments Microsoft Azure, AWS, GCP
Version Control Tools Git, GitLab, Azure Repos, Bitbucket
Configuration Management Tools Ansible, Chef, Terraform, CloudFormation
Scripting/Programming Languages Bash & Shell Scripting, Python, Java.Net
CI/CD Tools Git, Bitbucket, Maven, ANT, Gradle, Jenkins, Azure Pipelines, GitLab Pipelines, SonarQube
Build Tools Maven, ANT
Container Tools Docker, Kubernetes, ECS
Monitoring Tools Splunk, Graylog, App Insights, ELK, Grafana, CloudWatch, Nagios
Operating Systems Windows, UNIX, RHEL, CentOS & Ubuntu
Databases MySQL, NoSQL, MongoDB, DynamoDB, MSSQL, PostgreSQL, Oracle
Web/Application Servers Apache Tomcat, WebLogic, Oracle Application Server, Nginx, JBOSS, Citrix

EDUCATION:
B-Tech (ECE) from Jaya Mukhi Institute of Technological Science, Jawaharlal Nehru Technological University (JNTU-H)

CERTIFICATIONS:
AWS Solutions Architect
Azure Fundamentals

PROFESSIONAL EXPERIENCE:
Client: Fidelity Investments Dallas, Texas June 2023 to Present.
Role: Senior Cloud DevOps Engineer
Responsibilities:
Configured Azure Automation Scripts for application deployment, leveraging Azure Compute, Web, and Mobile services, Blobs, Resource Groups, Azure Data Lake, HDInsight Clusters, Azure Data Factory, Azure SQL, Cloud Services, and ARM services.
Developed Terraform code to automate the deployment and management of Azure Databricks workspaces, improving infrastructure scalability and reliability.
Developed HTTP Triggers within Azure Functions, integrated with Application Insights for monitoring and load testing, using Python API to upload agent logs into Azure blob storage.
Designed and implemented Service Oriented Architecture using Azure Data Lake Store and Azure Data Factory for data processing.
Utilized Azure Storage Queues for inter-process communication and configured Azure load balancer for traffic distribution.
Performed load testing on Kafka clusters to assess performance under heavy loads and identify bottlenecks.
Designed and implemented CICD pipelines to deploy and manage Azure Databricks, ensuring efficient and reliable software delivery.
Created and managed Azure SQL database, including monitoring, restoring, and migration from On-premises Microsoft SQL server.
Worked on container orchestration with OpenShift, leveraging Kubernetes container storage and automation for enhanced multi-tenancy within container platforms.
Deployed .NET and Python applications to Azure DevOps, focusing on development efficiency with Build Pipelines and Test plans.
Created CI/CD pipelines for applications in Azure DevOps, integrating with GitHub, VSTS, and artifacts, and deploying to Kubernetes cluster.
Used Terraform for Infrastructure as Code, implementing modules for deployment across Azure.
Deployed microservices-based applications on Azure Kubernetes Service (AKS), utilizing Ingress API Gateway, MySQL, SQL Databases, and Cosmo DB.
Led container-based application implementation on Azure Kubernetes using Azure Kubernetes Service (AKS) and Kubernetes Cluster.
Implemented Jenkins pipelines into Azure pipelines for microservices builds and deployment to Kubernetes.
Automated infrastructure activities using Ansible playbooks and maintained Docker container clusters managed by Kubernetes.
Configured Unity Catalog in Azure Databricks, managing catalog permissions to ensure secure and controlled access to data and resources.
Used Chef for configuration management within GCP, configuring and networking Virtual Private Cloud (VPC).
Ensured high availability for IaaS VMs and PaaS role instances in Azure, collaborating with business process managers for big data technologies utilization.
Proficient in Python and Bash scripting, along with Apache Airflow for job scheduling within GCP.
Used Apache airflow in GCP composer environment to build data pipelines, worked on TERRAFORM for provisioning Environments in the GCP platform.
Conducted integration testing of Kafka with other components of the tech stack to ensure seamless data flow and system reliability.
Integrated Jenkins/Helm/Kubernetes/Vault with GCP for automated releases, created Service accounts, VM instances, and VPC networks using Terraform on GCP.
Automated Compliance Policy Framework for multiple projects in GCP, deployed GKE on GCP with the help of Gitlab-Jenkins-Terraform integration.
Designed and implemented scalable cloud-based web applications using AWS and GCP, migrated applications to the PKS and GCP cloud.

Environment: Environment: Azure Cloud, Azure-DevOps, GCP, Terraform, AKS, Ansible, Maven, Azure-Monitoring, Power-Shell, Windows.

Client: HCL Technologies Chennai, India September 2021 to April 2023
Role Cloud/ AWS DevOps Engineer
Responsibilities:
Implemented AWS solutions using EC2, S3, RDS, DynamoDB, Route53, EBS, Elastic Load Balancer, and Auto Scaling groups.
Configured AWS IAM and security groups in VPC's public and private subnets.
Created AWS Lambda functions to insert and retrieve data into DynamoDB.
Managed AWS EC2 instances utilizing Auto Scaling, Elastic Load Balancing, and Glacier for various environments.
Implemented security testing for Kafka to ensure data confidentiality, integrity, and availability.
Utilized AWS Lambda for serverless architecture, deploying via gulp and AWS CloudFormation.
Provisioned highly available EC2 instances using Terraform and CloudFormation.
Migrated legacy and monolithic systems to Amazon Web Services using Terraform.
Managed Kubernetes charts using Helm, creating reproducible builds and managing releases.
Implemented a production-ready, load-balanced, highly available Kubernetes cloud infrastructure.
Created Kubernetes clusters, pods, replication controllers, and services using YAML files.
Used Build Automation tools like ANT, Maven, Artifactory, and Jenkins for continuous integration.
Setup Artifactory pro as a container with a secure private Docker registry and local Docker repositories.
Designed an ELK (Elastic Logstash Kibana) system for monitoring and alerting.
Installed and configured Apache Webserver for application transportation.
Conducted fault tolerance testing by simulating node failures and network partitions to ensure Kafka cluster resilience.
Implemented a Continuous Delivery framework using Jira, SVN, Bamboo, Maven, Nexus, and Puppet.
Performed server migration from Physical to Virtual (P2V) and Virtual to Physical (V2P).
Deployed infrastructure as code in AWS using Terraform, CloudFormation, and AWS SDKs.
Managed Red Hat LINUX user accounts, groups, directories, file permissions, and Sudo rules.
Used CloudWatch service to monitor and maintain infrastructure, creating alerts for unusual activity.

Environment: AWS Lambda, AWS S3, EC2, RDS, AWS Route53, Jenkins, Terraform, EBS, Linux, Ant, Maven, Data Dog, Puppet, Docker, Python, Ansible, Unix, Jira, Maven, GIT, Kubernetes, DynamoDB, Nginx, Splunk, ELK.

Client: JDA Software Bangalore, India November 2017 to August 2021
Role: DevOps Engineer
Responsibilities:
Managed Azure infrastructure including Azure Web Roles, Worker Roles, SQL Azure, Azure Storage, and Azure AD Licenses.
Implemented Cloud Solutions using various AWS Services such as EC2, VPC, S3, Glacier, Security-Groups, and CloudFront.
Identity & Access Management experience with Azure Active Directory, Azure Identity, and Multi-Factor Authentication (MFA).
Managed external storage configurations in Azure Databricks, enabling seamless integration with external data sources and improving data accessibility.
Built Jenkins jobs and created Infrastructure from GitHub repositories containing Terraform code.
Provisioned AWS networking Infrastructure like VPC, subnets, and security groups using Terraform reusable modules and Jenkins shared libraries.
Virtualized servers using Docker for test and dev environments, and configured automation using Docker containers.
Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of Azure Nodes and tested Playbooks on Azure instances using Python SDK.
Developed automation tools using Shell scripting and PowerShell scripting for Branching, build creation, and deployments.
Performed WebLogic Server administration tasks such as installing, configuring, monitoring, and performance tuning on a Linux environment.
Extensive Experience working on Kubernetes for orchestrating the deployment scaling and management of docker containers.
Created Docker images using Docker files, managed images, and Docker volumes, and wrote JSON format scripts to automate and integrate Docker with Kubernetes.
Proficient in virtualization technologies like VMware, Docker for containerization, and Kubernetes for container orchestration.
Used Jenkins pipelines to drive all microservices builds to the Docker registry and deployed to Kubernetes, managing Pods using Kubernetes.
Built and maintained Docker container clusters managed by Kubernetes on GCP, using Linux, Bash, GIT, and Docker.
Developed Private cloud system solution on CoreOS using Kubernetes (Docker weave).
Changed AWS infrastructure Elastic Beanstalk to Docker with Kubernetes, managed local deployments in Kubernetes, created local clusters, and deployed application containers.
Container management using Docker, writing Docker files, setting up the automated build on Docker HUB, and installing and configuring Kubernetes.
Worked with Kubernetes to manage containerized applications using its nodes, ConfigMaps, selector, and Services, and deployed application containers as Pods.

Environment: Azure cloud, Amazon AWS cloud, Bit-Bucket, Jenkins, Gradle, Kubernetes, Docker, Terraform, Ansible, Dynatrace, Shell Scripting, Linux.


Client: Mercedes-Benz India Pvt. Ltd Bangalore, India Jan 2015-October 2017
Role: Build & Release Engineer
Responsibilities:
Configured SAML authentication for AWS accounts using single sign-on and deployed cross-account roles with AWS CloudFormation stack sets.
Automated cloud deployments with Python (boto & fabric) and AWS CloudFormation Templates.
Managed IAM Policies, providing access to AWS resources, and designed and refined workflows for access granting.
Deployed and configured ELK (Elasticsearch, Logstash, Kibana) for log analytics and application monitoring, integrating with AWS Lambda and CloudWatch, storing logs and metrics in an S3 bucket with Lambda functions.
Utilized Azure DevOps, GitHub Actions, and Octopus for building CI/CD pipelines.
Implemented CI/CD for applications using Jenkins for integration with GitHub, Maven, JUnit, and Nexus Artifactory, and Ansible for continuous deployment into testing, staging, and production environments.
Created, developed, and tested environments for various applications by provisioning Kubernetes clusters on AWS with Docker, Ansible, and Terraform.
Created Docker images with Docker files, managed Docker containers, and set up Docker Registry for local image storage and download from Docker Hub.
Configured Kubernetes provider with Terraform to interact with Kubernetes resources for creating services such as Deployments, Services, Ingress rules, Config Maps, and Secrets in different Namespaces.
Built and maintained Docker container clusters managed by Kubernetes, using Linux, Bash, GIT, and Docker, and utilized Kubernetes and Docker for the runtime environment of the CI/CD system.
Integrated Kubernetes with HashiCorp Vault for injecting configurations at runtime using init, config sidecars, and persistent volume sharing between app and config containers.
Deployed instances with Ansible playbooks, wrote Ansible modules for integration with Apache Tomcat and AWS, and used Ansible playbooks for deploying applications.
Designed an ELK (Elastic Logstash Kibana) system for monitoring and searching enterprise alerts, configured ELK stack with AWS, and used Logstash to output data to AWS S3, terminating SSL onto Nginx for API server optimization.
Worked with Red Hat OpenShift Container Platform for Docker and Kubernetes, using Kubernetes for deploying, scaling, load balancing, and managing Docker containers with multiple namespace versions.

Environment: AWS, Azure, GitLab, Jenkins, Nexus, Go lang, GitHub Actions, Maven, Gradle, Docker, Kubernetes, Ansible, VMware, Web Logic, Tomcat, Cloud Watch, Perl, Oracle 10g/11g, GitHub, Grafana, Hashicorp Vault, Ruby, Git lab, Nagios, YAML, Splunk, JIRA, AppDynamics.


REFERENCES:
Available upon request
Keywords: continuous integration continuous deployment access management sthree database active directory golang

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];1969
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: