Home

Deepika Kadapa - DevOps
[email protected]
Location: Fuquay Varina, North Carolina, USA
Relocation: no
Visa: H4 EAD
Name: Deepika Kadapa
Email: [email protected]
Ph : 469-919-0494

PROFESSIONAL SUMMARY

10+ years of experience in IT industry with major focus on Cloud Computing, DevOps tools and technologies, Continuous Integration, Continuous Delivery, Continuous Deployment (CI/CD pipeline), Version Control, Build and Release management, Monitoring tools, Software Configuration Management, Linux/Windows System Administration & Automation.
Actively involved in all stages of software development life cycle (SDLC) Methodologies using Agile programming and Waterfall Process.
Hands on experience with Microsoft Azure Cloud services, Storage Accounts and Virtual Networks and Azure Fabric Services.
Expertise in AWS Cloud Administration which includes services like EC2, S3, EBS, VPC, ELB, AMI, SNS, RDS, IAM, Route 53, Auto scaling, CloudFront, Cloud Watch, Cloud Trail, Cloud Formation, OPS Work and Security Groups.
Experience in designing and architect server-less Web applications using AWS Lambda, API Gateway, Dynamo DB and security token services (STS) and Configuring Inbound/Outbound in AWS Security groups according to the requirements.
Experience in Migrating On-premises infrastructure to cloud platforms like Aws/Azure and involved in virtualization using (VMware) and infrastructure orchestration using containerization technologies like Docker and Kubernetes.
Used Terraform modules for two tier Architecture which includes AWS resources VPC, Subnets, Security groups, Ec2, Load Balancers, Auto scaling group, Cloud watch Alarms, ECS clusters, S3 buckets for logs.
Hands on expertise with configuration tools like Chef, Puppet and Ansible. Created several Cookbooks, Manifests and Playbooks to automate infrastructure maintenance & configuration.
Extensive experience using MAVEN and ANT as build management tools for building of deployable artifacts (jar, war and ear) from source code.
Experienced in Branching, Tagging and maintaining the versions across different SCM tools like GitHub, Subversion (SVN) on Linux and Windows platforms.
Experience in setting up Baselines, Branching, Merging and Automation Processes using Shell, Perl, Ruby, Python and Bash Scripts.
Hands-on experience in packaging the files and place them in Artifactory such as Nexus, JFrog and SonarQube.
Worked on Docker components like Docker Engine, Docker-Hub, Docker-Compose, Docker Registry, Docker Swarm and Kubernetes.
In-depth understanding of the principles and best practices of Software Configuration Management (SCM) and CI/CD in Agile, SCRUM, Waterfall methodologies.
Good knowledge and experience in using ElasticSearch,Kibana,Cloudwatch,Nagios and Grafana for logging and monitoring.
Expertise in Jenkins pipelines to drive all micro services builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes
Expertise in Designing and creating multiple deployment strategies for CI/CD Pipelines using Jenkins and Bamboo, shortened deployment cycles by automating the deployments.
Worked in implementing the entire CI/CD pipeline using Groovy script.
Good knowledge on various Oracle Weblogic Application server, JBoss, WebSphere Application server and maintenance, integration including Tomcat Apache Server.
A Deep involvement in LINUX/Unix system Administration, System Builds, Server Builds, Installations, Upgrades, Patches, Migration, Troubleshooting on RHEL.
Worked on various operating systems like UNIX/Ubuntu/RHEL/CentOS/Fedora and Windows of production, test and development servers.






TECHNICAL SKILLS:


CloudEnvironment Amazon Web Services (AWS): Azure
Scripting Shell Scripting, Python, Ruby, VB Scripting, Groovy,powershell
Version Control Tools SVN (Subversion), GIT, GitHub ,TFS, Gitlab
Configuration Management Chef, Puppet, Ansible
CI Tools Jenkins, Hudson, Bamboo,Gitlab CI
Build Tools ANT, MAVEN, GRADLE
Container Technologies Docker, Docker Swarm, Docker Compose, Kubernetes
Monitoring Tools Nagios, Splunk, Grafana,Kibana
Web/Application Servers WebLogic, WebSphere, Apache Tomcat ,Web
Operating Systems Unix/Linux (Red Hat 5/6, CentOS, SUSE), Ubuntu 14, windows 2008 Server
Network Protocols TCP/IP, SMTP, SOAP, HTTP/HTTPS, DNS, DHCP, NFS, SNMP, BGP, EIGRP
Databases Oracle, MS SQL Server, MySQL, DynamoDB


PROFESSIONAL EXPERIENCE


Client Name: USAA, San Antonio- Texas Dec 2021 - present
Role: Devops Engineer
Roles & Responsibilities:

Implemented CI/CD pipeline using GitLab with automated unit tests, static code analysis, Code coverage and deploying the applications in all environments.
Worked on CI/CD in a containerized microservices environment.
Worked on documentation testing virtualization evaluate new features and functionality work on proof of concepts , research new features , support for new features in client environment.
Use agile methodology throughout the project.
Deploy and monitor scalable infrastructure on Amazon web services(AWS)& configuration management .
Created pipeline to Build,test and deploy applications in EKS cluster.
Worked on Kafka streaming audit logging and Springboot API for HeartBeat API maintenance project
Responsible for installing and setup configuring Apache Kafka .
Used Kafka to collect website activity and stream processing.
Created dashboards to monitor the application logs in ELK and created documents for developers' guides.
Worked on creating dashboards in datadog and created queries for API tracing, applications deployed in EKS.
Worked on creating pipeline and deploying applications in the Kubernetes cluster using EKS service in AWS
Added integration testing to existing pipeline using cypress framework.
Work with desktop tools distribution automation and develop best practices for application development in the Shared System team services
Created a pipeline for configure and run performance tests along with results , worked on converting Ready API scripts to Taurus test framework scripts.
Involved in design and implementation ,and modifying the python code.
Developed python and shell scripts for automation of the build process.
Worked on Aws Ec2/Vpc/S3 based on automation Terraform,python,Bash scripts.
Created a training documentation on how to install Terraform in local and run the pipeline.
Deployed and configured infrastructure using Terraform.
Worked on Python, Powershell, Bash Scripting.
Write Terraform scripts for Cloudwatch alerts.
Created the yml templates to automate the process.
Develop application enhancements to improve business performance.
Worked on creating AWS KMS key using Terraform scripts to manage permissions in USAA.
Worked on POC to Migrate public cloud image from V2 to V3 and deploy application in EKS Cluster.
Worked on End-to-End testing in the virtualized network.
Worked testing JSON API using RESTful clients like POSTman,CuRL.



Environment: AWS, Elk ,Datadog, Postman ,Java , Cloudwatch,eks ,Terraform, Maven, Kubernetes, Shell Scripting, PowerShell,, JFrog Artifactory, Gitlab,Jira, postman,Kafka,Springboot,Yaml.





Client Name: Zoetis Inc, Exton- PA Nov 2019 - Nov 2021
Role: Devops/Cloud Engineer

Roles & Responsibilities:

Collaborate with the cross functional engineering team to design, build and support automated processes enabling seamless delivery of the SDLC including system builds and software deployment.
Worked on Azure Site Recovery and Azure Backup Deployed Instances on Azure environments and in Data centers and migrating to Azure using Azure Site Recovery and collecting data from all Azure Resources using Log Analytics and analyzed the data to resolve issues.
Configured Azure Multi-Factor Authentication (MFA) as a part of Azure AD Premium to securely authenticate users and worked on creating custom Azure templates for quick deployments and advanced PowerShell scripting. Deployed Azure SQL DB with GEO Replication, Azure SQL DB Sync to standby database in another region & fail over configuration.
Worked on Server-less services, created and configured HTTP Triggers in the Azure Functions with application insights for monitoring and performing load testing on the applications using the Visual Studio Team Services (VSTS) also called Azure DevOps Services.
Used GitHub action runners as CI/CD to deploy code and WEB applications to AZURE.
Implemented a CI/CD pipeline with Docker, Jenkins (TFS Plugin installed), Team Foundation Server (TFS), GitHub and Azure Container services, whenever a new TFS/GitHub branch gets started, Jenkins, our Continuous Integration (CI) server, automatically attempts to build a new Docker container from it.
Worked with Terraform Templates to automate the Azure IaaS virtual machines using terraform modules and deployed virtual machine scale sets in production environments.
Worked on OpenShift for container orchestration with Kubernetes container storage, automation to enhance container platform multi-tenancy also worked on with Kubernetes architecture and design troubleshooting issues and multi-regional deployment models and patterns for large-scale applications.
Deploying windows kubernetes (K8s) cluster with Azure Container Service (ACS) from Azure CLI and Utilized kubernetes and Docker for the runtime environment of the CI/CD system to build, test and Octopus Deploy.
Using Ansible created multiple playbooks for machine creations and SQL server, cluster server and my SQL installations.

Designed, wrote, and maintained systems in Python scripting for administering GIT, by using Jenkins as a full cycle continuous delivery tool involving package creation, distribution, and deployment onto Tomcat application servers via shell scripts embedded into Jenkins jobs.
Maintained Artifacts in binary repositories using JFrog Artifactory and pushed new Artifacts by configuring the Jenkins project Jenkins Artifactory plugin.
Involved in Jira as defect tracking system and configure various workflows, customizations, and plugins for Jira bug/issue tracker integration Jenkins with Jira, GitHub.
Worked with various scripting languages like Bash, Perl, Shell, Ruby, PHP and Python.
Built and managed a highly available monitoring infrastructure to monitor different application servers like JBoss, Apache Tomcat and its components using Nagios.
Automated Weekly releases with ANT/Maven scripting for Compiling Java Code, Debugging and Placing Builds into Maven Repository.

Environment: Azure, Terraform, Maven, Jenkins, Ansible, Azure ARM, Azure AD, Kubernetes, Python, Ruby, XML, Shell Scripting, PowerShell, Nexus, JFrog Artifactory, Jenkins, Git, Jira, GitHub, Docker, Windows Server, TFS, VSTS.Kafka

Client: Bristol Myers Squibb, Princeton, NJ Aug 2018 - Nov 2019
Roles: Sr.Cloud/DevOps Engineer

Responsibilities:

Designed the data models to use in AWS Lambda applications which are aimed to do complex analysis creating analytical reports for end-to-end traceability and definition of Key Business elements from Aurora.
Worked on Amazon Kinesis for monitoring and get the logs of the servers and to check the applications data and metric using the dashboard.
Responsible for standing up back end resources in the Cloud using infrastructure as code.
Integrated the Amazon Kinesis with S3 bucket to store the generated application data logs.
Implemented CI using Jenkins with automated unit tests, static code analysis, Code coverage, code duplication and code standards using SonarQube.
Designed AWS Cloud Formation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.
Responsible for Continuous Integration and Continuous Delivery process implementation using Jenkins along with Python and Shell scripts to automate routine jobs.
Integrated Kubernetes with Hashicorp Vault to inject configurations at runtime for each service using init, config sidecars and persistent volume sharing between app and config containers.
Worked on creating the python scripts for the GIT pre-push and SVN commit hooks.
Implemented AWS solutions using ECS, S3, RDS, EBS, Elastic Load Balancer, and Auto scaling groups, Optimized volumes and EC2 instances.
Wrote Terraform scripts to automate AWS services, which include Web servers, ELB, Cloud Front distribution, Database, EC2, database security groups and S3 bucket.
Implemented CI Pipeline to create and manage AWS AMI s using Hashicorp Packer, terraform and Ansible.
Worked on Kubernetes to deploy, scale, load balance, scale, manage Docker Containers with multiple namespace versions on Pod clusters in nodes in QA, Test and Production environments.
Utilized Nexus for dependency management for Maven as well as storing the Snapshot and Release Build binaries WARs and EARs.
Configured Kubernetes Master and various nodes, configured Consul for service discovery, interacted with API server using Kubectl command line utility.
Responsible for the creating and maintaining automated builds for projects written in Java, C/C++, PHP, HTML/CSS, using Jenkins.
Written CI/CD pipeline in Groovy scripts to enable end-to-end setup of build & deployment using Jenkins and developed an end-to-end CI/CD Pipelines in Jenkins to retrieve code, perform tests and push build artifacts to Nexus.
Deployed application build artifacts using Jenkins and shell scripts in multiple integrated environments.
Troubleshoot build issues in Jenkins, performance and generating metrics on master's performance along with jobs usage and no of builds being run.
Involved in DevOps migration/automation processes for build and deploy systems.
Setup a maintain and logging and monitoring subsystem using tool kibana , Grafana and Alert manager.
Monitoring tool Grafana tool for using analysis and visualization.
Configured Terraform to reliably version and create infrastructure on Azure. Created resources, using Azure Terraform modules, and automated infrastructure management.
Ensured all steps are successfully completed and document their completion times and issue logs.


Environment: AWS, Azure , Terraform, Maven, Jenkins, Ansible, Kubernetes, Python, Shell Scripting, JFrog Artifactory, Jenkins, Groovy , Git, Jira, GitHub, Docker.


Client: SunNet Solution, Houston, TX Jan 2018 - Aug 2018
Role: Sr. Cloud/DevOps Engineer

Responsibilities:

Implementing a Continuous Delivery framework using Jenkins, Chef, Maven & Nexus in Linux environment.
Installed Chef Server Enterprise on premise/workstation/bootstrapped the nodes using knife and automated by testing Chef recipes/cookbooks with test-kitchen/chef spec.
Launching Amazon EC2 Cloud Instances using Amazon Web Services (Linux/ Ubuntu) and Configuring launched instances with respect to specific applications by configuring Elastic Load Balancers (ELB) with EC2 Auto Scaling groups.
Managed local deployments in Kubernetes, creating local cluster and deploying application containers.
Container management using Docker by writing Dockerfiles, set up the automated build on Docker HUB, installed, and configured Kubernetes.
Creating S3 buckets and managing policies for S3 buckets and Utilized S3 bucket and Glacier for storage and backup on AWS.
Implemented a Continuous Delivery pipeline with Docker, and GitHub and AWS.
Automated the build and release management process including monitoring changes between releases.
Used Splunk to monitor the system logs as well as notify the incident management system upon exceeding thresholds and used Jira for tracking and ticketing.
Configured Docker containers by creating Docker compose files and pushed Docker images onto EC2 instances from Docker- Registry to deploy the applications using Kubernetes. Worked extensively on Docker Images, attaching to running container, removing images, managing directory structures and managing containers.
Setting up and building AWS infrastructure resources such as VPC, EC2, S3, IAM, EBS, Elastic Search, Security Group, Auto Scaling, Lambda and RDS in Cloud Formation and involved in deploying the content to Cloud platform on AWS using EC2, S3, EBS.
Created functions and assigned roles in AWS Lambda to run python scripts, and AWS Lambda using java to perform event driven processing.
Wrote Lambda functions in python for AWS Lambda and invoked power shell scripts for data transformation and analytics on large data sets in EMR clusters and AWS Kinesis data Streams.
Used AWS services to launch Linux and windows machines, created security groups, and written basic PowerShell scripts to take backups and mounted network shared drives.
Managed and optimized Continuous Delivery tools like Jenkins and Troubleshoot the build issue during the Jenkins build process.
Created Jenkins jobs for continuously building the projects. Scheduled jobs to build upon every check in into subversion.
Integrated Maven with Subversion to manage and deploy project related tags.

Environment: Amazon wes services, Docker , Linux/ Ubuntu, Maven, Jenkins, Ansible, Python, Shell Scripting, Artifactory, Jenkins, Git, Jira, GitHub.






Client: Realogy, Madison, NJ June 2016 Dec 2017
Role: DevOps Engineer

Responsibilities:

Involved in all phases of Software Development Lifecycle including Requirements, Collection, Analysis of the Customer Specifications, Development, Handling Change Request, Bug Fixing, Code Review and Customization of the Application.
Implemented and Tested desktop virtualization and introduced VMware View to the client.
Design and implementation of a Puppet-based configuration management system for all new Linux machines.
Setup puppet master, client and wrote scripts to deploy applications on Dev, QA, production environment.
Development of Puppet modules with Jenkins for continues integration and continues deployment of managed products, and related services.
Used Hibernate ORM tools which automate the mapping between SQL databases and objects in Java.
Extensively used Hibernate in data access layer to access and update information in the database.
Performed integration of Code Quality Analysis techniques - Checkstyle, Findbugs with CI tools.
Managed Sonatype Nexus repositories to upload and download the artifacts (jar, war & ear).
Creating branches and tags on Git repository and provided branches access permission to dev team.
Used Jenkins for Continuous Integration and deployment into Web logic Server.
Worked on Jenkins tool to configure Nightly Builds to check sanity of our java source code.
Worked with the release and deployment of large-scale Java/J2EEWeb applications.
Support Production/development and Testing systems, UNIX system upgrading and UNIX Network configuration User/group maintenance in both production/development environments, worked with LDAP integration for a single log on purpose.
Collaborated with DBA managing their databases by backup and recovery procedures. Handled exchange 2000 servers along with active directory.

Environment: Jenkins, VM ware , puppet ,SQL, Sonar, Unix, J2EE ,Find bugs Weblogic Server.


Client: Zensar Technologies, Hyderabad, India Oct 2013 - July 2014
Role: Build & Release Engineer

Responsibilities:

Involved in Planning, Defining and Designing data based on business requirements and provide Documentation.
Worked on Release management, Environment Management, Continuous deployments, Continuous integration.
Provided support for more than 5 different applications for Configuration Management and build, designed and deployed for production and lower environments.
Used ANT build tool for deploying scripts and deploy processes using Jenkins to move from one environment to other.
Administration/Maintenance experience of source control management systems, such as Git. Managed project dependencies by creating parent-child relationships between projects.
Installed and administered repository to deploy the artifacts generated by ANT and to store the dependent jars, which were used during the build.
Used Nexus repository manager to share the artifacts by configuring the repository manager.
Worked on continuous integration tool like Jenkins for End-to-End automation for all build and deployments.
Built and deployed Java/J2EE to Tomcat Application servers in continuous integration process and automated the whole process implementing a CI/CD using Jenkins.
Installed and configured the packages using YUM and RPM package managers.
Developed automated processes that run daily to check disk usage and perform cleanup of file systems on Unix/Linux environments using shell scripting and Cron. Developed unit and functional tests in python and Ruby and developed and maintained Shell scripts for build and release tasks.
Assisted on monitoring servers and responding to event notifications (Service outages, Load Alerts etc.) using Nagios.

Environment: Jenkins, ANT , Git , Shellscript, Python, Ruby, Unix, Weblogic Server , Nagios.



Client: Tanzanite Technologies Inc. May 2011 - Oct 2013
Role: IT Engineer

Responsibilities:

Involved in Planning, Defining and Designing data based on business requirements and provide Documentation.
Built .NET applications by configuring Atlassian Bamboo and integrating with Jira, Bitbucket, and Confluence.
Used JMeter and Selenium for load testing and front-end performance testing.
Created and maintain build automation shell, manage building fixes, merges, release cuts as needed, written scripts for environment changes in Bash and Perl for WebSphere mapping modules to the enterprise application.
Worked on UNIX and Windows environments including shell and Perl scripts.
Installed and configured DCHP server to give IP leases to production servers.
Used AGILE (Scrum, XP approach) methodology including test-driven and pair-programming concept.
Installation of Oracle Patches and Troubleshooting, Creating, and modifying application related objects, Creating Profiles, Users, Roles and maintaining system security.
Used Rest API while developing an application, used this framework as an intermediate to the front and backend servers.

Environment: .NET , Jira , Bitbucket , JMeter , Selenium , WebSphere, DCHP , UNIX


EDUCATION DETAILS:

B.Sc. (Bachelor s of Science ) ,S.V University, India
MCA ( Master of Computer Applications) , S.V University, India.
Keywords: cprogramm cplusplus continuous integration continuous deployment quality analyst sthree database active directory information technology microsoft New Jersey Pennsylvania South Carolina Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];58
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: