Yedunandan - Cloud Devops engineer role |
surya.k@logicplanet.com |
Location: Dallas, Texas, USA |
Relocation: |
Visa: H1B |
Resume file: Yedunandan resume_1743600480649.docx Please check the file(s) for viruses. Files are checked manually and then made available for download. |
PROFESSIONAL SUMMARY:
12 years of IT experience as Cloud DevOps CI/CD and Data Engineer on Domains like Telecom, Banking, Finance, Health care and clinical research and Public Education department. Azure DevOps Architect with extensive experience in designing, implementing, and managing scalable DevOps solutions on Azure. Experience in SCM tools Subversion SVN and GITHUB. Proficient in automating end-to-end software delivery pipelines using Azure DevOps, GitHub Actions, Jenkins, and other CI/CD tools. Adept at building robust infrastructure using Infrastructure as Code (IaC) tools like Terraform and ARM templates. Demonstrates expertise in implementing Azure networking designs, including hub-and-spoke models, load balancing, and security controls. Possesses a strong background in container orchestration using Kubernetes, Docker, and Azure Kubernetes Service (AKS). Designed and implemented Azure DevOps, GitHub Actions, Jenkins, and CI/CD pipeline automation. Setting up Monitoring and observability with Grafana, Splunk, and Azure Monitor. Extensive experience on migrating from On-prem infrastructure to Azure cloud using terraform. Hands-on experience on AWS cloud services like EC2, IAM, Code-Commit, Code star and Code star. Responsible for Managing, Implementing, Troubleshooting Apache-tomcat servers /instances in all environments and Application Monitoring using AppDynamics. Configured the distributed and multi-platform server using Nagios. Project tracking tool using JIRA to provide the updates to the management. Use of Log tools like Splunk and Datadog. Knowledge on scripting languages like Shell, Bash and Python. Expertise in Azure Scalability and Azure Availability - Build VMs availability sets using the Azure portal to provide resiliency for IaaS based solution and Virtual Machine Scale Sets (VMSS) using Azure Resource Manager (ARM) to manage network traffic. Experience in working with AWS CodePipeline and creating CloudFormation JSON templates to create custom sized VPC & migrate a production infrastructure into an AWS utilizing Code Deploy, Code Commit, OpsWorks. Experience in writing Infrastructure as a code (IaC) in Terraform, Azure resource management, AWS Cloud formation. Created reusable Terraform modules in both Azure and AWS cloud environments. TECHNICAL SKILLS: Cloud Environments: Microsoft Azure, Amazon Web Services, Google cloud platform. Build and Testing Tools: Maven, ANT, Gradle, Selenium, JIRA. Databases: Oracle 10g/9i, Oracle 11g, Teradata, MYSQL, Snowflake, SQLServer. Scripting Languages: Python, Shell Scripting, Bash Shell, PowerShell, YAML, Operating System: Unix, Linux, Ubuntu, Mac OS, WINDOWS NT/2000/2003/XP/7/8/10. Other Tools WinSCP, IBM UCD, Active Directory, AMW. Monitoring Tools Autosys, M-Control. IDE Tools: Microsoft Visual Studio, NetBeans, Eclipse, PyCharm, Oracle SQL DBA. Web Technologies HTML5, CSS3, Bootstrap, JSON, JQuery, JavaScript, C#, ASP.NET, XML Monitoring and Bug tracking tools Nagios, Splunk, Dynatrace, AppDynamics and Datadog. Version control: SVN, Git, GitHub,Gitlab.Azure repos. Configuration management: Chef, Puppet, Ansible, Terraform. Deployment Tools: Bamboo, Jenkins. Container Tools: Docker, Kubernetes, Openshift Networking Protocols: DNS, DHCP, FTP/TFTP, NFS, SMTP, TCP/IP, HTTP/HTTPS, WAN, LAN Education and Certification: Bachelor of Technology, Electronics and Communication Engineering, JNTUK. Certified AZ-900 Certified AZ-400 PROFESSIONAL EXPERIENCE: Role: Sr. Azure devops Architect (October - 2024 Till date) Company: Hitachi Digital Services, Dallas, TX. Responsibilities: Expertise in Azure infrastructure management (Azure Web Roles, Worker Roles, SQL Azure, Azure Storage, Azure AD Licenses, Office365). Expertise in Architecting and Implementing Azure Service Offering, such as Azure cloud services, Azure storage, IIS, Azure Active Directory (AD), Azure Resource Manager (ARM), Azure Storage, Azure, Blob Storage, Azure VMs, SQL Database, Azure Functions, Azure Monitor, and Azure Service Bus. Responsible for building fully automated CI/CD pipelines using Gitlab and Ansible in all the respective environments for products. Developed Ansible playbooks for managing the application/OS configuration files in GitHub, integrating with Gitlab, and Verifying with Gitlab plugins, deploying the application into the Linux environment. Experience in Custom Process design of Transformation via Azure Data Factory & Automation Pipelines. Building and Installing servers through Azure Resource Manager Templates (ARM) or Azure Portal. Deployed Azure Resource Manager based resources. Created PowerShell &Python scripts for various Systems Administration tasks to automate repeated processes. Wrote Terraform templates for Azure Infrastructure as a code to build staging, production environments & set up build & automations for Gitlab workflow CI/CD. Worked on Gitlab workflow to automate the testing, building, and deployment of applications. Designed, deployed, and maintained OpenShift clusters in a hybrid cloud environment, supporting large-scale enterprise applications. Worked on Microsoft Azure Cloud Services like Azure Databricks and Azure Data Lake Store. Developed and maintained scripts for batch processing and data processing using Python, shell scripting, and other scripting languages. Worked on Automating various development processes, such as code linting, static code analysis, and security scanning, using GitHub Actions. Involved in writing various Custom Ansible Playbooks for deployment orchestration and developed Ansible Playbooks to simplify and automate day-to-day server administration tasks. Automated various infrastructure activities like Continuous Deployment, Application Server setup, Stack monitoring using Ansible playbooks and has integrated Ansible with Gitlab Workflow. Developed several playbooks using Ansible, YAML and deployed the applications/services on the client hosts. Created reusable and tested infrastructure with Terraform modules and Versioned modules for Staging, Testing and Production environments in GCP. Created projects, VPC S Subnetworks, GKE clusters for environments QA and prod using Terraform. Used Terraform scripts to Automate instances, added Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of AWS Nodes aneploys critical applications, and proactively manages change. Monitored application Insights, Logs to Splunk by triggering respective functions and pushing events to Splunk by using Splunk search, WMI Issues, Splunk crash logs and Alert scripts for real-time Analysis and Visualization. Monitored servers, switches and ports using Nagios Monitoring tool and assisted internal users of Splunk in designing and maintaining production quality dashboards. Environment: Azure, Azure Devops, GITLab CI/CD workflows , Ansible Playbooks, Splunk, Terraform, Python, ARM, Azure data factory. Role: Sr. Azure devops Architect (November - 2023 October - 2024) Company: AT&T, Dallas, TX. Responsibilities: Worked on management, deployment, and optimization of applications on the Pivotal Cloud Foundry platform(PCF). Involved in design, implementation and modifying the Python code. As a lead played a critical role in planning, designing, and implementing the DevOps culture and practices within an organization, particularly in the context of Microsoft s Azure cloud platform. Setting up and maintaining PCF environments across different stages (development, staging, production). Worked on GitHub Actions to automate the testing, building, and deployment of applications. Designed, deployed, and maintained OpenShift clusters in a hybrid cloud environment, supporting large-scale enterprise applications. Worked on Microsoft Azure Cloud Services like Azure Databricks and Azure Data Lake Store. Developed and maintained scripts for batch processing and data processing using Python, shell scripting, and other scripting languages. Worked on Automating various development processes, such as code linting, static code analysis, and security scanning, using GitHub Actions. Responsible for establishing the JDBC connections for various RDBMS databases to pull the data into Azure Data Lake. Collaborated with development teams to containerize microservices and deploy them on OpenShift, leveraging Docker and Kubernetes. Developed Databricks Asset Bundle yaml scripts for deploying pipelines in Azure Databricks. Worked on migration of on-prem infrastructure to Azure using Terraform which involves automating the process of provisioning resources in Azure while maintaining infrastructure as code (IaC). Clone the TFS repository with Git-TFS and push it to GitHub, preserving branches and commit history. Used TFS for build and release, you ll need to map out the pipelines and translate them into GitHub Actions, Jenkins, or another CI/CD system that integrates with GitHub. Utilize Power BI to build interactive and visually appealing dashboards and reports. Worked on migrating data from Netezza to Azure data lake using Sqoop and spark jobs Configured and monitored Azure Kubernetes Services clusters in different regions. Worked on Serverless services, created and configured HTTP Triggers in the Azure Functions with application insights for monitoring and performing load testing on the applications using the Visual Studio Team Services (VSTS) also called Azure DevOps Services. Worked on user managed virtual machines in a subscription while another handles virtual networks as part of Azure RBAC. Worked with CMake, Makefile, and CMocka for building and testing automation for complex software applications. Worked on setting up and maintaining automated API testing frameworks as part of CI/CD pipelines. Test API requests and responses, ensuring expected behavior with accurate data validation. Created pipelines in Azure Data Factory using linked services to various source systems and then used databricks notebook for transformation and then loaded the data into azure blob storage. Used Azure Kubernetes Service to deploy a managed Kubernetes cluster in Azure and created an AKS cluster in the Azure portal, with the Azure CLI, also used template driven deployment options such as Resource Manager templates and Terraform. Used Akamai tool to analyze content delivery performance, cache hit ratios, and geographical traffic data. Evaluate API performance, including response time, throughput, and scalability under various loads. Responsible for Installing, setup and Configuring Apache Kafka and Apache Zookeeper. Used Kafka to collect Website activity and Stream processing. Worked on Azure Fabric, Microservices, IoT & Docker containers in Azure and involved in setting up Terraform continuous build integration system. Implement Akamai DDoS protection solutions to prevent distributed denial-of-service attacks on web applications. Developed Microservice on boarding tools leveraging Python and Jenkins allowing for easy creation and maintenance of build jobs and Kubernetes deploy and services. Worked on Power Query Editor to transform messy data into something beautiful. Filter, merge, pivot, and aggregate as needed using power BI. Setup Azure firewall for network security as part of Azure security. Worked with Terraform Templates to automate the Azure Iaas virtual machines using terraform modules and deployed virtual machine scale sets in the production environment. Expertise in Architecting and Implementing Azure Service Offering, such as Azure cloud services, Azure storage (ADLS, AFS), IIS, Azure Active Directory (AD), Azure Storage, Azure, Blob Storage, Azure VMs, SQL Database, Azure Functions, Azure Monitor, and Azure Service Bus. Defined security rules (allow/deny) based on source/destination IP, port, and protocol. Associate NSGs with subnets and Azure networking interfaces. manage access to Azure resources. It s like handing out VIP wristbands to the right people using Azure RBAC. Created reusable and tested infrastructure with q modules and Versioned modules for Staging, Testing and Production environments in GCP. Worked in Custom Process design of Transformation via Azure Databricks, Azure Data Factory & Automation Pipelines. Automated setting up server infrastructure for the DevOps services, using Ansible, shell and python scripts. Set up GCP Firewall rules to allow or deny traffic to and from the VM S instances based on specific configuration and used GCP cloud CDN (Content delivery network) to deliver content from GCP cache locations drastically improving user experience and latency. Adding tasks build pipelines for Code Analysis, Version tracking and Azure Security check. Configured cost alerts and budgets in Azure to notify responsible teams when spending approaches thresholds or deviates from forecasts as part of FinOps. Environment: Azure, Azure Devops, GIT, Bitbucket, Azure Kubernetes Services, Azure pipelines, Jenkins, Terraform, GCP, Power BI, Akamai, Azure Iaas, Azure storage, Azure Active Directory, Azure VM s, SQL, Azure Monitor, Azure Service Bus, RBAC, Azure Databricks, Azure Data Factory, Python, Powershell, Shell. Role: Sr. Cloud devops Architect (October - 2021 November - 2023) Client: AT&T Direct TV. Responsibilities: Worked on creation of CI/CD pipelines using Azure DevOps Worked on integrating SonarQube code analysis, code coverage etc.. in CI/CD pipelines Utilize scripting knowledge (Bash, Python, Ansible, PowerShell, etc.) to automate routine tasks and streamline deployment processes. Monitor and troubleshoot OpenShift environments, ensuring high availability, performance, and security compliance. Customize and manage PCF buildpacks that are responsible for compiling and running applications in the platform. As a lead, enforce best practices for code management, testing, automation, and deployment across all teams. Implemented Azure Load Balancers and networking models (hub-and-spoke) to enhance OpenShift deployments. Orchestrating the deployment of applications to multiple platforms (e.g., Kubernetes, Docker, VMs) using GitHub Actions. Creating Azure DevOps dashboards depicting build and release data. Used CMake to generate build files, which in turn use a Makefile to define how to compile and link the source code. Integrated GitHub Actions with other tools, such as JIRA for project management, SonarQube for code quality, or Terraform for infrastructure as code (IaC). Code securely, follow best practices, and use Azure Application Gateway for web applications as part of Azure Security. Worked on using Git / TFS repositories. Implemented and managed secure communication protocols (TLS/SSL) for applications running on PCF. Working on built-in roles doesn't cut it and creating your own custom roles using Azure RBAC. Created multiple Python, Bash, Shell and Ruby Shell Scripts for various application-level tasks. Configured Azure Multi-Factor Authentication (MFA) as a part of Azure AD Premium to securely authenticate users and worked on creating custom Azure templates for quick deployments and advanced PowerShell scripting. Deployed Azure SQL DB with GEO Replication, Azure SQL DB Sync to standby database in another region & Fail over configuration to enhance Azure security. Developed microservice on boarding tools leveraging Python and Jenkins allowing for easy creation and maintenance of build jobs and Kubernetes deploy and services on GKE Clusters. Automated the IAM secrets policy management for Hashi Corp vault by integrating it with Jenkins, deployment of Google cloud SQL databases, load balancers for Aquasec container security tool inside GCP using Cloudsdk and python. Refactoring of monolithic applications to a Microservices and Component based architectures. Configured new VNet and used an existing one and defined address spaces (CIDR blocks) for subnets within the VNet as part of Azure Networking. Designed data models. Created relationships between tables using primary keys and foreign keys in power BI. Replicate on-prem firewall and security group configurations in Azure using NSGs and Azure Firewall. Worked on Serverless services, created and configured HTTP Triggers in the Azure Functions with application insights for monitoring and performing load testing on the applications using the Visual Studio Team Services (VSTS) also called as Azure DevOps Services. Created Azure DNS zones for your domains. Configure DNS records (A, CNAME, MX, etc.) within the DNS zone as part of Azure Networking. Created Azure Automation Assets, Graphical runbook, PowerShell runbook that will automate specific tasks, deployed Azure AD Connect, configuring Active Directory Federation Service (AD FS) authentication flow, ADFS installation using Azure AD Connect, and involved in administrative tasks that include Build, Design, Deploy of Azure environment. Managed Kubernetes charts using Helm, Created reproducible builds of the Azure Kubernetes applications, managed Kubernetes manifest files and Managed releases of Helm packages. Build stunning reports with visuals tables, charts, maps, and KPI cards using power BI. Gave your database administrators (DBAs) the power to manage Azure SQL databases within a subscription using Azure RBAC. Wrote numerous scripts in Python to perform various tasks such as the importing of data into MySQL and replacing proprietary compiled Python code with easy to read, modifiable, custom code to accomplish the same tasks. Monitored and tracked deployments using DataDog and Dynatrace. Formed a cross-functional team that includes representatives from finance, engineering, and cloud operations to oversee FinOps practices. Environment: Azure, Azure Devops, GIT, Azure pipelines, Jenkins, GCP, Terraform, Docker, Kubernetes, Power BI, Python code, Azure Networking, Azure Iaas, Azure storage, Azure Active Directory, Azure VM s, SQL, Azure Monitor, Azure Functions, Graphical runbook, PowerShell runbook. Role: Sr. Cloud Devops Engineer (May - 2021 October - 2021) Client: Chewy Responsibilities: Worked on creation of CI/CD pipelines using DevOps Worked on integrating SonarQube code analysis, code coverage etc.. in CI/CD pipelines Worked on integrating different kinds of tests (unit, smoke, regression etc..) in CI/CD Pipelines Expertise in automating builds and deployment process using Bash, Python and Shell scripts with focus on CI/CD leveraging cloud DevOps services Build infrastructure on a private cloud using IAAS terraform. Built and Deployed Docker images on AWS ECS and automated the CI-CD pipeline. Provided ongoing operational support for OpenShift clusters, including upgrades, patching, and scaling. Worked in AWS Cloud IaaS stage with components VPC, ELB, Auto-Scaling, EBS, AMI, ECS, EMR, Kinesis, Lambda, CloudFormation template, CloudFront, CloudTrail, ELK Stack, Elastic Beanstalk, CloudWatch, EKS and DynamoDB. Implemented CICD for EKS env using GITLAB in EKS through Helm charts and Kubernetes Manifest files. Setting GitLab repository and runner for build automation. Creation of gitlab-ci.yaml file for kicking build process in environment stages that run in docker container. Collaborated with developers to implement GitOps practices for continuous integration and deployment on OpenShift. Build stunning reports with visuals tables, charts, maps, and KPI cards using power BI. Designed a distributed private cloud system solution using Kubernetes (Docker) on CoreOS and used it to deploy scale, load balance and manage Docker containers with multiple name spaced versions. Automated builds and deployment process using Bash, Python and Shell scripts with focus on CI/CD leveraging cloud DevOps services. Supported networking and storage configurations to optimize OpenShift workloads for performance and scalability. Worked on Power Query Editor to transform messy data into something beautiful. Filter, merge, pivot, and aggregate as needed using power BI. Involved in installing and administering CI/CD tools like Jenkins for managing weekly Build, Test and Deploy chain, GIT with Test/Prod Branching Model for weekly releases. Environment: AWS, Azure,GIT, Jenkins, Bash, Python, AWS ECS, VPC, ELB, Auto-Scaling, EBS, AMI, ECS, EMR, Kinesis, Lambda, CloudFormation template, CloudFront, CloudTrail, ELK Stack, Elastic Beanstalk, CloudWatch, EKS and DynamoDB. Role: Sr. Devops Engineer (Feb 2020 May 2021) Client: New Mexico Public Education Department., Santa Fe, NM. Responsibilities: As a member of the Release Engineering group, redefined processes and implemented tools for software builds, patch creation, source control, and release tracking and reporting. Used the ADO.NET and C# for the Data Access from the Database. Used ADO.NET, C# and data objects such as Data Adapter, Data Reader, Dataset, data table for consistent access to Azure SQL data sources. Worked closely with QA Teams, Business Teams, and DBA team and Product Operations teams to identify QA and UAT cycles release schedule to non-prod and prod environments. Created Build definition and Release definition for Continuous Integration and Continuous Deployment. Worked on with Git / GitHub for code check - ins/checkouts, branching etc. Created dashboards on Azure boards for CI/CD pipelines, Work items and bugs. Resolved Merge Conflicts, configured triggers and queued new builds within the release pipeline. Worked on CI/CD from VSTS, GitHub to deploy in windows server using Azure pipelines. Worked on provisioning the Azure Kubernetes clusters in AKS and managed the clusters and nodes using kubectl and as command line utilities. Worked on Docker-Compose, Docker-Machine to create Docker containers for testing applications in the QA environment and automated the deployments, scaling and management of containerized applications across clusters of hosts using Azure Kubernetes. Automated setting up server infrastructure for the DevOps services, using Ansible, shell and python scripts. Developed customer facing web application using ASP.NET 4.0 C# and convert to an XML data file. Involved in Conversion of classic ASP web application to latest ASP.NET MVC5, C# and AngularJS. Created Azure services using ARM templates (JSON) and ensured no changes in the present infrastructure while doing incremental deployment. Installation, Administration, Upgrading, Troubleshooting Console Issues & Database Issues for AppDynamics. Identifying the Critical applications for System resource utilization (CPU, Memory, and Threads etc.) & JVM heap size was monitored using AppDynamics. Monitored and tracked deployments. Worked on Automating End-to-end Application testing using Selenium QA Automation. Environment: Azure, Azure Devops, ASP.NET 4.0 C#, ASP.NET MVC5, AngularJS, Selenium QA Automation, Azure pipelines, GitHub, Azure Kubernetes clusters, Docker, python scripts, AppDynamics. Role: Devops Engineer (July 2018 Feb 2020) Client: American Heart Association, Bangalore, India. Responsibilities: Designed and Configured Azure Virtual Networks, subnets, Azure network settings, DHCP address blocks, DNS settings, and Azure Security policies & configured BGP routes to enable ExpressRoute connections between on-premise data centers & Azure cloud. Worked on Azure Fabric, Microservices, IoT & Docker containers in Azure and involved in setting up Terraform continuous build integration system. Worked in Azure Cloud Services (PaaS & IaaS & SaaS), Azure Storage(ADLS, AFS), Web Apps, Active Directory, Azure Application Insights, Logic Apps, Data Factory, Service Bus, Traffic Manager, Azure Monitoring, Azure OMS, Key Vault, Cognitive Services (LUIS) and Azure SQL , loud Services, Resource Groups, ExpressRoute, Load Balancing, Application Gateways. Worked on Data encryption, secure transmission, and auditing which are mandatory under the Security Rule for compliance standards relevant to healthcare like HIPAA, HITRUST. Configured and deployed Azure Automation Scripts for a multitude of applications utilizing the Azure stack for Compute, Web and Mobile, Blobs, Resource Groups, Azure Data Lake, HDInsight Clusters, Azure Data Factory, Azure SQL, Cloud Services, and ARM Services. Worked on Docker-Compose, Docker-Machine to create Docker containers for testing applications in the QA environment and automated the deployments, scaling and management of containerized applications across clusters of hosts using Kubernetes. Used Kubernetes to deploy scale, load balance, and worked on Docker Engine, Docker HUB, Docker Images, Docker Compose for handling images for installations and domain configurations. Used Ansible and Ansible Tower as Configuration management tool, to automate repetitive tasks, quickly deploys critical applications, and proactively manages change. Integrated Jenkins with various DevOps tools such as Nexus, SonarQube, Ansible and used CI/CD system of Jenkins on Kubernetes container environment, utilizing Kubernetes and Docker for the runtime environment for the CI/CD system to build and test and deploy. Implemented High Availability setup with the help of AppDynamics Coe team. Integrating AppDynamics with ServiceNow for Auto ticketing and incidents. Experience on PowerShell scripts to automate the Azure cloud system creation including end - to-end infrastructure, VMs, storage, firewall rules. Enforced of HIPAA rules and penalties for non-compliance, particularly in the use of electronic health records (EHRs). Environment: Azure, Azure Devops, Azure Virtual Networks, subnets, Azure network, ExpressRoute, Storage, Web Apps, Data Factory, Service Bus, Traffic Manager, Azure Monitoring, Key Vault, Azure Data Lake,Azure Data Factory, Azure SQL, load balancer, AppDynamics, VMs, storage, firewalls. Role : Devops Engineer (March 2017 June 2018) Client: Barclays Bank UK, Pune, India. Responsibilities: Set up the scripts for creation of new snapshots and deletion of old snapshots in Amazon S3 and worked on setting up the life cycle policies to back the data from AWS S3 bucket data. Created Build definition and Release definition for Continuous Integration and Continuous Deployment. Worked on with Git / GitHub for code check - ins/checkouts, branching etc. Resolved Merge Conflicts, configured triggers and queued new builds within the release pipeline. Involved in installing and administering CI/CD tools like Jenkins for managing weekly Build, Test and Deploy chain, GIT with Test/Prod Branching Model for weekly releases. Migrate infrastructure to AWS Used Build Tool Maven for Developing Kafka Ingestion and Spark- Scala project. Automated AWS components like EC2 instances, Security groups, ELB, RDS, IAM through Terraform. Worked on provisioning the Kubernetes clusters in EKS and managed the clusters and nodes using kubectl and as command line utilities. Experience in creating alarms and notifications for EC2 instances using Cloud Watch. Experience with ElasticSearch, Logstash Kibana stacks. Creating Lambda function to automate snapshot backup on AWS and set up the scheduled backup. Worked with Terraform Templates to automate the AWS IaaS VPN using terraform modules and deployed virtual machine scale sets in production environments. Managed AWS design architectures with AWS IaaS/PaaS, DevOps, Storage, Databases Components also Work with Cloud Platform Teams on implementing new features on AWS platform and Design and Development work on building scripts and automations for AWS cloud. Monitored and tracked deployments using Dynatrace, DataDog and CloudWatch. Environment: Aws, Github, EC2 instances, Security groups, ELB, RDS, IAM, Terraform, ElasticSearch, Logstash Kibana stacks, Storage, Databases, DataDog, Dynatrace, CloudWatch. Role: Site Reliability Engineer (Feb 2015 March 2017) Client: Deutsche Bank Germany, Bangalore, India. Responsibilities: Managed AWS design architectures with AWS IaaS/PaaS, DevOps, Storage, Databases Components also Work with Cloud Platform Teams on implementing new features on AWS platform and Design and Development work on building scripts and automations for AWS cloud. Set up the scripts for creation of new snapshots and deletion of old snapshots in Amazon S3 and worked on setting up the life cycle policies to back the data from AWS S3 bucket data. Created automated pipelines in AWS CodePipeline to deploy Docker containers in AWS ECS using services like CloudFormation , CodeBuild , CodeDeploy , S3 and puppet . Built and configured a virtual data center in the AWS cloud to support Enterprise Data Warehouse hosting including Virtual Private Cloud (VPC), Public and Private Subnets, Security Groups, Route Tables and Elastic Load Balancer. Managed Ansible Playbooks with Ansible roles. Created service in Ansible for automation of the continuous deployment. Used Jenkins as a continuous integration tool to create new jobs, managing required plugins, configuring the jobs, selecting required source code management tool, build trigger, build system and post build actions, scheduled automatic builds, notifying the build reports. Environment: Aws, AWS S3, AWS CodePipeline, Docker containers, AWS ECS, CloudFormation, CodeBuild , CodeDeploy , S3, puppet. Role: Build & Release Engineer (Feb 2013 Feb 2015) Client: GE Healthcare, India. Responsibilities: Automated Test, build and deployment using Jenkins, Maven, Tomcat and Shell Scripts, for their existing proprietary systems. Participated in the release cycle of the product which involves environments like developments QA and production. Involved in setting up builds using CHEF as a configuration management tool. Established Chef Best practices approaches to system deployment with tools with vagrant and managing Chef Cookbook as a unit of software deployment and independently version controlled. Involved in developing and build shell scripts. Ensured that EHRs meet HIPAA standards for protecting ePHI, often involving the use of secure cloud services and encrypted storage solutions. Managed all the bugs and changes into a production environment using the JIRA tracking tool. Assisted end-to-end release process from the planning of release content through to actual release deployment to production. Wrote various SQL and PL/SQL scripts and stored procedures to support applications. Deployed application packages on to the Apache Tomcat server. Coordinated with software development teams and QA teams. Environment: Jenkins, Maven, Tomcat, Shell Scripts, Chef, SQL, PL/SQL, JIRA, Apache Tomcat server. Keywords: csharp continuous integration continuous deployment quality analyst business intelligence sthree database active directory information technology procedural language Arizona New Mexico Texas 732-512-0009 EXty 831 Surya.k@logicplanet.com Keywords: csharp continuous integration continuous deployment quality analyst business intelligence sthree database active directory information technology procedural language Arizona New Mexico Texas |