Home

Santosh Gandham - AZZURE DevOps Engineer
[email protected]
Location: Atkins, Virginia, USA
Relocation: yes
Visa: H1B
Santosh Gandham (Certified Azure DevOps Engineer)
Email: [email protected] Mobile: 972-945-5031
LinkedIn: https://www.linkedin.com/in/santosh-gandham-270947148/
Summary
Around 10 years of experience with Cloud and Copado DevOps engineering using AWS and Azure. Recently, configured Azure settings for DNS, address blocks, and security policies. He also has experience building out cloud architectures to create space for new applications. He has experience deploying Java, .NET, and Python applications to the cloud.
Guided the cloud security group, identifying opportunities for improvement, and then driving those improvements through the enterprise. Closely collaborate with security architects in developing cloud security frameworks for the enterprise. Usage strategy, monitoring, alerting, reporting, and blocking.
Examine current cloud security practices and identify key risks, then execute programs to address them.
Configured Azure key vault and key management policies and also involved security product assessments such as Azure firewall.

Technical Skills
AWS and Azure experience
Cloud architecture and security
Experience developing with Java.
Copado, Salesforce, GitHub, SQL, Excel
GIT, Jenkins, chef, SVN, JIRA, Rally
Developer Console, workbench, sandboxes

Certifications
Copado
Microsoft Azure Security Engineer Associate

Professional Experience

State of CA, Sacramento, CA Mar 21 To Till date
Salesforce DevOps Engineer

Responsibilities:
Working on Large-scale & complex Salesforce Development projects along with integrating data from other systems/Repositories.
Used Copado for deployment starting from setup to creating user stories and promote and Deploy collaborate closely with team members and business community. Think through hard problems and work with a team them reality.
The primary responsibility of a build and release engineer is developing software from the ground up and developing it to end users.
Implemented a continuous Delivery pipeline with Docker, Jenkins and GitHub and AWS AML S
Source control into a Cl/CD pipeline, and each approved change will be automatically reflected in staging and production databases as required.
Created many lighting components and server-side controllers to meet the business requirements.
Worked on Salesforce.com configurations such as creating the Custom Objects, Custom fields, and buttons, links, Record types, Page layouts, User Profiles, Workflow s approvals and Validating Rules
Good knowledge and hands-on experience in the new Salesforce version, Lightning.
Gained knowledge in code deployments from Sandbox to another environment by using Eclipse IDE and Change Set s.
Implemented Validation Rules, Assignment Rules, Sharing Rules, and Escalation Rules according to the application requirements.
Refresh and configuration of Sandboxes of different teams.
Used Jenkins to do code builds from Git to perform the Salesforce deployments.
Executed various Salesforce manual tasks (Pre and Post Deploy) in order to configure the respective Salesforce environments for the deployment and also configure as per the business needs after deployment
Reviewed and executed various pre deployment and post deployment manual tasks in JIRA. Continuous JIRA reviewing to ensure the quality of the tasks, to assign right statuses of the tasks after their execution in various environments.
Direction to other members of the business applications team on using Copado and resolving deployment issues
Copado Change Management setup will let you achieve best practices in your release management process. With this setup, you will be able to control your user stories and their metadata changes as they move through your environment landscape until they finally reach the production environment.
Good Knowledge and involvement in Salesforce teams working on governor limits, worked with large data sets and worked with teams developing apex codes by working around the limits
Environment: Apex Classes, Apex Trigger, Visual Force, SFDC Eclipse Plug-ins, Force.com IDE for Eclipse, Apex Deployment tools data, data loader, copado, changesets, workbench.

EY, Alpharetta, GA Aug 19 To Feb 21
Salesforce DevOps Engineer.

Responsibilities:
Prepared capacity and architecture plan to create the Azure Cloud environment to host migrated IaaS VMs and PaaS role instances for refactored applications and databases.
Experience in customizing Salesforce CRM for generating web-to-leads and web-to-cases
Worked on Salesforce.com configurations such as creating the Custom Objects, Custom fields, and buttons, links, Record types, Page layouts, User Profiles, Workflows approvals and Validating Rules
Performed Salesforce.com Administrator activities where I was responsible for Creating Roles, Profiles, Page Layouts, Workflow Alerts and Actions, and Approval Workflows
Examine current cloud security practices and identify key risks, then execute programs to address them.
Configured Azure key vault and key management policies and also involved security product assessments such as Azure firewall.
Refresh and configuration of Sandboxes of different teams.
Use Copado for deployment starting from setup to creating User Stories and Promote and Deploy Collaborate closely with team members and business community.
Maintain release queue, prepare deployment plans, work with agile teams with collaborative approach Deep understanding of Copado Product including limitations of the tool
Building and Installing servers through Azure Resource Manager Templates(ARM) or Azure Portal.
Implemented Chef Recipes for Deployment on build on internal Data Centre Servers. Also re-used and modified same Chef Recipes to create a Deployment directly into Amazon EC2 instances
Worked on Cloud automation using AWS Cloud Formation templates.
Conduct business analysis by working with end users to identify system, operational requirements.
Automated different manual Salesforce administrator operations of the bank by using Process Builder. Involved with the developer teams in order to develop apex codes for the same, which is performed as a continuous research towards automating different processes in the bank.
Good knowledge in Apex Classes, Triggers, Batch Apex, Test Classes, Visual force pages, Web Service etc. to achieve complex business functionalities.
If this checkbox is selected, a promotion and a deployment will be created for this user story.
Used Jenkins, Bit Bucket to perform Salesforce and mule module deployments.
Configured out of the box reports and dashboards as well as custom reports.
Designed various types of email templates for auto response to customers.
Created custom Dashboards for manager s home page and gave accessibility to dashboards for authorized people.
This value is automatically added by Copado based on the selected credential
Refresh and configuration of Sandboxes of different teams.
Used Jenkins to do code builds from Git to perform the Salesforce deployments.
Created Sandboxes and performed the entire configuration activities of the new sandboxes as per the bank s policies.
Introduction to Mule, checked and changed the mule build numbers/versions to make sure the new mule versions are set in the repo. Undeployed the older versions/duplicate versions to replace them with the new versions.
Created and configured new users and permissions respectively. Assigned different profiles to users as per the requirements to perform their duties.
Performed the entire Salesforce deployments using auto deploys in Salesforce, performing Deletes by pulling the delete scripts from the respective branches and executing them in anonymous window in developer console, performing Upserts operations through Jenkins to upsert the code updates/new codes into the respective Salesforce environments by manually setting the build parameters and deploying from Jenkins.
Executed various Salesforce manual tasks (Pre and Post Deploy) in order to configure the respective Salesforce environments for the deployment and also configure as per the business needs after deployments.
Reviewed and executed various pre deployment and post deployment manual tasks in JIRA. Continuous JIRA reviewing to ensure the quality of the tasks, to assign right statuses of the tasks after their execution in various environments.
Executed apex scripts in various Salesforce environments to test and complete the new functionalities developed by the developer teams.
Environment: GIT, Jenkins, Chef, SVN, Ansible, Lambda, AWS EC2, AWS S3 and RDS, Beanstalk, AWS ELB, AWS SQS, AWSCloudWatch and Route53, ANT, MAVEN, Shell (bash), Nagios, Apache Tomcat Application Server, Docker, Azure services, Azure API S, GCP

FISERV, Alpharetta, GA Sep 18 To Aug 19
DevOps/Aws Cloud Engineer

Responsibilities:
Utilized AWS services like EC2, VPC, Auto scaling, S3, EBS, ELB, CFT, LAMBDA, IAM, SNS, SQS, Dynamo DB, Elastic Bean Stalk and cloud watch services to build highly available, Scalable and self-healing Applications.
Performed Auto Scaling, Elastic Load Balance (ELB) & AMIs and utilized EBS to store persistent data and mitigate failure by using snapshots.
Implemented a Continuous Delivery pipeline with Docker, Jenkins and GitHub and AWS AMI s.
Virtualized the AWS Servers using the Docker, created the Docker files and version controlled them.
Worked on Cloud automation using AWS Cloud Formation templates.
Created AWS S3 buckets, performed folder management in each bucket, Managed cloud trail logs and objects within each bucket.
Innovate using infrastructure-as-code tools like Azure Resource Manager to deploy on Azure
Developed Chef Cookbooks, Recipes, Resources and Run lists, managing the Chef Client nodes, and upload the cookbooks to chef-server from Workstation.
Manage and support customer environments in Microsoft Azure cloud.
Experience creating development workflows in Enterprise. Usage of Azure Artifacts as a source repository is a plus.
Provide guidance on AWS & GCP best practices to internal customers and external vendors
Experience in deployment automation and related tooling (Terraform and AWS Cloud Formation).
Building, changing and versioning Infrastructure by handling multiple providers with terraform simultaneously.
Created the AWS VPC network for the Installed Instances and configured the Security Groups and Elastic IP's accordingly.
ARM templates writing, setup automation for resources provisioning Azure PowerShell
Used Terraform for server provisioning.
Automating the tasks using Ansible playbooks, Shell scripting and Python. Provisioned and patched servers regularly using Ansible.
Integrated chef cookbooks into Jenkins jobs for CD framework and created roles, environments using chef handlers for different auto kickoff requirement jobs.
Worked on log management tools like Logstash and Elasticsearch.
Experienced with setup, configuration and maintain ELK stack (Elasticsearch, Logstash and Kibana).
Created Master-Slave configuration using existing Linux machines and EC2 instances to implement multiple parallel builds through a build farm.
Written Templates for AWS infrastructure as a code using Terraform to build staging and production environments.
cloud infrastructure maintenance using a combination of Jenkins, Ansible and Terraform for automating CI/CD pipeline in AWS.
Environment: AWS, Ansible, Git, Docker, Kubernetes, Terraform, Python, Java, Shell scripting, JIRA, Jenkins, Maven, Nexus, Apache Tomcat. SonarQube, Bitbucket, Azure Artifacts, Azure portal, GCP

ATOS/DISNEY, Orlando, FL Aug 17 To Aug 18
DevOps Cloud Engineer

Responsibilities:
Designed DevOps workflow for multiple applications by orchestrating Test, Build, Release and Deploy phases through various CI/CD pipelines using Git, Maven, Jenkins, Docker, Chef, Puppet & Cloud formation tools.
Implemented a CI/CD pipeline with Jenkins, GitHub, Nexus, Maven and AWS AMI's.
Managed multiple AWS accounts with multiple VPC's for both production and non-prod where primary objectives included automation, build out, integration and cost control.
Created S3 buckets and managing policies for S3 buckets and Utilized S3 bucket and Glacier for storage and backup on AWS.
Implemented Chef to spawn new servers with right configuration.
Implemented multi-tier application provisioning in Amazon cloud Services, integrating it with Chef.
Installed and configured Chef and written recipes to automate the administrative tasks.
Created Automated Deployment using Chef Cookbooks.
Setup profiles, thresholds, alerts based on thresholds and created various dashboards in Dynatrace and AppDynamics. By using Application performance Management in AppDynamics, monitored Micro Services deployed in elastic infrastructure, spotting thread contention issues, and their root cause and integrated alerting system with Jira.
Scripting in PowerShell and Python. Experience with systems and IT operations, monitoring operations is involved. Using Service to manage tickets as well as building backend automation
Implemented multi-tier application provisioning in Amazon cloud Services, integrating it with Chef.
Written Bash scripts to perform deploys dynamic content to tomcat Webserver and WebSphere App Servers.
Migrating and maintaining build and test environments into the Cloud Infrastructure.
Worked on AWS services EC2, IAM, S3, Lambda, Cloud Watch, Dynamo DB, SNS, Elastic Bean stalk, VPC, ELB, RDS, EBS, Route 53, ECS and Auto scaling.
Administered databases using RDS, MySQL and DynamoDB in AWS and executed the DML and DDL scripts.
Deploying phase monitoring Nagios, Splunk code deploying phase puppet, chef, ansible, testing, debugging.
Environment: Maven, Jenkins, Nexus, Bash, GIT, JIRA, SonarQube, Apache Tomcat, WebSphere, PowerShell.

PANASONIC AVIONICS CO, Lake Forrest, IL Feb 16 To Jul 17
DevOps Engineer

Responsibilities:
Worked on Managing the Private Cloud Environment using Chef.
Managed and optimize the Continuous Delivery tools like Jenkins.
Install, Configure, Administer Jenkins Continuous Integration Tool
Developed and implemented Software Release Management strategies for various applications according to the agile process.
Developed build and deployment scripts using Maven as build tools in Jenkins to move from one environment to other environments.
Branching, Tagging, Release Activities on Version Control Tool GIT.
Automated deployment of builds to different environments using Jenkins.
Built and Deployed Java/J2EE to a web application server in an Agile continuous integration environment and automated the whole process.
Used Jenkins for Continuous Integration and deployment into WebSphere Application Server.
Used Maven as build tool on Java projects for the development of build artifacts on the source code.
Developed build and deployment processes for Pre-production environments.
Used Subversion as source code repositories.
Developed automation scripting in Python (core) using Puppet to deploy and manage Java applications across Linux servers.
Managed SVN repositories for branching, merging, and tagging.
Environment: Shell Script, Git, Jenkins, Puppet, Artifactory, LINUX, Maven, Web sphere, JIIRA


Education:
Bachelor of Technology in Computer Science and Engineering 2013
Master s Degree in computer Science and Engineering 2016
Keywords: continuous integration continuous deployment sthree database information technology California Colorado Florida Georgia Illinois

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];51
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: