Home

Vijay - AWS Devops Engineer Or AWS Cloud Engineer
[email protected]
Location: Jersey City, New Jersey, USA
Relocation: Open all over USA
Visa: H1B
CARRIER HIGHLIGHTS:

Certified AWS DevOps Engineer Around 8 Years of extensive IT experience, Expertise in Infrastructure and Cloud

Engineering & Linux Engineering.

Exposed to all aspects of Software Development Life Cycle (SDLC) such as Analysis, Planning, Developing, Testing, and implementing and post-production analysis of the projects and methodologies such as Agile, SCRUM and waterfall.

Extensive experience in Amazon Web Services (AWS) Cloud services such as EC2, VPC, S3, Code Commit, Code Build, Code Deploy, Code Pipeline, IAM, EBS, RDS, ELB, VPC, Route53, Dynamo DB, Lambda, Gaur duty, Config, Macie, Service Catalog, Cloud Formation, Auto Scaling, Cloud Front, CloudTrail, CloudWatch, Elastic search, Elastic File system (EFS), Elastic Beanstalk, EKS, SNS, SQS, SES, SWF & AWS Direct Connect etc.

Worked on various AWS security services like AWS Identity and Access Management (IAM), Amazon Inspector, AWS Key Management Service (KMS), Amazon Macie, AWS Shield, AWS WAF, Amazon Guard Duty, AWS Security Hub

Firm grasp on AWS Cloud Security, leveraging Linux, and Windows operating systems, using the AWS console and CLI (command line interface). Worked with application development teams for implementation of best security practices within the SDLC. Performed security reviews on new and legacy applications.

Monitoring resources and Applications using AWS Cloud Watch, including creating alarms to monitor metrics such as EBS, EC2, ELB, RDS, S3, SNS and configured notifications for the alarms generated based on events defined.

Experience with designing, building, and operating solutions using virtualization using private hybrid/public cloud technologies. DevSecOps Process & Regulatory Plan & Execution for Cloud Migration Plan. Knowledge of High Availability (HA) and Disaster Recovery (DR) options in AWS.

Experience in different migration services like AWS Server Migration Service (SMS) to migrate on-premises workloads to AWS in easier and faster way using Rehost lift and shift* methodology and AWS Database Migration Service (DMS), Sybase on premise data to RDS SQL Servers.

Knowledge of High Availability (HA) and Disaster Recovery (DR) options in AWS.

Experience in Migrating a production infrastructure into an Amazon Web Services utilizing AWS Cloud formation.

Hands on experience in Architecting Legacy Data Migration projects such as Teradata to AWS Redshift migration and from on-premises to AWS Cloud.

Experience in configuring Docker Containers for Branching and deployed using Elastic Beanstalk.

Experience in designing, installing, and implementing Ansible configuration management system for managing Web applications, Environment s configuration Files, Users, Mount points and Packages.

Extensively worked on Jenkins and Hudson by installing, configuring, and maintaining the purpose of Continuous Integration (CI) and for End-to-End automation for all build and deployments and in implementing CI/CD for database using Jenkins. Integrate with DevOps teams to help them transform into DevSecOps teams.

Experience in managing U Deploy configuration, administration, upgrade, security and maintenance of systems, platforms like Web, application. Hands-on experience in deployment automation using Shell/Ruby scripting.

Experience in setting up Baselines, Branching, Merging and Automation Processes using Shell, Ruby, and PowerShell scripts. Experience in using version controller tools SVN, Git, GitHub, and Bit Bucket.

Expertise in application builds, deployment, smoke testing and release promotion for complex applications and infrastructure. Performed several types of testing like smoke, functional, system integration, white box, black box, gray box, positive, negative and regression testing.

EDUCATION:

MASTER S In Information Assurance with 3.5 GPA Wilmington University

MASTER S In Computer Science with 3.65 GPA Stratford University

Certifications:

Certified AWS DevOps Engineer Professional in Amazon Web Services for Cloud Computing Services

TECHNICAL SKILLS:

Operating System Windows, Mac OS, RHEL 4/5/6/7, Ubuntu, CentOS

Versioning Tools Subversion, GIT Hub, GIT Lab, Bit Bucket

CI Tools Jenkins, Bamboo, Hudson

CD Tools IBM UDeploy

Build Tools ANT, MAVEN, Gradle

Bug Tracking Tools JIRA, ServiceNow

Programming Languages Python, C, C++

Scripting Languages Shell scripting, Python, Java Scripting, Bash and Ruby

Web Application servers Apache Tomcat, JBOSS, Nginx

Databases MySQL, Mongo DB, NOSQL

Monitoring Tools Amazon CloudWatch, CloudTrail, Nagios, Splunk, nexus, Data Dog

Configuration Management Tool Chef, Ansible

Virtualization Technologies vSphere, VMware Workstation, Oracle Virtual Box, Hyper-V

Containers Tools Docker, ECS, Kubernetes, EKS

testing tools SELENIUM, Junit

Networking/protocols TCP/IP, REST, API, Routing Protocols, Subnets, VPN, Route53

REPOSITORIES NEXUS, GIT, ARTIFACTORY,

Cloud Technology AWS (EC2, ELB, VPC, RDS, IAM, CLOUD FORMATION, S3,

CloudWatch, CloudTrail, Lambda, Service Catalog, Config, EFS,

X-Ray, ECS, EKS, Step Functions, SNS, SQS, Dynamo DB).

WORK EXPERIENCE:

AWS CLOUD ENGINEER

MACQUARIE GROUP PHILADELPHIA, PA SEP 2020 PRESENT

Responsibilities:

Set up a new baseline multi AWS Accounts environments that is secure, well architected, as per the Organization Security Standards using AWS service catalog through AWS Control Tower and supporting the entire organization to setup the access and to make sure everything the need is up and running.

Built and support multiple services such as AWS Organizations, AWS Identity and Access Management, AWS Config, AWS CloudTrail and AWS service catalog using AWS Control Tower.

Implemented and supporting a CI/CD pipeline in Dev, UAT, and Pre-Prod to deployed applications using ECS service using AWS deployment tools, when a code check- happens to the Code Commit, Code Build automatically triggers, and it will create or update an existing CloudFormation stack. CloudFormation creates/updates AWS resources required to run our application.

Supporting end to end AWS automation, creating account, configuring roles policies, users, EC2 launch Service Catalog, Security Groups, network ACL s, S3, API creation. CloudFront.

Implemented AWS Step Functions to automate and orchestrate the Amazon Sage Maker-related tasks such as publishing data to S3, training the ML model, and deploying it for prediction.

Experience in DevSecOps strategy for security auditing, continuous monitoring of the entire infrastructure and applied Web Application Firewall (WAF) rules for blocking the attacks through SQL injection and patterns.

Played a pivot role in implementing DevSecOps Model across the organization, platform and implemented vulnerability checks at different phases.

Built CloudFormation templates for SNS, SQS, Elasticsearch, DynamoDB, Lambda, EC2, VPC, RDS, S3, IAM, CloudWatch services implementation and integrated with Service Catalog.

Implemented Config-Aggregator to enhance the compliance across the accounts and centralized management.

Built AWS Lambda function with python language to monitor the new resources creation without the organization standards and enabled notification and appropriate action to taken.

Created typescript reusable components and services to consume RESTAPIs using component-based architecture using angular.

Worked on some of the new features of Angular like new if else syntax, ng-templates, and form validators.

Experienced in creating multiple VPC s and public, private subnets as per requirement and distributed them as groups into various availability zones of the VPC.

Developed unit and functional tests in Python managed the code migration from TFS, CVS and Star team to Bitbucket repository.

Created NAT gateways and instances to allow communication from the private instances to the internet through bastion hosts.

Used security groups, network ACL s, internet gateways and route tables to ensure a secure zone for organization in AWS public cloud. Integrated ServiceNow with Splunk to generate Incidents from Splunk Administered Apache HTTP Server 2.0.

Consulted with stakeholders to gather and document requirements for data governance projects, helping to establish agreed upon data definitions and consistent data capture across the company.

Critically evaluated information gathered from multiple sources and worked with customers to assess whether data conformed to data governance approved mappings and standards.

Implemented a 'server less' architecture using API Gateway, Lambda, and Dynamo DB and deployed AWS Lambda code from Amazon S3 buckets. Created a Lambda Deployment function and configured it to receive events from your S3 bucket.

Implementing and supporting a centralized logging system using log stash configured as an ELK stack (Elastic search, Log stash, and Kibana to monitor system logs, AWS CloudWatch, VPC Flow logs, CloudTrail Events, changes in S3 etc.

wrote one click deployments using SLS framework, AWS CDK (backed by CloudFormation implemented on Python SDK).

Implemented the AWS Cost budget notifications environment wise by using python with Lambda functions and SNS notification service.

Deploy new Splunk systems and Monitor Splunk internal logs from the monitoring Console (MC) to identify and troubleshoot existing or potential issues.

Experienced with event-driven and scheduled AWS Lambda functions to trigger various AWS resources.

Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of AWS Nodes and Test Playbooks on AWS instances using Python. Experience with Ansible Tower to more easily manage enterprise Ansible deployments.

Configure the pipeline to periodically check the repository. Start the pipeline when changes are detected.

Configure the repository to generate an Amazon CloudWatch Events event upon changes.

Configure the pipeline to start in response to the event. Created and configured elastic load balancers and auto scaling groups to distribute the traffic and to have a cost efficient, fault tolerant and highly available environment.

Wrote Ansible playbooks to launch AWS instances and used Ansible to manage web applications, configuration files, used mount points and packages.

Attach an AWS IAM policy to the developer IAM group that denies the actions of pushing commits, merging pull requests, and adding files to the master branch.

Attach a resource policy to the Code Commit repository that denies members of the IAM developer group the actions of pushing commits, merging pull requests, and adding files to the master branch.

Set up an AWS Lambda function that runs every 15 minutes to check for repository changes and publishes a notification to an Amazon SNS topic.

Developed and maintained Python/Shell PowerShell scripts for build and release tasks and automating tasks.

Integrated services like Bitbucket AWS Code Pipeline and AWS Elastic Beanstalk to create a deployment pipeline.

Created S3 buckets in the AWS environment to store files, sometimes which are required to serve static content for a web application.

Configured S3 buckets with various life cycle policies to archive the infrequently accessed data to storage classes based on requirement.

Configured the application to run on the datacenter using Terraform. Developed and deployed stacks using AWS Cloud Formation Templates (CFT) and AWS Terraform.

Experienced with Ansible playbooks for virtual and physical instance provisioning, configuration management, patching and software deployment and wrote shell scripts to bootstrap instance.

Implemented Ansible to manage all existing servers and automated build/configuration of new servers.

Possess good knowledge in creating and launching EC2 instances using AMI s of Linux, Ubuntu, RHEL, and Windows Used.

Used Bamboo pipelines to drive all micro services builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes.

Design an ELK system to monitor and search enterprise alerts. Installed, configured, and managed the ELK Stack for Log management within EC2 / Elastic Load balancer for Elastic Search.

Implemented domain name service (DNS) through Route53 to have highly available and scalable applications.

Created EBS volumes for storing application files for use with EC2 instances whenever they are mounted to them.

Experienced in creating RDS instances to serve data through servers for responding to requests.

Written Templates for AWS infrastructure as a code using Terraform to build staging and production environments.

Automated regular tasks using Python code and leveraged Lambda function wherever required.

Implemented Amazon Macie, Guard Duty, Centralized CloudTrail, Centralized Cloud Config, Redlock integration.

Knowledge on Containerization Management and setup tool Kubernetes and ECS. Deploying Docker container on AWS ECS.

Implemented WAF & Shield across all the AWS environments to maintain the environments safe and secured.

Implemented the AWS Cost budget notifications environment wise by using python with Lambda functions and SNS notification service.

Environment/Tools: EC2, Elastic Load Balancing, ECS, EKS, Cloud Front, Power Shell, Cloud Formation, Elastic Cache, CloudWatch, Route53, Redshift, Lambda, Dynamo DB, RDS, Terraform, Jira, Ansible, Bash scripts, Bit bucket, Service Catalog, X-Ray, Guard Duty, Config, Macie, VPC, CloudTrail, IAM, Redlock, Organizations, WAF, S3, Elastic Search, SNS, SQS and SES.

AWS CLOUD ENGINEER

EMC INS DES MOINES, IA FEB 2020 AUG 2020

Responsibilities:

Maintaining and Supporting a Cl/CD pipeline with bamboo on Docker container environment utilizing Docker swarm and Docker for the runtime environment for the Cl/CD system to build test and deploy on DEV, UAT and PROD environment.

Set up the ETL process by reading DDL statements from S3 buckets, validate the queries on Athena platform and loaded data into DynamoDB tables.

Automated the process of keeping track of succeeded and failed Athena queries, written python code in lambda function.

Experience in writing SAM template to deploy serverless applications on AWS cloud.

Hands-on experience on working with AWS services like Lambda function, Athena, DynamoDB, Step functions, SNS, SQS, S3, IAM etc.

Designing and Supporting ETL Processes in AWS Glue to migrate Campaign data from external sources like S3, ORC/Parquet/Text Files into AWS Redshift.

Create external tables with partitions using Hive, AWS Athena, and Redshift. Schedule the production Jobs using Airflow, CloudWatch, AWS Lambda and AWS Glue.

Developing and Supporting applications written for AWS S3, Lambda, AWS DynamoDB, AWS SQS, AWS SNS and Amazon Serverless Application Model.

Experienced with event-driven and scheduled AWS Lambda functions to trigger various AWS resources.

Integrated SQS and DynamoDB with step functions to iterate through list of messages and updated the status into DynamoDB table.

Used Go Programming Language (Golang) and Scala in the development of the application.

Implemented different state machines and using almost all available tasks for step functions like map, parallel, choice, task etc.

Involved in Upgrade of Bamboo & Artifactory Server by scheduling backups in S3.

Managed the Code Repository by maintaining code in Bit Bucket, improve practices of branching and code merge to custom needs of development team.

Experience in maintaining Atlassian products like JIRA, Confluence, Bamboo etc. Configuring and managing AWS Simple Notification Service (SNS) and Simple Queue Service (SQS).

Installation, Configuration and Management of RDBMS and NoSQL tools such as Dynamo DB.

Created a Lambda Deployment function and configured it to receive events from your S3 bucket.

Configured S3 buckets with various life cycle policies to archive the infrequently accessed data to storage classes based on requirement.

Attach a resource policy to the Code Commit repository that denies members of the IAM developer group the actions of pushing commits, merging pull requests, and adding files to the master branch.

Integrated services like Bitbucket, AWS Code Pipeline, Bamboo and AWS Elastic Beanstalk to create a deployment pipeline.

Automated regular tasks using Python code and leveraged Lambda function wherever required.

Configure the repository to generate an Amazon CloudWatch Events event upon changes.

Integrated the AWS S3 logs with Athena service, exported the data in csv format by leveraging the Lambda with python code.

Working with Data Engineer team to analyses the data in S3 bucket using AWS Athena and understand the pattern of data.

Working on tools like Bamboo for building the job and putting the latest files for deployment in AWS S3 bucket.

Configuring AWS SNS to publish message to trigger AWS lambda function.

Define and document best practices and strategies regarding application deployment and infrastructure maintenance.

Implement the application s CI/CD pipeline using bitbucket/bamboo and lambda.

Implemented step functions for triggering series of lambdas executing different functionalities.

Automation of applications for build and deployments. No human interference during the build and deployment of the application.

Environment/Tools: EC2, Elastic Load Balancing, ECS, EKS, Cloud Formation, CloudWatch, Route53, Redshift, Lambda, Dynamo DB, Terraform, Jira, PowerShell, Bit Bucket, Bamboo, Apache Mesos, X-Ray, VPC, CloudTrail, IAM, S3, Elastic Search, SNS and SQS.

AWS CLOUD ENGINEER

JEFFERIES LLC. JERSEY CITY, NJ FEB 2019 JAN 2020

Responsibilities:

Set up a new baseline multi AWS Accounts environments that is secure, well architected, as per the Organization Security Standards using AWS service catalog through AWS Control Tower and supporting the entire organization to setup the access and to make sure everything the need is up and running.

Built multiple services such as AWS Organizations, AWS Identity and Access Management, AWS Config, AWS CloudTrail and AWS service catalog using AWS Control Tower.

Implemented a CI/CD pipeline in Dev, UAT, and Pre-Prod to deployed applications using ECS service using AWS deployment tools, when a code check- happens to the Code Commit, Code Build automatically triggers, and it will create or update an existing CloudFormation stack. CloudFormation creates/updates AWS resources required to run our application.

Worked on ECS deployment to an application on a webserver hosted in containers using CloudFormation to create Infrastructure as Code.

Working on AWS s Cloud Security Services such as IAM (Identity and Access Management), Identity Federation, Certificate Manager, Key Management Service, CloudTrail, SSO, Guard Duty, Secrets Manager, Config etc.

Implemented Amazon Macie, Guard Duty, Centralized CloudTrail, Centralized Cloud Config, Redlock integration.

Built AWS Lambda function with Python language to monitor the creation of the new resource without the organization standards and enabled notification and appropriate action to be taken.

Built CloudFormation templates for SNS, SQS, Elasticsearch, DynamoDB, Lambda, EC2, VPC, RDS, S3, IAM, CloudWatch services implementation and integrated with Service Catalog.

Worked with AWS Secrets Manager to protect secrets, easily rotate, manage, and retrieve database credentials, API keys, OAuth tokens and used CloudWatch for monitoring AWS cloud resources and the applications that deployed on AWS by creating new alarm, enable notification service and monitored stats from all services in AWS solutions.

Created CloudFront distributions to serve content from edge locations to users to minimize the load on the frontend.

Worked on tagging standards for proper identification and ownership of EC2 instances and other AWS Services like Cloud Front, cloud watch, RDS, S3, Route53, SNS, SQS, and Cloud Trail.

Create read replicas of AWS RDS in multiple Availability Zone using backups and snapshots and scaling out based on requirements. Worked with networking teams in configuring Aws Direct Connect to establish dedicated connection to datacenters and Aws cloud.

Implemented Config-Aggregator to enhance compliance across the accounts and centralized management.

Created functions and assigned roles in AWS Lambda to run Python scripts, and AWS Lambda using Python to perform event-driven processing. Created Lambda jobs and configured Roles using AWS CLI.

Experience in migrating database using AWS Database Migration Service (DMS) homogenous migrations such as oracle to oracle heterogeneous migrations between different database platforms such as Oracle to Amazon Aurora and Confidential SQL to MySQL

Experienced with the event-driven and scheduled AWS Lambda functions to trigger events in a variety of AWS resources using boto3 modules. Implemented the AWS Cost budget notifications environment-wise by using Python with Lambda functions and SNS notification service.

Experienced in creating multiple VPCs and public, private subnets as per requirement and distributed them as groups into various availability zones of the VPC. Created NAT gateways and instances to allow communication from the private instances to the internet through bastion hosts.

Used security groups, network ACLs, internet gateways, and route tables to ensure a secure zone for the organization in AWS public cloud.

Configured S3 buckets with various life cycle policies to archive the infrequently accessed data to storage classes based on the requirement.

Environment/Tools: EC2, Elastic Load Balancing, EKS, Cloud Front, Cloud Formation, Elastic Cache, Cloud Watch, Route 53, Redshift, Lambda, Dynamo DB, Terraform, Jira, Ansible, Bash scripts, Bit bucket, Service Catalog, X-Ray, Guard Duty, Macie, VPC, Cloud Trail, IAM, Redlock, Organizations, WAF, S3, Elastic Search, SNS, SQS and SES.

AWS CLOUD ENGINEER

EBAY INC. - SAN JOSE, CA FEB 2018 JAN 2019

Responsibilities:

Configured and deployed GIT repositories with branching, forks, tagging, merge requests and notifications.

Automated weekly releases with Maven scripting for compiling Java Code, debugging and placing Builds into Maven Repository.

Experienced in authoring pom.xml files, performing releases with Maven release plugins and managing artifacts in Maven internal repository.

Involved in installing Jenkins on Linux environment and implemented a Master and Slave configuration to run multiple build operations in parallel.

Deployed and monitored Micro services using pivotal cloud foundry, also managed domains and routes with the cloud foundry. Worked in using Dockers Swarm and deployed spring boot applications.

Built enterprise wide DNS historical data trending analysis tools in Go (Golang), Languages with REST API for user queries as Microservice.

Implementing Continuous Integration and Continuous Deployments using CI tool Jenkins and enhancing the Jenkins shared libraries using chef, Groovy DSL, Shell and Python Scripts.

Created stage timeline in Web UI client using JavaScript and implemented stage deletion in CLI using Golang. Used Go to run the Docker Command Line Interface tools.

Enhanced, fixed functional and performance related bugs in core product written in GO (Golang).

Monitoring Splunk dashboards, Splunk Alerts and configure scheduled alerts based on the internal customer requirement.

Implemented new JIRA workflows for the QA teams and worked on Splitting JIRA and Remedy server s configuration

Design an ELK system to monitor and search enterprise alerts. Installed, configured and managed the ELK Stack for Log management within EC2 / Elastic Load balancer for Elastic Search.

Used SonarQube for continuous inspection of code quality and to perform automatic reviews of code to detect bugs. Managing AWS infrastructure and automation with CLI and API.

Used Jenkins pipelines to drive all micro services builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes.

Building/Maintaining Docker container clusters managed by Kubernetes Linux, Bash, GIT, Docker, on GCP (Google Cloud Platform).

Created Kubernetes cluster with objects like Pods, Deployments, Services and Config Maps and created reproducible builds of the Kubernetes applications, managed Kubernetes manifest files and Helm packages and implemented Kubernetes to deploy scale, load balance, scale and manage Docker containers with multiple namespace versions.

Working on Micro services for Continuous Delivery environment using Docker.

Worked on Ansible for configuration management and infrastructure automation. Also created inventory in Ansible for automating continuous deployment and wrote playbooks using YAML scripting.

Installed Docker using Docker Toolbox and worked on creating the Docker containers and Docker consoles for managing the application life.

Setup Log Analysis AWS Logs to Elastic Search and Kabana and Manage Searches, Dashboards, custom mapping and Automation of data.

Working with Terraform key features such as Infrastructure as code, Execution plans, Resource Graphs, Change Automation.

Setup JFrog Artifactory on AWS, single copy of any binary is ever stored on this file system.

Used Minikube to manage local deployments in Kubernetes, created local cluster and deployed application containers.

Creation of RDS database instances Postgres SQL in AWS cluster making use of EC2 and VPC and launched via Cloud Formation template.

Implemented logging solutions with Elastic search, Logstash & Kibana.

Used Kubernetes to manage containerized applications using its nodes, Config Maps, selector, Services and deployed application containers as Pods.

Utilized Kubernetes for the runtime environment of the CI/CD system to build, test deploy.

Implemented and maintained the monitoring and alerting of corporate servers/storage using AWS CloudWatch, Nagios and New Relic.

Used Nagios for application and hardware resource monitoring and wrote new plugins in Nagios to monitor resources.

Experience with Atlassian JIRA installation, administration and maintenance.

Installed various Jira plugins such as Jira client, Jira importer plugin, Jira Charting Plugin, the connector for Microsoft project and Jira Misc and Remedy ticket system.

Custom fields. Migrated Jira across environments and worked on Jira database dumps.

Developed unit and functional tests in Python and Ruby and managed the code migration from TFS, CVS and Star team to Subversion repository.

Developed Python scripts to take backup of EBS volumes using AWS Lambda and CloudWatch.

Used IAM to create new accounts, roles and groups which are engaged in enabling Lambda functions for dynamic creation of roles.

Launched Amazon EC2 Cloud Instances using Amazon Web Services (Linux/Ubuntu) and configuring launched instances with respect to specific applications.

Environment/Tools: EC2, ELB, Cloud Front, Power Shell, CloudFormation, Elastic Cache, CloudWatch, Route53, Lambda, Dynamo DB, RDS, Terraform, Jira, Ansible, Bash scripts, Bit bucket, VPC, CloudTrail, IAM, S3, Elastic Search, SNS, SQS and SES.

LINUX ENGINEER

Global Logic, - HYD, IN JAN 2014 DEC 2015

Responsibilities:

Experience in implementing new processes and policies for build process involved in auditing.

Used Chef for implementing Automated Application Deployment.

Assisting developers to integrate their code with mainstream.

Implementing build automation in Jenkins using Bash scripting for daily builds.

Ensuring release to test environments by merging conflict code.

Experience in managing GIT and SVN as source control systems.

Managing Nexus for Artifactory and dependency management systems.

Defined the build and automated testing infrastructure by educating the development and QA teams with the tools and processes.

Performed hardware and software installations, upgrades, and maintenance, patch administration, kernel modification/upgrades, file system management, performance and security analysis and network configuration/tuning.

Constituted Jenkins to perform the build in the non-production and production environments.

Created and deployed a tool to automate branch and project creation in Subversion using Perl and Chef Scripts.

Hands on experience with Ruby/Rails to deploy production and development stacks.

Wrote automation and deployment scripts.

Used Maven to work with development and QA automation and deployments.

Experience in designing and implementing continuous integration pipeline using Agile methodologies.

Installation and configuration of Solaris 9/10 and Red Hat Enterprise Linux 5/6 systems.

Involved in building servers using jumpstart and kickstart in Solaris and RHEL respectively.

Installation and configuration of RedHat virtual servers using ESXi 4/5 and Solaris servers (LDOMS) using scripts and Ops Center.

Performed package and patches management, firmware upgrades and debugging.

Addition and configuration of SAN disks for LVM on Linux, and Veritas Volume Manager and ZFS on Solaris LDOMs.

Configuration and troubleshooting of NAS mounts on Solaris and Linux Servers.

Configuration and administration of ASM disks for Oracle RAC servers.

Analyzing and reviewing the System performance tuning and Network Configurations.

Managed Logical volumes, Volume Groups, using Logical Volume Manager.

Troubleshooting and analysis of hardware and failures for various Solaris servers (Core dump and log file analysis)

Performed configuration and troubleshooting of services like NFS, FTP, LDAP and Web servers.

Installation and configuration of VxVM, Veritas file system (VxFS).

Management of Veritas Volume Manager (VxVM), Zetabyte File System (ZFS) and Logical Volume Manager

Involved in patching Solaris and RedHat servers.

Configured and maintained Network Multipathing in Solaris and Linux.

Configuration of Multipath, EMC power path on Linux, Solaris Servers.

Provided production support and 24/7 support on rotation basis.

Performed POC on Tableau which includes running load tests and system performance with large amount of data.

Environment: Solaris 9/10/11, RedHat Linux 4/5/6, AIX, Sun Enterprise Servers E5500/E4500, Sun Fire V1280/480/440, Sun SPARC 1000, HP 9000K, L, N class Server, HP & Dell blade servers, IBM RS/6000, VMware ESX Server, LAN/WAN/NOC Administration, Systems Installation, Configuration & Upgrading, PowerShell, Linux Servers, VBScript, LAN/WAN, Oracle Databases, SQL, MYSQL, NOS Patches & Updates.
Keywords: cprogramm cplusplus continuous integration continuous deployment quality analyst machine learning user interface sthree database information technology golang hewlett packard California Iowa New Jersey Pennsylvania

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];929
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: