Looking for ELK_AWS Developer ( Need Only H1B_C2C_EX Verizon profiles ) at Irving, Texas, USA |
Email: [email protected] |
From: suma, Rurisoft technologies [email protected] Reply to: [email protected] ELK/AWS Developer Location : Irving,TX (Onsite) Only Local Profiles Responsibilities: Started with Amazon log agents (kinesis agent)to ship Apache logs to Kinesis stream where in kinesis firehose directly stream data to Amazon Elasticsearch Also experimented on cloudwatch agents to ship server logs to Cloudwatch Service triggering them using Lambda function streamed to Elasticsearch. Developed build and deployment scripts using ANT and MAVEN as build tools in Jenkins to move from one environment to other environments and create new jobs and branches through Jenkins. Created Python Scripts to automate services on AWS. Written ansible Script as a POC for ELK package to be installed, deployed and configured in aws cloud environment and also build CI/CD Pipeline with different Jenkins jobs running. Configured AWS Identity and Access Management (IAM) Groups and Users for improved login authentication. In addition to the above pipeline fetched server logs from kinesis stream through logstash and finally streamed to Amazon Elasticsearch. As a conclusion tried researching on beats, used filebeat to ship sever logs to logstash which acted as a parser here where in it transformed and streamed logs to AmazonES (correlation taken place with all instance metrics, system metrics, apache server metrics) Create and Configured the CI/CD pipeline for deploying ELK - Services and Lambda functions using Jenkins Adding metricbeat to the above context also gives the correlation. Handled operations and maintenance support for AWS cloud resources which includes launching, maintaining and troubleshooting EC2 instances, S3 buckets, Virtual Private Clouds (VPC), Elastic Load Balancers (ELB) Created and managed a Docker deployment pipeline for custom application images in the cloud using Jenkins. Able to present Dynamic Dashboard in Kibana using python script with correlation let say Apache, System, Instance metrics. Used Ansible Playbooks to setup Continuous Delivery Pipeline, including provisioning AWS environments using Ansible Playbooks. Experimented on td-agent as a shipper and shipped logs to logstash after parsing streamed to Amazon Elasticsearch with all the metrics. Locally started filebeat agent integrated with Kafka Queue fetching the data form Kafka through logstash, parsing it and streamed it to Elasticsearch with correlation. Keywords: continuous integration continuous deployment sthree information technology Texas Looking for ELK_AWS Developer ( Need Only H1B_C2C_EX Verizon profiles ) [email protected] |
[email protected] View all |
Thu Nov 14 20:33:00 UTC 2024 |