Home

ELK/AWS Developer at Remote, Remote, USA
Email: [email protected]
ELK/AWS Developer

Location : I
rving,TX (Onsite) Only Local Profiles 

Responsibilities:

Started with Amazon log agents (kinesis agent)to ship Apache logs to Kinesis stream where in kinesis firehose directly stream data to Amazon Elasticsearch
Also experimented on cloudwatch agents to ship server logs to Cloudwatch Service triggering them using Lambda function streamed to Elasticsearch.
Developed build and deployment scripts using ANT and MAVEN as build tools in Jenkins to move from one environment to other environments and create new jobs and branches through Jenkins.
Created Python Scripts to automate services on AWS.
Written ansible Script as a POC for ELK package to be installed, deployed and configured in aws cloud environment and also build CI/CD Pipeline with different Jenkins jobs running.
Configured AWS Identity and Access Management (IAM) Groups and Users for improved login authentication.
In addition to the above pipeline fetched server logs from kinesis stream through logstash and finally streamed to Amazon Elasticsearch.
As a conclusion tried researching on beats, used filebeat to ship sever logs to logstash which acted as a parser here where in it transformed and streamed logs to AmazonES (correlation taken place with all instance metrics, system metrics, apache server metrics)
Create and Configured the CI/CD pipeline for deploying ELK - Services and Lambda functions using Jenkins
Adding metricbeat to the above context also gives the correlation.
Handled operations and maintenance support for AWS cloud resources which includes launching, maintaining and troubleshooting EC2 instances, S3 buckets, Virtual Private Clouds (VPC), Elastic Load Balancers (ELB)
Created and managed a Docker deployment pipeline for custom application images in the cloud using Jenkins.
Able to present Dynamic Dashboard in Kibana using python script with correlation let say Apache, System, Instance metrics.
Used Ansible Playbooks to setup Continuous Delivery Pipeline, including provisioning AWS environments using Ansible Playbooks.
Experimented on td-agent as a shipper and shipped logs to logstash after parsing streamed to Amazon Elasticsearch with all the metrics.
Locally started filebeat agent integrated with Kafka Queue fetching the data form Kafka through logstash, parsing it and streamed it to Elasticsearch with correlation.

--

Keywords: continuous integration continuous deployment sthree information technology Texas
ELK/AWS Developer
[email protected]
[email protected]
View all
Tue Nov 12 21:38:00 UTC 2024

To remove this job post send "job_kill 1923636" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,