Opening for Systems Infrastructure Engineer or Infrastructure DevOps Engineer only USC GC at Remote, Remote, USA |
Email: [email protected] |
From: jai, spear staffing [email protected] Reply to: [email protected] Role: Systems Infrastructure Engineer" or "Infrastructure DevOps Engineer" Location: 100% Remote USC GC only infrastructure engineer type of candidates who came up the ranks of Unix/Linux first and them moved to the Cloud (AWS or AZURE is okay but they use AWS) and has 2 years experience of building out ElasticSearch Must have ElasticSearch for at last 2 years building it out. The company is planning to use ElasticSearch to consume all types of unstructured data like every financial news articles in the world, videos, emails, social media like X (Twitter), Youtube, financial Podcasts, etc. Then configure for AI so their financial researchers access Petabytes of data a day. Must have Python coding skills of at least mid-level as 25% of the work is coding Through their work experience, they have to do data engineering so tools like Airflow and Argo Workflows or similar and DataBricks is a HUGE PLUS as is any Kafka to move data around as fast as possible. The company, an international Fixed Income Asset Manager, with main office in Newport Beach, California, is seeking a 100% remote, Systems Infrastructure Engineer or Infrastructure DevOps Engineer with 2 years building out ElasticSearch They need ElasticSearch to build out from scratch. Not to just support or maintain, but build out the ElasticSearch platform in a cloud environment 100% remote now so they can be anywhere in USA Working on a team in four locations in USA. One manager is in Minnesota and the other is in Austin, Texas Systems Infrastructure Engineer or Infrastructure DevOps engineer with ElasticSearch Job Description: We need a Systems Infrastructure Engineer or Infrastructure DevOps engineer to assist developers in utilizing AI for processing unstructured data using This involves establishing a solution for the acquisition, movement, and ingestion of large volumes of unstructured data on a daily basis, and making this data available for AI processing. Skills, in priority order (from must-have to nice-to-have), include: DevOps that can code: 25% coding, 75% operations ElasticSearch and Python Kubernetes (K8s), Airflow/Dagster/Airbyte, Argo Workflows, Databricks Infrastructure as Code (IaC) - Terraform, Cloud Development Kit (CDK) Monitoring (Datadog, Splunk, PagerDuty) - managing a production system Event-based systems - Kafka APIs, Layer 3/7 networking (Envoy) Application authentication (Okta) AI stack - OpenAI, Claude 3, Pinecone, SageMaker, Bedrock [email protected] Keywords: artificial intelligence information technology green card Opening for Systems Infrastructure Engineer or Infrastructure DevOps Engineer only USC GC [email protected] |
[email protected] View all |
Fri Aug 16 00:11:00 UTC 2024 |