Priorities of Aug, 28th at Plano, Texas, USA |
Email: [email protected] |
Hello Partner, Position: 1 Position: Kubernetes Architect Location: Plano, TX Pay Rate: $80/Hr. on C2C Duration: 12+ Months Note: (Its a specific Kubernetes Position and hence please do not submit pure DevOps resources with Kubernetes knowledge.) Job Description: Collaborate closely with senior leadership, Enterprise Architects, and engineering leaders to develop robust cloud-based container solutions. Identify and recommend opportunities for developing product functional and technical strategies for the domains and the development of actionable short and long-term product roadmaps by partnering with business product managers, lead systems architects, and more experienced engineers. Make decisions and resolve issues regarding strategic products of engineering teams to meet business objectives. Oversee engineering and operations to deliver commitments aligned with strategic product priorities. Collaborate and consult with agile teams and product managers to drive outcomes. Interact directly with third-party vendors and technology service providers. Manage allocation of people and financial resources for technology engineering. Proactively monitor capacity, performance, and cost metrics to ensure quality and identify opportunities for improvement. Explore new technologies and solutions to push our capabilities forward. Qualifications: 15+ years of IT experience, 3+ years for project management Technical Skills: 15+ years of industry experience and 5+ years of experience designing, building, securing, and managing Kubernetes on the cloud at scale. Experience with messaging and data streaming technologies like KAFKA, event hub etc. Experience with different Kubernetes flavors like Rancher, OpenShift, etc. Hands-on and expert Knowledge and experience with Linux, Cloud Platforms (Azure preferred), and infra-automation such as Terraform, Docker Compose, GitOps, and shell scripting. Proficient in modern DevOps programming languages such as Python, Golang, and PowerShell. Good Knowledge of monitoring tools such as Prometheus, Grafana, AppDynamics, Dynatrace, or related tools. Hands-on experience with container traffic management tools like Nginx, Istio, etc. Experience working with public cloud, preferred Microsoft Azure. Familiarity with Agile best practices. # Architect-level cloud certification. Certifications such as the following are a plus: Certified Kubernetes Application Developer (CKD), Certified Kubernetes Admin (CKA), and Certified Kubernetes Security Specialist (CKS) Non-Technical Skills: Excellent analytical, decision-making, and problem-solving skills Must be able to multitask in a fast-paced environment focusing on timeliness, documentation, and communications with peers and business users alike. Ability to communicate verbally and in writing to technical and non-technical audiences of various levels within or outside the organization (executives, regulators, clients, etc.) Results-oriented, business-focused, and successful at interfacing across multiple organizational units Position: 2 Data Engineer Plano, TX (Onsite) Long-term Contract Rate: $55/hr. on C2C Max Roles and Responsibilities: 12+ years skilled in Business Intelligence/Data Warehouse, Data and Technical Architecture, Methodology, Architecture, Data Governance, Data Modeling, ETL Tools, Big Data, Data Lake. Strong expertise applying industry best practice methods and sound enterprise architecture, data architecture/ management, and integration techniques across domains. End to End responsibility for data modeling (OLTP, OLAP) Data analysis experience with writing complex queries, unions, joins and aggregations. Data analysis experience using major RDBMS solutions like snowflake, redshift etc. Experience with ELT, ETL processes using glue, glue brew transformations. Use airflow to schedule and time data transfers. Understanding of deep JSON structures and partitioning using Kafka, spark, Scala, s3 etc. Work with Data Scientist team to build segmentations, ML use cases, forecasting etc. Experience working with sagemaker, jupyter notebooks for deep data analysis. Work with the BI specialists to design develop and enhance connectors to get closer to business use cases. Migration of existing custom pipelines to a normalized connector approach Help educate CT teams on data integration, validation standards and drive clean ingestion egestion patterns for the platform. Experience and Qualifications: Work with APIs to extract and ingest data. Work with virtual warehouses and configure them for optimal performance and efficiency. Conduct ETL data integration, cleansing transformations using glue spark script. Work on aggregations of data coming from applications and apis to store results in a historic table. Experience with streaming data analytics and building of streaming pipelines and connectors. Experience with connections to BI solutions like tableau which include configurations of roles, policies within aws Leverage Lambda, glue and step functions to cleanse and transform data. Work with DWH technologies like EC2, s3, redshift, athena, snowflake to churn large data sets and partition them in readable formats and in real time. Position: 3 Senior Data Architect (AWS) Plano, TX (Onsite) Long-term Contract Rate: $75/hr . on C2C Max Roles and Responsibilities: We are seeking a 15+ years skilled Data Architect with expertise in Data Platforms to join our team. As a Data Architect, you will also be responsible for developing and architecting data strategies that will enable secure and performant use of the data platform for BI solutions, advanced analytics and querying engine. End to End responsibility for data collection, transformation, storage, governance, and consumption. Manage and define processes of external and internal data integrations. Work closely with software engineers and DevOps teams to ensure APIs, data models, connectors, microservices are efficient and performant. Lead architecture build self-service data access capabilities and smart APIs to drive data decisions. Lead cost efficiency, cost optimization efforts to keep data warehouse/virtual warehouse costs low. Improve data quality, data validation layers to help the BI solutions highlight rich data. Solid understanding of Kafka fundamentals to leverage kafka streams to provide near real time reporting solutions. Collaborate with development teams and product owners to implement data strategies and plans that align with business objectives and performance goals. Design and implement virtual warehouses for existing and new use cases. Experience and Qualification: Hands-on experience in developing data enterprise solutions using AWS services. Extensive hands-on experience with DWH, APIs, Data Lake environments Strong understanding of cloud infrastructure and services like EMR, Athena, Scala, Databricks, Snowflake, Spark, RDS, glue, Redshift, Apache airflow, SQS, SNS Lead architecture of solutions to build self-service composite tools composed of microservices. Excellent communication skills, with the ability to communicate technical concepts and solutions to both technical and non-technical stakeholders and management. Experience in leading solutions around technology, data, Machine Learning, and Data Engineering, data governance and regulatory compliance, managing deliverables. Thanks , Rishi Alpharetta, GA 30022 Phone: (470) 668- 2443 Fax Line: (866) 431-2320 www.cohetech.com Keywords: machine learning business intelligence sthree information technology Connecticut Georgia Texas |
[email protected] View all |
Mon Aug 28 21:50:00 UTC 2023 |