Urgent requirement for AWS DevOps Engineer with Big Data exp at Remote at Remote, Remote, USA |
Email: [email protected] |
From: Praveen Kumar, Magicforce [email protected] Reply to: [email protected] Job Title: AWS DevOps Engineer with Big Data exp Location: Remote Duration: 1+ Year Job Description: Mandatory Skills: AWS, lambda, Python, Apache Spark, Dynamo DB, Devops, Big Data Bachelors degree in Computer Science, Information Technology, or a related field. Strong programming skills in languages such as Python or Scala. Experience with data engineering tools and technologies, including ETL frameworks. Knowledge of cloud computing platforms, especially AWS. Familiarity with big data processing frameworks such as Apache Spark. Ability to optimize and fine-tune Databricks clusters for performance. Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills. Preferred Qualifications: AWS certification in Big Data or related field. Previous experience with data lakes and data warehousing solutions. Familiarity with data governance and compliance standards Please fill the skill set matrix and revert me back Skill Years of Experience Self-rating out of 10 lambda Aws Python Apache Spark Big Data Dynamo DB DevOps Health care domain Keywords: database Urgent requirement for AWS DevOps Engineer with Big Data exp at Remote [email protected] |
[email protected] View all |
Thu May 09 18:20:00 UTC 2024 |