Python PySpark with AWS -- Wilmington, DE at Wilmington, Delaware, USA |
Email: [email protected] |
ONLY LOCAL Job Title : Python PySpark with AWS Location: Wilmington, DE Job Description : Collaborate with the team to build out features for the data platform and consolidate data assets. Build, maintain and optimize data pipelines built using Spark. Advise, consult, and coach other data professionals on standards and practices. Work with the team to define company data assets. Migrate CMS data platform into Chases environment. Partner with business analysts and solutions architects to develop technical architectures for strategic enterprise projects and initiatives. Build libraries to standardize how we process data. Loves to teach and learn, and knows that continuous learning is the cornerstone of every successful engineer. Has a solid understanding of AWS tools such as EMR or Glue, their pros and cons and is able to intelligently convey such knowledge. Implement automation on applicable processes. Mandatory Skills: 5+ years of experience in a data engineering position. Proficiency is Python (or similar) and SQL. Strong experience building data pipelines with Spark. Strong verbal & written communication. Strong analytical and problem solving skills. Experience with relational datastores, NoSQL datastores and cloud object stores. Experience building data processing infrastructure in AWS. Bonus: Experience with infrastructure as code solutions, preferably Terraform. Bonus: Cloud certification. Bonus: Production experience with ACID compliant formats such as Hudi, Iceberg or Delta Lake. Bonus: Familiar with data observability solutions, data governance framework. Best Regards Shivani Garg Percient Technologies M: (646) 978-5220 -- Keywords: information technology Delaware Python PySpark with AWS -- Wilmington, DE [email protected] |
[email protected] View all |
Wed Sep 04 19:51:00 UTC 2024 |