Data Platform Engineer :: Remote at Remote, Remote, USA |
Email: [email protected] |
Please share resume at [email protected] Data Platform Engineer - (Python, AWS Data Lake, Java, SQL) Visa:- GC,USC 100% Remote - must sit in EST time zone, no exceptions Contract to Hire: 6-12 month contract, then convert to fulltime Job Description:- Client is seeking a Data Platform engineer to join their growing team! You will be helping them build fundamental platform capabilities on their next generation AWS Data Lakehouse platform with Python, Java, and SQL as primary development languages, and Terraform as a declarative language. Source code is managed via Gitlab. This platform includes leading edge tools like Databricks, Fivetran, Airflow, and DBT. Specific project details will include creation of Databricks Workspaces, Role and Access Privilege automation, Platform monitoring and observability with Datadog and PagerDuty, and many more key new features that will enable rapid growth, performance, and security on the next generation Data platform. Create data platform features, such as Enterprise role-based access controls, data anonymization, user onboarding and setup, and workspaces within Databricks Create data ingestion patterns and pipelines, following standards that support self-service analytics users across all bank domains (deposits, loans, checking, etc.), via real-time, near real-time, and batch processes Be a hands-on contributor that can build data products as needed, from source to target ingestion, transformative data models and data marts that provide high-quality, trusted data for self-service consumers. Interact with a variety of engineering team members to enable their professional capabilities, development, and understanding of data platform features Promote the adoption and execution of industry best practices related to data ingestion and transformation, from source to target model development, dashboard analytics, and data access, quality, and governance. Help establish and document designs, standards, and user guides to help engineering teams consume the data platform features that you create. Required and Preferred Experience 2+ years of hands-on experience with data engineering, using tools like Python, SQL, DBT (Data Build Tool), ascend.io, or Informatica with source code control in Gitlab/Bitbucket 2+ years of experience working with AWS-based cloud engineering technologies like Redshift, Databricks, Glue, Step Functions, EC2, Cluster management, and configuration 1+ years of experience with Config As Code/Infrastructure As Code with tools like Terraform Basic understanding of modern Data Platform capabilities Ability to work in a team environment and not take yourself too seriously. Preferred: Experience building/maintaining software and/or engineering platforms and frameworks Thanks and Regards!! Akashika Pandey Technical Resource Specialist Ace Technologies Inc 2375 Zanker Road, Suite 250, San Jose, CA 95131 Phone: 1-408-617-7200 | Extn 4286 | Email ID/Hangout : [email protected] ------------------------------------------------------------------------------------------------------- Reporting Manager: Manish Sharma| Email ID : manish @acetechnologies.com | Phone: 1-408-617-7200 Ext 4298 Escalations : [email protected] Note: We respect your Online Privacy. This is not an unsolicited mail. Under Bills.1618 Title III passed by the 105th U.S. Congress this mail cannot be considered unsolicited as long as we include Contact information and a method to be removed from our mailing list. If you are not interested in receiving our e-mails then please reply with a "REMOVE" in the subject line to [email protected] and mention all the e-mail addresses to be removed with any e-mail addresses, which might be diverting the e-mails to you. We are sorry for the inconvenience -- Keywords: information technology green card California Idaho Data Platform Engineer :: Remote [email protected] |
[email protected] View all |
Fri Aug 02 22:43:00 UTC 2024 |