Home

Data Architect|| Owing Mills at least 2 times a week || local only at Mills, Wyoming, USA
Email: [email protected]
From:

hanshika,

vyze

[email protected]

Reply to:   [email protected]

Job Description -

 Adobe Experience Platform Python PySpark Air flow ect. Local Candidate is a deal on it

Data Architect 

Location: Owing Mills at least 2 times a week, no exceptions HYBRID

Duration: 12 months

Interview: initial interview is a panel with 3 team members (1 hour): technical screening, 2nd round if required

Hands on at least 5 years in Data architect space
Build a framework, no just pipelines, more of a liaison between data engineers and architect, build data 
Pyspark, python (must be able to code, not just write scripts), at least 2 years with modern data: spark, snowflake (redshift or data lakes also works in lieu of snowflake), etc.
ETL/ ELT: dbt, airflow (Work flow mgmt.), quality great expectations
SQL knowledge to transfer to DB2: writing SQL and doing implementations for transformation logic
Documentation, doing the implementations, architects will socialize a solution with the data engineers, more of a work with data architect, present to socialize with the stakeholders
Database modeling experience 
DevOps: Jekins, Gitlab required
Adobe Experience Platform: more of a ingestion into batch. Able to understand different patterns or ingestions: batch ingestion, micro batch ingestion, etc. Diff implementations with AEP integrations. In general, need to know how would it look on AWS for reference ingestions, etc.
Plus: informatica, snap logic, kafka, new database platforms such as Postgres
Team Size: <10 ppl in the architecture team, this resource will be working with 1-2 people in terms of data arch, diff implementations. Engineers: 8-9 will be using reference implementations
Tuesdays- Thursday: 2 days are onsite days, most of the team does 2 of these days 
Some technical knowledge, good interpersonal skills, coding, etc.

Tech Data Architect specializing in Hands on data engineering work for our technology & business partners. You will be part of the Distribution & Marketing Solutions Architecture team and steer strategic technology direction, define target state architecture, roadmaps and help build reference implementations in partnership with our Platform Engineering teams.

Tech Architect Job Responsibilities:

At least 5+ years experience as a data architect

Experience in Adobe Experience Platform Integration patterns

Experience in building framework and reference Implementations

Experience in SQL is a must

Experience coding in python, pyspark for server side/data processing 

2+ years experience using modern data stack (spark, snowflake) on cloud platforms (AWS)

Experience building ETL/ELT pipelines for complex data engineering projects (using Airflow, dbt, Great Expectations would be a plus)

Experience with Database Modeling, Normalization techniques

Experience with dev ops tools like Git, Jenkins, Gitlab CI

Skills that would be a plus:

ETL tools (Informatica, Snaplogic, dbt etc.)

Experience with Snowflake or other Cloud Data warehousing products

Exposure with Workflow management tools such as Airflow

Exposure to messaging platforms such as Kafka

Exposure to New SQL platforms such as Cockroachdb, Postgres etc

Keywords: continuous integration information technology
[email protected]
View all
Tue Feb 13 21:35:00 UTC 2024

To remove this job post send "job_kill 1113975" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 2

Location: ,