Home

Opening for Sr. Data Engineer only USC $ GC at Remote, Remote, USA
Email: [email protected]
From:

jai,

spear staffing

[email protected]

Reply to:   [email protected]

Sr. Data Engineer/ Architect 

6 months

Tampa, Florida 33634, US-- Hybrid come onsite once a week with Flexibility needs to be within 1- 2 hours driving distance to Tampa.

Rate : $60/hr on C2c MAX!

TIME BLOCKS (Days /Times)   Teams interview with Derek / Interview June 11th, June 12th, and June 13th 3pm and after all days

INTERVIEW PROCESS               Teams video interview to hire.

 Must Have:   HUBSPOT

TOP SKILLS REQUIRED

SIZZLES PLEASE INCLUDE 4 BULLETS HIGHLIGHTING THE SKILL SET REQUIRED. THANK YOU!

1

Building Data warehouses

2

PostgreSQL, MySQL, IS A MUST KNOWING NATIVE SCRIPTING

3

Python experience and ETL Development is a must

4

HubSpot is HIGHLY PREFERRED or other relevant API/ CRM system experience

CLIENTS FORMAL JOB DESCRIPTION

Data Warehouse and Reporting Engineer

Position Overview: We are seeking a skilled and motivated Data Warehouse and Reporting Engineer to join our team. The ideal candidate has extensive experience working with databases such as PostgreSQL and MySQL, proficient in Python scripting, possesses expertise in ETL (Extract, Transform, Load) processes, and is well-versed in designing and maintaining data lakes and data warehouses. This role requires a strong foundation in data architecture, data modeling, and reporting to ensure our data infrastructure supports efficient analytics and reporting operations.

Key Responsibilities:

1.           Data Warehouse Design and Management:

o             Collaborate with cross-functional teams to design, develop, and optimize data warehouse solutions tailored to business requirements.

o             Create and maintain data models, schemas, and structures for efficient storage and retrieval of data.

o             Monitor and enhance data warehouse performance and scalability to accommodate growing data volumes.

2.           ETL Development:

o             Design and implement ETL processes to extract data from various sources, transform it to meet business needs, and load it into the data warehouse.

o             Develop and maintain ETL pipelines using industry best practices and tools.

3.           Data Lake Implementation:

o             Work with the team to design and build scalable data lake solutions for storing raw and processed data.

o             Implement data governance and security measures to ensure data integrity and compliance.

4.           Reporting and Analytics:

o             Collaborate with stakeholders to understand reporting requirements and design effective dashboards, visualizations, and reports.

o             Develop reporting solutions that provide actionable insights to drive business decisions.

5.           Database Management:

o             Administer and optimize PostgreSQL, MySQL, and other database systems to ensure high availability, performance, and data integrity.

o             Monitor database performance and troubleshoot issues as they arise.

6.           Scripting and Automation:

o             Utilize Python scripting to automate data processing tasks, data quality checks, and other routine operations.

Qualifications:

             Bachelor's degree in Computer Science, Information Technology, or related field (or equivalent experience).

             Proven experience (5-7 years) as a Data Warehouse Engineer, ETL Developer, or similar role.

             Strong proficiency in PostgreSQL, MySQL, and other relational databases.

             Expertise in designing and optimizing data warehouse solutions.

             Proficiency in ETL tools and processes.

             Experience with data lake architecture and management.

             Advanced knowledge of Python programming for data manipulation and automation.

             Familiarity with reporting and analytics tools such as Tableau, Power BI, or similar.

             Strong problem-solving skills and the ability to work in a collaborative team environment.

             Excellent communication skills to interact with technical and non-technical stakeholders.

Preferred:

             Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services.

             Knowledge of data governance, security, and compliance best practices.

             Familiarity with data streaming technologies (e.g., Kafka, Spark Streaming).

             Certification in relevant areas (e.g., AWS Certified Data Analytics, Google Cloud Professional Data Engineer).

 [email protected]

Keywords: business intelligence information technology
Opening for Sr. Data Engineer only USC $ GC
[email protected]
[email protected]
View all
Mon Jun 03 20:20:00 UTC 2024

To remove this job post send "job_kill 1446002" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 4

Location: ,