Sr. Data Engineer-Architect Hybrid Tampa, Florida (Local Needed) USC GC only at Tampa, Florida, USA |
Email: [email protected] |
From: Ankit upadhyay, Pivotal Technologies [email protected] Reply to: [email protected] CLIENT/WEBSITE/ VMS POSITION/ # OPENINGS DURATION + C, CTH, DH LOCATION (O, H, R)/ ADDRESS IF ONSITE Sr. Data Engineer/ Architect 6 months contract to hire Tampa, Florida 33634, US Hybrid come onsite once a week with Flexibility needs to be within 1- 2 hours driving distance to Tampa MAX BILL + PAY + CONVERSION SALARY Pay rate : $58hr on C2c 3rd PARTY (C2C, SELF-IC, ISS) Must be able to convert without sponsorship . Client doesnt offer sponsorship . TOP SKILLS REQUIRED SIZZLES PLEASE INCLUDE 4 BULLETS HIGHLIGHTING THE SKILL SET REQUIRED. THANK YOU! 1 Building Data warehouses 2 PostgreSQL, MySQL, IS A MUST KNOWING NATIVE SCRIPTING 3 Python experience and ETL Development is a must 4 Hubspot is HIGHLY PREFERRED or other relevant API/ CRM system experience INTERVIEW PROCESS TIME BLOCKS (Days /Times) Teams interview with Derek / Interview June 11th, June 12th and June 13th 3pm and after all days INTERVIEW PROCESS Teams video interview to hire INTERVIEWERS LINKEDIN CLIENTS FORMAL JOB DESCRIPTION Data Warehouse and Reporting Engineer Position Overview: We are seeking a skilled and motivated Data Warehouse and Reporting Engineer to join our team. The ideal candidate has extensive experience working with databases such as PostgreSQL and MySQL, proficient in Python scripting, possesses expertise in ETL (Extract, Transform, Load) processes, and is well-versed in designing and maintaining data lakes and data warehouses. This role requires a strong foundation in data architecture, data modeling, and reporting to ensure our data infrastructure supports efficient analytics and reporting operations. Key Responsibilities: Data Warehouse Design and Management: Collaborate with cross-functional teams to design, develop, and optimize data warehouse solutions tailored to business requirements. Create and maintain data models, schemas, and structures for efficient storage and retrieval of data. Monitor and enhance data warehouse performance and scalability to accommodate growing data volumes. ETL Development: Design and implement ETL processes to extract data from various sources, transform it to meet business needs, and load it into the data warehouse. Develop and maintain ETL pipelines using industry best practices and tools. Data Lake Implementation: Work with the team to design and build scalable data lake solutions for storing raw and processed data. Implement data governance and security measures to ensure data integrity and compliance. Reporting and Analytics: Collaborate with stakeholders to understand reporting requirements and design effective dashboards, visualizations, and reports. Develop reporting solutions that provide actionable insights to drive business decisions. Database Management: Administer and optimize PostgreSQL, MySQL, and other database systems to ensure high availability, performance, and data integrity. Monitor database performance and troubleshoot issues as they arise. Scripting and Automation: Utilize Python scripting to automate data processing tasks, data quality checks, and other routine operations. Qualifications: Bachelor's degree in Computer Science, Information Technology, or related field (or equivalent experience). Proven experience (5-7 years) as a Data Warehouse Engineer, ETL Developer, or similar role. Strong proficiency in PostgreSQL, MySQL, and other relational databases. Expertise in designing and optimizing data warehouse solutions. Proficiency in ETL tools and processes. Experience with data lake architecture and management. Advanced knowledge of Python programming for data manipulation and automation. Familiarity with reporting and analytics tools such as Tableau, Power BI, or similar. Strong problem-solving skills and the ability to work in a collaborative team environment. Excellent communication skills to interact with technical and non-technical stakeholders. Preferred: Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Knowledge of data governance, security, and compliance best practices. Familiarity with data streaming technologies (e.g., Kafka, Spark Streaming). Certification in relevant areas (e.g., AWS Certified Data Analytics, Google Cloud Professional Data Engineer). Thanks & Regards, ANKIT UPADHYAY Technical Recruiter Office: +1 (703) 570-8775 (Ext-217) Email- [email protected] Connect with me:-- linkedin.com/in/ankit-upadhyay-a689a1232 Keywords: cprogramm business intelligence rlang information technology Sr. Data Engineer-Architect Hybrid Tampa, Florida (Local Needed) USC GC only [email protected] |
[email protected] View all |
Fri May 31 05:30:00 UTC 2024 |