Data Engineer with Talend, Snowflake, Hadoop @ Richmond, Virginia (Onsite) at Richmond, Virginia, USA |
Email: [email protected] |
Need H1B with Passport No Hi, Hope you are doing great. Please go through the below mentioned requirement and send your updated word format resume if you are interested in this requirement. Job Title: Data Engineer with Talend, Snowflake, Hadoop Location: Richmond, Virginia (Onsite) Duration: 6+ Months Qualifications: 1. Strong understanding of Data warehousing (Dimensional Modeling, ETL etc.) and RDBMS concepts 2. Minimum 5 years working experience with Talend ETL tool 3. Minimum 5 years working experience in SQL, Stored Procedures and Table Design 4. Minimum 5 years working experience in SQL Query optimization and ETL Data loading Performance 5. Minimum 2 years working experience in Snowflake Cloud Data warehouse 6. Experience as a Data Engineer in Hadoop Platforms on components like HIVE, KAFKA, NiFi, Spark etc is a big plus. 7. Minimum 2 years working experience in shell scripting 8. Experience in real time streaming technologies is preferred 9. Experience deploying machine learning models and automating processes in production is a plus 10. Experience with cloud technologies (AWS, Azure, GCP) is big plus Responsibilities: As a core member of team of Data Engineers/ETL Developers, responsibilities include but not limited to 1. Design, Develop and maintain secure, consistent and reliable ETL solutions supporting critical business processes across the various Business Units. 2. Ensure data solutions are compliant with enterprise security standards 3. Work in complex multi-platform environments on multiple project assignments. 4. Develop and perform tests and validate data flows and prepare ETL processes to meet complex business requirements, including designing and implementing of complex end-to-end solutions using BI platforms. 5. Coordinate with analysts and developers to ensure jobs designed and developed meet minimum support standards and best practices before migration into the production environment. 6. Partner with various infrastructure teams, application teams, and architects to generate process designs and complex transformations to various data elements to provide the Business with insights into their business processes. 7. Uses strategies such as Indexing and partitioning to fine tune the data warehouse and big data environments to improve the query response time and scalability. 8. Define and capture metadata and rules associated with ETL processes. 9. Assist production support team in providing resolutions to production job failures, data issues and performance tuning ETL Development and Process Support, may require weekend/off business hours work. Thanks & Regards, Karunakar Basireddy Sr. Technical Recruiter Email Id: [email protected] -- Keywords: business intelligence information technology golang Idaho Data Engineer with Talend, Snowflake, Hadoop @ Richmond, Virginia (Onsite) [email protected] |
[email protected] View all |
Fri Nov 15 20:54:00 UTC 2024 |