Senior Data Engineer with Datadog Exp. at Remote, Remote, USA |
Email: [email protected] |
Title: Senior Data Engineer with Datadog Exp. (Hybrid) City: Livonia, MI ( Local with DL) Duration: 6+ months Interview: Video Visa: USC, GC, TN, OPT/EAD, CPT/EAD, EAD/GC, H4/EAD Top Skills: 1. AWS competence 2. Snowflake - construct data pipeline in Snowflake 3. Datadog for monitoring 4. Will be doing a coding assignment 5. Architect and implementation experience 6. extensive AWS components experience. Like building pipelines in AWS and coding it in terraform. Work Breakdown: (Day in the life of) 50% trying to find custom solution 30% POC's make sure - fix, test, tweak and make sure it works 20% enable for production - specific requirement to pull in all the logs from the data pipelines in Datadog. have a dashboard that shows everything Hiring Manager goes line by line on a resume and someone had AWS 12 times on a resume and it was a red flag. Think outside the box and come up with solutions that is why she says architect and if you do not know I dont know but can learn it Tell me about what you did coding in terraform aws infrastructure some lambda and some s3 buckets AWS -components EC2 instances, S3 buckets, Snowflake - Efficient ETL for Data Ingestion Data Quality Considerations with Lambda, what are the problems you have faced using it Datadog Code with Terraform Measure cost and cost optimization in AWS Can you tell me how many ways you can move data On Prem to Redshfit/Snowflake - # of ways to move data Hey can you tell me how many ways you can move data from a SQL server at your company' data center to Redsfhit or Snowflake In Office Schedule T/W Feedback on Candidates They Have Seen So Far: come and architect and develop, design, build and construct data pipelines, implement Top Skills: 1. AWS competence 2. Snowflake - construct data pipeline in Snowflake 3. Datadog for monitoring 4. Will be doing a coding assignment 5. Architect and implementation experience 6. extensive AWS components experience. Like building pipelines in AWS and coding it in terraform. Work Breakdown: (Day in the life of) 50% trying to find custom solution 30% POC's make sure - fix, test, tweak and make sure it works 20% enable for production- specific requirement to pull in all the logs from the data pipelines in Datadog. have a dashboard that shows everything Thank & Regards Rishu Arora | Sr. Technical Recruiter D: 215-799-9177 || Rishu Arora First Ring Solutions LLC | Philadelphia, PA 19102 Note: Due to high volume of calls, I may miss your call, email is the better way to reach me Keywords: sthree information technology green card trade national Michigan Pennsylvania Tennessee Senior Data Engineer with Datadog Exp. [email protected] |
[email protected] View all |
Tue Sep 03 19:46:00 UTC 2024 |