Need Data Engineer with Trading and Investment exp Local to PA only USC GC HYBRID Face to Face Interview at Malvern, Pennsylvania, USA |
Email: [email protected] |
From: Amandeep Kaur, Techrooted INC [email protected] Reply to: [email protected] Hi, This is Aman, Technical Recruiter at Techrooted, we have an excellent job opportunity for you. Please let me know if you are interested then share your updated resume along with your contact details so that we can proceed further Data Engineer with Trading and Investment domain exp (OneTick or KDB) Location: Malvern, PA (Need Local to PA only; 3 days per week Onsite; HYBRID) USC GC only Interview: In-person Interview Solid LinkedIn profile is must . Job Description: NEED VERY Sr level Data Engineer that has experience with OneTick or KDB (which are platforms that are used to capture investment data). They use OneTick but are open to KDB too. They also need to the strong skills of Data Engineer (very strong Python, SQL (Hive) building pipeline, PySpark) etc.. She said the job description skills are accurate and very standard of a Sr. Data Engineer. team works with our global trading desks to optimize their trading strategies, saving millions of dollars for clients every year. The team works with our trader and portfolio manager partners, across asset classes and passive as well as active mandates, to conduct data-driven analyses and build tools to help shape our trading strategy. As an Engineer on the full stack team, you will be dedicated to help design, implement, and maintain a modern, robust, and scalable platform which will enable the TAS team to meet the increasing demands from the various trading desks. Must haves: Proficiency in Python programming Strong expertise in SQL, Presto, HIVE, and Spark Knowledge of trading and investment data Experience in big data technologies such as Spark and developing distributed computing applications using PySpark Experience with libraries for data manipulation and analysis, such as Pandas, Polars and NumPy Understanding of data pipelines, ETL processes, and data warehousing concepts Strong experience in building and orchestrating data pipelines Experience in building APIs Write, maintain, and execute automated unit tests using Python Follow Test-Driven Development (TDD) practices in all stages of software development Extensive experience with key AWS services/components including EMR, Lambda, Glue ETL, Step Functions, S3, ECS, Kinesis, IAM, RDS PostgreSQL, Dynamodb, Timeseries database, CloudWatch Events/Event Bridge, Athena, SNS, SQS, and VPC Proficiency in developing serverless architectures using AWS services Experience with both relational and NoSQL databases Skills in designing and implementing data models, including normalization, denormalization, and schema design Knowledge of data warehousing solutions like Amazon Redshift Strong analytical skills with the ability to troubleshoot data issues Good understanding of source control, unit testing, test-driven development, and CI/CD Experience with OneTick or KDB Nice to haves: Proficiency in data visualization tools and ability to create visual representations of data, particularly using Tableau Regards Amandeep Kaur Sr. Technical Recruiter E : [email protected] TechRooted Inc. 14 Wall Street 20th Floor | New York, NY 10005 www.techrooted.com/ Keywords: continuous integration continuous deployment sthree green card New York Pennsylvania Need Data Engineer with Trading and Investment exp Local to PA only USC GC HYBRID Face to Face Interview [email protected] |
[email protected] View all |
Fri Dec 06 00:29:00 UTC 2024 |