Senior Scala Developer -New Castle, Delaware(F2F) at Delaware, Ohio, USA |
Email: [email protected] |
From: pankaj, Stellent IT [email protected] Reply to: [email protected] Senior Scala Java Developer New Castle, Delaware(F2F) Phone and F2F Long Term Job Description: Must Have: 5+ years of experience in hadoop/big data technologies. 3+ years of hands-on experience as a Scala developer (with previous Java background) Senior Developer (Scala/Java 5+ yrs experience) Job Description: Looking for Hard Core Scala/Java candidates. Strong academic record, ideally with a good degree in a mathematical or scientific background. This role is for a Data Engineering Lead to work on the Vanguard Big Data Platform. The team is responsible for the maintenance and development of leading Big Data initiatives and use cases providing business value. Job Background/context: The Vanguard platform supports operational real-time event based processing and compute applications as well as analytics consumption use cases including machine/deep learning. The technology team is responsible for building and maintaining a multitude of applications including: Real-time ingestion/stream processing and data distribution via Big Data APIs Build out canonical models and data conformance Implement best in class data management and data ingestion. Leverage new storage engines like Kudu that enables analytics on fast changing data Leverage GPU implementation to enable advanced Machine Learning Enhance Self-Service capability for data science and ML practitioners This is a hands on development role that will offer exposure to the full development cycle, whilst working closely with our business and technology stakeholders. Analysis and development across Lines of business including Payments, Digital Channels, Liquidities, Trade Cross train and fertilize functional and technical knowledge, Align to Engineering Excellence Development principles and standards. Promote and increase our Development Productivity scores for coding Fully adhere to and evangelize a full Continuous Integration and Continuous Deploy pipeline Development Value: This role would open career opportunities for the successful individual to establish their profile in the Data Innovation and Architecture organization. It provides an opportunity in Institutional Client banking to work closely with the business to provide value add data solutions Knowledge/Experience: 5+ years of experience in hadoop/big data technologies. 3+ years of hands-on experience as a Scala developer (with previous Java background) Experience with Spark/Storm/Kafka or equivalent streaming/batch processing and event based messaging Experience with API development and use of JSON/XML/Hypermedia data formats Relational and NoSQL database integration and data distribution principles experience Unix/Linux Experience with containerization and related technologies (e.g. Docker, Kubernetes) Experience with all aspects of DevOps (source control, continuous integration, deployments, etc.) Comprehensive knowledge of the principles of software engineering and data analytics Advanced knowledge of the Hadoop ecosystem and Big Data technologies Hands-on experience with the Hadoop eco-system (HDFS, MapReduce, Hive, Pig, Impala, Spark, Kafka, Kudu, Solr) Knowledge of agile(scrum) development methodology is a plus Strong development/automation skills Cloudera/Hortonworks/AWS EMR, S3 experience a plus Qualifications: Strong academic record, ideally with a good degree and a mathematical or scientific background. Strong Communication skills Self Motivated Willingness to learn Excellent planning and organizational skills Pankaj Kumar IT Technical Recruiter Email: [email protected] Gtalk: [email protected] Keywords: machine learning sthree information technology |
[email protected] View all |
Tue Feb 13 03:14:00 UTC 2024 |