Data Platform Engineer at New York, New York, USA |
Email: [email protected] |
From: kajal, technocraft [email protected] Reply to: [email protected] Hello, Hope you are doing good. My name is Kajal katara and I am associate recruiter at Technocraft solution LLC. I am reaching out to you for an exciting job opportunity with one of our clients. Role: Data Platform Engineer Location: New York, NY (Onsite) Pro Tip: Look for someone who has hardcore financial/investment/Hedge Fund background. A global team of alternative investment managers passionate about delivering uncommon value to our investors and shareholders. With over 30 years of proven expertise across Private Equity, Credit and Real Estate, regions and industries, were known for our integrated businesses, our strong investment performance, our value-oriented philosophy and our people. We are seeking hands on Data Platform Engineer consultant to architect and implement solutions for Data Pipeline Orchestrating (ELT, ETL, Enrichment, Storage), and Distribution Platform Management (API, EGRESS, Data Sharing, Cataloging). The ideal candidate is a well-rounded hands-on engineer, passionate about delivery and streamlining data integration tasks, optimization, and solutions to support data applications. The Lead Engineer is encouraged to understand and ready to embrace modern data platforms like Azure ADF, Databricks, Synapse, Snowflake, Azure API Manager, as well as innovate on ways to streamline the developer experience. Our engineers will have the aptitude to fulfill tasks independently and work alongside team members focused on a common goal. Responsibilities: Part of a team to deliver API platforms, Use API Management platform to design and implement requirements of the API layer. Such as policies that will cover security, caching, limits, logging, request, and response modifications Harness modern application best practices with code quality, API test Coverages, Agile Development, DevOps, and Observability and support. Maintain programming standards and ensure the usage of the pattern / template for API Proxy. Design and implementation of data pipeline solutions that improve data engineer productivity for onboarding data, while maintaining accuracy and joinability. Conduct code reviews and automatic test coverage Standardize the CI/CD setup for API management tools and automated deployment. Utilize problem-solving skills to help your peers in the research and selection of tools, products, and frameworks (which is vital to support business initiatives) Responsible for large data API requests and security of API consumption Ensuring stability of API and APIM performance and maintain SLAS Implement OAuth Okta integration for communication between API producers and consumers. Mandatory Skills Description: Qualifications & Experience 7+ years of proven industry experience; Masters or bachelors degree in IT or related fields Hands-on development expertise in Java, Python, GraphQL, SQL, Junit, Spring Boot, OpenAPI, Spark, Flink, Kafka Experience working in cloud data platforms such as Azure, Snowflake, Yellowbrick, Singlestore, GBQ Understanding of Databases, API Frameworks, Governance Frameworks, and expertise in hosting and managing platforms like: Hadoop, Spark, Flink, Kafka, SpringBoot, BI Tools like Tableau, Alteryx, Governance Tools like Callibra, Soda, Amazon DeeQu Strong understanding of Twelve-Factor App Methodology Solid understanding of API and integration design principles and pattern experience with web technologies. Design object-oriented, modularized, clean, and maintainable code and creating policies in Java, JavaScript, Node JS, Python etc. Experience implementing requirements of the API layer like security, throttling, OAuth 2.0, TLS, certificates, Azure KeyVault, caching, logging, request, and response modifications etc. using API management platform. Experience creating custom policies in XML, Java, JavaScript, Node JS, Python etc. in API management platform. Experience with test-driven development and API testing automation. Demonstrated track record of full project lifecycle and development, as well as post-implementation support activities Significant experience of designing, deploying, and supporting production cloud environments like Azure and Kubernetes Experience with Azure DevOps CI/CD Tools to build and deploy Java/API packages and API deployment automation. Hands-on experience in designing and developing high volume REST using API Protocols and Data Formats. Nice-to-Have Skills: Additional Qualifications Financial experience: Public and Alternatives Asset Management Familiar in NoSQL\\NewSQL databases Working with Azure API and DB Platforms Strong documentation capability and adherence to testing and release management standards Design, development, modification and testing of databases designed to support Data Warehousing and BI business teams Familiarity with SDLC methodologies, defect tracking (JIRA, Azure DevOps, ServiceNow etc.) Soft Skills: Candidate must have an analytical and logical thought process for developing project solutions Strong interpersonal and communication skills; works well in a team environment Ability to deliver under competing priorities and pressures. Excellent organizational skills in the areas of code structuring & partitioning, commenting and documentation for team alignment and modifications Thanks and Regards, Kajal Katara Associate Recruiter Technocraft Solutions LLC Email: [email protected] Keywords: continuous integration continuous deployment javascript access management business intelligence database information technology New York |
[email protected] View all |
Wed Jan 25 22:46:00 UTC 2023 |