Open Requirements at Mclean, Virginia, USA |
Email: [email protected] |
Hello , Please find open requirements below. Please reach me at [email protected] Java Full Stack Engineer McLean VA (Day-1 Onsite| 3 days Onsite & 2 Days remote) JD, 9+ years of experience in Design and Development of applications using Java 8+/J2EE, Spring, Spring-Boot, RESTful Services and UI Framework 2+ years of experience in design and development of Microservice using Spring-Boot and REST API Strong knowledge/experience in ORM Framework - JPA / Hibernate Good knowledge and experience in Docker and Kubernetes 2+ years of experience in any one of the UI Framework - Angular / ReactJS 1+ years of experience in designing and Implementing cloud-based solutions in various AWS Services (EC2, IAM, S3, Lambda, etc) Good knowledge and experience in any RDBMS/PostgreSQL Strong experience in DevOps tool chain (Jenkins, Artifactory, Maven/Gradle, GIT/BitBucket) Good knowledge in technical concepts Security, Transaction, Monitoring, Performance Nice to have: Experience with OAuth implementation using Ping Identity Familiarity with API Management (Apigee) and Service Mesh (Istio) Experience with Elasticsearch, Logstash & Kibana Good knowledge and experience in any Queue based implementations Good knowledge and experience in NoSQL (MongoDB) Experience with scripting languages using Unix, Python Soft Skills Fast learner of new technologies and tools. Work independently contributing to the success of assigned project(s). Participate in discussions with project teams to understand the application design, build process and help deploy applications in target environments. Degree in Computer Science, Engineering or equivalent Preferably certified in AWS Position: Java Lead with Strong Bigdata Job Location - McLean, VA. (Day-1 Onsite| 3 days Onsite & 2 Days remote) Job Description Mandatory: 10+ years of experience in solution, design and development of applications using Java 8+/J2EE, Spring, Spring-Boot, Micro Services, RESTful Services and with experience in Big Data and with experience working in heavy data background needed Must have strong experience in Big Data and with experience working in heavy data background needed Must be strong in Cloud AWS event-based architecture, Kubernetes, ELK (Elasticsearch, Logstash & Kibana) Must have excellent experience in designing and Implementing cloud-based solutions in various AWS Services (: s3, Lambda, Step Function, AMQ, SNS, SQS, CloudWatch Events, etc.) Must be well experienced in design and development of Microservice using Spring-Boot and REST API and with GraphSQL Must have solid knowledge and experience in NoSQL (MongoDB) Good knowledge and experience in any Queue based implementations Strong knowledge/experience in ORM Framework - JPA / Hibernate Good knowledge in technical concepts Security, Transaction, Monitoring, Performance Should we well versed with TDD/ATDD Should have experience on Java, Python and Spark 2+ years of experience in designing and Implementing cloud-based solutions in various AWS Services Strong experience in DevOps tool chain (Jenkins, Artifactory, Ansible/Chef/Puppet/Spinnaker, Maven/Gradle, Atlassian Tool suite) Very Good knowledge and experience in Non-Functional (Technical) Requirements like Security, Transaction, Performance, etc. Nice to have: Experience in any one of the UI Framework - Angular / ReactJS Experience in Experience with OAuth implementation using Ping Identity Experience in API Management (Apigee) or Service Mesh (Istio) Good knowledge and experience in Queue/Topic (Active-MQ) based implementations Good knowledge and experience in Scheduler and Batch Jobs Experience with scripting languages using Unix Preferably certified in AWS Position:- Tech Lead (Spark and Python) Location:- McLean, VA. (Day-1 Onsite| 3 days Onsite & 2 Days remote) Duration long term NOTE :- Need someone who has worked in Java and then moved into Spark (BigData) and Python programming and working as Lead or senior level. Sample Resume is attached. Job Description: Mandatory: 10+ years of experience in solution, design and development of applications using Java 8+/J2EE, Spring, Spring-Boot, Micro Services, RESTful Services and with experience in Big Data and with experience working in heavy data background needed. Develop, program, and maintain applications using the Apache Spark open-source framework Work with different aspects of the Spark ecosystem, including Spark SQL, DataFrames, Datasets, and streaming Spark Developer must have strong programming skills in Java, Scala, or Python Familiar with big data processing tools and techniques Proven experience as a Spark Developer or a related role Strong programming skills in Java, Scala, or Python Familiarity with big data processing tools and techniques Experience with the Hadoop ecosystem Good understanding of distributed systems Experience with streaming data platforms Must have strong experience in Big Data and with experience working in heavy data background needed Must be strong in Cloud AWS event-based architecture, Kubernetes, ELK (Elasticsearch, Logstash & Kibana) Must have excellent experience in designing and Implementing cloud-based solutions in various AWS Services (: s3, Lambda, Step Function, AMQ, SNS, SQS, CloudWatch Events, etc.) Must be well experienced in design and development of Microservice using Spring-Boot and REST API and with GraphQL Must have solid knowledge and experience in NoSQL (MongoDB) Good knowledge and experience in any Queue based implementations Strong knowledge/experience in ORM Framework - JPA / Hibernate Good knowledge in technical concepts Security, Transaction, Monitoring, Performance Should we well versed with TDD/ATDD Should have experience on Java, Python and Spark 2+ years of experience in designing and Implementing cloud-based solutions in various AWS Services Strong experience in DevOps tool chain (Jenkins, Artifactory, Ansible/Chef/Puppet/Spinnaker, Maven/Gradle, Atlassian Tool suite) Very Good knowledge and experience in Non-Functional (Technical) Requirements like Security, Transaction, Performance, etc. Excellent analytical and problem-solving skills Nice to have: Experience in Experience with OAuth implementation using Ping Identity Experience in API Management (Apigee) or Service Mesh (Istio) Good knowledge and experience in Queue/Topic (Active-MQ) based implementations. Good knowledge and experience in Scheduler and Batch Jobs Experience with scripting languages using Unix Preferably certified in AWS Data Architect ( Sybase & MongoDB is mandatory here)) McLean VA (Day-1 Onsite| 3 days Onsite & 2 Days remote) JD, 5+ years of experience leading architecture governance processes such as technology selection and architecture review boards, implementing Cloud (AWS) solutions, building proofs-of-concept, and migrating large-scale workloads to the cloud using native solutions. 5+ years of experience securing enterprise applications on public cloud service providers, including IaaS, PaaS, and SaaS, and performing Application Security assessments. 5+ years experience in applying enterprise integration patterns using API gateways and cloud-native services, architecting for multi-resiliency using the enterprise patterns 2+ years of experience developing and implementing layered security architecture, architecture designs to remediate vulnerabilities, and securing applications in the cloud. 7+ years of experience managing stakeholders, including negotiating, influencing, and persuading others. Excellent collaboration skills, including written and verbal communication skills. 5+ years of experience as an Enterprise or Solution Architect with a development background using Python or Java on the cloud. 5+ years of experience in AWS services such as EC2, ECS Fargate, Lambda, PostgreSQL, MongoDB, SageMaker, Athena, Glue, VPC, CloudFront, etc., and experience using APIs for developing or programming software. 5+ years of experience designing systems to control, monitor, and manage large, complex, and sometimes geographically dispersed IT infrastructure and applications to optimize IT service delivery. Desired Experiences. Data Architect What type of data are they working with[AA] loan/mortgage data in Sybase.. to be migrated to Mongo Are they working with specific DBMS What recommendations are you looking for this person to make[AA] after the analysis, the recommendations are to be about new/alternative data source to be utilized for the applications currently using the Sybase database. How much ETL experience does this person need to have[AA] 1-3 years should be good Data Analyst (Sybase & MongoDB is mandatory here) McLean VA (Day-1 Onsite| 3 days Onsite & 2 Days remote) JD, Analyze the current source of data, for key data elements, understand the current usage and identify the new/alternative source for these data elements Experience in Sybase and DB2 is needed and knowledge of ECDA is desired Identify customer needs and intended use of requested data in the development of database requirements and support the planning and engineering of databases. Maintain comprehensive knowledge of database technologies, complex coding languages, and computer system skills. Collaborate with developers and project managers to understand integration requirements. Assist in designing and implementing API Integrations between different software systems in Modeling applications. Write and maintain API Documentation for internal and external stakeholders. Troubleshoot and debug integration issues, identifying and resolving technical obstacles. Stay up to date with industry trends and best practices in API Integration. Work with people with different functional expertise respectfully and cooperatively to work toward a common goal Skilled in documentation and database reporting for the purposes of analysis, data discovery, and decision-making with the use of relevant software such as Crystal Reports, Excel, or SSRS Experience in the process of analyzing data to identify trends or relationships to inform conclusions about the data Data Analyst What type of data are they working with[AA] Sybase/Mongo Do they need to have specific visualization/ dashboarding experience Are you open to candidates with different tools[AA] Tableau / Donado What functions does this person need to be able to execute within SQL (joins, tables, etc)[AA] primary focus has to be on data mapping for the existing/new data source, in addition to the typical SQL operations. Data Modeler MongoDB(can be other NoSQL Databases also) McLean VA (Day-1 Onsite| 3 days Onsite & 2 Days remote JD, Designs, implements, and documents data architecture and data modeling solutions, which include the use of MongoDB and other NoSQL databases. Development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. Skills needed: 10 years data modeling experience with 5+ years of hands-on relational, dimensional, and/or analytic experience (using MongoDB and other NoSQL data platform technologies, and ETL and data ingestion protocols). Specialized technical knowledge of the MongoDB platform or similar NoSQL technologies Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. In depth knowledge of modeling/architectural patterns, governance methodologies, and potential limitations within MongoDB Ability to configure schema and MongoDB data modelling Experience in Database security management Knowledge of MongoDB administration and installation in AWS and Red Hat In-depth understanding of MongoDB architecture We need a data modeler for NoSQL Databases (not necessarily Mongo only can be other NoSQL Databases also) This is not a DBA position for Mongo this is primarily a data modeler need to work with unstructured data (JSON file formats). There is no relational data here so the thought process of the ideal candidate should be aligned to be able to work with Big Data, NoSQL data which is not relational. Keywords: user interface message queue sthree information technology Virginia |
[email protected] View all |
Fri Nov 03 20:44:00 UTC 2023 |