Home

Need :: Java FSD :: Data Analyst with MongoDB :: SDET :: Snowflake + Airflow + Kubernetes at Plano, Texas, USA
Email: [email protected]
Hello Partner's

This is Vinay from virtual networx inc. please go through below priority requirements and share me suitable resumes to 

[email protected]
 or call me at 469-209-6237

If you are submitted for Hexaware please Ignore

Role : Java FSD with AWS

Client location: Plano, TX/ Reston VA

Required Skills: 
AWS (EC2, ECS, S3, CloudWatch, SQS , SNS, AWS Lambda) Spring boot, Java 8/11 , Microservices and Angular (If full stack)

10+ Programming experience with Java, J2EE, XML, and Web Services

Experience with Angular 8 development, Java script, CSS, HTML- If Full Stack candidate

Micro services, Spring, Spring Boot, Hibernate, Web services based on REST.

Experience in Agile Scrum software development Methodology

Experience developing in distributed application environments.

AWS Experience including AWS CodeBuild, AWS CodeCommit, AWS CodePipeline , AWS Lambda, API Gateway, AWS CLI/Yaml/CloudFormation, Serverless Deployment

Experience with Application integrations (SOAP/REST Web Services, ESB, JMS, File/Data transfers, etc

_________________________________________________________________________________________

Please share best candidates, with Realtime experience on MongoDB/NoSQL 

Data Analyst with MongoDB/NoSQL

Job Location: McLean, VA

Work Model: Onsite from Day1 Hybrid (3 days onsite, 2 days remote)

Client:
 Hexaware/Freddie Mac

Job Description:

5 to 7 years experience with good hands-on experience on the database side

Hands on experience with data modeling not the full design but should understand the tables.

Strong Database knowledge, joins, normalization concepts

Work across client engagements, providing expertise in data collection, data analysis, data mapping, data profiling, data mining and data modeling.

Responsible for inspecting, cleansing, transforming, and modeling data and will address issues related to data completeness and quality.

Work directly with our software development team to ensure that we are creating best-in-class solutions to solve our customers complex data challenges.

Monitor the quality of data and examine complex data to optimize the efficiency and quality of the data being collected.

Support the deployment, monitoring, and maintenance of production use cases.

Skills:

Must be proficient in SQL (DB2), NoSQL (Mongo DB query), JIRA

Exposure to AWS cloud-based systems, API integrations, ETLs

Proficient in software or data testing.

Candidates must possess strong communication skills and problem-solving abilities.

Excellent customer skills are a must as well as strong aptitude to learn and adapt to new technologies.

Nice to have skills:
Data modeling experience, Data Lakes, Snowflakes SQL

Collibra, Business Objects

_________________________________________________________________________________________________

Role:- SDET | AR- 260842

Location-Mclean VA Day 1 onsite in hybrid

Experience 5-7 Years

Job Description-

Creates an overall automation strategy for the application under test

Supports the application team in their continuous delivery efforts

Participates in code review meetings

Support other testers and automation engineers in the approach design, script development, execution, and reporting.

Analyzes and communicates test results and defect resolution tasks

Stays up to date with industry trends and best practices in order to apply them to an existing project

Influences stakeholders to adopt automation practices to help make the software development processes more efficient

Mandatory:

6+ years of proven experience and well versed in
Java, Selenium, BDD & TDD, Cucumber, Gherkin and CI/CD tools

Strong knowledge and proven experience in
API automation, experience with REST assured or Karate; front-end and backend automation as well

Strong knowledge in Databases like
(DB2, Mongo DB or Snowflake) and experience in writing
SQL queries 

Must have
strong communication skills with ability to work with all management levels

Good to have:

Preferred experience in File handling in Java

Good knowledge or experience in Cypress

Preferably certified in AWS.

Degree in Computer Science, Engineering or equivalen

__________________________________________________________________________________________________

Role: Snowflake + Airflow + Kubernetes

Location: Chicago, IL

Duration: Long Term

Experience: 10+ Years

Responsibilities:

Design, implement, and maintain data pipelines on Snowflake, ensuring scalability, reliability, and performance.

Develop and optimize data ingestion processes from various sources, including Azure Blob Storage, Azure Data Lake, databases, APIs, and streaming data sources.

Implement data transformation workflows using SQL, Python, and Airflow to cleanse, enrich, and aggregate raw data for downstream consumption.

Collaborate with data scientists and analysts to understand data requirements and implement solutions that enable advanced analytics and machine learning.

Design and implement data governance policies and procedures to ensure data quality, security, and compliance with regulatory requirements.

Perform performance tuning and optimization of Snowflake data warehouse, including query optimization, resource management, and partitioning strategies.

Develop monitoring, alerting, and logging solutions to ensure the health and availability of data pipelines and Snowflake infrastructure.

Stay up-to-date with the latest trends and technologies in data engineering, cloud computing, and workflow orchestration, and recommend relevant tools and practices to enhance our data infrastructure.

Qualifications:

Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent work experience).

Minimum of 8years of experience working as a Data Engineer, with a focus on cloud-based data platforms.

Strong expertise in Snowflake data warehouse, including experience with Snowflake architecture, SQL, and performance optimization.

Hands-on experience with Azure cloud platform, including Azure Blob Storage, Azure Data Lake, and Azure SQL Database.

Proficiency in workflow orchestration tools such as Apache Airflow, including DAG definition, task scheduling, and error handling.

Experience with data modeling concepts and techniques, including dimensional modeling and data warehousing best practices.

Strong programming skills in SQL and Python, with experience in data manipulation, transformation, and analysis.

Solid understanding of data governance, security, and compliance requirements, particularly in a regulated industry.

Excellent problem-solving skills and the ability to troubleshoot complex issues in data pipelines and infrastructure.

Strong communication skills and the ability to collaborate effectively with cross-functional teams.

Keywords: continuous integration continuous deployment sthree database golang Arkansas Illinois Texas Virginia
Need :: Java FSD :: Data Analyst with MongoDB :: SDET :: Snowflake + Airflow + Kubernetes
[email protected]
[email protected]
View all
Mon Apr 01 20:45:00 UTC 2024

To remove this job post send "job_kill 1268133" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 14

Location: Plano, Texas