Hiring :: Java FSD :: Data Analyst :: Snowflake with Airflow :: DB2 Database Administrator at Mclean, Virginia, USA |
Email: [email protected] |
Hello Partner's This is Vinay from virtual networx inc. please go through below priority requirements and share me suitable resumes to [email protected] or call me at 469-209-6237 If you are submitted for Hexaware please Ignore Title: Java Fullstack Developer Location: McLean VA (Hybrid| 3 Days- Onsite & 2 Days Remote) Start Date: ASAP Client: Hexaware/Freddie Mac Job Description Mandatory: 8+ years of experience in Design and Development of applications using Java 8+/J2EE, Spring, Spring-Boot, RESTful Services and UI Framework 2+ years of experience in design and development of Microservice using Spring-Boot and REST API Strong knowledge/experience in ORM Framework - JPA / Hibernate Good knowledge and experience in Docker and Kubernetes 2+ years of experience in any one of the UI Framework - Angular / ReactJS 1+ years of experience in designing and Implementing cloud-based solutions in various AWS Services (EC2, IAM, S3, Lambda, etc) Good knowledge and experience in any RDBMS/PostgreSQL Strong experience in DevOps tool chain (Jenkins, Artifactory, Maven/Gradle, GIT/BitBucket) Good knowledge in technical concepts Security, Transaction, Monitoring, Performance Nice to have: Experience with OAuth implementation using Ping Identity Familiarity with API Management (Apigee) and Service Mesh (Istio) Experience with Elasticsearch, Logstash & Kibana Good knowledge and experience in any Queue based implementations Good knowledge and experience in NoSQL (MongoDB) Experience with scripting languages using Unix, Python Soft Skills Fast learner of new technologies and tools. Work independently contributing to the success of assigned project(s). Participate in discussions with project teams to understand the application design, build process and help deploy applications in target environments. Degree in Computer Science, Engineering or equivalent Preferably certified in AWS _________________________________________________________________________________________ Data Analyst with MongoDB/NoSQL Job Location: McLean, VA Work Model: Onsite from Day1 Hybrid (3 days onsite, 2 days remote) Client: Hexaware/Freddie Mac Job Description: 5 to 7 years experience with good hands-on experience on the database side Hands on experience with data modeling not the full design but should understand the tables. Strong Database knowledge, joins, normalization concepts Work across client engagements, providing expertise in data collection, data analysis, data mapping, data profiling, data mining and data modeling. Responsible for inspecting, cleansing, transforming, and modeling data and will address issues related to data completeness and quality. Work directly with our software development team to ensure that we are creating best-in-class solutions to solve our customers complex data challenges. Monitor the quality of data and examine complex data to optimize the efficiency and quality of the data being collected. Support the deployment, monitoring, and maintenance of production use cases. Skills: Must be proficient in SQL (DB2), NoSQL (Mongo DB query), JIRA Exposure to AWS cloud-based systems, API integrations, ETLs Proficient in software or data testing. Candidates must possess strong communication skills and problem-solving abilities. Excellent customer skills are a must as well as strong aptitude to learn and adapt to new technologies. Nice to have skills: Data modeling experience, Data Lakes, Snowflakes SQL Collibra, Business Objects _____________________________________________________________________________________________ Role: Snowflake with Airflow | 253160 Location: Chicago, IL (Hybrid 3 days a week) Client: The Northern Trust Responsibilities: Design, implement, and maintain data pipelines on Snowflake, ensuring scalability, reliability, and performance. Develop and optimize data ingestion processes from various sources, including Azure Blob Storage, Azure Data Lake, databases, APIs, and streaming data sources. Implement data transformation workflows using SQL, Python, and Airflow to cleanse, enrich, and aggregate raw data for downstream consumption. Collaborate with data scientists and analysts to understand data requirements and implement solutions that enable advanced analytics and machine learning. Design and implement data governance policies and procedures to ensure data quality, security, and compliance with regulatory requirements. Perform performance tuning and optimization of Snowflake data warehouse, including query optimization, resource management, and partitioning strategies. Develop monitoring, alerting, and logging solutions to ensure the health and availability of data pipelines and Snowflake infrastructure. Stay up-to-date with the latest trends and technologies in data engineering, cloud computing, and workflow orchestration, and recommend relevant tools and practices to enhance our data infrastructure. Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent work experience). Minimum of 8years of experience working as a Data Engineer, with a focus on cloud-based data platforms. Strong expertise in Snowflake data warehouse, including experience with Snowflake architecture, SQL, and performance optimization. Hands-on experience with Azure cloud platform, including Azure Blob Storage, Azure Data Lake, and Azure SQL Database. Proficiency in workflow orchestration tools such as Apache Airflow, including DAG definition, task scheduling, and error handling. Experience with data modeling concepts and techniques, including dimensional modeling and data warehousing best practices. Strong programming skills in SQL and Python, with experience in data manipulation, transformation, and analysis. Solid understanding of data governance, security, and compliance requirements, particularly in a regulated industry. Excellent problem-solving skills and the ability to troubleshoot complex issues in data pipelines and infrastructure. Strong communication skills and the ability to collaborate effectively with cross-functional teams __________________________________________________________________________________________________ Job Title:: DB2 Database Administrator 24/7 rotational shift (Shift 1 (8 AM 5 PM), Shift 2 (4 PM 1 AM), Shift 3 (12 AM 9 AM)) Plano TX- onsite Freddie Mac Job Description: Skills Needed DB2, Production Support, Monitoring, L1, L2, L3, DB Joins Offer DBA DB2 support for application development team. Ensure integrity, availability and performance of DB2 database systems by providing technical support and maintenance. Monitor database performance and recommend improvements for operational efficiency. Assist in capacity planning, space management and data maintenance activities for database system. Perform database enhancement and modification as per the requirements. Perform database recovery and backup tasks on daily and weekly basis. Develop and maintain patches for database environments. Identify and recommend database techniques to support business needs. Assist in implementation of new features and program fixes in databases. Maintain database security and disaster recovery procedures. Perform troubleshooting and maintenance of multiple databases. Resolve any database issues in accurate and timely fashion. Monitor databases regularly to check for any errors such as existing locks and failed updates. Oversee utilization of data and log files. Manage database logins and permissions for users. Keywords: user interface access management sthree database golang Illinois Texas Virginia |
[email protected] View all |
Tue Mar 05 21:09:00 UTC 2024 |