Immediate Requirement for Big Data Architect at Washington, DC at Washington, DC, USA |
Email: [email protected] |
From: Avinash Kumar, ZealHire Inc [email protected] Reply to: [email protected] Position: Big Data Architect Location: Washington, DC Duration: Long-Term Contract Visa Status: Only USC/GC/GC EAD Job Description: USC/GC/H4-EAD 100% Onsite for the initial one month, followed by either fully remote work or one day per week (hybrid). Location- DMV area- we don't allow ready to relocate folks, please get me someone nearby to Washington DC IMPORTANT: If the candidate has prior federal/government experience, it would be preferable. We are seeking someone who possesses all the skills mentioned under 'Qualifications/Skills,' including: 16+ years of total experience. Experience with Databricks is a MUST- Experience with Big Data and Data Architecture, including data bricks or data lakes. Experience with Tableau. Experience with API/Web Services (REST/SOAP). Required 3+ years' experience with deployment and management of data science tools and modules such as JupyterHub. Required 3+ years' experience with ETL, data processing, and analytics using languages such as Python, Java, or R." JOB DESCRIPTION: seeks an experienced IT Consultant to support the design, development, implementation and maintenance of an enterprise Big Data solution as part of our Data Modernization Effort. This role will provide expertise to support the development of a Big Data / Data Lake system architecture that supports enterprise data operations, including the Internet of Things (IoT) / Smart City projects, enterprise data warehouse, the open data portal, and data science applications. This is an exciting opportunity to work as a part of a collaborative senior data team manager. This architecture includes an Databricks, Microsoft Azure platform tools (including Data Lake, Synapse), Apache platform tools (including Hadoop, Hive, Impala, Spark, Sedona, Airflow) and data pipeline/ETL development tools (including Streamsets, Apache NiFi, Azure Data Factory). Requirements Coordinates IT project management, engineering, maintenance, QA, and risk management. Plans, coordinates, and monitors project activities. Develops technical applications to support users. Develops, implements, maintains and enforces documented standards and procedures for the design, development, installation, modification, and documentation of assigned systems. Provides training for system products and procedures. Performs application upgrades. Performs, monitoring, maintenance, or reporting on real- time databases, real-time network and serial data communications, and real-time graphics and logic applications. Troubleshoots problems. Qualification/Skills Required 5+ years experience implementing Big Data storage and analytics platforms such as Databricks and Data Lakes. Required 5+ years knowledge of Big Data and Data Architecture and Implementation best practices. Required 5+ years knowledge of architecture and implementation of networking, security and storage on cloud platforms such as Microsoft Azure. Required 5+ years experience with deployment of data tools and storage on cloud platforms such as Microsoft Azure. Required 5+ years knowledge of Data-centric systems for the analysis and visualization of data, such as Tableau, MicroStrategy, ArcGIS, Kibana, Oracle. Required 10+ years experience querying structured and unstructured data sources including SQL and NoSQL databases. Required 5+ years experience modeling and ingesting data into and between various data systems through the use of Data Pipelines. Required 5+ years experience with implementing Apache data products such as Spark, Sedona, Airflow, Atlas, NiFi, Hive, Impala. Required 5+ years experience with API / Web Services (REST/SOAP). Required 3+ years experience with complex event processing and real-time streaming data. Required 3+ years experience with deployment and management of data science tools and modules such as JupyterHub. Required 3+years experience with ETL, data processing, analytics using languages such as Python, Java or R. Required 16+ years planning, coordinating, and monitoring project activities. Required 16+ years leading projects, ensuring they are in compliance with established standards/procedures. Preferred 3+ years experience with Cloudera Data Platform Bachelors degree in IT or related field. Keywords: quality analyst rlang information technology green card |
[email protected] View all |
Fri Mar 15 19:20:00 UTC 2024 |