Home

Urgent and Direct client req Big Data Engineer at Remote, Remote, USA
Email: [email protected]
Hi professionals 

Good morning

How are you all

Hope you all are doing well

Below is the JD 
for the 
Big Data Engineer

Please let me know if you have any suitable candidates please share their resumes Including contact details with their current location & which visa

Urgent and Direct client 

Big Data Engineer

Peraton/CDC

Big Data Engineer

Remote(East Coast)

Contract

Rate : open

USC only with 10+ years exp on w2 or 1099

Responsibilities
 for the Role:

Collaborates with CDC and other public health entities to translate workflows into business requirements in support of data and system integration projects

Work with the data scientists to productionize their analytical workloads and models

Work with data scientists and data visualization developers to productionize their analytical workloads, visualizations and dashboards

Directly responsible for production pipelines, their orchestration and change management

Evaluating current data and help define the development/ETL strategies for moving data from heterogeneous source systems to a data warehouse/data lake and surfacing reporting and analytics using MS Power BI or other Visualization and Analytical Tools in the Cloud

Communicate and/or address build, deployment and operational issues as they come up

Work on workflow optimization and execution improvements

Automate the monitoring of data processes and outputs

Solve day-to-day customer and production challenges

Interact with, and support, a variety of different teams (engineering, quality, management, etc.)

Collaborate with Engineering and Platform teams to improve automation of workflows, code testing and deployment

Collaborate with Engineering and Platform teams on the latest technologies for data management

Monitor all data update processes and outputs to ensure predictive quality

Communicate with customers to discuss any issues with received data and help them identify and fix data issues

Iterate on best practices to increase the quality and velocity of deployments

Design and implement secure automation solutions for production environments

Strong communication and organizational skills

Experience working in a cross-functional team in a dynamic environment:

Ability to work independently and deliver to deadlines

Ability to solve problems with minimal direction

Strong deductive reasoning ability

Great attention to detail and accuracy

Ability to work in a dynamic team environment using AGILE methodology.

Basic Qualifications:

The qualified applicant will meet the following job requirements and will be able to fulfill the responsibilities listed below.

Bachelors in engineering, Information Systems, Computer Science or Information Technology or 7 years equivalent experience.

Design and implement scalable data pipelines and data storage on Azure using, Data Factory Service Bus, ADLS, Synapse and a Spark-based architecture.

Automate the deployment and operation of data pipelines using Azure Data Factory (ADF) and Databricks Spark

Determine data storage structural requirements by analyzing and reviewing objectives, key business metrics and reporting requirements with customer.

Demonstrated expertise working with ETL on SQL, JSON, CSV/TSV, Parquet data sources in Cloud Object Storage like S3/Azure ADLSv2 using Big Data Cloud technologies.

Provide high level expertise in applicable public health disciplines to collect, abstract, query/code, analyze, and interpret scientific data contained within information systems and cloud databases, S3, Azure Blob (ADLS) and other data structures related to public health.

Help build data pipelines leveraging Data Factory, Databricks Notebooks with experience working with Delta Lakes (Data bricks) and demonstrated knowledge of data flows.

Working knowledge of relational DB: Cosmos, SQL Server, Azure Synapse SQL, Postgres, AWS, MongoDB

Knowledge of scalable and low latency implementations for data products using Spark, Kafka Elasticsearch, or similar

Knowledge of orchestration and monitoring of pipelines for their failure success and recovery

Implement Comprehensive Testing and Continuous Integration frameworks for schema, data, and functional processes/pipelines.

Provide recommendations on opportunities for leveraging new data sources, data reporting capabilities, and integration of systems.

Provides meaningful knowledge transfer of design decisions, component composition and technical solutions to the program staff.

Consulting with CDC Scientists and Epidemiologist on the algorithms needed to support research.

Advanced SQL knowledge and experience working with relational and non-relational databases, as well as designing Tables, Schemas, or Collections

Familiarity with relational and data Warehouse optimizations such as materialized views, indexing and query optimization.

Manage data in cloud storage, and data technologies such as, Spark, Databricks and Snowflake environments, using scripts and automation.

Strong dedication/commitment to
 automation, simplicity, and smooth-running systems

Experience with Python/Scala integration with Databricks

Preferred Qualifications:

Knowledge of Hive Megastore, Hadoop Ecosystem

Experience with developing dashboards Power BI and or Quicksight

Thanks & Regards 

Akhil Reddy

Administrative Assistant

Keshav Consulting Solutions, LLC

Phone:  9
1
9-439-7374

Email: [email protected]

Address: 5470 McGinnis Village Place, Suite102, Alpharetta, GA, 30005,

Website: 
www.keshavconsulting.com

Keywords: business intelligence sthree database wtwo microsoft Georgia
[email protected]
View all
Mon Aug 28 20:05:00 UTC 2023

To remove this job post send "job_kill 575796" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 1

Location: ,