Urgent opening for Sr. Data Engineer at Jacksonville, FL (Hybrid- need local) at Jacksonville, Florida, USA |
Email: [email protected] |
From: Shivani Gupta, ibridgetechsoft.com [email protected] Reply to: [email protected] Hi, Hope you are doing Great!! My name is Shivani Gupta from ibridgetechsoft.com . I am aggressively recruiting for one of the positions for Sr. Data Engineer at Jacksonville, FL (Hybrid- need local), and United States with one of our clients. Please have a look at the job description below and if interested feel free to call me or revert to this email. If not, then you may refer anyone you may know looking for a job/job change. Job Title: Sr. Data Engineer Job Location: Jacksonville, FL (Hybrid- need local) Experience Required: 10+ Yrs. Position Type: Long Term Contract (REQUIRED) Experience in AWS data platform such as AWS EMR, Step Function, Event Scheduler, SNS, S3, CodeCommit, CodePipeline Experience in Snowflake platform Experience in data ingestion, data transformation, and data loading Experience in Azure Data platforms such as Azure Databricks, Azure Synapse, Azure Data Factory, ADLS Gen2, etc. Working knowledge of Python, PySpark or equivalent 4 + Years of Data ingestion and transformation experience using AWS data services such as EMR, step function, event scheduler, SNS, Python & SQL Server SSIS (any combination) 2+ Years experience in Snowflake platform 8+ Years ETL experience with designing star/snowflake schema 2+ Years of experience with Azure Data platforms (ADF, Synapse workspace, Databricks, Event Hub, Service Bus, Logic App, ADLS gen 2) and Python 2 + Years of experience in more than one object-oriented programming languages like React.JS, Jode.JS, HTML, JavaScript Proficiency with HTML, CSS, JavaScript/jQuery, local storage and cross browser compatibility JOB DUTIES: Design, develop, document & maintain complex data pipelines using AWS data services such as EMR, Step Function, SNS, Event Scheduler to source data from multiple sources including RDBMS, Parquet files (on AWS S3 bucket), Excel, and text files. Design, develop, document & maintain complex data pipelines using Azure data services such as Logic App, ADF, ADLS gen 2, Azure Databricks, Azure Synapse workspace, Event hub, service bus to source data from multiple sources including RDBMS, Parquet files (ADLS gen 2, S3 bucket, GCP, on prem), Excel, and text files. Deploy & automate (using CICD pipeline) Azure, AWS cloud services using IaaC (Infra as a Code) Develop & design web solution using ReactJS, Redux & NodeJS Evaluate, debug, and modify existing complex Python/PySpark code, & SSIS packages in accordance with business requirements. Perform data analysis and test/debug software solutions Analyze existing ETL jobs, develop ELT pipelines & data warehouse applications or work to formulate logic for moderately complex new systems and devise moderately complex algorithms. Practices current development methods/techniques including CI/CD pipelines using AWS CodeCommit & AWS CodePipeline, Azure DevOps and establishes development standards (which include coding standards, documentation standards, and testing standards) to ensure the quality and maintainability of automated solutions. EDUCATION: Bachelor s degree in computer science, or IT or equivalent Please share your updated resume at [email protected] or feel free to call me at +1 904-587-1278 Keywords: continuous integration continuous deployment javascript access management sthree information technology Florida |
[email protected] View all |
Fri Mar 08 01:37:00 UTC 2024 |