C2H || USC only || Ohio Local Only || Data Engineer with Python || Cincinnati, OH(Onsite) || 8+ Years at Cincinnati, Ohio, USA |
Email: [email protected] |
Hello All, I Hope you are doing well. I have an urgent requirement for a Data Engineer to locationCincinnati, OH(Onsite). Please have a look and let me know if you are interested. Key Responsibilities: Design, develop, and maintain data pipelines that enable the extraction, transformation, and loading (ETL) of data from various sources into our data warehouse. Collaborate with finance executives, data scientists, analysts, and other stakeholders to understand data requirements and ensure data quality, reliability, and accessibility. Create and optimize SQL queries for data extraction, transformation, and analysis, with a strong emphasis on PostgreSQL. Utilize Python for scripting, automation, and data manipulation tasks, ensuring code efficiency and maintainability. Develop and maintain data infrastructure in cloud environments (AWS or Azure), including setting up and configuring necessary services, data storage, and orchestration. Implement DevOps best practices for continuous integration and continuous deployment (CI/CD) of data pipelines and associated infrastructure. Monitor, troubleshoot, and optimize data pipelines to ensure performance, scalability, and data accuracy. Document data engineering processes, standards, and best practices for the team. Qualifications: Bachelor's degree in Computer Science, Data Engineering, or a related field (or equivalent work experience). Strong proficiency in Python, including experience with data manipulation libraries like Pandas. Solid expertise in SQL, with a focus on PostgreSQL, and the ability to write complex and efficient code. Proficiency in Linux command line scripting and BASH. Experience working with cloud platforms, with preference for Azure a plus. Knowledge of data warehousing concepts and data modeling. Familiarity with data pipeline orchestration tools, such as Apache Airflow. Experience with version control systems (e.g., Git) and CI/CD practices. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication and collaboration skills to work effectively with cross-functional teams. Nice-to-Have Skills: Experience with big data technologies such as Spark, and exposure to or experience with distributed computing. Knowledge of containerization and container orchestration tools (e.g., Docker, Kubernetes). Familiarity with data security and compliance best practices. Previous work in Agile or Scrum development environments. -- Keywords: continuous integration continuous deployment information technology Ohio |
[email protected] View all |
Thu Nov 02 00:07:00 UTC 2023 |