JOB | Data Engineer | Remote OK at Remote, Remote, USA |
Email: [email protected] |
From: Manoj Rathee, Sunray Enterprise, Inc. [email protected] Reply to: [email protected] Hi Dear, I hope my mail finds you in good health and doing well! We currently have the JOB POSITION listed below available. Kindly go through the job description and share your latest updated RESUME, visa copy, and photo ID so that I can submit the profile to the client. Job Position :- Data Engineer Location :- REMOTE OK Duration :- Long Term Job Description:- Role Description: The Senior Data Engineer will be working collaboratively with business and technical team members to build data pipelines and stage high quality data for consumption by business analysts, data scientists and visualization developers. You will work with product owners, architects, developers to build real time/ near real time scalable data integration solutions to enable our Data & Analytics platform in providing relevant, timely and accurate data to self-service BI tools, web platforms via APIs and Data science tools. This position requires the candidate to be a team player and results oriented. High sense of integrity, good communication skills, great attention to detail, and proven track record to work with different partners and technologies are essential qualities for a successful candidate. Job Experience Requirements: Bachelors degree in the areas of Computer Science, Engineering, Information Systems, Business, or equivalent field of study required 7+ years of experience in working with data solutions. 3+ years of experience coding in Python, or Scala or similar scripting language. 3+ years of experience in developing data pipelines in AWS Cloud Platform (preferred), Azure, or Snowflake at scale. 2+ years Experience in designing and implementing data ingestion with real-time data streaming tools like Kafka, Kinesis or any similar tools.SAP/Salesforce or other cloud integrations are preferred. 3+ years experience working with MPP databases such as Snowflake (Preferred) , Redshift or similar MPP databases. 2+ years experience working with Serverless ETL processes (Lambda, AWS Glue, Matillion or similar) 1+ years experience with big data technologies like EMR, Hadoop, Spark, Cassandra, MongoDB or other open source big data tools. Knowledge of professional software engineering best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations. Experience designing, documenting, and defending designs for key components in large distributed computing systems Demonstrated ability to learn new technologies quickly and independently Demonstrated ability to achieve stretch goals in a very innovative and fast paced environment Ability to handle multiple competing priorities in a fast-paced environment Excellent verbal and written communication skills, especially in technical communications Strong interpersonal skills and a desire to work collaboratively Experience participating in an Agile software development team, e.g. SCRUM Job Responsibilities: Responsible for the building, deployment, and maintenance of critical scalable Data Pipelines to assemble large, complex sets of data that meet non-functional and functional business requirements Work closely with SMEs, Data Modeler, Architects, Analysts and other team members on requirements to build scalable real time/near real time/batch data solutions. Contributes design, code, configurations, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, and loading into Data Lake/Cloud Data Warehouse/MPP (Snowflake/Redshift/similar Technologies ) . Owns one or more key components of the infrastructure and works to continually improve it, identifying gaps and improving the platforms quality, robustness, maintainability, and speed. Cross-trains other team members on technologies being developed, while also continuously learning new technologies from other team members. Interacts with technical teams across Cepheid and ensures that solutions meet customer requirements in terms of functionality, performance, availability, scalability, and reliability. Performs development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions. Keep up with current trends in big data and Analytics , evaluate tools and pace yourself for innovation. Mentor Junior engineers ,create necessary documentation and Run-books while still being able to deliver on goals Hope to hear from you soon !!! Keywords: quality analyst business intelligence information technology golang Idaho |
[email protected] View all |
Thu Oct 19 01:10:00 UTC 2023 |