Data Engineer ll Onsite ll Contract at Remote, Remote, USA |
Email: [email protected] |
From: Asad Saeed, DigitalDhara [email protected] Reply to: [email protected] Title: Sr Data Engineer Location: CA Duration: 6+ Months JD: Role: Senior Data Engineer Exp: 10+ years Location: Onshore Job Experience Requirements: Bachelors degree in the areas of Computer Science, Engineering, Information Systems, Business, or equivalent field of study required 7+ years of experience in working with data solutions. 3+ years of experience coding in Python, or Scala or similar scripting language. 3+ years of experience in developing data pipelines in AWS Cloud Platform (preferred), Azure, or Snowflake at scale. 2+ years Experience in designing and implementing data ingestion with real-time data streaming tools like Kafka, Kinesis or any similar tools.SAP/Salesforce or other cloud integrations are preferred. 3+ years experience working with MPP databases such as Snowflake (Preferred) , Redshift or similar MPP databases. 2+ years experience working with Serverless ETL processes (Lambda, AWS Glue, Matillion or similar) 1+ years experience with big data technologies like EMR, Hadoop, Spark, Cassandra, MongoDB or other open source big data tools. Knowledge of professional software engineering best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations. Experience designing, documenting, and defending designs for key components in large distributed computing systems Demonstrated ability to learn new technologies quickly and independently Demonstrated ability to achieve stretch goals in a very innovative and fast paced environment Ability to handle multiple competing priorities in a fast-paced environment Excellent verbal and written communication skills, especially in technical communications Strong interpersonal skills and a desire to work collaboratively Experience participating in an Agile software development team, e.g. SCRUM Job Responsibilities: Responsible for the building, deployment, and maintenance of critical scalable Data Pipelines to assemble large, complex sets of data that meet non-functional and functional business requirements Work closely with SMEs, Data Modeler, Architects, Analysts and other team members on requirements to build scalable real time/near real time/batch data solutions. Contributes design, code, configurations, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, and loading into Data Lake/Cloud Data Warehouse/MPP (Snowflake/Redshift/similar Technologies ) . Owns one or more key components of the infrastructure and works to continually improve it, identifying gaps and improving the platforms quality, robustness, maintainability, and speed. Cross-trains other team members on technologies being developed, while also continuously learning new technologies from other team members. Interacts with technical teams across Cepheid and ensures that solutions meet customer requirements in terms of functionality, performance, availability, scalability, and reliability. Performs development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions. Keep up with current trends in big data and Analytics , evaluate tools and pace yourself for innovation. Mentor Junior engineers ,create necessary documentation and Run-books while still being able to deliver on goals Thanks Asad [email protected] Keywords: quality analyst information technology California |
[email protected] View all |
Mon Nov 13 23:18:00 UTC 2023 |