URGENT NEED Big Data Architect at Portland, OR (Onsite) at Portland, Oregon, USA |
Email: [email protected] |
Hi, Pleasure mailing you. Please go through the below requirement and let me know if you are comfortable for the position. Please send me your updated resume along with the best hourly rate, work authorization status and availability. An early response is really appreciated. Job Title: Big Data Architect Location: Portland, OR (Onsite) Duration : 12+ months Responsibilities: (The primary tasks, functions and deliverables of the role) Design and build reusable components, frameworks and libraries at scale to support analytics products Design and implement product features in collaboration with business and Technology stakeholders Identify and solve issues concerning data management to improve data quality Clean, prepare and optimize data for ingestion and consumption Collaborate on the implementation of new data management projects and re-structure of the current data architecture Implement automated workflows and routines using workflow scheduling tools Build continuous integration, test-driven development and production deployment frameworks Analyze and profile data for designing scalable solutions Troubleshoot data issues and perform root cause analysis to proactively resolve product and operational issues Experience: Strong understanding of data structures and algorithms Strong understanding of solution and technical design Has a strong problem solving and analytical mindset. Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders Able to quickly pick up new programming languages, technologies, and frameworks Experience building cloud scalable, real time and high-performance data lake solutions Fair understanding of developing complex data solutions Experience working on end-to-end solution design Willing to learn new skills and technologies Has a passion for data solutions Required and Preferred Skill Sets: Hands on experience in Databricks and AWS - EMR [Hive, Pyspark], S3, Athena. Familiarity with Spark Structured Streaming Minimum 4 years experience working experience with Hadoop stack dealing huge volumes of data in a scalable fashion 10+ years of hands-on experience with SQL, ETL, data transformation and analytics functions 8+ years of hands-on Python experience including Batch scripting, data manipulation, distributable packages 4+ years experience working with batch orchestration tools such as Apache Airflow or equivalent, preferable Airflow 4+ years working with code versioning tools such as GitHub or BitBucket; expert level understanding of repo design and best practices Familiarity with deployment automation tools such as Jenkins 8+ years of hands-on experience designing and building ETL pipelines; expert with data ingest, change data capture, data quality; hand on experience with API development 4+ years designing and developing relational database objects; knowledgeable on logical and physical data modelling concepts; some experience with Snowflake Familiarity with Tableau or Cognos use cases Familiarity with Agile; working experience preferred Thanks, Kushal P Recruiter- US IT Phone : 732-795-1267 Email : [email protected] | www.tekskills.in INDIA | USA | CANADA | UK I AUSTRALIA ISO 9001:2015 | Appraised at CMM Level 3 | WMBE Certified Company -- Keywords: sthree information technology golang URGENT NEED Big Data Architect at Portland, OR (Onsite) [email protected] |
[email protected] View all |
Tue Jul 16 00:08:00 UTC 2024 |