Data Platform Admin at Mountain View, California, USA |
Email: ryan@nityainc.com |
https://jobs.nvoids.com/job_details.jsp?id=2064985&uid= From: Nitya, Nitya software solution ryan@nityainc.com Reply to: ryan@nityainc.com Role : Data Platform Admin Location : Mountain View CA (Day 1 onsite) C2C Must have: Strong experience in Java and Spring Boot frameworks. Experience with data platforms such as Snowflake, Databricks, Google Big Query, AWS Redshift, or similar. Strong understanding of data warehouse and data lake architectures. Proficiency in SQL for querying and managing data. Knowledge of data pipeline tools (e.g., Apache Airflow, Informatica, Talend). Familiarity with scripting languages like Python, , or Bash for automation. Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP). Familiarity with containerization and orchestration tools like Docker and Kubernetes. Understanding of access control, encryption, and data security best practices. Knowledge of compliance requirements (e.g., GDPR, CCPA, HIPAA). Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Ability to prioritize tasks and manage time effectively in a fast-paced environment. What You'll Do Manage and maintain the data platform infrastructure to ensure availability, reliability, and performance. Monitor and troubleshoot platform-related issues to minimize downtime. Configure, optimize, and upgrade data platform components (e.g., data lakes, data warehouses, or databases). Oversee the ingestion, processing, and storage of data in the platform. Monitor ETL/ELT pipelines for performance and error resolution. Ensure pipelines meet data governance and compliance requirements. Manage user roles, permissions, and access to the data platform. Implement and enforce data security policies and best practices. Audit access logs and ensure compliance with regulations like GDPR or HIPAA. Optimize platform configurations for better query performance and scalability. Perform regular capacity planning and resource allocation. Work with engineering teams to enhance data storage and retrieval efficiency. Integrate data platform components with other enterprise systems and tools. Collaborate with data engineers, analysts, and stakeholders to support data-driven initiatives. Document platform configurations, workflows, and processes for knowledge sharing. Implement monitoring tools to track platform health and usage. Automate repetitive administrative tasks using scripting or orchestration tools. Good to have: Experience with big data tools like Apache Hadoop, Spark, or Kafka. Familiarity with BI tools like Power BI, Tableau, or Looker. Experience with cloud-native ML services like AWS SageMaker, Google Vertex AI, or Azure ML Studio. Knowledge of database indexing, partitioning, and caching techniques. Hands-on experience with query performance tuning and workload management. Familiarity with monitoring tools like Datadog, Prometheus, or Grafana. Experience with infrastructure-as-code tools like Terraform or Ansible. Understanding of machine learning workflows and integration with the data platform. Familiarity with data catalog and lineage tools (e.g., Alation, Collibra). Certifications in relevant cloud platforms (e.g., AWS Certified Data Analytics, GCP Data Engineer, Azure Data Engineer). Certifications in data platform technologies like Snowflake or Databricks. Keywords: artificial intelligence machine learning business intelligence California Data Platform Admin ryan@nityainc.com https://jobs.nvoids.com/job_details.jsp?id=2064985&uid= |
ryan@nityainc.com View All |
07:06 AM 09-Jan-25 |