Snowflake Developer position at Richardson TX ONSITE at Richardson, Texas, USA |
Email: [email protected] |
From: Hari prasad, Smartfolks [email protected] Reply to: [email protected] Hi, Greeting from Smart Folks!!! My name is Hari Prasad we have a job opportunity for you as QA Automation Lead role. Please find the job description below, if you are available and interested, please send us your word copy of your resume to [email protected] Or please call me on 469-425-3345. Role: Snowflake Developer Work Location: Richardson TX (Day 1 onsite) Job Description: Must Have Skills: Snowflake Python Pyspark Nice to have skills: DataBricks Palantir foundry Detailed Job Description: JD - Develop and Support necessary data ingestion procedures into Snowflake and Palantir Foundry environment. Define data requirements, gather and mine large scale of structured and unstructured data, and validate data by running various data tools Support the standardization, customization and ad-hoc data analysis, and develop the mechanisms in ingest, analyze, validate, normalize and clean data. Implement health checks on new data sources, and apply rigorous iterative data analytics. Work with compliance, security teams and legal to create data policy and develop interfaces and retention models which requires synthesizing or anonymizing data. Develop and maintain data engineering best practices and contribute to insights on data analytics and visualization concepts, methods and techniques. Experience - 4+ years of Experience working on data ingestion into Snowflake and Palantir platforms. Experience with Databricks is a big plus. Experience with Azure cloud environment; Experience working with Orc and Parque file formats; Experience working with Python transformations to perform insert/delete-insert logics on data frames. Experience working with PySpark SQL to create load scripts. Experience with performance tuning of SparkSQL queries. Experience on working with datasets in Cloud. Utilize Relational and Non-relational databases. (Teradata, Vertica, SQL Server, Oracle, MySQL, MongoDB ) Utilize Hbase and Hbase Shell. Develop pyspark scripts ingest large datasets.. Utilize Python and/or Java to create transformation and cleansing process Utilize Databricks to ingest very large datasets. Keywords: quality analyst active directory Texas |
[email protected] View all |
Wed Feb 08 22:07:00 UTC 2023 |