Bigdata Cloud Engineer position at Tampa, FL location at Tampa, Florida, USA |
Email: [email protected] |
From: Hari prasad, Smartfolks [email protected] Reply to: [email protected] Hi , Greeting from Smart Folks!!! My name is Hari Prasad we have a job opportunity for you as Bigdata cloud Engineer with GCP role. Please find the job description below, if you are available and interested, please send us your word copy of your resume to [email protected] Or please call me on 469-425-3345. Role: Bigdata Cloud Engineer GCP Work Location: Tampa, FL 33637 Minimum years of experience: 8+ Job Description: Must Have Skills GCP, Bigdata, Teradata Detailed Job Description Overall 8+ years of professional IT and around 5 years of expertise in Bigdata using Hadoop framework Analysis, Design, Development, Documentation, Deployment, and Integration using SQL and Big Data technologies. 5+ years experience with Google Cloud Services such as Streaming + Batch, Cloud Storage, Cloud Dataflow, Cloud Pub/Sub , Cloud Composer , Data Proc , DFunc, Big Query & Big Table Responsible for building scalable distributed data solutions using Hadoop. Experience in implementing various Big Data Analytical, Cloud Data engineering, Data Warehouse/ Data Mart, Data Visualization, Reporting, Data Quality, and Data virtualization solutions. Experience in providing ETL solutions for any type of business model. Created procedures, macros in Teradata Experience in moving high and low volume data objects from Teradata and Hadoop to snowflake. Develop the script files for processing data and loading to HDFS. Written CLI commands using HDFS. Develop the UNIX shell scripts for creating the reports from Hive data. Experience in writing SQL queries, PL/SQL programming and Query Level Performance tuning. In Depth understanding and usage of TERADATA OLAP functions. Proficient in TERADATA SQL, Stored Procedures, Macros, Views, Indexes Primary, Secondary, PPI, Join indexes etc. Hands on experience with different programming languages such as Java, Python, Scala. Experience in using different Hadoop eco system components such as HDFS, YARN, MapReduce, Spark, Pig, Sqoop, Hive, Impala, HBase, Kafka, and Crontab tools Keywords: information technology Florida |
[email protected] View all |
Thu Jan 12 23:59:00 UTC 2023 |