GCP Data Engineer (Financial Domain EXP) || Dallas, TX /Little Rock, TX (Hybrid) need locals only with DL || Visa: USC/H4-EAD only at Dallas, Texas, USA |
Email: [email protected] |
Hi, Hope you are doing great! Please find the requirement below , If you find yourself comfortable with the requirement please reply back with your updated resume and I will get back to you or I would really appreciate if you can give me a call back at my contact number (302)-485-1559 Share only relevant resume with GCP exp and local to Dallas TX Position: GCP Data Engineer (Financial Domain EXP) Location: Dallas, TX /Little Rock, TX (Hybrid) need locals only with DL Duration: 1 Year Experience: 10 Years Visa: USC/H4-EAD only More of a senior role, a minimum of 4-5 years of GCP experience is required - Implementation experience. 7-8 years required building data engineering capabilities Must have clear communication Strong experience with SQL and Python is required They are looking for candidates who have strong GCP experience and can lead the rest of the team on the GCP side. GCP certification is highly preferred to validate technical competency Position is Hybrid This position is 3 days a week in the Office - Monday is mandatory in the office and the other two days are flexible. Remote is not an option. Interview process is 3 rounds Please make sure candidates can speak to the following Difference between Datawarehouse, data lake, Star and Snowflake etc. Different between Fact and Dimensions ETL stages and layers - Explain ETL Lifecycle Real time vs batch processing Understanding of sprint process Knowledge of data profiling, quality handling Understanding of data quality scenarios to handle as part of data engineering scope The requirements are below: If you are a Google Cloud data engineer, Simmons Bank invites you to collaborate in an agile team of peers developing cloud based analytics platform integrating data from broad amount of systems to enable next-gen analytical products. Senior Data Engineering Google Cloud Platform (GCP) is responsible to develop and deliver effective cloud solutions for different business units. This position requires in-depth knowledge and expertise in GCP services, architecture, and best practices. They will collaborate with cross-functional teams to design, implement, and manage scalable and reliable cloud solutions. This position will also be responsible for driving innovation and staying up-to-date with the latest GCP technologies and trends to provide industry-leading solutions. Our future colleague will : Contribute to multiyear data analytics modernization roadmap for the bank. You will directly work on the platform based on Google BigQuery and other GCP services to integrate new data sources and model the data up to the serving layer. Contribute to this is unique opportunity as the program is set-up to completely rethink reporting and analytics with Cloud technology. Collaborate with different business groups, users to understand their business requirements and design and deliver GCP architecture, Data Engineering scope of work You will work on a large-scale data transformation program with the goal to establish a scalable, efficient and future-proof data & analytics platform. Develop and implement cloud strategies, best practices, and standards to ensure efficient and effective cloud utilization. Work with cross-functional teams to design, implement, and manage scalable and reliable cloud solutions on GCP. Provide technical guidance and mentorship to the team to develop their skills and expertise in GCP. Stay up-to-date with the latest GCP technologies, trends, and best practices and assess their applicability to client solutions. Qualifications What will help you succeed: Bachelors University degree computer science/IT Masters in Data Analytics/Information Technology/Management Information System (preferred) Strong understanding of data fundamentals, knowledge of data engineering and familiarity with core cloud concepts Must have good implementation experience on various GCPs Data Storage and Processing services such as BigQuery, Dataflow, Bigtable, Dataform, Data fusion, cloud spanner, Cloud SQL Must have programmatic experience of SQL, Python, Apache Spark At least 7-8 years of professional experience in building data engineering capabilities for various analytics portfolio with at least 5 years in GCP/Cloud based platform. Your expertise in one or more of the following areas is highly valued: Google Cloud Platform, ideally with Google BigQuery, Cloud Composer and Cloud Data Fusion,Cloud spanner, Cloud SQL Experience with legacy data warehouses (on SQL Server or any Relational Datawarehouse platform) Experience with our main tools dbt, terraform/terragrunt, Git (CI/CD) Experience with a testing framework Experience with Business Intelligence tools like PowerBI and/or Looker Experience in complex migrations from legacy data warehousing solutions or on-prem datalakes to GCP Experience with building generic, re-usable capabilities and understanding of data governance and quality frameworks Experience in building real-time ingestion and processing frameworks on GCP. Adaptability to learn new technologies and products as the job demands. Multi-cloud & hybrid cloud experience Any cloud certification (Preference to GCP Certifications) Experience working with Financial and Banking Industry Thanks And Regards Pankaj Chauhan Cell: (302)-485-1559 [email protected] https://www.linkedin.com/in/pankajschauhan/ Accroid Inc. 1007 Orange ST 4th FL 1651 Wilmington, DE 19801 -- Keywords: continuous integration continuous deployment information technology Delaware Florida Texas GCP Data Engineer (Financial Domain EXP) || Dallas, TX /Little Rock, TX (Hybrid) need locals only with DL || Visa: USC/H4-EAD only [email protected] |
[email protected] View all |
Wed Jun 05 19:34:00 UTC 2024 |