Multiple Open Roles at San Antonio, Texas, USA |
Email: [email protected] |
Job Title/Role: Cloud Data Architect Location: San Antonio. TX Duration: Long Term Mandatory required skills: Cloud EDW, Snowflake, Informatica, DataStage, DBT, Big Data and NoSQL Technology Detailed Job Description A Technical Architect is expected to be functionally knowledgeable in multiple Cloud EDW, Snowflake, Informatica, DataStage, DBT, Big Data and NoSQL Technology areas and hands-on experience in data management. As a leader in our Data Solutions, you will provide best-fit architectural solutions for one or more projects; assist in defining scope and sizing of work; and anchor Proof of Concept developments. You will provide solution architecture for the business problem, platform integration with third party services, designing and developing complex features for clients' business needs. You will collaborate with some of the best talent in the industry to create and implement innovative high quality solutions, participate in Sales and various pursuits focused on our clients' business needs. You will also contribute in a variety of roles in thought leadership, mentorship, systems analysis, architecture, design, configuration, testing, debugging, and documentation. You will challenge your leading edge solutions, consultative and business skills through the diversity of work in multiple industry domains. This role is considered part of the Business Unit Senior Leadership team and may mentor junior architects and other delivery team members. Responsibilities Own and aggressively drive forward specific areas of Cloud EDW technology architecture. This includes data management architectures involving batch, micro-batch, and real-time streaming of data in both cloud and on-premise solutions Architect market leading Snowflake on AWS using Solutions. This includes ETL (Informatica or DataStage) and EDW design. Provide data architectural solutions/designs to project execution teams for implementation. Provide architectural assessments, strategies, and roadmaps for data management. Lead projects within architecture. Work with Product Owner/Business Analysts to understand functional requirements and interact with other cross-functional teams to architect, design, develop, test, and release features. Project and solution estimation and team structure definition Develop Proof-of-Concept projects to validate new architectures and solutions. Support multiple Agile Scrum teams with planning, scoping and creation of technical solutions for the new product capabilities, through to continuous delivery to production. Liaise with offshore team and clients for resolving technical dependencies, issues, and risks. Mentor and provide architectural guidance to multiple teams building innovative applications. Title: Power BI Developer with Cloud GCP Location: Houston, TX Duration: Long Term Skills: Power BI Must have SQL including query, stored procedure, function Must have DevOps mainly for code deployment Must have Google Cloud Platform (GCP) Must have GCP native tools (scheduled query, Dataform, Dataproc) Nice to have or on the know of, but its not difficult to learn as long as having other cloud ETL/ELT experiences) Python PySpark SQL, Pandas Just the ability to comprehend the code is enough Title: Data Analyst Location: Hybrid Phoenix, AZ (Locals Only) Duration: Long Term Local candidates is the. priority We actually have a need for two Senior Data Analysts and one Data Engineering Lead. Please use the below responsibilities and skills for the Data Engineering Lead. Thank you for your help Megha. Brad is out of office until Friday - he and Wayne Xu will conduct the Data Engineering interviews for those they support after reviewing resumes. Responsibilities Lead the Azure Data Ingestion and Integration activities from onshore for the on-off team of around 8 people Collaborate with the WAB Data Architects and Lead Data Engineers in data pipeline design Co-ordinate across the data engineering workstreams happening in parallel including reviews of work from the team Help PM/Scrum Master with technical effort estimations Skills 3+ years experience with Azure data services including Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage 6+ years of SQL scripting and database experience 5+ years of ETL experience with any ETL tool Good understanding of data warehouse design principles Agile DevOps working knowledge desired Role: Ab Initio Developer Location: Hybrid - New York Duration: Long Term Technical/Functional Skills: Experience designing and delivering complex, large-volume data warehouse applications Ab Initio ETL development (5+ years of hard-core Ab Initio experience) Strong experience in advanced SQL programming, Oracle databases Minimum data modeling experience Solid and professional communications skills, both verbal and written Unix familiarity and shell scripting experience Strong knowledge of Software Development Lifecycle (SDLC). Job scheduling experience using Maestro/ TWS is required Strong experience in Data Quality, Source Systems Analysis, Business Rules Validation, Source Target Mapping Design, Performance Tuning and High Volume Data Loads. Strong written and oral communication skills are essential. Strong analytical skills and ability to resolve problems are desired. Some knowledge of mortgage industry required. Ability to work independently and multi-task to meet critical deadlines in a rapidly changing environment Flexibility to work additional hours on an as needed basis to meet deadlines and with offshore team. Develop relevant functional and technical documentation Thanks & Regards, Syed Mubasheer "Certified Woman Owned Minority Business Enterprise (WMBE)" 222 West Las Colinas Blvd, Suite # 540, East Towers, Irving, Texas, 75039 USA Email: [email protected] Contact No: 469 962 2899 Ext: 542 www.techstargroup.com Please consider the environment before printing this e-mail Keywords: business intelligence information technology Arizona Colorado Texas Multiple Open Roles [email protected] |
[email protected] View all |
Mon Sep 09 21:52:00 UTC 2024 |