Home

Senior Quality Automation Engineer 12+ Year (SDET/QA) - Alpharetta, GA(F2F interview) at Alpharetta, Georgia, USA
Email: [email protected]
Position: Senior Quality Automation Engineer (SDET) 12+ Year

Location: Alpharetta, GA (Onsite) (In person interview)

Preference: Need Local to GA

Mandatary: Typescript and Playwright experience

Job Description:

Seeking an experienced, resourceful, quality automation engineer who can adapt and hit the ground running with minimal supervision. This individual will be passionate about end-user experience and best-in-class engineering excellence and will be part of a tight-knit, distributed engineering team developing and delivering a comprehensive data operations management solution for Equifax's Data Fabric Platform.

As a Quality Engineer, you will be a catalyst in both the development and the testing both frontend and backend software components. You are passionate about quality and how customers experience the products you test. You are experienced in creating, maintaining and executing test plans in order to verify requirements. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and assert end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of Equifax software quality engineering strategies.

Data Fabric is a GCP cloud-native modern data management platform which allows Equifax to acquire and curate data, provide entity resolution, and ingest into a single environment. It is deployed globally in multiple regions, highly secured and complies with regional and internal regulatory controls with strict governance and oversight. Business units, Data Scientists and many other stakeholders use APIs to consume data managed by the Data Fabric and operate data exchanges to monetize data through B2B and B2C channels.

Data operations management solution consists of:

        A web portal UI/UX that provides a single point of access to all data management and data reliability engineering

        A suite of backend API services that services the UI and integrates with low-level Data Fabric and other third-party system APIs

        Modern data lakehouse (data lake, data warehouse, batch and streaming ELT pipelines)

The data operations roadmap envisions a set of rich management capabilities including:

        Serves a large community of geographically dispersed data operations stakeholders

        Data quality and observability management to detect, alert, and prevent data anomalies

        Troubleshooting, triaging and resolving data and data pipeline issues

        OLAP, batch and streaming big data processing, and BI reporting

        MLOps

        Real-time dashboards, alerting and notifications, case management, user/group management, AuthZ, and many other foundational capabilities

General Responsibilities

        Cross-Functional Work: Collaborate with global teams to integrate with existing internal systems and GCP cloud. Partner with the development team to improve service quality through rigorous testing and release procedures

        Issue Resolution: Triage and resolve product or system issues, ensuring quality and performance

        Documentation: Write technical documentation, support guides, and run books

        Agile Practices: Participate in sprint planning, retrospectives, and other agile activities

        Compliance: Ensure software meets secure development guidelines and engineering standards

Quality Engineering Accountability

        Test Strategies and Plans: Develop in conjunction with App Engineering, Architecture, and Prod Arch, including mock data creation and management.

        Regression Tests: Identify and assure creation of re-usable, automated tests to detect defects early, creating a test automation suite and necessary documentation.

        Change Management: Influence CI/CD, tools integration, and SDLC recommendations to ensure adherence to Engineering Handbook, including security.

        Validation: Validate pre-deployment and post-deployment plans, record results, and complete vulnerability and penetration testing.

        Code Quality Reports: Generate using tools like SonarQube and Fortify.

        Customer Focus: Ensure end-customer needs are met and drive processes for a flawless customer experience.

Must-Have Skills

        Cloud-Native Application Development: 3+ years. Solid experience with software QA methodologies, tools, and processes, specifically in a cloud-based environment

        Frontend and backend software testing: 5+ years experience working in a TDD/BDD environment and can utilize technologies such as JUnit, Rest Assured, Appium,

Jbehave/Cucumber frameworks, APIs (REST/SOAP)

        Java Experience: 5+ years of general proficiency with Java; in the context of writing test cases

        Frontend Development and testing: 3+ years with Angular, JavaScript, TypeScript, or modern web application development frameworks; Jasper, Jest and other unit testing frameworks. Selenium, Cucumber, and other integration testing frameworks

        Architecture Knowledge: Understanding of modular systems, performance, scalability, security

        Agile Experience: Agile development mindset and experience

        Service-Oriented Architecture: Knowledge of RESTful web services, JSON, AVRO

        Application Troubleshooting: Debugging, performance tuning, production support

        Test-Driven Development: Unit, integration, and load testing experience and profiling (e.g. Java JVM, Databases) and tools such as Load Runner, JMeter

        Documentation Skills: Strong written and verbal communication

        General SDLC: Experience with CI/CD concepts and can use tools including Jenkins/Bamboo, and release management concepts. Understanding of GCP services related to big data like BigQuery, Dataflow, Pub/Sub,GCS, Composer/Airflow. Or, similar solutions in AWS: Redshift, SNS, SQS, S3, Kinesis and others

Nice-to-Have Skills

        Big Data Processing: ETL/ELT experience

        Linux/Unix: Bash shell scripting

        Scripting Languages: Groovy, Python

        Containerization & Orchestration: knowledge of Docker, Kubernetes

        Cloud Certification: Relevant certifications in cloud technologies

--

Keywords: continuous integration continuous deployment quality analyst user interface user experience business intelligence sthree information technology Georgia
Senior Quality Automation Engineer 12+ Year (SDET/QA) - Alpharetta, GA(F2F interview)
[email protected]
[email protected]
View all
Thu Oct 03 20:58:00 UTC 2024

To remove this job post send "job_kill 1809769" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 14

Location: Alpharetta, Georgia