QA Process Setup and Testing for the Meeting Platform

Case_Study_Anchor_AI

Industry

Office Software, Online Meetings

Country

United States

Type of Service

Manual testing

Cooperation Type

Part-time

Project Type

Mobile testing
Web testing

Overview

Anchor AI is an AI-powered meeting recording and organization tool built to simplify work life. Users can record their offline conversations and transcribe the uploaded audio files, create searchable transcripts, add tasks to to-do lists, and share any notes with others. Anchor AI also connects to live meetings on platforms like Zoom, Google Meet, and Microsoft Teams.

Challenge

The client needed QA expertise for their web application. The product was live when the QA engineer joined the team. The testing process was chaotic at the moment – without a clear flow or dedicated specialists, with other team members testing the product occasionally. Thus, the team requested setting up the QA process with further software testing.

Companies tend to seek external expertise in the audit and setup of the process more rarely compared to the testing alone. In this case, Anchor AI’s team understood the value of quality assurance and was open to changes.

The key challenge was the fact that the platform was already live. In such a case, it is critical to align the QA expert’s work with the existing flows, causing minimum interference while bringing maximum efficiency.

Solution

The QA engineer joined the project when the development was in the active phase. Their initial suggestions included the following:

  • To log all defects in Jira instead of a separate document.
  • To test all the completed tasks before the release instead of after.
  • To join the team meetings – to get a better understanding of the future functionality and be able to provide the analysis of the requirements.
  • To run smoke testing after every release.

Before starting the setup of the QA process, the QA engineer ran exploratory testing to get familiar with the product. They also suggested logging bugs using a new, standardized template. It would make working with reports easier for developers and allow for keeping the documentation in order.

Scope of Work

After running the first iteration of exploratory testing, the QA expert moved to the QA process setup and testing. The activities at this and further stages entailed:

  • Deciding on the bug-tracking system and discussing it with the team.
  • Establishing a clear and easy-to-follow task flow.
  • Implementing the smoke and regression testing.
  • Providing the basic essential test coverage and test documentation for it. Including the QA engineer in the team meetings for requirement analysis.
  • Testing of the web application’s features and sub-features on Windows and Mac.

Testing Coverage

It is an ongoing project, and the QA engineer currently works part-time. Thus, it was important to cover the critical types of testing within the agreed timeline.

  • Functional testing: to verify the platform’s features against the requirements.
  • UI testing: to check the visual consistency and functionality of the interface.
  • Compatibility testing: to run cross-browser and cross-platform checks on different devices.
  • Retesting: to review the functionality after the defect fixing.
  • Smoke testing: to check the vital functionality after every code iteration.
  • Regression testing: to run a comprehensive check between the update/upgrade and deployment.

The QA engineer chose to use checklists instead of test cases – it would be more efficient given the available time. The application’s functionality changes and grows, so the documentation is updated accordingly.

Results

Over 180 bug reports have been submitted so far. More than 50 critical defects and blockers were detected. The development team started fixing the critical bugs, and the process is ongoing.

The QA expert has been proactive from the beginning of their involvement in the project. They regularly:

  • Share feedback on requirements to prevent the issues at the earliest stage possible.
  • Write and update the checklists for the functionality under development.
  • Test the completed functionality before deployment.
  • Run post-release smoke testing.

After the QA specialist joined the project, the amount of defects in production decreased. Now, all the critical defects are discovered and fixed before the release.

Let’s Start a New Project Together

QA Madness helps tech companies strengthen their in-house teams by staffing dedicated manual and automated testing experts.

Anastasiia Letychivska

Head of Growth

Ready to speed up the testing process?