Table of Contents

Motivation

Test IO stands for supporting testers' growth. To make sure that your testing career follows the rising path, we decided to share with you the most common mistakes our testers make during the early days of their testing career. Even experienced testers sometimes make mistakes, and they learn from those.

It is very tempting to try everything when you are testing, even some unusual steps that a normal user would never make. When this urge happens, stop yourself before you submit a bug report. Think about if your steps are normal user behavior or if the bug you found will be interesting for our customers. Always think if your bug is going to impact the normal user flow.

Let's see some of the most common mistakes examples we have gathered for you.

Testing the Email Subscription using an email you don't have access to

Often our testers want to prove that the Email Subscription or Registration flow accepts invalid emails. Checking if the server accepts invalid emails is perfectly fine and one of the useful actions for our customers but before you proceed with such practice make sure that you understand the difference between invalid and non-existent emails. One would say that it is the same thing, but it is not. Please, read this article to understand what invalid emails are.

Non-existent emails on the other hand represent the emails that were not created by you (or anyone else), and hence don't exist in the system. An attempt to validate a non-existent email will most likely produce no reaction from the server as it will be considered valid if it follows the email structure pattern. Such behavior is expected, and it is not considered a bug.

The solution:

  • If you're testing with commercial emails, you MUST use an existing inbox. That way we won't ping Gmail, Outlook or any other email provider with invalid requests.

  • If you want to test validation, you must use qa.team emails. Since all valid qa.team usernames exist, testers will never send a request to an underivable inbox, so they will only be testing the validation itself.

  • For staging environments, qa.team emails should always be used for regular registration (unless stated otherwise by the customer).

  • For invalid email validations use the examples shared in the article or create your own combo that follows the same path

Reporting bugs related to the browser validation

Browser validation represents the HTML input validation done by the browser on the basis of attributes within the input element.

Here is an example of HTML 5 browser validation of email input:

<input type="email" id="email"
pattern="[email protected]\.io" size="30" required>

If you see something similar, it means that there is a browser validation implemented and that validation is not done by JavaScript.


One of the most common examples of browser validation is displaying the red zig-zag line under the misspelled word you input in the form field.

Reporting such bugs is not in the scope of the test cycle run by Test IO and such bugs will be rejected.

The solution:

  • When you are in wonder what kind of validation on input is implemented by the customer for the particular environment (website) right-click on the page and then click on the View Page Source. If you notice the code <input type="email"...> it means that the email field is validated by the browser and that you should not report the bug in the test.

Testing without the proxy enabled when it is requested in the test instructions

Sometimes our testers don't read the test instructions thoroughly, which causes their bugs to get rejected or even in the worst scenario a warning from our Anti-Cheat Unit. One of the most common mistakes is when testers don't use a proxy when it is required. We did a bit of investigation and here is what was the issue for testers:

In the recurring tests, testers read the test instruction once and when the next test from the same customer, for the same product comes, testers think that the testing strategy should be the same and that reading the instructions would be a waste of their precious time. This is where they make a mistake. The same title of the test doesn't mean that access to the test environment follows the same path.

Many times, our customers want to separate the traffic our testers cause from the traffic real users do. In such cases, we use a proxy. Another case is to get the access to staging environment which is locked to anyone without a proper pass: proxy enabled. If the tester misses enabling proxy, the staging environment will throw 403 or 1020 errors. Reporting such bugs will trigger rejections for not following the test instructions.

The solution:

  • When you accept the test cycle in which using a proxy to access the testing environment is required, you must follow the test instructions precisely or the bugs you found won't be considered legit.

Upgrading Content and Visual bugs into Functional ones to make them within the scope of the test

Sometimes our testers without any intention upgrade all Content and Visual bugs into Functional because they don't have the experience to determine which bugs should be upgraded into Functional. As a result, they get rejections. Many times, those rejections are for Out-of-Scope reasons which means that the tester's Quality and Rank will suffer greatly.

The solution:

  • Focus on learning the difference between Content, Visual and Functional bugs. Understand that if Content and Visual bugs don't prevent the functionality of the product or the intuitive workaround exists, you should not submit a Functional bug.

Not updating the device OS before accepting the Beta test cycle

This practice is well attested not just by our newbies but also by our experienced testers. Mostly, it is not intentional. Our testers participate in many tests over a week that they sometimes forget to sign up for testing Beta OS versions.

The solution:

  • It is human to make mistakes but make sure that you always read the test instructions because it contains important information on the requested device and Beta OS.

  • In cases when the Beta version of the OS is required, you should not test the product with the official build but with the Beta version.

Selecting the wrong device in the Bug Report

Due to the constant efforts to submit bugs faster than their fellow testers, some testers make a terrible mistake by selecting the wrong device in the Bug Report. That is how some valid, in the scope of the test bugs get rejected. If the test is still running, the tester would be able to submit the same bug with the correct device selected if another tester already didn't.

The solution:

  • Before you hit the button Submit Bug, double check if you picked the correct bug type, severity, device and browser. If you select the wrong device and browser, you won't be able to change it after you submit a bug. You would need to delete it and resubmit it (if it wasn't reviewed by TL already).

Sending a message in the test cycle chat to notify TL that you made changes TL requested in the Bug Report

In the early phase of testing, our testers receive multiple Information requests to improve their bug reports. During that time, testers might feel impatient to see what the outcome of the submission is. That is when testers, mostly, fall into the trap to send several comments in the Bug Report or even messages into the test cycle Chat to inform TL about the carried-out changes. Feels familiar, right? We have all been there, and we learned from our mistakes. But we want you to be smarter than that and learn from us.

The solution:

  • When you answer to the Information Request with all the necessary information that your TL requested, please restrain yourself from sending multiple comments in the Bug Report and messages in the test cycle Chat. TLs receive notifications about completed Information Requests and there is no need to get nervous or anxious. All the bugs will be reviewed in time because our system disallows closing any test with Bug Reports in the Unreviewed state.

Using Google Translate to translate the testing environment while you are recording the bug

One of the advantages of modern technology is that you don't need to speak all the languages in the world to test the product in a foreign language. Using a third-party translation tool, like Google Translate is permitted while testing but make sure that the bug you found is not caused by using Google Translate. Sometimes, Google Translate breaks the environment and our testers end up submitting non-existent bugs. In other cases, our testers submit a bug that exists, and it is not caused by Google Translate but they forget to disable Google Translate when they record the bug. When that happens, the bug will get rejected by TL because it doesn't follow our standards.

The solution:

  • Every time you are testing an environment in a language you don't speak, use the third-party translation tool for easier understanding of the product but disable it just before you start recording a screencast.

Reproducing a PASS bug in the test

This behavior is attested in tests where our testers are asked to submit one Functional bug with the title "PASS" to prove that the workflow described in the test instructions is successful. Submitting reproductions on such bugs is not allowed and such submissions will be rejected.

The solution:

  • When you see the "PASS" in the title of the bug report, please don't submit a Reproduction.

Mistakenly assume that filtering and sorting are similar functionalities

Usually, filtering and sorting are presented together because they help users handle large sets of items (products, movies, tickets…); however, their implementation highly differs. A filter functionality reduces a collection of items based on specific criteria like size, color, brand, and similar. A sorting functionality orders any given data set by different criteria like low to high or newest to oldest.

Understanding these differences is crucial since bugs found on these functionalities belong to different types. For instance, while issues with sorting functionality are functional bugs, most filtering issues are content bugs.

The solution:

  • The easiest way to differentiate these two functionalities is by looking at the type and number of options the functionality is given. The functionality will be filtering when it offers numerous options that describe physical items’ characteristics. The functionality will be sorting when the given options are just a few and are meant to see items displayed in a particular way.

The list of the most common mistakes doesn't end with mentioned cases. For more common mistakes and some really good tips, we suggest you listen to our podcast episode:

Related links and articles:

Did this answer your question?