There are some of the most well-known test design techniques: Equivalence Classes, Boundary Values, Pairwise, Decision Table, State-Transition Diagram.
You can use them separately, but it is better to combine them, for example, Equivalence Classes, Boundary Values, and Pairwise work well together. Let's talk about this.
Let's consider an explicit application with the creating events functionality in your own calendar as an example. This module has a simple form with fields to fill in and a "Create" button to create an event.
The fields are as follows: Name, Date, Time FROM, Time TO, Participant, Description, File.
The following requirements are formed for these fields:
So, having familiarized ourselves with the requirements for fields and form elements, we can form equivalent classes for each field. Let's start with positive equivalent classes (as an option):
Name:
Date:
Time FROM/TO:
Participant:
Description:
File:
Note that in some places (like in Name and Participant), we have set only one equivalent class, because it makes no sense to split into more (the logic of the application behaviour shall not change, regardless of whether we enter the event name in 10 characters or in 25 characters, or only in Latin, or only in special characters).
Of course, we can think of many more classes for each field, but the whole point of combining test design techniques is that we choose equivalent classes very carefully and only those that are more likely to help us find a bug. A larger number of "fictional" classes will only increase the number of unnecessary tests.
Now let's form negative equivalent classes (here the technique of boundary values will come in handy):
Name:
Date:
Time FROM/TO:
Participant:
Description:
File:
Now that we have a list of classes generated, we can generate test data sets that we will use for tests. To do this, we will use the Pairwise technique (to find unique pairs of values and reduce the number of unnecessary checks). A handy tool for this is PICT. Thus, we have the following:
POSITIVE CASES
Name | Date | Time FROM/TO | Participant | Description | File | |
1 | any valid unique value within 30 characters | current date | AM/PM transition |
any valid email |
any valid value within 100 characters | any file of size 5 MB or less |
2 | any valid unique value within 30 characters | future date | 23:54-23:59 | any valid email | empty | any file of size 5 MB or less |
3 | any valid unique value within 30 characters | future date | any interval within a day | any valid email | any valid value within 100 characters | no file attached |
4 | any valid unique value within 30 characters | current date | AM/PM transition | any valid email | empty | no file attached |
5 | any valid unique value within 30 characters | future date | AM/PM transition | any valid email | any valid value within 100 characters | no file attached |
6 | any valid unique value within 30 characters | current date | 00:00-00:05 | any valid email | empty | any file of size 5 MB or less |
7 | any valid unique value within 30 characters | current date | any interval within a day | any valid email | empty | any file of size 5 MB or less |
8 | any valid unique value within 30 characters | future date | 00:00-00:05 | any valid email | any valid value within 100 characters | no file attached |
9 | any valid unique value within 30 characters | current date | 23:54-23:59 | any valid email | any valid value within 100 characters | no file attached |
NEGATIVE CASES
Name | Date | Time FROM/TO | Participant | Description | File | |
1 | NOT unique | past date | time FROM is later than time TO |
empty |
more than 100 characters entered | file of more than 5 MB size attached |
2 | more than 30 characters entered | empty | empty (both or one field only) | incorrectly formatted email | more than 100 characters entered | file of more than 5 MB size attached |
3 | NOT unique | empty | not fully entered time (like 11:_ _) | incorrectly formatted email | more than 100 characters entered | file of more than 5 MB size attached |
4 | empty | not existing date (like 32/13/2002) | not fully entered time (like 11:_ _) | empty | more than 100 characters entered | file of more than 5 MB size attached |
5 | more than 30 characters entered | not existing date (like 32/13/2002) | time FROM is later than time TO | incorrectly formatted email | more than 100 characters entered | file of more than 5 MB size attached |
6 | more than 30 characters entered | past date | not fully entered time (like 11:_ _) | empty | more than 100 characters entered | file of more than 5 MB size attached |
7 | NOT unique | not existing date (like 32/13/2002) | empty (both or one field only) | empty | more than 100 characters entered | file of more than 5 MB size attached |
8 | empty | past date | empty (both or one field only) | incorrectly formatted email | more than 100 characters entered | file of more than 5 MB size attached |
9 | empty | empty | time FROM is later than time TO | empty | more than 100 characters entered | file of more than 5 MB size attached |
In fact, each of the lines (numbered 1-9) is our separate test to be performed. In this way, we do not test each field separately for positive/negative values, but the entire form as a whole, which allows us to get more reliable test coverage. Of course, you can generate more tests by entering all possible combinations of all equivalent classes into this table, but this is not necessary, besides, there will be too many tests. By finding only unique pairs with Pairwise, we have identified tests that are more likely to help us find the bug.
It's better not to combine positive and negative values in one test, it's better to separate positive and negative values so that you have a clear understanding which tests are of the highest priority.
Finally: All negative test data sets shall lead to the same result - the event shall not be stored in the system. An error shall be displayed next to all fields. However, with this approach, we may not immediately understand whether the event is not saved due to a negative value in one field or in several fields. Therefore, we can supplement the process of executing our negative tests:
In this way we can find the following hypothetical bug: imagine that we have reached the point where we have one field with a negative value, all the others - with correct positive values. Clicking on "Create", we saw that the event was saved even though an error was displayed near the field with a negative value. This indicates that the functionality of displaying errors near the field works correctly, but the functionality of "preventing" the storage of events with negative values contains a bug that needs to be further investigated and entered into the bug tracking system.😉