User Acceptance Testing (UAT) is a key part of a successful software change process. One that inevitably normally comes towards the end of the project and it usually does not get the same level of focus that earlier quality assurance phases do.
This is perhaps because there is a drive to ‘Shift Left’, meaning to test early and find issues as early as possible so they can be addressed as cheaply as possible. UAT is ‘Shift Right’ topic, but it is just as important as earlier QA activities, and some might say more important as it is perhaps the most costly type of testing there is.
In some cases, this ‘Shift Right’ consideration has shifted the topic of UAT over the horizon, or perhaps just into the neighbors’ yard, and it does not get the same level of attention that other more development-focused testing does.
One of the symptoms of this forgotten area is the lack of tools and technology used to support UAT and increase its efficiency and effectiveness, which often means the main tools used are spreadsheets and documents.
UAT is most often conducted through manual, tedious processes. This starts with planning what needs to be tested, how it is to be tested, who will do it, and when. A UAT manager without more suitable tools will be reaching for a spreadsheet. It continues with updating this spreadsheet as progress is made and screenshotting errors encountered, either into another spreadsheet or a Word document.
For any project beyond the most minor change, this approach will be inefficient, uninformative, slow, and tedious for the users doing the testing. There are many reasons why you should throw the spreadsheets away and stop killing your UAT efforts, but here as just five for starters.
Fortunately, technological advances to support efficient UAT practices now allow for a much simpler approach, saving time and money for businesses. By switching to modern planning and capture technology, users can continue with everyday responsibilities with the UAT solution tracking the user, documenting issues as they arise, allowing comments, and enabling further information to be recorded if needed.
1. Manual spreadsheets & documents make UAT slow
UAT is the most time-consuming and resource-hungry form of testing. It involves the most significant number of people, probably for the greatest amount of time. It comes at a time when the pressure to finish and go live is at its greatest. So, it should be the obvious place to invest in efficiency.
Although spreadsheets can be used to document Test Cases and steps, they are hard to manage and don’t work well in a shared environment, leading to errors, duplication, and an inability to re-use assets from previous tests. Using spreadsheets alone makes UAT incredibly inefficient and error-prone.
The biggest headache and most painful aspect of not using modern UAT solutions revolve around capturing test results. The time spent by users copying and pasting screenshots and comments into spreadsheets or documents can more than double the UAT effort. Not only that, the data collected may be incomplete, leaving critical holes in the audit trail or the diagnostics of an issue, resulting in more wasted time for those tasked with fixing problems.
Centralized spreadsheets create another issue when a user doesn’t have the correct access, this can bring the testing session to a halt. Also, when users copy the last project spreadsheet, this may introduce errors when the test cases are not the correct version, and there is no traceability to show where they have come from.
2. Individual spreadsheets create inconsistencies
In the context of user feedback, allowing flexibility in how this is expressed is good. Still, at the same time, the information received needs structure to be used effectively and interpreted correctly. A UAT manager relies on users reporting their feedback in a consistent and usable way, with the appropriate level of detail and accuracy, but this is hard to achieve in a spreadsheet and document-based strategy.
Even if the UAT manager has designed a uniform spreadsheet for the users to ensure a systematic approach is being taken, the quality of each user’s approach will still differ.
It becomes hard for the UAT manager and other receivers of the test results to understand and process the varied and incomplete information.
3. Spreadsheets rely on user consistency
Due to the manual nature of spreadsheet UAT, managers rely on users to be consistent, accurate, and meticulous during the process. For successful testing, all detected errors and issues will need to be recorded for review and analysis. Depending on the project’s size and change, many users may be testing the new system or very few emphasizing the need for systematic and accurate reporting of system issues.
This reliance on the diligence of users is a significant drawback if using manual spreadsheets for UAT. Some users will have been willing to participate, whereas others will have been instructed to participate, and their responses will be less passionate and meticulous. Manual spreadsheet methods mean users’ input will drastically differ and may often not be a comprehensive reporting of the system; when users are busy with their duties, they may not prioritize the system’s reporting.
While documenting issues or providing feedback, the steps leading up to the key moment are very often significant. This is especially true of genuine errors when the most critical data is what happened before the error appeared, so it can be understood and reproduced. This means that the user either needs to capture everything as they go along or go back and reproduce and simultaneously document the sequence of events. This is immensely time-consuming.
4. Spreadsheets create an inaccessible data mountain bottleneck
If there is a single process (test management) controlling the spreadsheet with all the allocated tasks and their statuses, it will usually only be practical for it to be maintained by one person, the UAT manager. This maintenance will be based on a mountain of feedback from users in slightly varying forms and will be dependent on their timely updates.
Furthermore, it is impossible to get a big-picture overview of the data to see how it all joins up – what has been done, where defects have come from, and what progress is being made. Consequently, a large amount of the UAT manager’s time is spent unscrambling this data mountain, converting it into a usable form, and communicating to and fro with the users, product owners, development teams, or vendors.
5. The Cobbler’s children?
A fundamental goal of User Acceptance Testing is to ensure the change being tested delivers the expected value and benefits. It achieves the required business objectives, usually improving the efficiency of a company’s operation. But if UAT is still carried out inefficiently, this will negate the improvement being tested, reducing its value because of the effort involved in the implementation.
Much like the busy Cobbler, who does not have the time to repair his own children’s shoes, we may have neglected to look after and support the users efficiently.
By utilizing capture tracking technology in the testing process, you can be assured that the testing is accurate and efficient and not distracting users from their day-to-day responsibilities.
The testing process needs to be a seamless and accurate representation of the system’s regular day-to-day activities. Capturing tools provide an excellent opportunity for the testing process to be monitored thoroughly while not requiring an excess use of time or effort from users.
Proper tools are more efficient for UAT.
Dedicated UAT solutions offer quick and easily implemented alternatives to this world of spreadsheet inefficiency and disorganization.
There are two key components:-
1. A UAT management platform which will
- Organize test assets and enable re-use
- Organize test assets and enable re-use
- Provide traceability for the test hierarchy
- Support timetabling and planning
- Provide users and testers with the information they need when they need it.
- Enable real-time communication and collaboration
- Provide a database of results, feedback, and issues
- Enable triage of results
- Support analysis and metrics
2. UAT and manual test capture
- Auto capture of screens and inputs, saving many hours of users’ time.
- Provide technical diagnostics
- Create clear result and issue reports supporting understanding and re-creation.
- Be painless to use and implement