Usability testing - Nancy Direct

Improve usability of public service platforms through testing

Context

Discover usability issues on Nancy'sTown Hall service platform, test them and suggest improvements. Participants confirmed that the platform was created in 2019.

Why Nancy?
We were using the new services of France's government, and we wondered how "mairies" (Town halls) performed.
We chose Nancy, an Eastern city with a significant amount of young population, mostly students, therefore strong potential for growth.

Nancy is a town in the North East of France, not too far from Germany. It is one of the major cities in the region, the others being Strasbourg, Reims, Metz, Mulhouse and Colmar.
The steps of the project are: 1 - start with a heuristic review of the platform, 2 - establish scenarios for usability testing 3 - Conduct usability testing, 4 - Analyse results and suggest improvements.

Heuristic review

The town of Nancy has a website and this platform, which provides city hall services online.

The platform presents 14 service categories, e.g. "Papers, citizenship, elections" lets users get new ID cards, obtain a birth certificate, etc.

It is designed for mobile first and is presented as a progressive web application, however the app only works for Android.

When a user selects a service, a file is created.
Each service ticket is tied to a code. Users complete a multi-step form and submit it. 
To access the form again in 3 ways. Users can enter the tracking code, or through their account specific to the Nancy Town Hall e-service stack, from which they can access all active procedures, or connect through the national authentication system France Connect.

Uncertainty management & error handling

‍- No indication of the maximum number of characters when filling out form for most requests. It is inconsistent as this feature certain forms.

- No pre-selected list for addresses in Nancy; users can enter incorrect and  fictitious addresses.

- Without an account, it is easy to lose the tracking code if the procedure is stopped but not abandoned, with no recovery method. If an account is created after the fact, the file is not retroactively tied to their account. Contacting city hall is necessary to recover progress.

- It is possible to click on the tracking code in order to be sent a reminder of the code by email, however the code block does not look like an interactive element.

- It is easy to randomly enter tracking code without hacking expertise (an 8-letter combination, no inclusion of symbols, vowels, numbers, or lowercase letters) creates cybersecurity risks.

- When a form is complete and procedure has started, a bar shows the progress made by the Town hall. However, there are processing stages that are not relevant to the user; only the response and the end signal have meaning for the platform's target users.

- No estimated time in days shown for procedure processing.

navigation, uncertainty prevention, account management
The screenshot shows the issues with how it presents the tracking number to users. 
1. The tracking number is on the right side of the screen. Assuming that most users will read left to right, it is not immediately noticeable.
2. Then users will be busy filling the form and may not notice the tracking number.
3. When users check the right to locate their steps, they will not necessarily see the tracking code above, which is separated by space between the two groupings.
4 - At the end of the process, not only is the tracking code no longer on the right, but the sentence mentioning it is not graphically distinguished from the rest.

Accont management

- Redirections to other domains requiring different accounts are not clearly indicated, especially for citizenship and electoral procedures. A Nancy Direct account is useless in those situations.

- When an account is created, a graphically alarming message appears (bright color + "!" symbol).

- It is possible for admins to delete user accounts for improper use or spam. However, this deletion did not prevent said users from viewing their procedures via the tracking code.

The screenshot shows issue with how uncertainty prevention: clicking on certain services redirects users to entirely different platforms with no warning. Here, requests for ID cards redirects users to the national platform for passports and ID cards.

Methodology

We decided to conduct tests with 2 groups:
- 2 Town hall workers (experts)
- 2 Nancy citizens (end users)
If we find a gap in performance between experts and end users, then we can assume that the platform is designer-centric and not user-centric.

A pilot round was conducted with another participant before test rounds to refine our research methods.1. Pre-test survey  
2. 6 usability scenarios
3. Post-test survey
All three steps are done consecutively on the same research round. Researchers are present during the tests.
Participants are encouraged to speak aloud during the test.

Usability scenarios

1. Create an account. This is an opening scenario and helps participants to ease them into more complex tasks.
2. Find the form to sign up to a first aid workshop, you can expand the categories but not click on any service.
3. Fill the first page of any form. Don't send it, close the tab. Now try to resume filling the form you started.
4. From the Nancy en Direct homepage, start a request to digitize a document at a library. Fill the form, and just before sending it, change your mail address.
5. Given a list of 4 goals, select which links you think will contains the desired form without starting them.
6. Access "pay your daycare invoices", then come back to Nancy en Direct homepage.

Data collected:

Quantitative
- Time to perform scenario
- Success or quitting. If a scenario takes more than 4 minutes, it's considered unsuccessful.
- Where participants choose to click first
- Number of clicks
- Number of mistakes made

Qualitative
- Participants' feeling through think-aloud1. Can you create an account before starting the procedures?
  OR for account havers: sign up to "Greenhouse Tuesdays"

Findings

Overall, navigation and interaction issues are the most pressing.
Problems regarding error prevention do not immediately turn off users, but can add up to an unpleasant experience.
Account management issues were more difficult to test for due to the chosen scenarios.

The pre-test survey shows that all participants prefer to use the online service to having to move to the Town Hall.

The scenarios that were the hardest to complete were:

- Scenario 2: The form is actually in "Papers, citizenship and elections". Professionals needed at least a minute, end users took anywhere between 3 and 4 minutes. A user tried to look for "first aid" in the tracking code input bar, thinking it was a search bar. Participants found it frustrating, it confirms our hypothesis about navigation being difficult due to unclear wording and illogical organization.
 
- Scenario 3: out of 4 participants, 3 failed. Participant 1 clicked on a form that directed her to another platform. Participant 1 didn't see it since it was not graphically salient, Participant 4 did not appreciate not getting a warning about the tracking code before starting the form, and Participant 3 is used to having a "save draft" button. This rate of failure confirms our review.
 
- Scenario 4: we expect end users to fail the task, since changing the mail address cannot be done on the form recap; users need to go back to the first step fo the form. Expectations were met, experts succeeded and users failed.
 
- Scenario 5, results confirm reviews about wording issues
 • sub-task 2: "obtain authorization to occupy a public space to carry out private constructions".
No participants found the right sub-category, it took 1:36 to 2:36 minutes before participants abandoned the task.
 • sub-tast 3: "notify authorities of fallen trees on roads".
Everyone found the right category but no one clicked on the right form: all agree that it should be in "Road obstructions" and not in "Requests concerning nature areas".

Post-test survey

In every aspect, experts gave higher rating than end users.
Ergonomics shows the widest difference between the groups: end users are tempted to use the general contact form to find a solution quicker, or even move to the Town Hall to submit their requests.

Average ratings for each usability aspect of the platform after usability tests were conducted;
- Categorization - 2.75 out of 5
- Wording - 2.75 out of 5
- Graphic design - 3 out of 5
- Ergonomics - 2.5 out of 5
Rounded up the platform scores a 3 out of 5.

Essential areas of improvement

On the home page:
1. Add a search bar and filtering options.
2. Center IA and wording on users' mental models. For high level folders, we can rearrange them according to order of utility/likelihood of use for users, and not by alphabetical order.
3. Show each form's purpose without needing to click on them, tool tips of even a drop down box can work.  

On forms:
4. Clearly teach users the purpose of tracking codes and account creation before a form is well underway.

High fidelity wireframe of the solution proposed to solve the issue of presenting tracking codes to users. We simply added a note at the top of the forms: "Do not forget to note down this tracking code somewhere. If you don’t have an account, then you will need to use this code to track your application.  
Click on the number to get a mail with it."
High fidelity wireframe of the solution proposed to solve the issues of hierarchy of categories, information browsing and uncertainty.
1. We added a search bar, keywords are enough to highlight the service requested.
2. We re-organized service groups by how often they are used and how many services are in each group, replacing the alphabetical ordering. 
3. For each service, we added tooltips on click; an interrogation mark icon is clickable on each service and a tooltip window will display what the service is about on the homepage.
ContextReview
Methodology
Improvements