UI PROTOTYPE & USER TESTING FOR A TAX REPORT APP
Pen, Paper, Proto.io, Adobe Illustrator
Developing a solid UI prototype that is highly functional for testing, as well as achieving the trustworthy look and feel.
Set up a new user account and start filling in the taxable income section by using the auto-fill feature, or manually typing in.
When this project was assigned to me in February, it was the beginning of the tax season. Good time to conduct a quick test on tax reporting app! There are few similar apps out in the market at the moment, however I wanted to challenge myself if I can make the process any easier by simply taking the photo of the slips instead of copying the numbers down manually. Rapid prototyping with pen and paper was a perfect method to see if the "photo-capture" process was intuitive enough at a glance; and if it is, how do users actually interact with it.
To avoid cutting down million sticky notes into a waste, I first started with a flow chart using a short task scenario. Tax reporting process has pretty well defined steps by nature, I did not have much to sort again. But I could definitely add extra features such as auto-fill to help the process move faster.
I do believe that the lesser screens or steps there are, the easier for the user; However there still are quite a few steps and documentations involved with a tax reporting process, that there are only so much that I can help with on my end. Hence, instead of merging every single sections into a trillion pixels long screen, I decided to number out each new section on a separate screen in hopes of giving the end user a sense of a goal.
Screens in order of expected appearance
From what I observed on the low fidelity prototype testing of the same task, my testers did not necessarily have a problem understanding the main functions or completing the task. However there were few questions and suggestions regarding the smaller details that I have not thought of until I tested, such as confirmation pop ups or differentiating between save & submit button.
Along with the revisions from the low-fi prototype, I also had to come up with an aesthetic solution to make the app look dependable and approachable at the same time. I decided to use bright green as the primary body colour, to communicate both trust and easy accessibility to the audience.
- Adding in a menu button as well as a side panel to slide in.
There has been a hamburger icon added on the top left of all screens to toggle a sidebar panel with menus. This menu will navigate to 5 (6 if including the wordmark) most crucial / frequently used functions; 1) View user profile, 2) start a new report, 3) link to all the saved reports, 4) check the status of submitted reports / returns, and 5) exit the app.
- Also adding in a mini calendar to choose dates from, instead of typing in
A standard drag-and-pick the date type of a calendar can pop up for choosing dates such as birth dates.
- Need to be able to distinguish between "save" and "submit" actions clearly via icons.
On the bottom of the screen — pages navigator, there has been a "submit" (letter envelope) icon added aside to the "save" (floppy disk) icon. This "submit" icon will only appear at the end of all taxable information sections. In the end, user will have to interact with a confirmation alert upon "submit".
- Need to communicate / confirm with the users about saving the user profile for future use.
Another pop-up alert regarding user profile update has been added. Also for any alert or a confirmation screen, the "confirm" (green checkmark) button has now moved to the right side and the "cancel" button vise versa for right handed users.
Screens in order of expected appearance
As this project started as one of the course assignments, I initially started prototyping on InVision, following the assignment expectation. However, while testing the med-fi phase prototype on an actual mobile device, I found that testers, including myself, are almost constantly "swiping" across the screen by accident; And this "swipe" in InVision navigates the testers to unexpected screens that was supposed to await until a user interaction occurs. However this UI is designed for a mobile app, and the task scenario involves an actual body motion (taking a photo using the built-in camera), therefore I wanted to persist on using a mobile device for testing. (On a desktop display, InVision works just fine!) Unfortunately I could not solve the issue on my end, hence I decided to give a try on other prototyping tools.
For this project I tried using Proto.io web app, which can be used both layer-based or page-based. I had multiple "pages" lined up in order first, then started bringing in layers of pop-ups or other features into the pages.
Although the photo capture is a critical part of the task, I'm not aware of any prototyping tool that actually allows access to camera built in on a mobile device, therefore I simply simulated a camera screen separately and brought it in as a page.