Refraction Eye Exam Application
A full refraction test at home
This application enabled individual to obtain their eye parameters without the need to visit the optometrist. The app will make a significant impact for those who face challenges accessing professional eye care, by making it more accessible to a broader audience
My Role:
Visual conceptualizations, detail design, user testing.
Background:
This cutting edge technology is built upon distinctive targets tested and developed by professional optometrist. I had the privilege of serving as the primary UX lead planning the user journey.
This involved incorporating clear graphic elements with audio instructions for seamless user guidance, ultimately leading individuals through a captivating yet challenging subjective experience.
Scroll through this page to observe the entire journey both the design and research processes.
1. Set up
Users found eligible for the exam following a clinical questionnaire proceed to configure their environments
The Set up process encompasses four primary conditions:
Phone Placement:
This crucial factor significantly influences the outcomes, potentially impacting the alignment of eye parameters and introducing distortions to the user's vision.
User Distance:
Achieving success in the examination requires the user to accurately determine their positioning and how to align themselves in front of the phone.
This choice dictates the size of the target displayed later.
Room Lighting and Face Recognition
Clear identification of the user by the phone camera is crucial. Algorithm in better recognizing the iris and mitigating issues such as squinting or incomplete eye coverage.
2. Speak or Listen?
Fundamental aspect of eye examination involves prompting users with inquiries regarding their perception of the targets
Based on user response, the application determines the subsequent presentations and their formats.
Design exploration of speech recognition elements:
Implement a visual cue on the screen to alert the user to pay attention during moments when the app is speaking or awaiting a response.The process involve Investigating speech recognition interfaces, such as Siri and Google speech to employ recognizable visual cues for speech indications and understanding their operational mechanisms.
The objective was to introduce a MVP version incorporating visual indicators along with a distinctive 'bip' sound to assess auditory cues. At the conclusion of the examination, users were queried about their awareness of when to vocalize: "How did you discern the appropriate moment to speak?"
The findings were inconclusive, fifty percent of participants reported seeing the visual indication but did not register the accompanying sound, while the remaining half demonstrated a lack of attention to on-screen events altogether.
However, all participants expressed comfort when prompted to speak.
3. Finalizing the design
Avoiding unnecessary icons or distractions that might divert the user's attention.
The half circle where the target appears, serving as the primary indicator for when it's time to speak or listen.
Enhancing this status change involves incorporating a distinct color shift and subtle visual representations of audio waves.
This approach ensures that the UI communicates effectively without overwhelming the user with unnecessary elements.
In app screens
Following an extensive clinical test with a prototype, I started the in app screens.
Initially, the design must align with the requirements identified during usability testing,
as well as meet the standards set by senior management for the product.