A busy week for #theiilab! Two MS students successfully defended, and two PhD students successfully qualified. Congratulations to Varnika and Prasanthi on degree completion, and Yuan and Tafadzwa (in pictures) on candidacy.
We developed a stripped-down image-based model to recognize some silent speech commands almost as fast as #speech. Then, evaluated target selection with #eye-gaze pointing + silent speech, where it was comparable to speech. Will present at #ISS2022 https://youtu.be/EZTcyicU5_0 #theiilab
#speech #eye #iss2022 #theiilab
#TiltWalker allows controlling #telepresence #robots with 1 hand by tilting phones by using a velocity-based CD mapping, optimizing interactions, visualization for comfort, clarity, legibility. It's faster, more accurate. Will present at #ISS2022 https://youtu.be/lpmMrlTekvY #theiilab
#TiltWalker #telepresence #robots #iss2022 #theiilab
I'm seeking to recruit three PhD students to join the Inclusive Interaction Lab at #UCMerced in Fall 2023. Check out this page for further details https://www.theiilab.com/prospectivestudents.html. Email me if you're interested or have questions. Please share. #theiilab #hci #ml #a11y #vr #xr #tei #speech
#UCMerced #theiilab #HCI #ml #a11y #vr #xr #tei #speech
We compared mid-air selection with Push, Tap, Dwell, Pinch augmented with Select, Hover & Select ultrasonic feedback in a Fitts' law study. Tap was the fastest, most accurate. Both feedback methods improved performance. Will present at #ISS2022 https://youtu.be/qUSAAMvqxF8 https://www.theiilab.com/pub/Dube_ISS2022_Mid_Air_with_Haptics.pdf #theiilab