[Loadstone] Porting to Android

Dave Mielke dave at mielke.cc
Thu Oct 22 13:01:37 BST 2015


[quoted lines by Shawn Kirkpatrick on 2015/10/21 at 19:44 -0700]

>No doubt I had poor sighted assistance. I think we've all had the
>experience of having a sighted person read something that they're not
>sure about and you don't know exactly how to guide them as to what
>you need. 

Yes, I certainly have. My objection to what you wrote is that you reported 
these things as if they were more or less facts when they weren't. The risk of 
doing that is that people might actually believe you, and that might deter them 
from looking into certain devices.

>We did try the steps you listed but for some reason didn't see anything about 
>talkback. I don't know if it wasn't there or if the person just didn't see it.

It's there. It's near the bottom, so, maybe, the erro was not scrolling down 
to see the rest of the list. I'm not sure how the Settings list is rendered on 
that particular device - some present it in multiple columns and others in just 
one.

>I wish I had known about that power switch trick, that might have helped 
>things.

Note that that works if Accessibility Shortcut has been enabled. For all any of 
us know, some vendours might set it's initial state to off. That switch, too, 
is on the Accessibility Settings screen.

Another place where you can do the "two fingers down till the phone speaks" 
gesture - and this one can't be disabled - is on tghe initial setup screen when 
you turn on an Android device for the very first time.

>Is that accross all devices and versions or is it a new thing?

Android releases often in order to get continual feedback. So, no, various 
features slowly came in release by release. For speech, I'd say Android 
accessibility became good in 4.1 (Jelly Bean). For braille, I'd say it got good 
in 4.2.

For braille users, there are two options. One is Google's BrailleBack. The 
other is brltty (brltty.com).

>That's good to hear that there's the swipe left and right gestures to
>explore screen elements, determining what's on the screen without
>these would be rather difficult.

Yes, prior to 4.1 (when that feature was introduced), it was difficult.

Note that single finger swipes usually do things like scroll the screen. When 
Explore by Touch is on, what Android does is require one more finger than usual 
to perform a standard system gesture. So, for example, to scroll the screen 
you'd use two fingers rather than one. That's why using a single finger to do 
accessibility gestures doesn't confuse things.

-- 
Dave Mielke           | 2213 Fox Crescent | The Bible is the very Word of God.
Phone: 1-613-726-0014 | Ottawa, Ontario   | http://Mielke.cc/bible/
EMail: Dave at Mielke.cc | Canada  K2A 1H7   | http://FamilyRadio.org/



More information about the Loadstone mailing list