Google’s “Voice Access” is decent for controlling the device through verbal commands, but you have to be looking at the screen to get results - it won’t read anything back to you.

Google’s “TalkBack” will read things on screen to you, but you have to interact with the screen physically (never mind the significant change in how interactions work - which I understand the need for - but it’s still a serious mental PITA to switch between the two interaction methodologies frequently).

Is there no way to just interact with it entirely verbally? A (very) simple example of what I’m looking for:

  1. “What are the current Google News headlines?”
  2. Starts reading each one aloud, along with the names of the sources.
  3. “Read the article about Trump caught making out with Elon from AP News.”
  4. Proceeds to load the article & read it aloud.

(Yeah, I know there are podcasts for this - it’s meant to illustrate the basic idea of completely verbal interaction with the device, not be an actual problem I’m looking for someone to provide a solution to.)

It just seems to me that we should be able to do this by now - especially with all the AI blow-up over the past couple of years. Can anybody point me to a usable solution to accomplish this?

TIA.

EDIT: I thought of a better example (I think), because it occurred to me that the above one could (sort of) be done with a Google Home speaker. I’m looking to be able to interact with Android apps verbally wherever possible, so my better example is “What are the latest posts made to the ‘No Stupid Questions’ community on Lemmy?” So far as I know, Google Home is not able to do such a thing. I’d like to tell Android to open my Lemmy client and start reading post headlines until it hit one I wanted to have it open & read to me.

I’m basically looking to use apps verbally to fill in gaps that Google Home/Assistant don’t cover.

EDIT 2: Here’s an even better, more universally applicable description of what I’m after - copied from a response I gave to another comment:

Imagine someone doing some relatively mindless menial job such as working an assembly line, janitorial work, chauffer - something where your mind is relatively unoccupied, but you’re not free to look at and/or touch your device (whether it be due to practicality, or job rules). While doing that job, I want to be able to have the device read and interact with something of interest to me at that moment (ADHD is a fickle mistress), rather than just relying on podcasts with predefined content. Kind of like having someone next to me doing all the interfacing between me and the device.

EDIT 3: I added a comment with some news on options that may come close enough to doing the job which I’ve come across since posting.

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    5 days ago

    Not without tearing your hair out in frustration when it constantly mishears you or doesn’t even fucking activate. I don’t know how well Alexa or Siri work in this regard, but Android’s shit fucking sucks. Never works when I need it; but somehow wakes up in the dead of night as if someone said “Hey Google” in the dead silence.

    • SanctimoniousApe@lemmings.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      5 days ago

      Yeah, the Google Minis are hit or miss for us a lot, too. I don’t generally have the keyword activation switched on with my phone, but was going to activate it if need be for this. I’ve got a Bluetooth headset with very good background noise cancellation (according to those I’ve spoken to on the phone), so I’m hoping/assuming that will help it understand me more reliably, and that Assistant activation via its button will obviate the need for turning on keyword activation.

  • SanctimoniousApe@lemmings.worldOP
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    5 days ago

    Replying to my own post in case anyone’s following this (still too new to Lemmy to know if that’s a thing - guess I’ll look into it after this).

    Apparently, Moto may be working on something that fills this need. The promo video in this article only demonstrates things I think Google Assistant might already be capable of (or maybe slightly more), but the article states what they’re working on involves using apps on your phone to do things rather than just being a microphone and speaker for stuff that actually happens mostly on Google’s servers. Crossing my fingers that includes doing what I want rather than just being focused on buying things as demonstrated in the teaser video.

    In the meantime, I’ve stumbled across the fact that Google’s built-in TalkBack feature has support for keyboard shortcuts. This likely means I can use a small handheld Bluetooth game controller along with an app to map the buttons to the appropriate key presses in order to move around the screen, and thus control the Android device without looking at or touching it. That means using Google Assistant to launch the apps, and the controller to actually use it via TalkBack. I’ll likely test this next weekend & report back if anyone’s interested.

    ETA: Forgot to mention: remembered that my kid has an iPad required by his school some years ago, but so far as we can see the capabilities are roughly on par with Android so switching platforms doesn’t seem to be the answer.

  • Skezlarr@aussie.zone
    link
    fedilink
    arrow-up
    2
    ·
    7 days ago

    I might be misunderstanding the question, so if you’re looking to make an existing android device you have work like this, please ignore the below.

    Maybe a Google Nest Mini would do what you’re looking for? It’s definitely got the ability to be interacted with entirely verbally, but I guess the downside is a smaller selection of apps that do work with it (there is a list on the Google store of which apps it works with)

    • SanctimoniousApe@lemmings.worldOP
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      6 days ago

      I have a few Google Home Minis, and so far as I know they are the same thing - just rebranded. It’s basically the same thing as Google Assistant built into every Google-approved/equipped (meaning it has their full suite of apps pre-installed) device. They’re just so limited. I know of no way to get them to read Lemmy posts as in my added second example.

      I also thought of another use case for what I’m after that might be more universally applicable and easily understood. Imagine someone doing some relatively mindless menial job such as working an assembly line, janitorial work, chauffer - something where your mind is relatively unoccupied, but you’re not free to look at and/or touch your device (whether it be due to practicality, or job rules). While doing that job, I want to be able to have the device read and interact with something of interest to me at that moment (ADHD is a fickle mistress), rather than just relying on podcasts with predefined content. Kind of like having someone next to me doing all the interfacing between me and the device.

      (EDIT: minor swipe keyboard corrections.)

  • Catoblepas@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    15
    ·
    edit-2
    7 days ago

    Sorry if I’m overlooking something obvious here, but you’re basically asking about accessibility features, right? Whatever settings and features blind users use should let you navigate without looking at it. I realize that’s overly vague but I only know how to get to accessibility features on iOS, sorry!

    Although I will say you probably would need to get used to listening to text at a very high speed to use it at the speed you read.

    • SanctimoniousApe@lemmings.worldOP
      link
      fedilink
      arrow-up
      14
      ·
      edit-2
      7 days ago

      Imagine someone blind who also has Parkinson’s - they can’t see to use Voice Access, and they can’t control their hands well enough to interact with the screen physically in a reliable manner using TalkBack. You can’t actually use those two accessibility features together - they are mutually exclusive in that they require you either be able to see the screen, OR you must be able to interact with it physically as it reads out what you’re touching. Why is there no way to interact entirely verbally?

      ETA: please see my added example in OP.

      • Catoblepas@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        6
        ·
        7 days ago

        Someone who knows more about how to/if you can do this on Android will have to answer the specifics for that; I know on iOS you can use custom voice control actions (along with default voice control phone navigation mode) to do more or less what you’re describing. I’m surprised if Android has no accessibility features that work similarly.

        • SanctimoniousApe@lemmings.worldOP
          link
          fedilink
          arrow-up
          6
          ·
          7 days ago

          iOS will open your Lemmy client, start reading posts to you aloud, and go into a post of interest upon command without you ever looking at or touching the screen (using my newer example that I added to the OP)? I’m seriously going to have to look into getting an iDevice of some sort if so.

          • Catoblepas@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            6
            ·
            edit-2
            7 days ago

            I don’t know if this is still the case, but I know years back iPhones were preferred by a lot of blind people just in terms of accessibility. Digging through accessibility settings, it looks like you can use Voice Control to tell it to open Lemmy, and VoiceOver to read all the text on the screen without touching it. I don’t know about the example you added to your OP, adding phrases it would need to interpret seems more like a Siri thing (which I don’t use), so I don’t know how well that plays with Voice Control.

            I wouldn’t rush out to buy anything unless some Android people confirm it’s not doable. Apple does have people that know the software working at their stores, so they could tell you specifics for sure. And check that I’m not totally wrong, lol.

            • SanctimoniousApe@lemmings.worldOP
              link
              fedilink
              arrow-up
              5
              ·
              7 days ago

              Yeah, I wouldn’t just jump in without looking first. If I can’t find a way to do this, then I’m definitely gonna have to take a trip to the nearest Apple Store, though. Thanks very much for the input!