Surgically implanted devices that allow paralyzed people to speak can also eavesdrop on their inner monologue.

That’s the conclusion of a study of brain-computer interfaces (BCIs) in the journal Cell.

The finding could lead to BCIs that allow paralyzed users to produce synthesized speech more quickly and with less effort.

But the idea that new technology can decode a person’s inner voice is “unsettling,” says Nita Farahany, a professor of law and philosophy at Duke University and author of the book: The Battle for Your Brain.

  • teft@piefed.social
    link
    fedilink
    English
    arrow-up
    11
    ·
    3 days ago

    What’s to stop a government from forcibly inserting one of these into the brain of someone they need information from? It’d be real hard to stop your thoughts from revealing the location of the classified information if it’s anything like that whole “don’t think of a purple elephant” thing.

    • Admetus@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      13
      ·
      3 days ago

      This probably pops up in a few sci-fi books but Hyperion’s was by far the nastiest version of this + torture. What’s to stop big gov doing it?

    • MysteriousSophon21@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 days ago

      Current BCIs require extensive training where the user actively thinks specific patterns - they cant just “read” random thoughts, and the implants are customized to specific brain regions and neural patterns so forcing someone to use one without their cooperation would yield gibbersh data at best.

      • Coopr8@kbin.earth
        link
        fedilink
        arrow-up
        4
        ·
        3 days ago

        Think Clockwork Orange scenario. Hard not to think words when you are shown those things in images, and especially if you’re drugged.

      • teft@piefed.social
        link
        fedilink
        English
        arrow-up
        6
        ·
        3 days ago

        Today that is true. I’m thinking of ten or twenty years from now when they have enough training data from all the volunteers to make solid guesses for randoms. I just think it’s something people should keep in mind for a technology like this. It could easily be abused.