This story is part of our new Future of Gaming series, a three-site look at gaming’s most pioneering technologies, players, and makers.
With only your mind: you can imagine, you can dream improbable things, and someday soon, you could play your favorite video games. Mind-control gaming might sound better suited to a Björk music video or a less lovely end of civilization than a world we might actually want to create, but virtual reality and neurotechnology companies are working, with much determination, on giving heads and bodies to our fantasies with science. Part of that has involved maturing virtual reality headsets from the face-fucking, barely usable concrete blocks they were in the ‘90s, to more impressive, trimmer cinderblocks. The technology gives us immediate access to the kinds of alternative worlds people have been striving toward for a millenium, from the many rooms of Christian Heaven to the Disneyland version of paradise, where you can devour turkey legs while surrounded by corporate icons. VR tech is inextricable from where gaming is and where it continues to go, representing both timeless fantasy and our modern ability to realize it. It’s been a long time coming. And, presently, VR research and the future of gaming technology is being determined by your brain, which is setting every industry standard.
The future is now
Whether it’s a vanilla white PlayStation VR 2 ($550), a three-eyed Meta Quest 3 ($500), or any number of others, a basic, modern VR system needs to provide you with the best-looking graphics and most intuitive controls. Anything else is a barrier to entering, and feeling fully submerged in, your virtual reality. Both these things are entrenched in perception, how we synthesize external information. To make virtual reality convincing to our senses, VR tech companies regularly tap neuroscience experts to improve their technology and user experience.
It’s as difficult as it sounds. “Most people don’t realize our brain activities are so noisy and varied,” my former classmate and current University of Michigan PhD audiovisual speech researcher Cody Cao—who also interned at Meta’s Reality Labs—told me over text. “It’s hard to make a generalized tool for everybody. Just within head movement alone, there are three axes of movement, and movement speed and reaction time actually tell us more straightforward info about a gamer than brain signals can.”
But tech companies are relying on our brains to take VR to its most extreme end: mind-controlled gaming. It is real, you know—researchers have already been working on it for years. In 2019, former Valve experimental psychologist Mike Ambinder told an audience at the Game Developers Conference that VR operated by brain-computer interfaces (BCI) (a system that rigs brain signals to an external device) was the most “naturalistic” way to play games.
“What if you didn’t have to remember [every controller input]?” Ambinder said at the time. “What if you could just think about what you wanted to do and it happened? Wouldn’t that change how you play games?”
UC Berkeley computer science professor James O’Brien, whose current research determined that a person’s motion in VR identifies them as reliably as a fingerprint, likewise suggested completely engaging brain-gaming is on its way.
VR won’t be widespread until the technology gets there, “but it will,” O’Brien says over Zoom. He remembers his first experience with VR in around 1995, when “resolution was, like, 300 by 300, or something really terrible,” and cost $200,000 “on the low end.”
Now, “the processing power in a Quest 2 is actually greater than those $200,000 [Silicon Graphics] machines,” he says. “The tracking is much better. Now, the systems use cameras doing what’s called “inside-out” tracking, where the cameras look out at the world, and as you move around, by observing how the world moves, they can infer what your head is doing.”
“You can imagine [that] in a world where those technologies are a lot more mature—which they will be,” O’Brien continues, “we don’t need a big headset. [T]here’ll be some little patch that you stick on the back of your head [...]. That’s your input to the system—thinking, basically. Some movement of your arms, probably, because a lot of people think kinematically. But, either way, that will be the interface, and the display will be something that could even be implanted in your eyes someday.”
“And I think, at that point, [VR] will be ubiquitous,” O’Brien said.
The human cost
However, when it comes to present-day VR tech, we perhaps aren’t giving our bodies—the handy containers for our brains—the priority they deserve. Alexis Souchet, who works at France’s tech institute IRT SystemX and studies virtual reality’s side effects, tells me VR hardware and software companies aren’t appropriately taking “human factors”—physical responses like dizziness and sweat—into account while rolling out products.
“Higher technical specifications don’t automatically mean better for humans,” Souchet says over email. Preliminary research indicates that in “most people,” VR induces “cybersickness, visual fatigue, and some muscle fatigue.”
Souchet recognizes that VR also offers “very promising benefits in mental treatments or pain management,” but that “we tend to pay a lot of attention to the actual benefits, not the risks,” he says.
As VR becomes more desirable—the volume of headsets sold is projected to peak at 27 million units in 2028, compared to 4 million units in 2018—Souchet says our neural plasticity could mold our brains to better fit our usage. Like phones have already proven, we’re locked in simultaneous evolution with our devices.
But, he wonders, “if we adapt to VR, will we be less adapted to our natural environment? What would be the consequences on populations? This is the kind of question that science keeps asking about human technology development. It always drives me to refer to the well-known quote from the sociobiologist Edward O. Wilson: ‘The real problem of humanity is the following: we have Paleolithic emotions, medieval institutions, and godlike technology.’”
And mind-control gaming could make you feel more omnipotent than any technology before it. O’Brien predicts that in the next decade you’ll be able to sit down in front of your TV and, through generative artificial intelligence systems, command your perfect form of entertainment into existence. “In real time,” he says, you’ll “have these high-quality things generated on the fly. [...] I can say, ‘Hey, what happened to Black Widow? Where’d she go? I want to go off into the void where she got sucked in to see what happened to her.”
“Thou shalt not kill off Carrie in her own movie,” I imagine myself saying. Movies and other media we might think of as static will become interactive, more like a video game, and “the difference [between the two] is going to go away,” O’Brien says.
“We’ll all be watching our own things,” he remarks, which could be isolating. But, “at the same time, [it will allow] us to connect with people. I’ll be able to go experience a virtual world with you. That will feel pretty realistic.”
Brain gaming beyond VR
But let’s look around our rooms. “Most people are going to game on their computer screen for the foreseeable future,” Neurable CEO Ramses Alcaide told me over Zoom. (Alcaide, by the way, funded Neurable’s first six months with his Hearthstone tournament winnings.) So technology in our immediate future is “less about what tools you’re using to game [with], [and more about] what tools we’re using to understand ourselves.”
Neurable’s smart headphones, which claim to track your focus by evaluating your brain’s electrical activity with an electroencephalogram (EEG), attempt to accomplish this by telling you when you’re focused.
He slides them on easily—the headphones are “made of fabric sensors,” he says. “Normally, you want metal [for EEG], but, because of our AI, we’re able to use these normal headphones and still record really good brain data.” I watch Alcaide peer at his computer screen and, on a graph, a thin blue bar curves up, then collapses when he announces he’s relaxed.
“There’s no calibration. The technology just works,” he says. “This could be integrated into your games, or your apps. Then, [we’d be] able to tell, is a gamer distracted? Are they burnt out? Are they on tilt?”
At the moment, Neurable is working with esports teams (and the Department of Defense, Alcaide tells me) to make its headphones “a standard part of training,” something that can provide digestible biofeedback and prevent burnout.
“It’s like, ‘hey, this [person’s] ability to focus on the game is getting to the point where they’re not making good decisions. They need to take a break,’” he says. Or, on the development side, “When you give somebody a jump scare—that could be neurally driven, right? How do you know that they learned the key piece of information from the tutorial?” he says. “There’s game development value there.”
In the comfort of your own home
Alcaide imagines Neurable’s technology could be more widely distributed in the next three years, though at a luxury price point (he didn’t decline my $300, $400 estimates). EEG gaming, at the moment, is scrappier than that. Twitch streamer Perrikaryal, who goes by Perri, can demonstrate.
Her setup isn’t as seamless as Neurable’s unreleased tech, but it gets the job done for her viral “mind-control” gaming streams. In these, she extinguishes Elden Ring bosses with her mind, blows through Halo multiplayer with her eyes, and, eventually, will use VR research company Intuitive’s work to play Skyrim’s Real Virtual Magic mod in the most immersive way possible.
“When I’m doing EEG gaming,” she tells me over Zoom, “it’s basically a whole bunch of code that is very, very janky,” then a Tobii eye-tracker “so I can go hands-free,” and a traditional wet EEG with electrodes that use saline solution to help interpret what’s happening under her scalp. Then, she hooks it all up.
“Never works the first time,” she says.
But Perri maintains that anyone could do what she does; they would just have to replicate her “supreme, supreme focus.”
When she’s playing a game, she’s “moving with head tilt, and gyro, and eye-tracking. And then, when I shoot, I’m imagining pushing a heavy boulder forward. When I jump, I’m imagining a plate spinning, and I’ve got some voice commands going so I can navigate the menus.” You could call it “involved.” Sometimes, Perri finds that playing with a normal controller “is a bit of a relief.”
“It’s easier than what I do with the EEG,” she says. But “after a certain amount of time, once I’m over that relief and ease, I’m always thinking ‘well, what’s missing?’”
For Perri, it’s the feeling of being completely absorbed by a game, a fantasy world, an ecstatic vision made tangible and, for tech companies, purchasable.
“I feel like regular gaming with the mouse and keyboard—at least, for me, because I’m exploring other avenues—we’ve done it. I think it’s done. We’ve got it as good as we’re gonna get it,” Perri says. And the future is already here. It’s been waiting, all this time, within you.
"control" - Google News
October 23, 2023 at 08:00PM
https://ift.tt/AbTq6Uk
Mind-Control Gaming Isn't Sci-Fi, It's Just Science - Kotaku
"control" - Google News
https://ift.tt/V5wgKmE
https://ift.tt/i8SutxC
Bagikan Berita Ini
0 Response to "Mind-Control Gaming Isn't Sci-Fi, It's Just Science - Kotaku"
Post a Comment