Using brainwave control devices will be the future VR/AR interaction mode?

â–¼ Using brainwave control devices will be the future VR/AR interaction mode? From Baidu VR All people say that the future with mixed reality will be a very futuristic future. Now, in addition to technical issues such as frame rate and tracking, the entire concept is also a challenge for designers, artists, directors, and other storytellers. From Rift, Apple Watch to Snap Spectacles, it can be seen that wearable computing devices are the current trend. Looking at MR from this trend, the equipment on our heads will also have computing capabilities in the future. Computer history also has some magical similarities. Computers began to be commercialized in 1946, but until the advent of graphical user interfaces (GUI), computing became truly personal, practical, and intuitive. The GUI interfaces that people are accustomed to haven't changed much in the past 30 years (since Macintosh 1984). The current hybrid reality device is like the IBM computer in the 70s. It also needs a GUI-style transformational invention to make it truly become a product that everyone can use. Meta CEO and neuroscientist Meron Gribetz often mentions "computers that do not need to learn," that is, computers that intuitively understand how to use them without learning and habits, so it is basically an extension of your brain. At present, it is The computer "has a new term: brain-computer interface. In 1925, humans discovered brain waves for the first time. In 1970, scientists at the University of California invented the term brain-computer interface (BCI), which used brain waves to control devices. BCI has many applications in the field of medical devices, but it has not been involved in wearable aspects. Currently the most common use areas of AR are efficiency work, construction, industrial assembly, transportation, sports, military, and law enforcement. These are very dynamic workplaces. The current complex and complicated head-display equipment is obviously not mobile enough. This publicity is not Especially suitable for voice input, this time BCI can show its talent. In addition to simple input, BCI can also solve another major limitation of current mixed reality: cognitive overload, when your own brain is like an automatic ad blocker, BCI adapts the interface or responds to user biological indicators. Software to filter out irrelevant information and reduce cognitive overload. On the other hand, BCI makes a lot of functions more realistic. For example, using the Touch controller to display the original force is obviously much more than using the "Ready Force" to display the original force. With BCI, VR will be more immersive. However, due to the excessively high cost and hardware requirements of BCI, its commercial applications have only stayed in a few areas. One of the applications is to determine the user's intention based on brain waves. The equipment costs tens of thousands of dollars. It also requires the user's head to be coated with a layer of paste that increases the signal quality. Companies such as Emotiv and Interaxon are pioneers in this field. They have introduced a low-cost head-up display with "mind-training" capabilities that can increase users' meditation and increase their work focus. The startup company Halo Neuroscience has developed specialized heads-ups for professional sports teams that can improve the quality of training through neural feedback. Although it sounds a little narrower in scope, it is indeed a step toward the rationalization and practical application of prices. Now our mobile phones, computers, tablets, and watches have become the eyes of our brains. In the future, they will become smaller and more invisible. The heavy heads will become contact lenses. At that time, BCI is the only one in this complex interactive process. Solution. It is time for VR/AR to work hard and Nirvana, BCI will be the next.