Brain interface and time critical machine operation.

1. Introduction

 

On the 28th of August 2020, Elon Musk made a live presentation of Neuralink V2 developments.

https://www.youtube.com/watch?v=iOWFXqT5MZ4

During this presentation, he showed to the World some of the first living animals implanted with a brain interface, elaborating on the compete inocuity of the technology.

 

He also mentioned the fact that FDA had approved the first human trials back in July this year.

 

I immediately forwarded this presentation to my kids, telling them that this was possibly the biggest breakthrough of the modern human evolution and will lead to a relatively fast evolution of the human kind.

 

In this blog, I am not going to try to cover all the immense possibilities of the technology, but rather focus on a field I know well : time critical machines operator control and management.

 

I have logged about 20,000 flight hours on manned aircraft and over 5,000 hours on un-manned ones, including super fast remote controlled jet powered vehicle, I am going to initially focus on the aviating and navigating part of a flying system ( immediate and near future flight path management ), then on the management of a complex machinery like and airliner.

 

2. Flight path and navigation management

 

Elon Musk has clearly mentioned that the electrodes could be implanted in different region of the neuro cortex, and the brain being quite plastic would arrange its neuronal network to connect to its new interface. This means that implanting a chip in the motor cortex would likely allow an operator to control a machine directly from his brain signals, without touching any tactile controls. This will be true for future machines. But what about current ones ?

 

Well, most airliners have an on-board system that allows maintenance to connect to the airplane network and download diagnostics. This is called « MAT » at Boeing. This system also allows an engineer to view all the live recorded parameters of the aircraft in flight. These are upper layer systems ( i.e. the flight control algorythms are not accessible, however their input parameters are ).

MAT has a computer interface that allows a PC to read these parameters and upload diagnostic software. I would be technically relatively easy ( disregarding the legal implications at this stage ) for Boeing to offer its airline customers an interface that broadcasts some parameters over WiFi within the flight deck to connect to pilot’s electrodes. The Neuralink system would much rather be connected to a zone of the sensory cortex in this case.

 

The output from the pilots would remain tactile ( i.e. hand output on the control wheel, thrust levers, rudder pedals, panels switches and knobs ). However the classic visual and sensory input would become enhanced by the ability to become aware of a lot more parameters.

 

The current typical panel of sensory inputs used by an experienced pilot are as followed in order of importance ( this can vary from one individual to another ):

 

1. Visual inputs: flight parameters ( flight instruments display, map reading ), management parameters ( systems messages, ACARS text, knob position )

2. Acceleration input ( vestibular system input, x, y and z acceleration sensation in organs as described as “guts flying” )

3. Sound input ( aerodynamic noise, engine noise )

4. Smell input ( dust smell, ozone smell any abormal burning smell )

5. Ear sensation ( pressurization felt through ear drum position )

6. Skin sensation ( airflow within the flight deck, air dryness )

7. Skeletal/ skin interface for vibration ( mostly coming from the engines and the flight controls with very different frequency modes )

 

 

These traditional inputs could be complemented by a set of sensory acquired parameters from the aircraft flight computer ( AIMS for Boeing ). The manufacturer could choose a set of data as a feed to the operator’s interface. Essentially the type of data available to a HGS ( Head up Guidance System ) would be suitable.

This would allow the individual to focus on the outside environment ( VMC flying when available or IMC runway acquisition) and have the sensory display of a kind of HUD ( head up display ) parameters at all times.

 

3. Mission management

 

However, another stream of information could be sensed by the operator for the other important side of time critical system operation: mission management.

This includes several levels of awareness and skills ( in chronological order ):

 

Mission parameters awareness

Systems awareness and analysis

Communication

Decision making

 

Some available data could be relevent to interface:

Mission parameters could be modified directly into the operators set of sensory data.

Systems parameters could be available as a stream, allowing the operator to “feel” the aircraft like his own body. Possibly, some sensation of discomfort could be input for some chosen parameters drifting out of normal range ( for example engine EGT ).

Decision making could be complemented by suggestions from mission control, flight operations via ACARS including a degree of confidence.

In case of emergency, relevent operational data could be uploaded at high speed ( airfield weather, airfield operational status, available diversion choices, enhanced system knowledge etc...)

 

In parallel, the interface could read the pilot’s brain waves and assess his/ her awareness and alertness levels and assess the suitability for the phase of flight ( maximum alertness in all phases of flight is not required and not wanted ).

 

Furthermore, the sensory information could wake the pilot up while he is taking controlled rest into the flight deck.

 

 

For airline operations, PAX related information, medical advise, security information could be made available to the flight purser in a similar way.

 

4. Certification considerations

 

Certification is always a big step for aviation designs. Such a novel system could take a long time to get certified for aircraft integration.

How reliable will the direct to brain inputs be?

How can this be monitored and recorded?

 

The first answer is to record the brain interface input as part of the black box data.

The second one is to retain the set of data feed to the operator as what is sent to currently certified HGS system.

 

However, other questions arise:

How will the operator perceive the inputs?

How will he organize the data stream within his thoughts?

How to control this to match current certified layout?

 

 

5. Legal considerations

 

Legal implications of the Neuralink system are new and possibly extensive.

They do not follow conventional legal issues and could potentially create situations outside of the scope of the laws.

I mentioned earlier the possibility to upload decision making suggestions. To what extend a direct-to-brain upload will influence one’s decision making process? Will the operator still be liable on direct-to-brain suggested decision making? Or will the liability be shared with the originator of the suggestion?

6. Ethical considerations

 

Ethical aspects of the Neuralink system are deep and will certainly raise many society debates.

I mentioned earlier the possibility to monitor the operator’s awareness and alertness, wake up the operator during his rest.

How far can the intrusion into one’s brain go? How much brain monitoring can be allowed? How much privacy will be preserved?

 

7. Conclusion

 

This short article offers a glimpse into the revolution that a Neuralink system could offer for time critical machines operation. I have mostly focused on aircraft, but this would be true for very different other systems, like nuclear powerplants, trains, spacecrafts, cars or simply riding your bicycle...