- So with that I'm gonna hand over now to Dr. Felpman,
and then we'll return back at the end.
- Thank you.
Hi everyone, so, starting in our mission here,
it's midnight, and our pilots have
woken up for this crucial mission.
It's still dark out.
That makes it difficult to wake up, waking up
at midnight is difficult, and becoming alert to the
level that they need to perform this crucial mission
is also difficult for the pilots to obtain
if they haven't received an adequate amount of sleep,
if they had uncomfortable sleeping quarters,
if there is a lot of environmental noise,
and further compounding their fatigue
are circadian rhythm differences.
And so some individuals are able to wake up at odd hours
of the night and perform at their optimal alertness and
performance levels while others struggle to become alert.
And so those who are fatigued and not alert are at an
increased risk for making the errors that can cause mishaps.
And so, whoops, actually, I jumped ahead.
To avoid the detrimental effects of fatigues
we are evaluating how differences in circadian rhythm
can impact mission performance so that we can develop
fatigue mitigation approaches that are tailored to the
individual soldier, and to provide leaders with
the tools that they need to make our soldiers
able to perform at their optimal levels.
So moving on in our mission, it's 0100 hours,
and our soldiers are gearing up for their flight,
and in our future scenario this now includes putting on
flight gear that has biometric sensors embedded.
This can include an electrode that's built into
their helmet, some respiration and heart rate monitoring
sensors that are built into their tops,
and it can also include small cameras that are
built into the cockpit that monitors their eyes,
their eye movement, and their pupillary changes.
To make this possible, our team of researchers is
examining several different types of sensor technologies.
So we're looking at physiological monitoring of the brain
activity, the eye activity, respiration,
or heart rate, just to name a few.
We're doing this because we know that wearable technology
is everywhere, and is becoming less invasive.
So our studies will be able to determine what sensors our
soldiers need to wear, and what sensors are needed
to identify the cognitive changes
that may impede mission success.
So through these studies, we will be able to provide
the data that is needed to develop the algorithms
for predicting performance changes,
and develop a standardized objective metric for assessing
the impact of new technology on a soldier's performance.
So this metric could then be integrated into
the acquisition cycle, to ensure that the systems
we're trying to implement in the cockpit or the grounds
vehicles are actually able to help our human soldier
that's working with that machine,
and not hindering them from their ability to operate.
And moving on to 0130 hours, our pilots have taken off,
and now in our future scenario they're having to
coordinate not only with other manned aircraft,
but also other unmanned systems.
So this means that they will now have increased information
coming from multiple channels that they'll need to
be able to integrate, and make decisions based on,
as well as fly and maintain their situational awareness.
To successfully handle the increase flow of information
and maintain situation awareness to keep them safe,
our soldiers will need to be able to divert their attention
to different sources of mission input.
To help our soldiers successfully integrate
mission-essential information so that they can make
the decisions that are required,
we are examining how non-invasive brain stimulation
and pharmaceuticals can enhance their ability to
attend to different sources of information,
increase the speed at which they can
process that information, and make quick but accurate
decisions using all of those sources of information.
These will then be used in helping soldiers to enhance
their operational effectiveness before they go on a mission,
as well as to avert operator overload when it's detected
by biometric sensors during the mission.
So moving on in our mission, our pilots have been flying
for several hours, and they're now
at the critical point in the mission.
Their alertness is an absolute
necessity to ensure a successful mission,
but it's becoming difficult to maintain,
as they're now assuming control of autonomous vehicles,
as well as responding to and monitoring
video feeds of those ahead of them.
We do not currently have an objective means to detect
when an operator's performance is slipping in real time,
and so the biometric work that our team is doing will
enable the detection of when the
operator is becoming overloaded,
and we'll be able to do that in real time.
This will allow us to engage an adaptive automation system
that will ensure that the human and the machine are working
in a balanced manner so that the human stays engaged in the
mission, but does not become critically overwhelmed.
And so we're evaluating the impact of different sources of
workload that our operators will encounter,
such as cluttered visual displays,
or increased auditory feeds, and we're using this to
develop a suite of sensors that are most sensitive
to identify an overload within those different sources,
and to aid in the determination of which tasks
need to be offloaded to the machine
in order to maintain our operator's effectiveness.
And so to show a quick example of how we are able to
monitor brain activity in near real time we have this clip
from a recent study that we completed,
and so in the lower left box is what the pilot saw
when they were flying, and the lower right hand video
is what the out-the-window view was.
The upper left demonstrates how algorithms, using the
pilots' brain activity that we measured during the flight,
can be used to identify changes in their workload.
And so the yellow line which spikes upward is showing an
increased level of workload experienced.
And then the far right is a heat map showing how his
brain activity was changing during the flight.
We're most interested in increased
activity in the frontal lobes.
And so you might watch that and think
"So what? It looks kinda cool,
but what's the outcome of this?"
Well, as we continue collecting this type of data,
we can use numbers that are driving the changes
in those lines on the graph to then predict
when the automation needs to take control,
and use that to avoid a mishap instead of waiting until a
mishap is likely to engage an automated system.
And so finally, our pilots are at their last
leg of the mission, it's eight thirty they've returned to
friendly territory, but they've been flying for about
seven hours under some pretty stressful circumstances,
and they're now at risk for being
fatigued and becoming complacent.
And, in fact, we know that a significant number of
mishaps tend to happen during this time and within
these sorts of circumstances where
someone's fatigued and likely to become complacent.
And there currently is no validated method
for detecting states of inattention in real time.
Yet we know it's a threat to our aviators,
to our UAS operators, and even to our grounds troops.
And so the detection of these states is critical to
ensure safe mission completion.
We will be completing work where we induce this
state in soldiers and measure their physiological
response with our biometric sensors.
This will help us to create a validated and
reliable method to detect inattention and enable the
machine to then shift controls back to the human,
instead of away from the human,
to allow the human operator to stay engaged and
situationally aware of the mission that they are completing.
So those are just a few examples of the ongoing work
that our team of researchers are completing
to help our soldiers in the future battle space,
and I will now turn it back over to Colonel Taylor
for a couple of closing remarks.
- Thank you Dr. Felpman, I think we can all appreciate
how important that work is as we particularly move to
these autonomous, blended autonomy, you know,
variably autonomous platforms.
Right, they need to know when they need to help you out,
and you need to know when you need
to take control back from them,
and this is the foundational work that helps us
understand when the machine needs to help the man
and when the man can then take back over from the machine.
So we see this application directly translating into
programs like Future Vertical Lift,
the next generation combat vehicle.
We even think that there's gonna be applications
as soldier-lethality as ground soldier systems
become more and more complicated.
For the trivia question,
who owns most of the UAS platforms and the DOD?
The United States Army, actually.
Between those three programs, eagle, raven, and shadow,
we actually now have more unmanned platforms and
systems than any of the other services.
So there'll be a lot more UAS operators currently now
in the army than there will be in the other services.
So with that I think I'd like to
wrap up and see if there's any questions?
I do appreciate your time and attention today.
It's been a privilege to represent not only
yoo-ter-al laboratory but also to represent MRMC today,
and I open the floor to any questions.
(audience mumbling)
Yes sir?
- [Male Audience Member] Do you acquire and
analyze very large databases as part of this work?
- Yes sir.
- Just describe a little bit about what that looks like.
- So, Dr. Felpman is probably better qualified to
answer that, but you can imagine that a single experiment
actually generates enormous amounts of data.
So, that enormous amount of data taken over
multiple subjects, over multiple projects,
over multiple years, we actually
have a very extensive database.
Certainly one of the things we're trying to understand
as we move into the modern era is how do we apply
deep learning, how do we apply learning algorithms,
data management, how do we do those things well
so that we actually can mine not only the data that we
have collected, but actually data that's been collected
previously that we happen to store,
and use that in a way that helps us.
Right, we know, by other work,
that you can take a machine-learning algorithm
and show it thousands of pictures, right?
We have a dermatologist in the field,
you can show them thousands and thousands of images,
and the machine can actually start to detect
basal cells about as well as a human being or
with radiologic images, can do the same,
it can start to detect very fine.
So we also believe that there's a lot of great things
that can be detected in this data.
So in the next six to twelve months we are looking to
see how we can begin to apply that sort of
machine learning to the data that we do have.
Any other questions?
(audience mumbling)
Again, if nothing else, I'll be standing around, we'll be
back at our booth if you'd like a further demonstration.
But again, on behalf of General Hokelman,
Commanding Sergeant Major Rogers, and General West
who's also joined us here today,
the current surgeon general, we'd just like to
thank you for the opportunity to present today.
Thank you for your attention, and we'll be
happy to answer any questions you have, so, thank you.
(audience clapping)
Không có nhận xét nào:
Đăng nhận xét