Brian,
I think it depends on who you talk to. There is a challenge for the AI community - I think - to figure out what the human is up to. In my opinion, a human factors challenge is to determine what requirements are to provide data to AI (systems) with data without the need of the human to do so explicitly. My potential application area is air traffic control and it would be nice if an AI could be aware of what an air traffic controller is looking at, but what (human factors) requirements do we have for an eye tracking system that would provide that data? For example, if the eye tracking system is not accurately enough, an AI system may create a whole new type of nuisances for the human ("Brian, did you notice that aircraft" - even though you are looking right at it).
It also depends on how we define AI, e.g. some will label speech recognition as AI, but do we have HF requirements for speech recognition to become part of systems? It is one thing for Siri or Google Assistant to require me to repeat three times that I want to add bananas to my shopping list, but in a safety critical system that would result in the user turning that function off.
------------------------------
Bernardus (Ben) Willems
Engineering Research Psychologist
Mays Landing NJ
------------------------------
Original Message:
Sent: 02-16-2023 14:59
From: Brian Green
Subject: AAAI23
Hi all,
I was lucky enough to go to the AAAI23 conference this week in DC. It was a very informative adventure.
I found it very interesting as human factors engineer to learn that Human-AI interaction is not necessarily a human-centric consideration, often it is the human supporting the needs of AI systems.
I'm curious if any of the rest of you were there and what you thought of it.
Brian Green, Human Factors Team Lead
NRR/DRO/IOLB/HFT
"All organizations are perfectly designed to get the results they get."
~Arthur W. Jones, Organizational Design Expert for Proctor & Gamble