Microsoft Responsible AI Impact Assessment Template
10
Role of humans
2.7 Indicate with an “X” the description that best represents the system regarding this intended use.
People will be responsible for troubleshooting triggered by system alerts but will not
otherwise oversee system operation. For example, an AI system that generates keywords
from unstructured text alerts the operator of errors, such as improper format of submission
files.
The system will support effective hand-off to people but will be designed to automate
most use. For example, an AI system that generates keywords from unstructured text that can
be configured by system admins to alert the operator when keyword generation falls below a
certain confidence threshold.
The system will require effective hand-off to people but will be designed to automate
most use. For example, an AI system that generates keywords from unstructured text alerts
the operator when keyword generation falls below a certain confidence threshold (regardless
of system admin configuration).
People will evaluate system outputs and can intervene before any action is taken: the
system will proceed unless the reviewer intervenes. For example, an AI system that generates
keywords from unstructured text will deliver the generated keywords for operator review but
will finalize the results unless the operator intervenes.
People will make decisions based on output provided by the system: the system will not
proceed unless a person approves. For example, an AI system that generates keywords from
unstructured text but does not finalize the results without review and approval from the
operator.
Deployment environment complexity
2.8 Indicate with an “X” the description that best represents the system regarding this intended use.
Deployment environment complexity
Simple environment, such as when the deployment environment is static, possible input
options are limited, and there are few unexpected situations that the system must deal with
gracefully. For example, a natural language processing system used in a controlled research
environment.
Moderately complex environment, such as when the deployment environment varies,
unexpected situations the system must deal with gracefully may occur, but when they do,
there is little risk to people, and it is clear how to effectively mitigate issues. For example, a
natural language processing system used in a corporate workplace where language is
professional and communication norms change slowly.
Complex environment, such as when the deployment environment is dynamic, the system
will be deployed in an open and unpredictable environment or may be subject to drifts in
input distributions over time. There are many possible types of inputs, and inputs may
significantly vary in quality. Time and attention may be at a premium in making decisions and
it can be difficult to mitigate issues. For example, a natural language processing system used
on a social media platform where language and communication norms change rapidly.