πŸ“š node [[bias_(ethics fairness)]]

bias (ethics/fairness)

Go back to the [[AI Glossary]]

#fairness

  1. Stereotyping, prejudice or favoritism towards some things, people, or groups over others. These biases can affect collection and interpretation of data, the design of a system, and how users interact with a system. Forms of this type of bias include:

    • automation bias
    • confirmation bias
    • experimenter’s bias
    • group attribution bias
    • implicit bias
    • in-group bias
    • out-group homogeneity bias
  2. Systematic error introduced by a sampling or reporting procedure. Forms of this type of bias include:

    • coverage bias
    • non-response bias
    • participation bias
    • reporting bias = sampling bias
    • selection bias

Not to be confused with the bias term in machine learning models or prediction bias.

πŸ“– stoas
β₯± context