π node [[7 data]]
π
garden/KGBicheno/Artificial Intelligence/Feminist Chatbot/The Feminist Design Tool/7 Data.md by @KGBicheno
7 Data
There are two main areas where data might be problematic.
- Training data for machine learning having its own inherent bias based on how it's chosen
- The collection of data ommitting questions or responses relevant to demographics outside the mainstream
Questions to ask yourself
- How will you collect and treat data through the development of your design?
- Are you aware of how bias might manifest itself in your training data?
- Are you aware of how bias might manifest itself in the AI techniques that power your design (like machine learning)?
- How could stakeholder-generated data and feedback be used to improve the design?
- Will the design learn from the stakeholderβs behaviour, and if so, are you assuming that the design will get it right?
- What mechanisms or features could make these assumptions visible to the stakeholder and empower them to change the assumptions if they want to?
- How will you protect stakeholder data?
Go to [[8 Architecture]]
For the full list see [[The Feminist Design Tool]]
This is part of the [[Feminist Chatbot Main Page]]
π stoas
- public document at doc.anagora.org/7-data
- video call at meet.jit.si/7-data
β₯± context
β pushing here
(none)
(none)
β pulling this
(none)
(none)
β₯
related node [[2007 04 09 the travesty of canadian mobile data rates]]
β₯
related node [[2009 04 27 open web vancouver open restaurants open data]]
β₯
related node [[20210321205327 the_commodification_of_data]]
β₯
related node [[20210516170353 data_type]]
β₯
related node [[20210714215033 data_is_a_commodity]]
β₯
related node [[20210714215054 data]]
β₯
related node [[20210714214523 data_relations_logic_mag]]
π full text search for '7 data'