The theory of signal detectability (TSD) has a broad application to tasks where people must distinguish between signal and noise. William Jones has written a long, but fascinating article on how we might use TSD to help stop keeping information that will never be used.
His argument broadly goes like this:
- To keep or not to keep, that is the question.
Whether you hoard information in case it’s useful later, or rigorously trash email, magazine articles and the like is fundamental to Personal Content Management. It is also fundamentally difficult. - Content gets fragmented across tools
Current PCM tools (from Outlook to PDAs) all mean that information can be kept in different ways. The resulting fragmentation makes the keeping decision harder, and makes the mistakes made when making that decision more costly - The keeping decision can be seen as a signal detection task
TSD allows for 2 types of error: saying “yes” to noise, and saying “no” to a signal. In PCM terms, keeping information you won’t use, and chucking information you will - Strategy 1: Reduce cost of noise
In the digital world at least, it is cheap to keep. It takes an awful lot of information to fill up my hard drive. The cost of the first type of error is therefore reduced. - But there are basic limits to this
People, or at least I have only a certain amount of time and attention. It may be cheap to keep, but keeping the wrong stuff means I’m likely to miss the gems I need - Strategy 2: Reduce likelihood of making keeping mistakes
Jones thinks the way forward on this is what he calls PUTs (Personal Unifying Taxonomies)
It’s well worth a read. I still haven’t taken it all in yet, but the grapes from the vine so far are:
Given the available evidence, x, the expected value, E, to keep information factors in the probabilities (P) for a hit and a false positive along with the costs and benefits (V) for each:
#1 E(Keep|x) = V(Hit)*P(Information Is Useful|x) + V(False Positive)*P(Information Is Not Useful|x).
The expected value to not keep information does likewise:
#2 E(Not Keep|x) = V(Miss)*P(Information Is Useful|x) + V(Correct Rejection)*P(Information Is Not Useful|x).
In these expressions, x is a catch–all representing the total evidence a person has readily available when making a decision. Included in x are the person’s understanding of the information itself, related information the person already “has”, and the activities for which this information will be used.
Several things follow from that – not least that a person should keep information when #1 is greater than #2. The one that sold it for me, though, was this:
People should also be more likely to keep information that is presented in a new form and over a new channel. For example, several people in the KFTF studies reported that when they first started to use a Web browser they freely created Bookmarks or Favorites; now they are much less likely to do so. The sense is that their space has “filled up.”
This struck a chord partly because of Ton Zijlstra’s recent post, but also because of Dave Pollard’s view of Personal Content Management and his view of a personal taxonomy. I like this idea of two complementary strategies. And I really like the fact that some of the maths behind it allows you get some clear insights expressed in diagrams. Am feeling very undertolerant of aphorisms such as “knowledge is the sound of one hand clapping, wisdom is the second hand”. Erm, perhaps mainly because I don’t understand them.
Just time for a cuppa, and a quick prayer for enlightenment.