I'm reading The Constitution of Knowledge by Jonathan Rauch (2021). It has been a very slow read for me because it is such a impressive and detailed analysis of what is ailing us today. Here is a major distinction that is largely unappreciated. Information is merely "stuff," whereas knowledge must be carefully earned through the use of intricate institutions that coordinate, test and refine human observations and conclusions. This excerpt is from page 125:
What the institutionalization of modern, fact-based journalism did was to create a system of nodes—professional newsrooms which can choose whether to accept information and pass it on. The reality-based community is a network of such nodes: publishers, peer reviewers, universities, agencies, courts, regulators, and many, many more. I like to imagine the system’s institutional nodes as filtering and pumping stations through which propositions flow. Each station acquires and evaluates propositions, compares them with stored knowledge, hunts for error, then filters out some propositions and distributes the survivors to other stations, which do the same.
Importantly, they form a network, not a hierarchy. No single gatekeeper can decide which hypotheses enter the system, and there are infinitely many pathways through it. . .
Suppose some mischievous demon were to hack into the control center one night and reverse the pumps and filters. Instead of straining out error, they pass it along. In fact, instead of slowing the dissemination of false and misleading claims, they accelerate it. Instead of marginalizing ad hominem attacks, they encourage them. Instead of privileging expertise, they favor amateurism. Instead of validating claims, they share claims. Instead of trafficking in communication, they traffic in display. Instead of identifying sources, they disguise them. Instead of rewarding people who persuade others, they reward those who publicize themselves. If that were how the filtering and pumping stations worked, the system would acquire a negative epistemic valence. It would actively disadvantage truth. It would be not an information technology but misinformation technology.
No one saw anything like that coming. We—I certainly include myself—expected digital technology to broaden and deepen the marketplace of ideas. There would be more hypotheses, more checkers, more access to expertise. How could that not be a leap forward for truth? At worst, we assumed, the digital ecosystem would be neutral. It might not necessarily tilt toward reality, but neither would it systematically tilt against reality.
Unfortunately, we forgot that staying in touch with reality depends on rules and institutions. We forgot that overcoming our cognitive and tribal biases depends on privileging those rules and institutions, not flattening them into featureless, formless “platforms.” In other words, we forgot that information technology is very different from knowledge technology. Information can be simply emitted, but knowledge, the product of a rich social interaction, must be achieved. Converting information into knowledge requires getting some important incentives and design choices right. Unfortunately, digital media got them wrong.