Everybody suffers from unrealistic expectations about the value of information. While in general it seems that more information is better, it still may not be enough to make decisions.

The flaw of all information systems is that they don’t account for entropy. Entropy of information and the entropy of the systems that are used to manage information.

To make things manageable, whether they admit or not, humans reduce the amount of information that they consider in any given situation.  This is completely understandable, but it is also somewhat random.  Things may work out well, but it may or may not have anything to do with the information that was used.

To streamline things, informati manage information, but trying to put order to information results in unintended consequences.  Information produces noise and massive amounts of variability

Information systems are built around a limited set of variables. To the extent these needs are common across a range of individuals or organizations, a category of software emerges to “manage” this information. Unsurprisingly, this software grows more complex over time because more needs are uncovered as people work with the tools.

Information expands and becomes more complex over time. At the same time, the systems that we use to manage information expand. The expansion in multiple dimensions is great until it gets to be too much.