Information Theory – Less Is More, More or Less

Published by Lex on

Information theory is the underlying foundation upon which modern information technology is built.

Two key concepts of information theory are:

1) Encode as much information in as little space as possible.
2) Confirmation that the information has been transmitted and received.

The concepts are powerful, but they only apply at the technical level. While the principles of information theory apply in human to human communication, the complexity of human communication make it more difficult to apply in practice.

The first question is how much information is enough. Shorthand is great, but all parties need to be able to understand shorthand in order for it to be effective. (This partly explains why so many disciplines have specialized vocabulary, but that is a subject for another day).

More things, more data, and more data about things. This creates more things to manage more data and more data to manage more things. While this seems like an endless cycle, this doesn’t even begin to cover the amount of information lost in the interactions between all of these elements .

But as it becomes easier to share unstructured data, information sharing has become sloppier and less precise. Dumping it all in an email or putting it in a channel and expecting someone to find and understand it is not realistic. While this is expedient and feels like documentation, it is not effective communication. Volume is not a substitute for clarity.

In a world of ambiguity, clarity is not easy to achieve. Oversimplification is easily mistaken for clarity too.

The solution to this problem is not better technology, but much better management of information.

Between all of the senses, the human brain is always processing information. Some information for basic functioning and survival and other information for some other motivation. Curiosity or entertainment perhaps.

And when you add all of the individuals that make up an organization, you have an exponential messy information situation. There is nuance, there is miscommunication, there is timing, and there is gaming of information in one form or another.

The way that most organizations deal with the increase in available knowledge, is to have more specialists to deal with the information. However, more specialists means more information coordination. More coordination means more information handoffs, and  information handoffs are still clunky.

While a lot of work has been done to improve the information handoffs, the solutions don’t seem to scale very well. There are standards for information exchange, but there are still a lot of “gotchas” when moving information between systems.

Even information that stays within one system is plagued with issues. Inconsistent policies, inconsistent data entry, programming changes, and other issues can lead to unanticipated problems that cascade down into other systems.

In theory, big data systems are able to filter out the noise and come up with something that comes close to the truth, but something tells that this truth is closer to averages over time as opposed to the truth at any particular point in time.

But because the data problem is so immense, we continue to poke away at the edges hoping that a better algorithm or a simpler interface will save us.

But it is an endless cycle. Data and information are additive. They are built on what has been built before regardless of the strength of the foundation.

Building effective modern information management and communication systems requires some serious rethinking about the entire information management lifecycle. Getting back to the basics of information theory seems like a good place to start.