The weirdest thing just happened.
As I often do, I participated tonight in #wjchat, a weekly Twitter chat for journalists seeking to make good use of current information technology (I'm pretty sure that the "wj" is for "web journalist"). We were talking about security and privacy, and someone posted a link to this xkcd cartoon:
I wanted it so much that I posted this on Facebook:
As I often do, I participated tonight in #wjchat, a weekly Twitter chat for journalists seeking to make good use of current information technology (I'm pretty sure that the "wj" is for "web journalist"). We were talking about security and privacy, and someone posted a link to this xkcd cartoon:
After reading it and letting it sink it a bit, I gave up trying to guess what entropy bits are, and turned to Wikipedia (some journalists, by the way, view Wikipedia as totally untrustworthy because it is so easily editable; but I think over time the preponderance of good actors over bad ones is making increasingly trustworthy and valuable). The entry titled "Entropy (information theory)" opens thus:
In information theory, entropy is a measure of the uncertainty in a random variable.[1] In this context, the term usually refers to the Shannon entropy, which quantifies theexpected value of the information contained in a message.[2] Entropy is typically measured in bits, nats, or bans.[3] Shannon entropy is the average unpredictability in a random variable, which is equivalent to its information content. Shannon entropy provides an absolute limit on the best possible lossless encoding or compression of any communication, assuming that[4] the communication may be represented as a sequence of independent and identically distributed random variables.
The reference to Claude E. Shannon reminded me of this book, which I read in 1985 or so:
Then the weird thing happened. As I remembered that book, I felt this...warmth inside. The phrase "fire in the belly" comes to mind. I wanted to re-read it, to refresh my memory on its contents - and then to discuss it.
I'm experiencing a desire to read up on information theory, but only if I can talk about it with someone. Anyone out there with an interest in Shannon, Bernoulli, entropy bits, etc?
Maybe it's not so weird that upon recalling the pleasure of reading a certain book, one might want to re-read it. But the desire was not just to re-read the book; the desire was to learn information theory. More specifically to learn information theory with a group of people.
Whatever on earth for?
I mean, what will I do with the stuff I re-learn, or the new stuff I learn? Heck if I know. But maybe if I at least have someone to discuss it with, I can extract some measure of day-to-day usefulness from it all. Just as I maintain the hope that if I read consistently at my own level about neuroscience, I can apply my learnings to my community work.
Or if not, maybe I can just get better at enjoying the learning for the joy of learning. Of useless learning.
Which wouldn't be so bad.
No comments:
Post a Comment