<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://www.wikiworld.com/index.php?action=history&amp;feed=atom&amp;title=InformationTheory</id>
	<title>InformationTheory - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://www.wikiworld.com/index.php?action=history&amp;feed=atom&amp;title=InformationTheory"/>
	<link rel="alternate" type="text/html" href="https://www.wikiworld.com/index.php?title=InformationTheory&amp;action=history"/>
	<updated>2026-05-06T14:11:17Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.45.1</generator>
	<entry>
		<id>https://www.wikiworld.com/index.php?title=InformationTheory&amp;diff=1755&amp;oldid=prev</id>
		<title>imported&gt;Import: Imported current content</title>
		<link rel="alternate" type="text/html" href="https://www.wikiworld.com/index.php?title=InformationTheory&amp;diff=1755&amp;oldid=prev"/>
		<updated>2026-01-28T11:54:25Z</updated>

		<summary type="html">&lt;p&gt;Imported current content&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;&lt;br /&gt;
Claude E. Shannon&amp;#039;s, 1948 &amp;#039;&amp;#039;&amp;#039;&amp;#039;A Mathematical Theory of Communication&amp;#039;&amp;#039;&amp;#039;&amp;#039; is basis and substance of Information Theory.&lt;br /&gt;
&lt;br /&gt;
Shannon defined a &amp;#039;&amp;#039;&amp;#039;&amp;#039;sender&amp;#039;&amp;#039;&amp;#039;&amp;#039; and a &amp;#039;&amp;#039;&amp;#039;&amp;#039;receiver&amp;#039;&amp;#039;&amp;#039;&amp;#039; of a &amp;#039;&amp;#039;&amp;#039;&amp;#039;signal&amp;#039;&amp;#039;&amp;#039;&amp;#039; on a &amp;#039;&amp;#039;&amp;#039;&amp;#039;channel&amp;#039;&amp;#039;&amp;#039;&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
Information is always a measure of the decrease of uncertainty at a receiver.&lt;br /&gt;
&lt;br /&gt;
Specifically, Shannon defined information as the reduction in the uncertainty of the receiver about the state of the sender.&lt;br /&gt;
&lt;br /&gt;
He showed that information can be measured in a discrete number of &amp;#039;&amp;#039;&amp;#039;&amp;#039;bits&amp;#039;&amp;#039;&amp;#039;&amp;#039;, defined as conditional probabilities of the truth value being true.&lt;br /&gt;
&lt;br /&gt;
He formalized the information transfer on both a noiseless and noisy channel.&lt;br /&gt;
&lt;br /&gt;
His result was equivalent to entropy (disorder, S) in thermodynamics and he called his information measure entropy.  Entropy is most often considered the loss of information yielding uncertainty.  An increase in certainty or order is considered negative entropy or increased information.&lt;br /&gt;
&lt;br /&gt;
S = k log(W) is the entropy of a system with W possible states.&lt;br /&gt;
&lt;br /&gt;
http://en.wikipedia.org/wiki/Entropy&lt;br /&gt;
&lt;br /&gt;
Information plus uncertainty equals one.&lt;br /&gt;
&lt;br /&gt;
In general systems theory the entropy, or 2nd Law of the [[LawsOfThermodynamics]], says that the entropy of a system tends to increase.  This leads to Eurler&amp;#039;s formula for heat dissapation, and Schodingder&amp;#039;s equation of quantum possibilitieties, but in quantum systems the interactions are non-linear and completely elastic and fail to progress toward equalibia dispite this tendency.&lt;br /&gt;
&lt;br /&gt;
http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html&lt;br /&gt;
----&lt;br /&gt;
Universal information systems may be enumerated as the networks of sender-receivers logically transforming information with a delay.&lt;br /&gt;
----&lt;br /&gt;
Is disinformation relevant to information theory?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[InformationPhysics]]&lt;br /&gt;
&lt;/div&gt;</summary>
		<author><name>imported&gt;Import</name></author>
	</entry>
</feed>