INFORMATION-THEORETIC MEASURE AND PRINCIPLE OF MAXIMUM ENTROPY
Let P=(p1 , p2 ,…,pn), be a probability distribution, then the first important measure of information was given by a pioneer Communication Engineer C.E. Shannon [2] in 1948. The amount of information is that amount by which the uncertainty in a situation characterized by a probability distribution ‘P’ is reduced, when it is known that which outcome will occur. The problem can be reduced in that of finding uncertainty associated with a probability distribution ‘P’. On the basis of some plausible postulates the measure of uncertainty of ‘P’ is deduced as
H (P) = - Sum(pi * ln(pi)) (1)
H(P) is known as a measure of entropy. The word entropy stands for ‘uncertainty’. H(P) satisfies most of the useful properties as, non-negativity, concavity, additivity, increasing with number of outcomes, maximum for uniform distribution, minimum for degenerate distribution and minimum value is zero etc. required to be satisfied by a measure of entropy. Here information and uncertainty are intrinsically related as
Information gained == uncertainty removed
My QUESTION : HOW CAN THIS POWERFUL PRINCIPLE BE USED FOR PRODUCT DESIGN AND SYSTEM ANALYSIS?
No comments:
Post a Comment