Uncertainty and information : foundations of generalized information theory / George J. Klir.
By: Klir, George J.
Contributor(s): John Wiley & Sons [publisher.] | IEEE Xplore (Online service) [distributor.].
Material type: BookPublisher: Hoboken, New Jersey : Wiley-Interscience, c2006Distributor: [Piscataqay, New Jersey] : IEEE Xplore, [2005]Description: 1 PDF (xvii, 499 pages) : illustrations.Content type: text Media type: electronic Carrier type: online resourceISBN: 9780471755579.Subject(s): Uncertainty (Information theory) | Fuzzy systems | Additives | Aggregates | Algorithm design and analysis | Area measurement | Argon | Bibliographies | Books | Calculus | Capacity planning | Cities and towns | Cognition | Complexity theory | Decision making | Density functional theory | Distribution functions | Entropy | Equations | Finite element methods | Fuzzy set theory | Fuzzy sets | Indexes | Information theory | Joints | Materials | Mathematical model | Measurement errors | Measurement uncertainty | Measurement units | Possibility theory | Power measurement | Probabilistic logic | Probability density function | Probability distribution | Sections | Silicon | Statistical analysis | Temperature measurement | Terminology | Tin | Uncertain systems | Uncertainty | WeavingGenre/Form: Electronic books.Additional physical formats: Print version:: No titleDDC classification: 003/.54 Online resources: Abstract with links to resource Also available in print.Includes bibliographical references (p. 458-486) and indexes.
Preface -- Acknowledgments -- 1 Introduction -- 1.1. Uncertainty and Its Significance -- 1.2. Uncertainty-Based Information -- 1.3. Generalized Information Theory -- 1.4. Relevant Terminology and Notation -- 1.5. An Outline of the Book -- Notes -- Exercises -- 2 Classical Possibility-Based Uncertainty Theory -- 2.1. Possibility and Necessity Functions -- 2.2. Hartley Measure of Uncertainty for Finite Sets -- 2.2.1. Simple Derivation of the Hartley Measure -- 2.2.2. Uniqueness of the Hartley Measure -- 2.2.3. Basic Properties of the Hartley Measure -- 2.2.4. Examples -- 2.3. Hartley-Like Measure of Uncertainty for Infinite Sets -- 2.3.1. Definition -- 2.3.2. Required Properties -- 2.3.3. Examples -- Notes -- Exercises -- 3 Classical Probability-Based Uncertainty Theory -- 3.1. Probability Functions -- 3.1.1. Functions on Finite Sets -- 3.1.2. Functions on Infinite Sets -- 3.1.3. Bayes' Theorem -- 3.2. Shannon Measure of Uncertainty for Finite Sets -- 3.2.1. Simple Derivation of the Shannon Entropy -- 3.2.2. Uniqueness of the Shannon Entropy -- 3.2.3. Basic Properties of the Shannon Entropy -- 3.2.4. Examples -- 3.3. Shannon-Like Measure of Uncertainty for Infinite Sets -- Notes -- Exercises -- 4 Generalized Measures and Imprecise Probabilities -- 4.1. Monotone Measures -- 4.2. Choquet Capacities -- 4.2.1. M�bius Representation -- 4.3. Imprecise Probabilities: General Principles -- 4.3.1. Lower and Upper Probabilities -- 4.3.2. Alternating Choquet Capacities -- 4.3.3. Interaction Representation -- 4.3.4. M�bius Representation -- 4.3.5. Joint and Marginal Imprecise Probabilities -- 4.3.6. Conditional Imprecise Probabilities -- 4.3.7. Noninteraction of Imprecise Probabilities -- 4.4. Arguments for Imprecise Probabilities -- 4.5. Choquet Integral -- 4.6. Unifying Features of Imprecise Probabilities -- Notes -- Exercises -- 5 Special Theories of Imprecise Probabilities -- 5.1. An Overview -- 5.2. Graded Possibilities -- 5.2.1. M�bius Representation -- 5.2.2. Ordering of Possibility Profiles.
5.2.3. Joint and Marginal Possibilities -- 5.2.4. Conditional Possibilities -- 5.2.5. Possibilities on Infinite Sets -- 5.2.6. Some Interpretations of Graded Possibilities -- 5.3. Sugeno l-Measures -- 5.3.1. M�bius Representation -- 5.4. Belief and Plausibility Measures -- 5.4.1. Joint and Marginal Bodies of Evidence -- 5.4.2. Rules of Combination -- 5.4.3. Special Classes of Bodies of Evidence -- 5.5. Reachable Interval-Valued Probability Distributions -- 5.5.1. Joint and Marginal Interval-Valued Probability Distributions -- 5.6. Other Types of Monotone Measures -- Notes -- Exercises -- 6 Measures of Uncertainty and Information -- 6.1. General Discussion -- 6.2. Generalized Hartley Measure for Graded Possibilities -- 6.2.1. Joint and Marginal U-Uncertainties -- 6.2.2. Conditional U-Uncertainty -- 6.2.3. Axiomatic Requirements for the U-Uncertainty -- 6.2.4. U-Uncertainty for Infinite Sets -- 6.3. Generalized Hartley Measure in Dempster-Shafer Theory -- 6.3.1. Joint and Marginal Generalized Hartley Measures -- 6.3.2. Monotonicity of the Generalized Hartley Measure -- 6.3.3. Conditional Generalized Hartley Measures -- 6.4. Generalized Hartley Measure for Convex Sets of Probability Distributions -- 6.5. Generalized Shannon Measure in Dempster-Shafer Theory -- 6.6. Aggregate Uncertainty in Dempster-Shafer Theory -- 6.6.1. General Algorithm for Computing the Aggregate Uncertainty -- 6.6.2. Computing the Aggregated Uncertainty in Possibility Theory -- 6.7. Aggregate Uncertainty for Convex Sets of Probability Distributions -- 6.8. Disaggregated Total Uncertainty -- 6.9. Generalized Shannon Entropy -- 6.10. Alternative View of Disaggregated Total Uncertainty -- 6.11. Unifying Features of Uncertainty Measures -- Notes -- Exercises -- 7 Fuzzy Set Theory -- 7.1. An Overview -- 7.2. Basic Concepts of Standard Fuzzy Sets -- 7.3. Operations on Standard Fuzzy Sets -- 7.3.1. Complementation Operations -- 7.3.2. Intersection and Union Operations -- 7.3.3. Combinations of Basic Operations.
7.3.4. Other Operations -- 7.4. Fuzzy Numbers and Intervals -- 7.4.1. Standard Fuzzy Arithmetic -- 7.4.2. Constrained Fuzzy Arithmetic -- 7.5. Fuzzy Relations -- 7.5.1. Projections and Cylindric Extensions -- 7.5.2. Compositions, Joins, and Inverses -- 7.6. Fuzzy Logic -- 7.6.1. Fuzzy Propositions -- 7.6.2. Approximate Reasoning -- 7.7. Fuzzy Systems -- 7.7.1. Granulation -- 7.7.2. Types of Fuzzy Systems -- 7.7.3. Defuzzification -- 7.8. Nonstandard Fuzzy Sets -- 7.9. Constructing Fuzzy Sets and Operations -- Notes -- Exercises -- 8 Fuzzification of Uncertainty Theories -- 8.1. Aspects of Fuzzification -- 8.2. Measures of Fuzziness -- 8.3. Fuzzy-Set Interpretation of Possibility Theory -- 8.4. Probabilities of Fuzzy Events -- 8.5. Fuzzification of Reachable Interval-Valued Probability Distributions -- 8.6. Other Fuzzification Efforts -- Notes -- Exercises -- 9 Methodological Issues -- 9.1. An Overview -- 9.2. Principle of Minimum Uncertainty -- 9.2.1. Simplification Problems -- 9.2.2. Conflict-Resolution Problems -- 9.3. Principle of Maximum Uncertainty -- 9.3.1. Principle of Maximum Entropy -- 9.3.2. Principle of Maximum Nonspecificity -- 9.3.3. Principle of Maximum Uncertainty in GIT -- 9.4. Principle of Requisite Generalization -- 9.5. Principle of Uncertainty Invariance -- 9.5.1. Computationally Simple Approximations -- 9.5.2. Probability-Possibility Transformations -- 9.5.3. Approximations of Belief Functions by Necessity Functions -- 9.5.4. Transformations Between l-Measures and Possibility Measures -- 9.5.5. Approximations of Graded Possibilities by Crisp Possibilities -- Notes -- Exercises -- 10 Conclusions -- 10.1. Summary and Assessment of Results in Generalized Information Theory -- 10.2. Main Issues of Current Interest -- 10.3. Long-Term Research Areas -- 10.4. Significance of GIT -- Notes -- Appendix A Uniqueness of the U-Uncertainty -- Appendix B Uniqueness of Generalized Hartley Measure in the Dempster-Shafer Theory -- Appendix C Correctness of Algorithm 6.1.
Appendix D Proper Range of Generalized�Shannon Entropy -- Appendix E Maximum of GSa in Section 6.9 -- Appendix F Glossary of Key Concepts -- Appendix G Glossary of Symbols -- Bibliography -- Subject Index -- Name Index.
Restricted to subscribers or individual electronic text purchasers.
Deal with information and uncertainty properly and efficiently using tools emerging from generalized information theory Uncertainty and Information: Foundations of Generalized Information Theory contains comprehensive and up-to-date coverage of results that have emerged from a research program begun by the author in the early 1990s under the name "generalized information theory" (GIT). This ongoing research program aims to develop a formal mathematical treatment of the interrelated concepts of uncertainty and information in all their varieties. In GIT, as in classical information theory, uncertainty (predictive, retrodictive, diagnostic, prescriptive, and the like) is viewed as a manifestation of information deficiency, while information is viewed as anything capable of reducing the uncertainty. A broad conceptual framework for GIT is obtained by expanding the formalized language of classical set theory to include more expressive formalized languages based on fuzzy sets of various types, and by expanding classical theory of additive measures to include more expressive non-additive measures of various types. This landmark book examines each of several theories for dealing with particular types of uncertainty at the following four levels: * Mathematical formalization of the conceived type of uncertainty * Calculus for manipulating this particular type of uncertainty * Justifiable ways of measuring the amount of uncertainty in any situation formalizable in the theory * Methodological aspects of the theory With extensive use of examples and illustrations to clarify complex material and demonstrate practical applications, generous historical and bibliographical notes, end-of-chapter exercises to test readers' newfound knowledge, glossaries, and an Instructor's Manual, this is an excellent graduate-level textbook, as well as an outstanding reference for researchers and practitioners who deal with the various problems involving uncertainty and information. An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.
Also available in print.
Mode of access: World Wide Web
Description based on PDF viewed 12/21/2015.
There are no comments for this item.