Decision making under uncertainty : theory and application / Mykel J. Kochenderfer, with contributions from Christopher Amato, Girish Chowdhary, Jonathan P. How, Hayley J. Davison Reynolds, Jason R. Thornton, Pedro A. Torres-Carrasquillo, N. Kemal �Ure, John Vian.
By: Kochenderfer, Mykel J.
Contributor(s): IEEE Xplore (Online Service) [distributor.] | MIT Press [publisher.].
Material type: BookSeries: Lincoln Laboratory series: Publisher: Cambridge, Massachusetts : MIT Press, [2015]Distributor: [Piscataqay, New Jersey] : IEEE Xplore, [2015]Description: 1 PDF (xxv, 323 pages) : illustrations (some color), portraits.Content type: text Media type: electronic Carrier type: online resourceISBN: 9780262331708.Subject(s): Intelligent control systems | Automatic machinery | Decision making -- Mathematical models | Epitaxial layers | Excitons | Nitrogen | Radiative recombination | Silicon carbide | Temperature measurementGenre/Form: Electronic books.Additional physical formats: Print version:: No titleDDC classification: 003/.56 Online resources: Abstract with links to resource Also available in print.Summary: Many important problems involve decision making under uncertainty -- that is, choosing actions based on often imperfect observations, with unknown outcomes. Designers of automated decision support systems must take into account the various sources of uncertainty while balancing the multiple objectives of the system. This book provides an introduction to the challenges of decision making under uncertainty from a computational perspective. It presents both the theory behind decision making models and algorithms and a collection of example applications that range from speech recognition to aircraft collision avoidance.Focusing on two methods for designing decision agents, planning and reinforcement learning, the book covers probabilistic models, introducing Bayesian networks as a graphical model that captures probabilistic relationships between variables; utility theory as a framework for understanding optimal decision making under uncertainty; Markov decision processes as a method for modeling sequential problems; model uncertainty; state uncertainty; and cooperative decision making involving multiple interacting agents. A series of applications shows how the theoretical concepts can be applied to systems for attribute-based person search, speech applications, collision avoidance, and unmanned aircraft persistent surveillance. Decision Making Under Uncertainty unifies research from different communities using consistent notation, and is accessible to students and researchers across engineering disciplines who have some prior exposure to probability theory and calculus. It can be used as a text for advanced undergraduate and graduate students in fields including computer science, aerospace and electrical engineering, and management science. It will also be a valuable professional reference for researchers in a variety of disciplines.Includes bibliographical references and index.
Restricted to subscribers or individual electronic text purchasers.
Many important problems involve decision making under uncertainty -- that is, choosing actions based on often imperfect observations, with unknown outcomes. Designers of automated decision support systems must take into account the various sources of uncertainty while balancing the multiple objectives of the system. This book provides an introduction to the challenges of decision making under uncertainty from a computational perspective. It presents both the theory behind decision making models and algorithms and a collection of example applications that range from speech recognition to aircraft collision avoidance.Focusing on two methods for designing decision agents, planning and reinforcement learning, the book covers probabilistic models, introducing Bayesian networks as a graphical model that captures probabilistic relationships between variables; utility theory as a framework for understanding optimal decision making under uncertainty; Markov decision processes as a method for modeling sequential problems; model uncertainty; state uncertainty; and cooperative decision making involving multiple interacting agents. A series of applications shows how the theoretical concepts can be applied to systems for attribute-based person search, speech applications, collision avoidance, and unmanned aircraft persistent surveillance. Decision Making Under Uncertainty unifies research from different communities using consistent notation, and is accessible to students and researchers across engineering disciplines who have some prior exposure to probability theory and calculus. It can be used as a text for advanced undergraduate and graduate students in fields including computer science, aerospace and electrical engineering, and management science. It will also be a valuable professional reference for researchers in a variety of disciplines.
Also available in print.
Mode of access: World Wide Web
Description based on PDF viewed 12/29/2015.
There are no comments for this item.