Télécharger la présentation
Publié parSalomon Marcel Modifié depuis plus de 10 années
1
The most incomprehensible thing about the world
is that it is comprehensible Albert Einstein
2
Bayesian Cognition Julien Diard Pierre Bessière Probabilistic models
of action, perception, inference, decision and learning Julien Diard CNRS - Laboratoire de Psychologie et NeuroCognition Pierre Bessière CNRS - Laboratoire de Physiologie de la Perception et de l’Action
3
To get more info Bayesian-Programming.org ftp://ftp-serv.inrialpes.fr/pub/emotion/bayesian-programming/Cours
4
Plan / planning Bessière c1 15/11 Diard c2 29/11, c3 13/12, c4 03/01
Incomplétude, incertitude, Programme Bayésien, inférence Bayésienne Diard c2 29/11, c3 13/12, c4 03/01 Modèles Bayésiens en robotique et sciences cognitives Diard c5 10/01 Sélection de modèles, machine learning, distinguabilité de modèles Bessière c6 17/01 Compléments : algorithmes d’inférence, maximum d’entropie
5
Daniel J. Simons & Christopher Chabris
Perception test Daniel J. Simons & Christopher Chabris Harvard University
10
Probability Theory as an alternative to Logic
The actual science of logic is conversant at present only with things either certain, impossible, or entirely doubtful, none of which (fortunately) we have to reason on. Therefore the true logic for this world is the calculus of Probabilities, which takes account of the magnitude of the probability which is, or ought to be, in a reasonable man's mind . James Clerk Maxwell
11
Incompleteness and Uncertainty
A very small cause which escapes our notice determines a considerable effect that we cannot fail to see, and then we say that the effect is due to chance. H. Poincaré
12
Shape from Image
13
Shape from Motion DROULEZ COLAS PLOS 6 EXPE PSYCHO
Colas, F., Droulez, J., Wexler, M. & Bessiere, P. (2008) Unified probabilistic model of perception of three-dimensional structure from optic flow; in Biological Cybernetics,in press Colas, F. (2006) Perception des objets en mouvement : Composition bayésienne du flux optique et du mouvement de l’observateur, Thèse INPG
14
Illusions: McGurkeffect
Courtesy of Masso Arnt, Associate Professor, University of Oslo Cathiard, M.-A., Schwartz, J.-L. & Abry, C. (2001). Asking a naive question to the McGurk effect : why does audio [b] give more [d] percepts with usual [g] than with visual [d] ? In Proceedings of the /Auditory Visual Speech processing, AVSP'2001/, Aalborg, Copenhague,
15
Credit card fraud detection
16
Beam-in-the-Bin experiment (Set-up)
17
Beam-in-the-Bin experiment (Results)
18
Beam-in-the-Bin experiment (Results)
19
Beam-in-the-Bin experiment (Results)
20
Logical Paradigm Incompleteness
21
Bayesian Paradigm =P(M | SDC) P(MS | DC)
22
Principle Incompleteness Uncertainty Decision Preliminary Knowledge +
Experimental Data = Probabilistic Representation Bayesian Learning Uncertainty Bayesian Inference Decision
23
Thesis Probabilistic inference and learning theory, considered as a model of reasoning, is a new paradigm (an alternative to logic) to explain and understand perception, inference, decision, learning and action. La théorie des probabilités n'est rien d'autre que le sens commun fait calcul. Marquis Pierre-Simon de Laplace The actual science of logic is conversant at present only with things either certain, impossible, or entirely doubtful, none of which (fortunately) we have to reason on. Therefore the true logic for this world is the calculus of Probabilities, which takes account of the magnitude of the probability which is, or ought to be, in a reasonable man's mind . James Clerk Maxwell By inference we mean simply: deductive reasoning whenever enough information is at hand to permit it; inductive or probabilistic reasoning when - as is almost invariably the case in real problems - all the necessary information is not available. Thus the topic of « Probability as Logic » is the optimal processing of uncertain and incomplete knowledge . E.T. Jaynes Subjectivist vs Objectivist epistemology of probabilities ?
24
A water treatment unit (1)
Complete simulation Incomplete model Observe the consequences of this incompleteness 11 values, 0 the worst
25
A water treatment unit (2)
26
A water treatment unit (3)
27
A water treatment unit (4)
28
A water treatment unit (5)
29
Uncertainty on O due to inaccuracy on S
30
Uncertainty due to the hidden variable H
31
Not taking into account the effect of hidden variables may lead to wrong decision (1)
C=0,1 or 2 leads to optimal value O*=6 With H the “reality” is somewhat more complex The adequate choice of C is more complex but also more informed
32
Not taking into account the effect of hidden variables may lead to wrong decision (2)
C=0,1 or 2 leads to optimal value O*=6 With H the “reality” is somewhat more complex The adequate choice of C is more complex but also more informed
33
Principle Incompleteness Uncertainty Decision Preliminary Knowledge +
Experimental Data = Probabilistic Representation Bayesian Learning Uncertainty Bayesian Inference Decision
34
Basic Concepts Far better an approximate answer to the right question which is often vague, than an exact answer to the wrong question which can always be made precise. John W. Tuckey
35
Bayesian Spam Detection
Classify texts in 2 categories “spam” or “nonspam” Only available information: a set of words Adapt to the user and learn from experience
36
Variable
37
Probability
38
Normalization postulate
39
Conditional probability
40
Variable conjunction
41
Conjunction postulate
42
Syllogisms Logical Syllogisms: Probabilistic Syllogisms: Modus Ponens:
Modus Tollens: Probabilistic Syllogisms:
43
Marginalization rule
44
Joint distribution and questions (1)
45
Joint distribution and questions (2)
3^N+1-2^N+1
46
Joint distribution and questions (3)
3^N+1-2^N+1
47
Decomposition
48
Bayesian Network
49
Parametric Forms (1)
50
Parametric Forms (2)
51
Identification
52
Specification = Variables + Decomposition + Parametric Forms
Variables: the choice of relevant variables for the problem Decomposition: the expression of the joint probability distribution as the product of simpler distribution Parametric Forms: the choice of the mathematical functions of each of these distributions
53
Description = Specification + Identification
54
Questions (1)
55
Questions (2)
56
Questions (3)
57
Question (4)
58
Bayesian Program = Description + Question
Specification Identification Description Question Program Variables Parametrical Forms or Recursive Question Decomposition Preliminary Knowledge p Experimental Data d Utilization
59
Bayesian Program = Description + Question
60
Results SpamSieve
61
Theoretical Basis Content: Definitions and notations Inference rules
Bayesian program Model specification Model identification Model utilization $$$citation$$$
62
Logical Proposition Logical Proposition are denoted by lowercase name:
Usual logical operators:
63
Probability of Logical Proposition
We assume that to assign a probability to a given proposition a, it is necessary to have at least some preliminary knowledge, summed up by a proposition p. Of course, we will be interested in reasoning on the probabilities of the conjunctions, disjunctions and negations of propositions, denoted, respectively, by: We will also be interested in the probability of proposition a conditioned by both the preliminary knowledge p and some other proposition b:
64
Normalization and Conjunction Postulates
Bayes rule Cox Theorem Resolution Principle Why don't you take the disjunction rule as an axiom?
65
Discrete Variable Variable are denoted by name starting with one uppercase letter: By definition a discrete variable is a set of propositions Mutually exclusive: Exhaustive: at least one is true The cardinal of X is denoted:
66
Variable Conjunction Not a variable
67
Conjunction rule
68
Normalization rule Proof
69
Marginalization rule Proof
70
Contraction/Expansion rule
71
Rules
72
Description The purpose of a description is to specify an effective method to compute a joint distribution on a set of variables: Given some preliminary knowledge p and a set of experimental data d. This joint distribution is denoted as:
73
Decomposition Partion in K subsets: Conjunction rule:
Conditional independance: Decomposition:
74
Parametrical Forms or Recursive Questions
75
Question Given a description, a question is obtained by partitionning the set of variables into 3 subsets: the searched variables, the known variables and the free variables. We define the Search, Known and Free as the conjunctions of the variables belonging to these three sets. We define the corresponding question as the distribution:
76
Inference
77
2 optimisation problems
78
API and Inference Engine
main () { //Variables plFloat read_time; plIntegerType id_type(0,1); plFloat times[5] = {1,2,3,5,10}; plSparseType time_type(5,times); plSymbol id("id",id_type); plSymbol time("time",time_type); //Parametrical forms //Construction of P(id) plProbValue id_dist[2] = {0.75,0.25}; plProbTable P_id(id,id_dist); //Construction of P(time | id = john) plProbValue t_john_dist[5] = {20,30,10,5,2}; plProbTable P_t_john(time,t_john_dist); //Construction of P(time | id = bill) plProbValue t_bill_dist[5] = {2,6,10,40,20}; plProbTable P_t_bill(time,t_bill_dist); //Construction de P(time | id) plKernelTable Pt_id(time,id); plValues t_and_id(time^id); t_and_id[id] = 0; Pt_id.push(P_t_john,t_and_id); t_and_id[id] = 1; Pt_id.push(P_t_bill,t_and_id); //Decomposition // P(time id) = P(id) P(time | id) plJointDistribution jd(time^id,P_id*Pt_id); ProBT® ProBAYES.com Bayesian-Programming.org Specification Variables Decomposition Description Parametric Forms Bayesian Program Identification Learning from instances //Question //Getting the question P(id | time) plCndKernel Pid_t; jd.ask(Pid_t,id,time); //Read a time from the key board cout<<"P(id,time)= "<<Pid_t<<"\n"; cout<<"Time? : "; cin>>read_time; //Getting P(id | time = read_time) plKernel Pid_readTime; Question
Présentations similaires
© 2024 SlidePlayer.fr Inc.
All rights reserved.