La présentation est en train de télécharger. S'il vous plaît, attendez

La présentation est en train de télécharger. S'il vous plaît, attendez

Bayesian Inference Algorithms Revisited

Présentations similaires


Présentation au sujet: "Bayesian Inference Algorithms Revisited"— Transcription de la présentation:

1 Bayesian Inference Algorithms Revisited

2 Inference

3 2 optimisation problems

4 Symbolic Simplification

5 Exact symbolic simplification (example)

6 Question dependent 9x106

7 Reordering

8 Applying normalization

9 Factorizing

10 Result (1) =40

11 Result (2) (21x10)+9+1=220

12 Summary Reorder Normalize Factorize
SPI : Symbolic Processing Inference SRA: Successive Restriction Algorithm

13 Question independent

14 Sharing parts (1)

15 Sharing parts (2)

16 Sharing parts (3)

17 Sharing parts (4)

18 Sharing parts (5)

19 Sharing parts (6)

20 Sharing parts (7)

21 Message passing algorithms

22 Example 2

23 Question dependent

24 Junction Tree Algorithm

25 Cut-Set Algorithm

26 Max-Product & Min-Sum Algorithms
No free variables

27 Viterbi Algorithm

28 Approximate symbolic simplification: Variational methods

29 Crunching numbers: Sampling methods
Monte Carlo (MC) Importance sampling Rejection sampling Markof Chains Monte Carlo (MCMC) Metropolis sampling Gibbs sampling Information theory, Inference and learning algorithms (2003) D. MacKay

30 Bayesian Learning Revisited

31 Data and Preliminary knowledge

32 Using Preliminary Knowledge
How to Deal with Data? Using Preliminary Knowledge

33 Direct problem: Inverse problem:

34 Bernoulli's Urn (1) Variables Draw Decomposition Parametrical Form
Preliminary Knowledge p: "We draw from an urn containing w white balls and b black balls"

35 Bernoulli's Urn (2) Variables: Decomposition: Parametrical Form: Note:

36 Bernoulli's Urn (3)

37 Parameters Identification
Variables: Decomposition:

38 Model Selection Variables: Decomposition:

39 Summary

40 Entropy Principles Content: Entropy Principle Statement
Frequencies and Laplace succession law Observables and Exponential laws Wolf's dice

41 Entropy Principle Statement
flip of a coins: 9553 heads Probability distribution of the coin?

42 Observables and Exponential Laws
Constraint levels: Maximum Entropy Distribution: proof Partition Function: Constraints differential equation:

43 20 000 Flips Observable: Constraint levels:
Maximum Entropy Distribution: Partition Function: Constraints differential equation:

44

45 Frequencies and Laplace Succession Law
Preliminary Knowledge: 1- Each of the digit is a number 2- The data come from the same phenomenon 3- A single variable V has been observed times 4- The order of these observations is not relevant 5- The variable V may take 6 different values

46 Wolf's dice (1) H1 Hypothesis: excavations shifted the gravity center

47 Wolf's dice (2) H2 Hypothesis: The dice is oblong along the 1-6 direction and the excavations shifted the gravity center

48 Wolf's dice (3) Inverse Problem:

49 Theoretical Basis Objective: Justify the use of the entropy function H
Content: What is a good representation? Combinatorial justification Information theory justification Bayesian justification Axiomatic justification Entropy concentration theorems justifications

50 What is a Good Representation?

51 Combinatorial Justification
Statistical Mechanic Probabilistic Inference q microscopic states q propositions Macroscopic state Distribution

52 Entropy Principles Preliminary Knowledge
"Exchangeability" Preliminary Knowledge: D has no meaningful order Each "experience" in D is independent from the others knowing the model and its parameters Each "experience" in D corresponds to a unique phenomenon

53 Maximum Entropy for Frequencies
Variables: Decomposition: Proof

54 Minimum X-entropy with Observed Frequencies
Variables: Decomposition:

55 Shannon’s justification
Shannon C. E. (1948) ; “A Mathematical Theory of Communication” ; Bell Systems Technical Journal ; 27 Reprinted as Shannon C.E. & Weaver (1949) “The Mathematical Theory of Communication” ; University of Illinois Press, Urbana

56 Shore’s Axiomatic Justification
Shore, J.E. & Johnson, R.W. (1980) ; “Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy” ; IEEE Transactions on Information Theory ; IT

57 Entropy Concentration Theorem
Jaynes E.T. (1982) ; “On the rationale of Maximum Entropy Methods” ; Proceedings of the IEEE Robert Claudine (1990) ; “An Entropy Concentration Theorem: Applications in Artificial Intelligence and Descriptive Statistics” ; Journal of Applied Probabilities

58 Probabilistic reasoning and decision making in sensory-motor systems
Want to know more ? Bayesian-Programming.org Probabilistic reasoning and decision making in sensory-motor systems Springer, Star Series

59


Télécharger ppt "Bayesian Inference Algorithms Revisited"

Présentations similaires


Annonces Google