Probabilistic Reasoning

Artificial Intelligence Chapter 13

Date: 23.04.25 ~ 23.04.31

Writer: 9tailwolf


Introduction


In this chapter, we learn about probabilistic reasoning. I introduced about the way that can representate relationship, and some algorithms.

Followings are some probabilistic inference tasks.

  • Simple queries
  • Conjunctive queries
  • Optimal decisions
  • Value of information
  • Sensitivity analysis
  • Explanation


Bayesian Networks


Bayesian Networks is a graph that representate events’ relationship. There is some characteristic.

  • A direct grpah.
  • An acyclic graph.
  • A conditional distribution for each node given its parents : \(P(X_{i}\mid Parent(X_{i}))\).
  • The calculation is a sum of all products with the number of cases and probability of events.

CPT : Conditional probability table

Global Semantics

\(P (x_{1},x_{2},...,x_{n}) = \Pi_{i=1}^{n} P(X_{i}\mid Parent(X_{i}))\)

Local Semantics

\(P (x_{1},x_{2},...,x_{n}) = P(X_{n}\mid Parent(X_{n-1}))\)

The idea of local semantics came from Markov Chain, which suppose that past can’t affect event of future.


Enumeration Algorithm


Algorithm

def Enumeration-ask(X,e,bn):
    Q(X) = vector()
    for x in X:
        Q(x) = Emumeration-all(bn.Vars,e[x])
    return Normalize(Q(X))

def Enumeration-all(var,e):
    if var == None:
        return 1
    Y = first(vars)
    if e in Y:
        return P(y|parent(Y)) * Enumeration-all(rest(var),e)
    return sum([P(y|parent(Y)) * Enumeration-all(rest(var),e[y]) for y in Y])


Variable Elimination Algorithm


Algorithm

def Elimination-ask(X,e,bn):
    factors = []
    for var in bn.Vars:
        factors = [make_factor(var,e)|factors]
        if var is hidden_variable:
        factors = sum_out(var,factors)
    return normalize(pointwise_product(factors))


Inference by Stochastic Simulation


Prior Sampling
def prior_sample(bn):
    x = []
    for X in bn.X:
        x.append(P(x|parent(x)))
    return x
Reject Sampling
def reject sampling(X,e,bn,n):
    N = []
    for i in range(n):
    x = prior_sampling(bn)
    if x is consist with e:
        N[x]+= 1
    return normalize(N)
Likehood Weighting
def likehood_weighting(X,e,bn,n):
    W = []
    for i in range(n):
        x,w = weight_sampling(bn,e)
        W[x] += w
    return normalize(W)

def likehood_sample(bn,e):
    w = 1
    x = event(e) # list
    for X in bn.X:
        if X in e.X:
            w *= P(e.X|parent(X))
        else:
            x[i] = P(X|parent(X))
    return x,w
Gibbs Sampling
def Gibbs_ask(X,e,bn,n):
    x = initialize(Z)
    N = []
    for j in range(n):
        for z in Z:
            x[z] = p(z|mb(z))
            N[x] = N[x] + 1
    return Normalize(N)