Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Uncertainty and Probabilistic Reasoning - Artificial Intelligence - Lecture Slides, Slides of Artificial Intelligence

Some concept of Artificial Intelligence are Agents and Problem Solving, Autonomy, Programs, Classical and Modern Planning, First-Order Logic, Resolution Theorem Proving, Search Strategies, Structure Learning. Main points of this lecture are: Uncertainty and Probabilistic Reasoning, Graphical Models Preliminaries, Conditional Independence, Bayesian, Acyclic Directed Graph Model, Conditionally Independent, Thunder, Vertices, Edges, Markov Condition

Typology: Slides

2012/2013

Uploaded on 04/29/2013

shantii
shantii 🇮🇳

4.4

(14)

98 documents

1 / 16

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Lecture 28 of 41
Uncertainty and Probabilistic Reasoning:
Graphical Models Preliminaries
Docsity.com
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff

Partial preview of the text

Download Uncertainty and Probabilistic Reasoning - Artificial Intelligence - Lecture Slides and more Slides Artificial Intelligence in PDF only on Docsity!

Uncertainty and Probabilistic Reasoning:Lecture 28 of 41

Graphical Models Preliminaries

Docsity.com

Graphical Models of Probability

P =( 20s P ( T , (^) ) · Female P ( F ) ·, PLow, ( L | (^) TNon-Smoker ) · P ( N | T , F ,) · No-Cancer, P ( N | L, N ) · Negative, P ( N | N ) · Negative P ( N | N ))

  • • • Conditional IndependenceBayesian (Belief) NetworkMarkov Condition for BBNs (Chain Rule): – – – – – X Example:Acyclic directed graph modelVertices (nodes)Edges (arcs, links) is conditionally independent (CI) from P ( Thunder V : denote events (each a random variable) E (^) : denote conditional dependencies| Rain , Lightning B = ( V, E, ) = (^)  YP ) representing (^) (given Thunder Z iff | PLightning ( X CI assertions | Y , Z ) =)  P ( TX  | over ZR ) for all values of | LX, Y , and Z
  • Example BBN PX 1 ,X 2 ,,Xn   in 1 PXi|parentsXi 

 Non Gender  ^ Age Descendant^^ X^ X^12  ^  s^ Exposure-To-Toxins^ Smoking Parents  X^ X^34 ^ Cancer^ X^5^  Descendant^ Serum Calcium^ Lung Tumor^ XX ^67  s

Docsity.com

Semantics of Bayesian Networks

Docsity.com

Markov Blanket

Docsity.com

Evidential Reasoning for Car DiagnosisExample:

Docsity.com

BNJ Core [1]Design

Docsity.com

BNJ Graphical User Interface:Network

© 2004 KSU BNJ Development Team

Network^ ALARM Docsity.com

BNJ Visualization [1]Framework

© 2004 KSU BNJ Development Team Docsity.com

BNJ Visualization [3]Network

© 2004 KSU BNJ Development Team

Network^ Poker Docsity.com

  • Scalability – Large networks (50+ vertices, 10+ parents)Features in ProgressCurrent Work:
  • Other Visualizations^ – – –^ Very large data sets (10K2 for structure learningConditioning^6 +)
  • • BNJ v1-2 portsLazy Evaluation – – Guo’s dissertation algorithms Importance sampling (CABeN) NetworkBarley © 2004 KSU BNJ Development TeamDocsity.com

• Introduction to Probabilistic Reasoning – – Framework: using probabilistic criteria to searchProbability foundationsSummary Points H

• Bayes’s Theorem – – Definition of conditional (posterior) probabilityProduct rule^ •^ •^ Definitions: subjectivist, objectivist; Bayesian, frequentist, logicistKolmogorov axioms

• • MaximumNext Week: Chapter 14, Russell and Norvig – – – Bayes’s Rule and MAP Uniform priors: allow use of MLE to generate MAP hypothesesRelation to version spaces, candidate elimination A Posteriori (MAP) and Maximum Likelihood (ML) Hypotheses

  • – Later: Bayesian learning: MDL, BOC, Gibbs, Simple (Naïve) BayesCategorizing text and documents, other applications Docsity.com