Jump to content
Wikipedia The Free Encyclopedia

Talk:Gibbs algorithm

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This article is rated Stub-class on Wikipedia's content assessment scale.
It is of interest to the following WikiProjects:
WikiProject icon Physics Low‐importance
WikiProject icon This article is within the scope of WikiProject Physics , a collaborative effort to improve the coverage of Physics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.PhysicsWikipedia:WikiProject PhysicsTemplate:WikiProject Physicsphysics
Low This article has been rated as Low-importance on the project's importance scale.

Gibbs measure

[edit ]

This page mentions two things: The Gibbs Algorithm and the Gibbs Distribution. In my opinion, both are important and should be separated. According to various Markov Random Field literature, the Gibbs distribution takes the form of:

P ( f ) = Z 1 ×ばつ e 1 T U ( f ) {\displaystyle P(f)=Z^{-1}\times e^{-{\frac {1}{T}}U(f)}} {\displaystyle P(f)=Z^{-1}\times e^{-{\frac {1}{T}}U(f)}}

where

Z = f F e 1 T U ( f ) {\displaystyle Z=\sum _{f\in F}e^{-{\frac {1}{T}}U(f)}} {\displaystyle Z=\sum _{f\in F}e^{-{\frac {1}{T}}U(f)}}

is a normalizing factor. T is a constant called the temperature, and U(f) is an energy function. For a specific choice of U(f), this leads to the (Gaussian) Normal_distribution.

Maybe the Gibbs distrubution should redirect to the Gibbs_measure (Unsigned, User:146.50.1.141, January 2007)

Now fixed. linas (talk) 21:16, 30 August 2008 (UTC) [reply ]

Gibbs Algorithm vs Gibbs Sampler

[edit ]

This article states that the Gibbs Algorithm is different from the Gibbs Sampler. But I encountered various interpretations of Markov Random Fields in terms of maximizing the Entropy, which is often defined as

H = i p i log p i {\displaystyle H=-\sum _{i}p_{i}\log p_{i}} {\displaystyle H=-\sum _{i}p_{i}\log p_{i}}

This makes the Gibbs algorithm probably a special case of Markov chain Monte Carlo iterations. For an interpretation of Markov Random Fields in terms of Entropy see for example here [1] op page 5/6. (Unsigned, User:146.50.1.141, January 2007)


On a similar note the article states H {\displaystyle H} {\displaystyle H} is the 'average log probability'. The expression given (entropy) is the negative of that quantity. This makes the language about 'minimising the average log probability' confusing - since we should actually be maximising it. - Summary, I think there is a sign error. — Preceding unsigned comment added by 62.25.109.195 (talk) 11:54, 18 December 2013 (UTC) [reply ]

AltStyle によって変換されたページ (->オリジナル) /