Forums  > Basics  > Bayesian optimization  
     
Page 1 of 1
Display using:  

riskPremium


Total Posts: 21
Joined: Nov 2018
 
Posted: 2019-10-18 12:45
Hi,

BO is used by some people for (hyper) parameter tuning when computing cost function is expensive or parameter space is large.

Suppose in my case, such computation is inexpensive and a grid search is computationaly feasible, is there any reason I should use BO instead of grid search?

nikol


Total Posts: 850
Joined: Jun 2005
 
Posted: 2019-10-18 19:41
I guess it is the same.

nikol


Total Posts: 850
Joined: Jun 2005
 
Posted: 2019-10-19 15:49
I don't like an embarrassing "I guess". It was an exclamation from my intuition.

Here it is in full:

P(H) is your grid over set of parameters of the model. Consider it flat, because you scan param-space with equal probability.
P(Data|H) is usual likelihood of your Data fit to a particular model on every point of the grid

as result you get

P(H|Data) = P(Data|H)*P(H)

which defines probability of having certain set of model parameters for observed set of Data.

I assumed P(Data)=1, but it can be that
P(Data) = Sum_i(P(Data|H_i)*P(H_i)) < 1
for example, if you already have some idea about P(H) distribution from previous step and now you have knew data (Data). So, it becomes

P(H_i|Data) = P(Data|H_i)*P(H_i) / (Sum_i(P(Data|H_i)*P(H_i)))
Previous Thread :: Next Thread 
Page 1 of 1