°æ¿µ°æÁ¦´ëÇÐ ÀÀ¿ëÅë°èÇаú

  • Åë°èÇаú
  • ´ëÇпø
  • »ç¶÷µé
  • Çлýȸ
  • Ä¿¹Â´ÏƼ

Ä¿¹Â´ÏƼ

  • °øÁö»çÇ×°øÁö»çÇ×
  • ÀÚÀ¯°Ô½ÃÆÇÀÚÀ¯°Ô½ÃÆÇ
  • »çÁø°¶·¯¸®»çÁø°¶·¯¸®
  • ÇаúÀÚ·á½ÇÇаúÀÚ·á½Ç
  • µ¿¹®°Ô½ÃÆǵ¿¹®°Ô½ÃÆÇ
  • ±³È¯Çлý±³È¯Çлý

HOME | Ä¿¹Â´ÏƼ | °øÁö»çÇ×

°øÁö»çÇ×

Á¦¸ñ µ¥ÀÌÅÍ°úÇבּ¸¼Ò ¼¼¹Ì³ª °ø°í
ÀÛ¼ºÀÚ ¹ÚÁ¤ÈÆ ³¯Â¥ 18.10.17 Á¶È¸ 10336
 
 
µ¥ÀÌÅÍ°úÇבּ¸¼Ò ¼¼¹Ì³ª °ø°í
 
 
 
 
 
 
¢Ã ¼¼¹Ì³ª ÀÏÁ¤
¡î ÀÏ ½Ã :
2018³â 10¿ù 25ÀÏ (¸ñ) ¿ÀÈÄ 4:30 ~ 5:30
¡î Àå ¼Ò :
Áß¾Ó´ëÇб³ °æ¿µ°æÁ¦´ëÇÐ 310°ü 410È£
¡î ¹ßÇ¥ÀÚ :
Ȳ¿µ´ö ±³¼ö (¼º±Õ°ü´ëÇб³ Åë°èÇаú)
¡î ÁÖ Á¦ :
 
 
Bayesian Sequential Optimization for Continuous Inputs with Finite Decision Space
 
 
 
 
 
 
 
 
Abstract
 
 
 
 
Optimization using stochastic computer experiments is commonplace in engineering and industry. This article addresses the problem of optimization, in which the input space of stochastic computer model is continuous, whereas the decision space in the real problem is restricted to be finite. We propose a new Bayesian sequential optimization method to tackle this problem. The knowledge gradient based on this restricted decision space is used as the criterion to choose new design points, and the fixed rank kriging or Gaussian process is used as the surrogate. This combination takes advantage of the continuous search space to decrease the uncertainty over the finite decision space. We demonstrate the benefit of our proposed methodology compared with existing sequential optimization methods.
 
 
 
 
 
 
 
 
Department of Applied Statistics &
The Research Center for Data Science
´ñ±Û(0)
ÀÔ·Â