Skip to search
Skip to main content
Catalog
Help
Feedback
Your Account
Library Account
Bookmarks
(
0
)
Search History
Search in
Keyword
Title (keyword)
Author (keyword)
Subject (keyword)
Title starts with
Subject (browse)
Author (browse)
Author (sorted by title)
Call number (browse)
search for
Search
Advanced Search
Bookmarks
(
0
)
Princeton University Library Catalog
Start over
Send
to
SMS
Email
Printer
Bookmark
Scaling Bayesian Optimization for High-Dimensional Iterative Experimental Design
Author/Artist
Semelhago, Andrew
[Browse]
Format
Senior thesis
Language
English
Availability
Available Online
Citation only:
DataSpace
Copies in the Library
Location
Call Number
Status
Location Service
Notes
Mudd Manuscript Library - Stacks
AC102
Browse related items
On-site access
Reading Room Request
Details
Advisor(s)
Engelhardt, Barbara
[Browse]
Department
Princeton University. Department of Operations Research and Financial Engineering
[Browse]
Certificate
Princeton University. Program in Finance
[Browse]
Class year
2017
Restrictions note
Walk-in Access. This thesis can only be viewed on computer terminals at the
Mudd Manuscript Library
.
Summary note
Bayesian optimization (BO) is an intelligent search technique for optimizing expensive nonlinear black-box objective functions. It is tempting to apply BO to iterative experimental design in the physical sciences. But these scenarios often have high dimensionality, presenting two problems: first, the large amount of time the algorithm takes to generate suggestions for subsequent experiments, and, second, the prohibitively large number of expensive experiments needed to thoroughly search the parameter space. We present a new approach to mitigate both issues with Bayesian optimization for high-dimensional problems. The proposed solution involves changing the prior model of the black-box objective function and developing a local optimization approach for the acquisition function. We evaluate these solutions on high-dimensional optimization tasks and on a computational analogy to a biological experimental design task: CRISPR/Cas9 guide RNA sequence optimization. In categorical spaces, the random forests prior model leads to fast convergence, whereas in continuous spaces, the Gaussian process prior performs best. However, random forests generate suggestions much more quickly than Gaussian processes. Local optimization (LO) improves performance across the board in exchange for a small constant time increase. When gradient information is available, gradient descent methods with momentum accentuate this performance improvement.
Statement on language in description
Princeton University Library aims to describe library materials in a manner that is respectful to the individuals and communities who create, use, and are represented in the collections we manage.
Read more...
Ask a Question
Suggest a Correction
Report Harmful Language
Supplementary Information