Edwin V. Bonilla's Home Page
Home
Posts
Scholarships
Publications
Contact
Edwin V Bonilla
Latest
Learning Efficient and Robust Ordinary Differential Equations via Invertible Neural Networks
Optimizing Sequential Experimental Design with Deep Reinforcement Learning
Model Selection for Bayesian Autoencoders
BORE: Bayesian Optimization by Density-Ratio Estimation
SigGPDE: Scaling Sparse Gaussian Processes on Sequential Data
Quantile Propagation for Wasserstein-Approximate Gaussian Processes
Variational Inference for Graph Convolutional Networks in the Absence of Graph Data and Adversarial Settings
Distribution Regression for Continuous-Time Processes via the Expected Signature
Calibrating Deep Convolutional Gaussian Processes
Generic Inference in Latent Gaussian Process Models.
Grouped Gaussian processes for solar power prediction
Scalable Grouped Gaussian Processes via Direct Cholesky Functional Representations
Sparse Grouped Gaussian Processes for Solar Power Forecasting.
Structured Variational Inference in Continuous Cox Process Models
Variational Graph Convolutional Networks
Cycle-consistent adversarial learning as approximate bayesian inference
AutoGP: Exploring the capabilities and limitations of Gaussian process models
Gray-box inference for structured Gaussian process models
Random feature expansions for deep Gaussian processes
Accelerating Deep Gaussian Processes Inference with Arc-Cosine Kernels
Scalable inference for Gaussian process models with black-box likelihoods
Automated variational inference for Gaussian process models
Collaborative Multi-output Gaussian Processes.
Distributed Bayesian geophysical inversions
Extended and unscented Gaussian processes
Decision-theoretic sparsification for Gaussian process preference learning
Dynamic microarchitectural adaptation using machine learning
Learning community-based preferences via dirichlet process mixtures of gaussian processes
New objective functions for social collaborative filtering
Improving Topic Coherence with Regularized Topic Models
Sparse gaussian processes for learning preferences
A predictive model for dynamic microarchitectural adaptivity control
Gaussian process preference elicitation
Portable compiler optimisation across embedded programs and microarchitectures using machine learning
Multi-task Gaussian process prediction
A note on noise-free Gaussian process prediction with separable covariance functions and grid designs
Kernel multi-task learning using task-specific features
Predictive search distributions
Predicting Good Compiler Transformations Using Machine Learning
Cite
×