You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Now I'm trying to construct GP model combined with deep learning. Here, the deep neural network has a role in transforming x to projected_x, then projected_x are inputted into GP. In the training process, my model won't predict accurate target values despite the loss value (-ExactMarginalLogLikelihood) is satisfyingly small.
When I trained only the deep model, Feature_transformerNN, I got a small MSE and accurately predicted target values.
Also, I firstly trained only the deep model, Feature_transformerNN, and transferred the learnable parameters for self.feature_transformer of GPRegressionModel, but the results were still bad ( the loss was small but the prediction was bad).
Furthermore, when I trained only the deep model, and used the transformed x as the inputs for the below GP, namely, ExactGP and Feature_transformerNN were separately trained like below, the results were good.
model1 = Feature_transformerNN()
optimizer1 = torch.optim.Adam(model1.parameters())
loss_func= torch.nn.MSELoss()
gp.train()
likelihood.train()
for epoch in range(50):
optimizer1.zero_grad()
output = gp(x_train)
loss = loss_func(output, y_train)
loss.backward()
optimizer.step()
x_trainformed = model.transform(x_train) # model.transform output the values just before the output layer
class GP(gpytorch.models.ExactGP):
def __init__(self, train_x, train_y, likelihood):
super(GPRegressionModel, self).__init__(train_x, train_y, likelihood)
self.mean_module = gpytorch.means.ConstantMean()
self.covar_module = gpytorch.kernels.GridInterpolationKernel(
gpytorch.kernels.ScaleKernel(gpytorch.kernels.RBFKernel(ard_num_dims=num_dims)),
num_dims=num_dims
)
self.scale_to_bounds = gpytorch.utils.grid.ScaleToBounds(-1., 1.)
def forward(self, x): # receive multiple arguments
projected_x = self.scale_to_bounds(projected_x)
mean_x = self.mean_module(projected_x)
covar_x = self.covar_module(projected_x)
likelihood = gpytorch.likelihoods.GaussianLikelihood()
model2 = GP(x_train,y_train,likelihood)
optimizer2 = torch.optim.Adam(gp.parameters())
loss_func2= gpytorch.mlls.ExactMarginalLogLikelihood(likelihood, gp)
gp.train()
likelihood.train()
for epoch in range(50):
optimizer2.zero_grad()
output = gp(x_train)
loss = -loss_func2(output, y_train)
loss.backward()
optimizer.step()
Finally, when I replaced the Feature_transformerNN with Simple NN the number of whose learnable parameters were about one-tenth than Feature_transformerNN, the results were not bad.
Does anyone have the reason and how to fix it?
Any help would be appreciated.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Dear gpytorch developers and users,
Now I'm trying to construct GP model combined with deep learning. Here, the deep neural network has a role in transforming x to projected_x, then projected_x are inputted into GP. In the training process, my model won't predict accurate target values despite the loss value (-ExactMarginalLogLikelihood) is satisfyingly small.
Here is a toy program of mine.
When I trained only the deep model, Feature_transformerNN, I got a small MSE and accurately predicted target values.
Also, I firstly trained only the deep model, Feature_transformerNN, and transferred the learnable parameters for self.feature_transformer of GPRegressionModel, but the results were still bad ( the loss was small but the prediction was bad).
Furthermore, when I trained only the deep model, and used the transformed x as the inputs for the below GP, namely, ExactGP and Feature_transformerNN were separately trained like below, the results were good.
Finally, when I replaced the Feature_transformerNN with Simple NN the number of whose learnable parameters were about one-tenth than Feature_transformerNN, the results were not bad.
Does anyone have the reason and how to fix it?
Any help would be appreciated.
Beta Was this translation helpful? Give feedback.
All reactions