You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I ran into some issues when trying to run examples/classification.ipynb using gpflow 2.7.0. I am looking to adapt the Graph Matern to my own work and would love to use the scalable methods you have tried.
Specifically, I get an error when running the function optimize_SVGP using gpflow.models.SVGP with GPInducingVariables. If I instead use GraphSVGP and `inducing_variable=[0] * num_eigenpairs' I get an dimension error. I have put the error messages last.
Both issues can be resolved by making the inducing variable have 2 axes: inducing_variable = np.zeros((num_eigenpairs, 1)). But for the SVGP model the accuracy drops to 16% so I suspect its not a proper fix.
By downgrading to gpflow 2.6.3 I could get the notebook to run without errors for the SVGP model but the GraphSVGP model gets the same error. I suspect GraphSVGP in classification.ipynb was not tested after 3dd6b9c.
Error with SVGP and GPInducingVariables:
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Cell In[51], line 16
13 adam_opt = tf.optimizers.Adam(0.001)
14 natgrad_opt = gpflow.optimizers.NaturalGradient(gamma=0.001)
---> 16 optimize_SVGP(model, (adam_opt, natgrad_opt), 1000, True)
17 gpflow.utilities.print_summary(model)
Cell In[48], line 35, in optimize_SVGP(model, optimizers, steps, q_diag)
33 opt_step(natgrad_opt, loss, natgrad_params)
34 if step % 200 == 0:
---> 35 likelihood = model.elbo((x_train, y_train))
36 t.set_postfix({'ELBO': likelihood.numpy()})
File ~/miniconda3/envs/bandits-gpflow/lib/python3.10/site-packages/check_shapes/integration/tf.py:76, in install_tf_integration.<locals>.TfWrapperPostProcessor.on_wrap.<locals>.wrapped_method(self, *args, **kwargs)
75 def wrapped_method(self: Any, *args: Any, **kwargs: Any) -> Any:
---> 76 return wrapped_function(self, *args, **kwargs)
File ~/miniconda3/envs/bandits-gpflow/lib/python3.10/site-packages/check_shapes/decorator.py:185, in check_shapes.<locals>._check_shapes.<locals>.wrapped_function(*args, **kwargs)
182 _check_specs(pre_specs)
184 with set_shape_checker(checker):
--> 185 result = func(*args, **kwargs)
186 arg_map[RESULT_TOKEN] = result
188 _check_specs(post_specs)
...
100 dtype = dtypes.as_dtype(dtype).as_datatype_enum
101 ctx.ensure_initialized()
--> 102 return ops.EagerTensor(value, ctx.device_name, dtype)
ValueError: Attempt to convert a value (<object object at 0x7f8bf9647dc0>) with an unsupported type (<class 'object'>) to a Tensor.
Error with GraphSVGP and inducing_variable=[0]*num_eigenpairs:
I ran into some issues when trying to run examples/classification.ipynb using gpflow 2.7.0. I am looking to adapt the Graph Matern to my own work and would love to use the scalable methods you have tried.
Specifically, I get an error when running the function
optimize_SVGP
usinggpflow.models.SVGP
withGPInducingVariables
. If I instead useGraphSVGP
and `inducing_variable=[0] * num_eigenpairs' I get an dimension error. I have put the error messages last.Both issues can be resolved by making the inducing variable have 2 axes:
inducing_variable = np.zeros((num_eigenpairs, 1))
. But for theSVGP
model the accuracy drops to 16% so I suspect its not a proper fix.By downgrading to gpflow 2.6.3 I could get the notebook to run without errors for the
SVGP
model but theGraphSVGP
model gets the same error. I suspectGraphSVGP
in classification.ipynb was not tested after 3dd6b9c.Error with
SVGP
andGPInducingVariables
:Error with
GraphSVGP
andinducing_variable=[0]*num_eigenpairs
:The text was updated successfully, but these errors were encountered: