Error trying to implement hierarchical parameter in PyroSample

I have a Bayesian NN that is working when I do:

self.fc1.weight = PyroSample(dist.Normal(0., 1.).expand([h1, in_features]).to_event(2))

But when I try making the variance a parameter like this:

var = PyroSample(dist.Exponential(1.))
self.fc1.weight = PyroSample(dist.Normal(0., var).expand([h1, in_features]).to_event(2))

I get the following error:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-42-1960cbe051d0> in <module>
  1 features = ['net_ortg','net_drtg','lead','sec_rem_period','total_pts', 'p_1','p_2','p_3','p_4','p_5','p_6']
----> 2 model = ThreeLayerNNModel(11, 1, 24, 24, 24)
  3 losses, guide = inference(model, features=features, epochs=20)
  4 plt.plot(losses)

<ipython-input-40-19d2ca07f372> in __init__(self, in_features, out_features, h1, h2, h3)
  5         self.fc1 = PyroModule[nn.Linear](in_features, h1)
  6         var = PyroSample(dist.Exponential(1.))
----> 7         self.fc1.weight = PyroSample(dist.Normal(0., var).expand([h1, in_features]).to_event(2))
  8         self.fc1.bias = PyroSample(dist.Normal(0., 10.).expand([h1]).to_event(1))
  9         self.fc2 = PyroModule[nn.Linear](h1, h2)

~/pyro/xpm/pyro/lib/python3.7/site-packages/pyro/distributions/distribution.py in __call__(cls, *args, **kwargs)
 15             if result is not None:
 16                 return result
---> 17         return super().__call__(*args, **kwargs)
 18 
 19 

~/pyro/xpm/pyro/lib/python3.7/site-packages/torch/distributions/normal.py in __init__(self, loc, scale, validate_args)
 42 
 43     def __init__(self, loc, scale, validate_args=None):
---> 44         self.loc, self.scale = broadcast_all(loc, scale)
 45         if isinstance(loc, Number) and isinstance(scale, Number):
 46             batch_shape = torch.Size()

~/pyro/xpm/pyro/lib/python3.7/site-packages/torch/distributions/utils.py in broadcast_all(*values)
 22     """
 23     if not all(isinstance(v, torch.Tensor) or isinstance(v, Number) for v in values):
---> 24         raise ValueError('Input arguments must all be instances of numbers.Number or torch.tensor.')
 25     if not all([isinstance(v, torch.Tensor) for v in values]):
 26         options = dict(dtype=torch.get_default_dtype())

ValueError: Input arguments must all be instances of numbers.Number or torch.tensor.

Any idea what is the issue?

Hi @thecity2,

Does using

var = pyro.sample("var", dist.Exponential(1.))
self.fc1.weight = pyro.sample("fc1_weight", dist.Normal(0., var).expand([h1, in_features]).to_event(2))

solve the issue?

1 Like

Getting a different error:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-50-1960cbe051d0> in <module>
      1 features = ['net_ortg','net_drtg','lead','sec_rem_period','total_pts', 'p_1','p_2','p_3','p_4','p_5','p_6']
----> 2 model = ThreeLayerNNModel(11, 1, 24, 24, 24)
      3 losses, guide = inference(model, features=features, epochs=20)
      4 plt.plot(losses)

<ipython-input-49-44a12b00ee39> in __init__(self, in_features, out_features, h1, h2, h3)
      4         self.fc1 = PyroModule[nn.Linear](in_features, h1)
      5         var = pyro.sample("var", dist.Exponential(1.))
----> 6         self.fc1.weight = pyro.sample("fc1_weight",dist.Normal(0., var).expand([h1, in_features]).to_event(2))
      7         self.fc1.bias = PyroSample(dist.Normal(0., 10.).expand([h1]).to_event(1))
      8         self.fc2 = PyroModule[nn.Linear](h1, h2)

~/pyro/xpm/pyro/lib/python3.7/site-packages/pyro/nn/module.py in __setattr__(self, name, value)
    544             return
    545 
--> 546         super().__setattr__(name, value)
    547 
    548     def __delattr__(self, name):

~/pyro/xpm/pyro/lib/python3.7/site-packages/torch/nn/modules/module.py in __setattr__(self, name, value)
    792                 raise TypeError("cannot assign '{}' as parameter '{}' "
    793                                 "(torch.nn.Parameter or None expected)"
--> 794                                 .format(torch.typename(value), name))
    795             self.register_parameter(name, value)
    796         else:

TypeError: cannot assign 'torch.FloatTensor' as parameter 'weight' (torch.nn.Parameter or None expected)

@thecity2 I believe the correct way to allow one PyroSample to depend on another PyroSample is to make the dependent on a lambda:

self.fc1.var = PyroSample(dist.Exponential(1.))
self.fc1.weight = PyroSample(
    lambda fc1: dist.Normal(0., fc1.var).expand([h1, in_features]).to_event(2))

Note that for this to work:

  1. We need to set var in self.fc1 rather than in self, since when self.fc1 accesses its .weight attribute, it has only access to itself (i.e. to self.fc1).
  2. We need to ensure self.fc1 is an actual PyroModule rather than a mere nn.Module (which it looks like you’ve already done via self.fc1 = PyroModule[nn.Linear].
  3. The lambda syntax might be a little more intuitive if you inherited from PyroModule[nn.Linear]:
class BayesianLinear(PyroModule[nn.Linear]):
    def __init__(self, *args, **kwargs):
        super().__init__(*args, **kwargs)
        self.var = PyroSample(dist.Exponential(1.))
        self.weight = PyroSample(
            lambda self: dist.Normal(0., fc1.var).expand([h1, in_features]).to_event(2))

Wow It seems to be working. Definitely not something I would have figured out on my own! Where in the docs is this covered?

There’s a simple example of this syntax in the PyroSample docs, but it’s pretty brief. We awlays welcome contributions in the form of improved docstrings. It’s hard as a code author to see things with beginner’s eyes :slightly_smiling_face: