Hi!!

Thanks for your reply :slight_smile @fehiepsi : am trying to implement your suggestion but something is not clear to me:

If transforms is obtained from the initialize_model function, then this part :

`transforms[name][value]`

is not working because it throws:

`TypeError: 'ComposeTransform' object is not subscriptable`

due to Transforms looking like this (with the invertible transformation only…):

```
{'Value1': ComposeTransform(
), 'Value2': ComposeTransform(
_InverseTransform(),
_InverseTransform()
), 'Value3': ComposeTransform(
), 'Value4': ComposeTransform(
_InverseTransform(),
_InverseTransform()
)}
```

so I changed to this:

init_params = {name: transforms[name] (value) for name, value in map_points.items()}

but that does not apply any transformation to the map_point items. Init_params looks like this:

{‘value1’: tensor([[ -3.4214, 13.0987, -13.1515],

[ -0.6091, 10.6803, -15.8382]],

grad_fn=), ‘value2’: tensor([-0.2674, -0.6896, 0.1402], grad_fn=), ‘value3’: tensor([0.9876, -0.789, 0.5231], grad_fn=),‘value4’: tensor([ 5.4123, -1.2537, 5.1462], grad_fn=)}

which, if I use it when I run mcmc like this:

init_params, potential_fn, transforms, _ = initialize_model(model,model_args=(data_obs,), num_chains=chains,jit_compile=True,skip_jit_warnings=True)

map_points = _get_initial_trace(data_obs,average) #This returns guide.median()

init_params, potential_fn, transforms, _ = initialize_model(model,model_args=(data_obs,), num_chains=chains,jit_compile=True,skip_jit_warnings=True)

init_params = {name: transforms[name] (value).no_grad() for name, value in map_points.items()}

nuts_kernel = NUTS(potential_fn=potential_fn, max_tree_depth=5, target_accept_prob=0.8)

mcmc = MCMC(nuts_kernel, num_samples=samples, warmup_steps=warmup, num_chains=chains,initial_params=init_params, transforms=transforms)

mcmc.run(data_obs)

it results into this error:

RuntimeError: you can only change requires_grad flags of leaf variables. If you want to use a computed variable in a subgraph that doesn’t require differentiation use var_no_grad = var.detach()

I tried .no_grad() but it got this:

AttributeError: ‘Tensor’ object has no attribute ‘no_grad’

I am not sure if I misunderstood something, but is not exactly working…

Thank you very much for your help