You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ACER assumes that all the parameters of a distribution (defined by get_params_of_distribution) require grad so that the algorithm can compute the gradient wrt the parameters.
Reported in #143
ACER assumes that all the parameters of a distribution (defined by
get_params_of_distribution
) require grad so that the algorithm can compute the gradient wrt the parameters.pfrl/pfrl/agents/acer.py
Lines 172 to 180 in 44bf2e4
pfrl/pfrl/agents/acer.py
Lines 218 to 221 in 44bf2e4
However,
GaussianHeadWithFixedCovariance
(pfrl/pfrl/policies/gaussian_policy.py
Line 96 in 44bf2e4
scale
parameter of thetorch.distributions.Normal
distribution does not require grad, resulting in an assertion error.The text was updated successfully, but these errors were encountered: