Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Review on the JCON Paper #27

Open
kellertuer opened this issue Jan 26, 2025 · 1 comment
Open

Review on the JCON Paper #27

kellertuer opened this issue Jan 26, 2025 · 1 comment

Comments

@kellertuer
Copy link

These are a few comments on the JCON paper review at JuliaCon/proceedings-review#179.

I marked a few things in the PDF, and wrote additional notes. They are numbered to discuss them.
Maybe the best way to address these is in a PR.

Review ExponentialFamilyManifolds JCON

  1. I would maybe postfix every package name by .jl for clarity? (I only marked the first 3 and a few throughout)
  2. conerning the advanced optimization techniques – directly or via Manopt.jl? I that fits, mentioning Manopt.jl in the abstract would be great (if that is used for them that is)
  3. Can you maybe provide a reference that the family is a popular choice?
  4. natural parameters sounds like that is a known fixed term in statistics? If so then it can stay as is, and/or otherwise maybe a short explanation would be nice
  5. can you maybe add that they are implemented using the interface for manifolds provided in ManifoldsBase.jl ? That fits a bit better than the mention of Manifolds.jl in the sentence afterwards
  6. Would NaturalParametersManifold <: AbstractManifold be correct here? Maybe even `ManifoldsBase.AbstractManifold (though that is a bit long-ish)
  7. Again to be a bit picky: ProductManifold is already available from ManifoldsBase.jl , Manifolds.jl (also a bit of a heavy dependency) would not be necessary for this
  8. HOw does get_natural_manifold work? What are its two arguments? Maybe a link to the docs would be nice here
  9. retract(M, p, X, m) starting from p in direction X is missing the retraction method m here, otherwise it is equivalent to just calling exp(M, p, X) unless the default_retraction_method(M) is overwritten accordingly, but then that method should be mentioned again
  10. Ah from herre you might indeed need Manifolds.jl since the positive numbers are from there. The footnote is a bit misleading, it might be read in a way, that PositiveNumbers is actually implemented in exponentialFamilyManifolds.jl as well
  11. Oh, it gets a bit technical in Code 1. ForwardDiff computes the Euclidean gradient, on Manifolds you need the Riemannian gradient, see the note at https://manoptjl.org/stable/tutorials/AutomaticDifferentiation/#EmbeddedGradient and change_representer . On positive numbers you might just be slightly wrong, but it would be better to actually change the representer here as well
    In principle you (implicitly) do the same in Code 4, just that there Fisher is the metric you aim for.
    We just do not have Fisher-Rao much in Manifolds.jl but that hcnage is the same as you would need in Code 1 as well, unless M is Euclidean.
  12. This explanation is too short for me, but I am not a statistics expert, maybe for them that is enough information
  13. What is [7] for a reference? Meant to be the Zenodo-DOI? Then please add this. The reference seems quite incomplete in the current form.

📋

10.21105.jcon.00172.pdf

@kellertuer
Copy link
Author

In a second part I will add a few comments on the documentation and the code in a few days.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant