Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Doc fixes #365

Merged
merged 32 commits into from
Nov 30, 2020
Merged

Doc fixes #365

merged 32 commits into from
Nov 30, 2020

Conversation

mpariente
Copy link
Collaborator

Trying to fix the docstrings to render nicely on 0.4.0 release.

I'm struggling with functions that return multiple things, the docstring is never parsed right when it starts by the return type.

Comparing docstrings and visual

  1. Looks ok but it doesn't have the dots like in the args.
    def trial(self, pair_wise_losses, perm_reduce=None, **kwargs):
        r"""Trial docstring

        Args:
            pair_wise_losses: there is a dot in the beginning.
            perm_reduce: And it's automatic. 

        Returns:
            :class:`torch.Tensor`: The loss corresponding to the best
            permutation of size $(batch,)$.

            :class:`torch.Tensor`: The indices of the best permutations.
        """

image

  1. Sphinx is joking with me
    def trial(self, pair_wise_losses, perm_reduce=None, **kwargs):
        r"""Trial docstring

        Args:
            pair_wise_losses: there is a dot in the beginning.
            perm_reduce: And it's automatic.

        Returns:
            - :class:`torch.Tensor`: The loss corresponding to the best
              permutation of size $(batch,)$.

            - :class:`torch.Tensor`: The indices of the best permutations.
        """

image

  1. Looks ok but the line is broken
    def trial(self, pair_wise_losses, perm_reduce=None, **kwargs):
        r"""Trial docstring

        Args:
            pair_wise_losses: there is a dot in the beginning.
            perm_reduce: And it's automatic.

        Returns:
            - :class:`torch.Tensor`: The loss corresponding to the best
            permutation of size $(batch,)$.

            - :class:`torch.Tensor`: The indices of the best permutations.
        """

image

  1. This is looks great, but cannot start with the type.
    def trial(self, pair_wise_losses, perm_reduce=None, **kwargs):
        r"""Trial docstring

        Args:
            pair_wise_losses: there is a dot in the beginning.
            perm_reduce: And it's automatic.

        Returns:
            - The loss corresponding to the best permutation of size $(batch,)$.
              (:class:`torch.Tensor`: )

            - The indices of the best permutations (:class:`torch.Tensor`:).
        """

image

@jonashaag
Copy link
Collaborator

Haha yes Sphinx/reST syntax really is horrible.

I think the bullets just aren't supported in Google-style docstrings. This works (NumPy-style):

def trial(self, pair_wise_losses, perm_reduce=None, **kwargs):
    r"""Trial docstring

    Returns
    -------
    :class:`torch.Tensor`
        The loss corresponding to the best permutation of size $(batch,)$.

    :class:`torch.Tensor`
        The indices of the best permutations.
    """

Btw I found this which I think would be nice for tensors: https://www.sphinx-doc.org/en/master/usage/extensions/napoleon.html#confval-napoleon_type_aliases

@@ -176,8 +176,8 @@ def bound_complex_mask(mask: ComplexTensor, bound_type="tanh"):
"tanh"/"bdt" (default), "sigmoid"/"bdss" or None/"bdt".

References
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you know why we have : at the end of Args: but not for References?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Args: is detected by Sphinx and it handles it well.
And if I add : to Reference, it looks like this:
image

I really don't understand the tool, so when I find something that works, I superstitiously stick to it.

docs/README.md Outdated
### Writing good docstrings

- Start with [RST and Sphinx CheatSheet](https://thomas-cokelaer.info/tutorials/sphinx/rest_syntax.html)
- Linking to method in a class `:func:~mymodule.myclass.myfunc`.
Copy link
Collaborator

@jonashaag jonashaag Nov 28, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But I think you can use any tag (role) and it works anyways

@jonashaag
Copy link
Collaborator

Btw what's that $...$ syntax? Never seen it (only in LaTeX and Markdown)

@mpariente
Copy link
Collaborator Author

Thanks a lot for your feedback !

Btw what's that $...$ syntax? Never seen it (only in LaTeX and Markdown)

No idea, but it works ✨ 😂

@mpariente
Copy link
Collaborator Author

What's your call on the multiple output function then?
I'm ok not to start with the type, but this seems like black magic.

@jonashaag
Copy link
Collaborator

jonashaag commented Nov 30, 2020

I really don't know. Here's what the internet says...

https://stackoverflow.com/questions/29221551/can-sphinx-napoleon-document-function-returning-multiple-arguments
http://web.archive.org/web/20200702074210/https://bitbucket.org/birkenfeld/sphinx-contrib/issues/111/napoleon-multiple-returns-indentations

So looks like Google style just doesn't support this, and we should be using the "tuple of N elements:" workaround instead.

I guess the best way to implement it would be using lists (like in your example 2) but not sure how much work it is to get this into Sphinx.

Also found this, not sure if related sphinx-contrib/napoleon#4

@mpariente
Copy link
Collaborator Author

Thanks for the pointers !

Finally found a way that worked:
image

    def trial(self, pair_wise_losses, perm_reduce=None, **kwargs):
        r"""Trial docstring

        Args:
            pair_wise_losses: there is a dot in the beginning.
            perm_reduce: And it's automatic.

        Returns
            - :class:`torch.Tensor`:
              The loss corresponding to the best permutation of size $(batch,)$. and
              if I keep typing? It works? 

            - :class:`torch.Tensor`:
              The indices of the best permutations.
        """

@mpariente
Copy link
Collaborator Author

I tried napoleon_type_aliases and couldn't make it work out of the box.

By the way, do you know how to automatically have anchor points for each class/function that is documented?

mpariente added a commit to asteroid-team/asteroid-filterbanks that referenced this pull request Nov 30, 2020
@mpariente mpariente merged commit 080a1a0 into master Nov 30, 2020
@mpariente mpariente deleted the doc_fixes branch November 30, 2020 14:03
@jonashaag
Copy link
Collaborator

By the way, do you know how to automatically have anchor points for each class/function that is documented?

No sorry :( My best advice is to look at other documentations and try to copy their config :-D

Finally found a way that worked:

This is nice, if this works out ok I think it would be great to add this to Stack Overflow

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants