Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: PyEscape #2072

Closed
whedon opened this issue Feb 5, 2020 · 76 comments
Closed

[REVIEW]: PyEscape #2072

whedon opened this issue Feb 5, 2020 · 76 comments

Comments

@whedon
Copy link
Collaborator

@whedon whedon commented Feb 5, 2020

Submitting author: @SirSharpest (Nathan Hughes)
Repository: https://github.com/SirSharpest/NarrowEscapeSimulator
Version: 1.0
Editor: @drvinceknight
Reviewer: @pdebuyl, @markgalassi
Archive: 10.5281/zenodo.3725946

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/c47ec67686a14361072ed703a58bac15"><img src="https://joss.theoj.org/papers/c47ec67686a14361072ed703a58bac15/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/c47ec67686a14361072ed703a58bac15/status.svg)](https://joss.theoj.org/papers/c47ec67686a14361072ed703a58bac15)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@pdebuyl & @markgalassi, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @drvinceknight know.

Please try and complete your review in the next two weeks

Review checklist for @pdebuyl

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@SirSharpest) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @markgalassi

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@SirSharpest) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
@whedon

This comment has been minimized.

Copy link
Collaborator Author

@whedon whedon commented Feb 5, 2020

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @pdebuyl, @markgalassi it looks like you're currently assigned to review this paper 🎉.

⭐️ Important ⭐️

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf
@whedon

This comment has been minimized.

Copy link
Collaborator Author

@whedon whedon commented Feb 5, 2020

Reference check summary:

OK DOIs

- 10.1007/s10955-004-5712-8 is OK
- 10.1073/pnas.0706599104 is OK

MISSING DOIs

- None

INVALID DOIs

- None
@whedon

This comment has been minimized.

Copy link
Collaborator Author

@whedon whedon commented Feb 5, 2020

@pdebuyl

This comment has been minimized.

Copy link

@pdebuyl pdebuyl commented Feb 6, 2020

@drvinceknight I can't tick the boxes in the review checklist. From memory, I was able to edit the checklist directly in my earlier work with JOSS. Is there anything that I should do to get the authorization?

@drvinceknight

This comment has been minimized.

Copy link

@drvinceknight drvinceknight commented Feb 7, 2020

@drvinceknight I can't tick the boxes in the review checklist. From memory, I was able to edit the checklist directly in my earlier work with JOSS. Is there anything that I should do to get the authorization?

That's strange, I've just checked and I can tick them so I'd assume you can because I believe you have all the necessary authorisation. Could you double check and also perhaps try on a different browser?

@pdebuyl

This comment has been minimized.

Copy link

@pdebuyl pdebuyl commented Feb 7, 2020

My bad @drvinceknight I needed to renew my invitation to the reviewers group. I thought that it would not be needed for reviewers having already served.

@drvinceknight

This comment has been minimized.

Copy link

@drvinceknight drvinceknight commented Feb 7, 2020

No problem, glad it's sorted 👍

@pdebuyl

This comment has been minimized.

Copy link

@pdebuyl pdebuyl commented Feb 13, 2020

Hi @SirSharpest ,

I had a first look at the program. I believe that documentation needs to be improved. I have
written a first review below. I suggest that you take a look at existing scientific codes on GitHub, preferably with a JOSS paper, to have a good impression of what is expected. The purpose of the review is to eventually tick all the boxes, so I can provide more information if needed.

Functionality

Installation: Does installation proceed as outlined in the documentation?

pip install does not work because of permissions. Instead of using sudo pip install
(which users could easily find by googling), I suggest to propose pip install --user
instead.

Functionality: Have the functional claims of the software been confirmed?

I ran the Example notebook with success. I would like to have a least a few benchmark cases.

Documentation

A statement of need: Do the authors clearly state what problems the software is designed to
solve and who the target audience is?

The problem is well stated. The target audience is not.

Installation instructions: Is there a clearly-stated list of dependencies? Ideally these
should be handled with an automated package management solution.

A requirements.txt file is missing. This is the most standard way to state dependencies for
Python projects.

Example usage: Do the authors include examples of how to use the software (ideally to solve
real-world analysis problems).

There is one example notebook. I suggest to improve the example by adding a textbook-type
example, with known numerical output.

Functionality documentation: Is the core functionality of the software documented to a
satisfactory level (e.g., API method documentation)?

No. There are no docstring and no module-wide documentation page. The purpose of the
routines is not documented explicitly, only the example, the name of the functions, and the
source code provide that information.

Automated tests: Are there automated tests or manual steps described so that the
functionality of the software can be verified?

There is limited testing. The test checks against an upper limit. An output time of zero
would pass the test. There is no theoretic estimate provided for the test. Playing with the
test case repeatedly, I had a 1% failure rate.

Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute
to the software 2) Report issues or problems with the software 3) Seek support

There is a contribution section in the readme. It only applies to new features and not to
bug reports, other issues, or support requesets.

@SirSharpest

This comment has been minimized.

Copy link

@SirSharpest SirSharpest commented Feb 13, 2020

@pdebuyl Thank you for the detailed response, and for taking the time to review. I will address these issues and report back within the next week.

@SirSharpest

This comment has been minimized.

Copy link

@SirSharpest SirSharpest commented Feb 14, 2020

I have just pushed a series of commits which I feel address most of the concerns:

Functionality

  • A --user flag had been added
  • Installation uses setuptools to install and setup the application and the requirements. A requirements.txt would be redundant as setuptools handles packages
  • The examples notebook, and the readme.md have been updated to reflect software functionality

Documentation

  • A sentence was added to the readme.md to give use-cases. However, we see this software as having use-cases that we currently do not foresee as it is can be used in various branches of research
  • Documentation relating to installation has been revised
  • Our examples notebook now uses a real-world example from published work
  • Docstrings have been added to all functions
  • Automated testing of major modules and features are implemented.
  • Tests with a failure rate because of time have been given an appropriate flag with a pytest decorator
  • Added info for contributing and contacting with requests

Other

I am happy to take on board any other issues that reviewers may have and will address them promptly also. Thank you again for your valuable input.

@SirSharpest

This comment has been minimized.

Copy link

@SirSharpest SirSharpest commented Feb 14, 2020

I would also like to add, we are currently preparing to submit a paper which will cite this software, when it is published we will be able to provide additional real-world solutions. Until then, we are reluctant to add additional examples to prevent self-plagiarism or to remove novelty from our current research.

@drvinceknight

This comment has been minimized.

Copy link

@drvinceknight drvinceknight commented Feb 17, 2020

@markgalassi apologies for the nudge, do you know when you might have a moment to carry out your review?

@pdebuyl

This comment has been minimized.

Copy link

@pdebuyl pdebuyl commented Feb 20, 2020

@SirSharpest the doi for the JOSS paper should already be known. @drvinceknight is it ok to cite a JOSS paper before its acceptance?

@SirSharpest

This comment has been minimized.

Copy link

@SirSharpest SirSharpest commented Feb 20, 2020

I don't think it would be appropriate to cite before acceptance, the DOI currently is inactive it seems.

@drvinceknight

This comment has been minimized.

Copy link

@drvinceknight drvinceknight commented Feb 21, 2020

Yes the DOI is not yet an actual representation of the software as it might be modified still. I'm still waiting to hear back from @markgalassi but if I don't soon I will look for another reviewer.

@markgalassi

This comment has been minimized.

Copy link

@markgalassi markgalassi commented Feb 21, 2020

I just got back from travel and dove in to the paper. I am afraid that it is clearly not a candidate for anything yet. The simplest of things (following the procedure in README.md) fails:
pores = fibonacci_spheres(p, v)

gives:

NameError: name 'p' is not defined

Clearly some variable p should be set, but for a minimal example which is the first thing shown this is not given!

Please have the author double-check that the README.md works and then we can get back to work on evaluating this software.

@markgalassi

This comment has been minimized.

Copy link

@markgalassi markgalassi commented Feb 21, 2020

Let me also add a comment on the idea of notebooks as documentation: I know they are much in vogue, but there are people (like me) who insist on reproducible procedures for running python code as batch. This means that I am not in the habit of loading notebooks, and if I am presented with documentation as a notebook I want a command line invocation which will load all that. So pointing to a directory is not enough: please give a command to load that notebook up from a standard GNU/Linux system.

@markgalassi markgalassi reopened this Feb 21, 2020
@markgalassi

This comment has been minimized.

Copy link

@markgalassi markgalassi commented Feb 21, 2020

(sorry: clicked wrong button and "close"ed the issue; I think that this reopens it)

@SirSharpest

This comment has been minimized.

Copy link

@SirSharpest SirSharpest commented Feb 24, 2020

Thanks for your comments.

I have added the required lines to the read me.

Re Notebooks: For the case of being documentation and providing an example of how to use the software, then the integrated github viewer should be enough. This is presented as a method without results (beyond solving a known problem), so I wouldn't expect running it to be a complete necessity. Saying that however, most users will want to run notebooks differently and depending on your current setup you should be able to run jupyter-notebook <notebook name> from your command-line.

Though, if you're wanting that command-line experience then running

narrow_escape -D 400 -v 1 -a 0.1 -p 1 -N 1000 -dt 2e-9

# "narrow_escape --help" can give arguments info

from bash (providing you installed the library) will give the same results, without plots as in the example notebook.

@pdebuyl

This comment has been minimized.

Copy link

@pdebuyl pdebuyl commented Feb 26, 2020

Thank you for the update. I ticked many checks for the review already. Some items remain to
improve, especially given the focus of JOSS on best practices (including testing).

  1. The comment "base install of python (not recommended)" is superfluous.

  2. The tests are not well defined. For instance, the test for the sphere placement routine
    does not check the distance of the pore but that it falls below a much larger upper bound.

  3. The tests use floating-point equality, which is not a reliable way to perform numerical
    tests. The most convenient solutions are either NumPy's testing routines (that include
    "closeness" tests) or pytest's approx feature.

  4. The escape tests now does not include a bound. The earlier version was better but an
    estimate for this time must be given. Only "0" would not pass the test now.

  5. scipy is not used in the code, it should be removed from the install_requires.

  6. There are unused imports and unused variables in the test files, they should be removed.

  7. The duplicate "%notebook" directive does not work as intented in the notebook example
    (the directive can only be used once per notebook).

  8. The optional argument max_steps does not limit the range of searches that exit the
    container. Picking a dt that is too large results in infinite loops.

  9. If max_steps is reached, the escape routine returns max_steps*dt instead of an obviously
    erroneous values (0 or np.nan).

I filed pull requests for 7 and part of 6.

@markgalassi

This comment has been minimized.

Copy link

@markgalassi markgalassi commented Mar 9, 2020

Further testing: thanks @SirSharpest for addressing the issues I reported previously. Smaller ones now:

  1. Since you document how to run before you document installation, you should slip in a "after installing (according to the instructions below)" here and there.
  2. Because of the python community's sad failure in updating the world to python3, you might want to mention that on many systems they might have to run "pip3 install ."
  3. The second test python program fails with a:
$ python3 t2.py 
Traceback (most recent call last):
  File "t2.py", line 9, in <module>
    pores = fibonacci_spheres(p, v)
NameError: name 'p' is not defined

So p and dt are not defined. This is the usual problem with the notebook people trying to write reproducible and deliverable software and documentation :-)
So the author should definitely test all those examples.

Otherwise my earlier problems seem to have been addressed and I am continuing with the checklist.

@SirSharpest

This comment has been minimized.

Copy link

@SirSharpest SirSharpest commented Mar 10, 2020

Again, thank you for the comments and suggestions. Particularly towards testing and the errors in the examples, I was running them sequentially and so variables were carried over.

I've made updates to the readme to remove the unnecessary text also.
I'm reluctant to mention pip3 commands, simply because I state twice that this is a python 3 library and adding pip3 would cause errors for some setups. For example:

zsh> which pip
/Users/nathan/anaconda3/bin/pip
zsh> which pip3
/usr/local/bin/pip3
zsh> conda activate myenv; which pip
/Users/nathan/anaconda/envs/bin/pip

I've also updated several tests to use the numpy floating point closeness functions.
Removed scipy from the setup.py.
Cleaned up some unused imports in testing
Added a test for max_steps variable being hit
Reaching max_steps now returns a zero value (I am working on writing some warnings to be raised also)
Some tests still need to be addressed (in progress)

Pull requests for minor fixes have been accepted.

@markgalassi

This comment has been minimized.

Copy link

@markgalassi markgalassi commented Mar 11, 2020

I have checked most of the boxes in the paper checklist. Here is a small suggested diff (below) so you don't assume that users are steeped in markdown. I would suggest that the "statement of need" in the readme.md file could do with a bit more, and that the paper put a math citation at the moment of stating "The mathematical models provided are simple and robust @iWouldCiteMySourceHere". I also have not yet checked the "state of the field" checkbox. In the paper you discuss the Schuss and Holcman papers, but no reference to existing software. You say yours is "novel" but you do not say that there is no "narrow escape" softare.

And another nit: when I did a "git diff" after running your notebook procedure I saw a diff on a date in the notebook. Remember: notebooks are not reproducible and they are not entirely human source, so it is flawed to add them to a version control repo. I would not hold up the paper on this, but I would recommend that you put reproducibility and best VC practices front-and-center: commit a .py file, then provide a simple one-liner to load that into a notebook for notebook types.

Other those simple suggestions I think you meet the criteria and I would quickly finish this off.

And if I might add a personal note: your project is interesting! It made me want to read up on narrow escape and take an interest in the topic.

diff --git a/paper.md b/paper.md
index c78409e..bd9404c 100644
--- a/paper.md
+++ b/paper.md
@@ -1,3 +1,9 @@
+---
+# process with:
+# pandoc --bibliography paper.bib -o paper.pdf paper.md
+# pandoc --standalone --bibliography paper.bib -o paper.html paper.md
+---
+
 ---
 title: "PyEscape: A narrow escape problem simulator package for python"
 tags:
@drvinceknight

This comment has been minimized.

Copy link

@drvinceknight drvinceknight commented Mar 31, 2020

@whedon set 10.5281/zenodo.3725946 as archive

@whedon

This comment has been minimized.

Copy link
Collaborator Author

@whedon whedon commented Mar 31, 2020

OK. 10.5281/zenodo.3725946 is the archive.

@drvinceknight

This comment has been minimized.

Copy link

@drvinceknight drvinceknight commented Mar 31, 2020

@whedon accept

@whedon

This comment has been minimized.

Copy link
Collaborator Author

@whedon whedon commented Mar 31, 2020

Attempting dry run of processing paper acceptance...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

@whedon whedon commented Mar 31, 2020

Reference check summary:

OK DOIs

- 10.1109/MCSE.2007.55 is OK
- 10.1109/MCSE.2011.37 is OK
- 10.1007/s10955-004-5712-8 is OK
- 10.1073/pnas.0706599104 is OK
- 10.1098/rsif.2008.0014 is OK
- 10.1016/j.jcpx.2019.100047 is OK

MISSING DOIs

- None

INVALID DOIs

- None
@whedon

This comment has been minimized.

Copy link
Collaborator Author

@whedon whedon commented Mar 31, 2020

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#1400

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#1400, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true
@drvinceknight

This comment has been minimized.

Copy link

@drvinceknight drvinceknight commented Mar 31, 2020

Sorry for the delay @SirSharpest, I've recommended acceptance. Thank you @pdebuyl and @markgalassi for your time and effort reviewing this work: it's really appreciated.

@danielskatz

This comment has been minimized.

Copy link

@danielskatz danielskatz commented Mar 31, 2020

Thanks - I'll work on finishing this

@danielskatz

This comment has been minimized.

Copy link

@danielskatz danielskatz commented Mar 31, 2020

@SirSharpest - please update the archive's metadata so that the title matches the paper title

@danielskatz

This comment has been minimized.

Copy link

@danielskatz danielskatz commented Mar 31, 2020

Additionally, I've suggested some changes to the paper in SirSharpest/NarrowEscapeSimulator#6

@danielskatz

This comment has been minimized.

Copy link

@danielskatz danielskatz commented Mar 31, 2020

And I've suggested some changes to the references in SirSharpest/NarrowEscapeSimulator#7

@danielskatz

This comment has been minimized.

Copy link

@danielskatz danielskatz commented Mar 31, 2020

@whedon generate pdf

@whedon

This comment has been minimized.

Copy link
Collaborator Author

@whedon whedon commented Mar 31, 2020

@SirSharpest

This comment has been minimized.

Copy link

@SirSharpest SirSharpest commented Mar 31, 2020

Hi @danielskatz, thank you for the changes, I've reviewed and merged the changes, and just updated the zenodo title.

@danielskatz

This comment has been minimized.

Copy link

@danielskatz danielskatz commented Mar 31, 2020

@whedon accept

@whedon

This comment has been minimized.

Copy link
Collaborator Author

@whedon whedon commented Mar 31, 2020

Attempting dry run of processing paper acceptance...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

@whedon whedon commented Mar 31, 2020

Reference check summary:

OK DOIs

- 10.3233/978-1-61499-649-1-87 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.1109/MCSE.2011.37 is OK
- 10.1007/s10955-004-5712-8 is OK
- 10.1073/pnas.0706599104 is OK
- 10.1098/rsif.2008.0014 is OK
- 10.1016/j.jcpx.2019.100047 is OK

MISSING DOIs

- None

INVALID DOIs

- None
@whedon

This comment has been minimized.

Copy link
Collaborator Author

@whedon whedon commented Mar 31, 2020

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#1401

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#1401, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true
@danielskatz

This comment has been minimized.

Copy link

@danielskatz danielskatz commented Mar 31, 2020

@whedon accept deposit=true

@whedon

This comment has been minimized.

Copy link
Collaborator Author

@whedon whedon commented Mar 31, 2020

Doing it live! Attempting automated processing of paper acceptance...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

@whedon whedon commented Mar 31, 2020

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@whedon

This comment has been minimized.

Copy link
Collaborator Author

@whedon whedon commented Mar 31, 2020

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 openjournals/joss-papers#1402
  2. Wait a couple of minutes to verify that the paper DOI resolves https://doi.org/10.21105/joss.02072
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? notify your editorial technical team...

@danielskatz

This comment has been minimized.

Copy link

@danielskatz danielskatz commented Mar 31, 2020

Thanks to @pdebuyl & @markgalassi for reviewing, and @drvinceknight for editing!

And congratulations to @SirSharpest and co-authors!

@whedon

This comment has been minimized.

Copy link
Collaborator Author

@whedon whedon commented Mar 31, 2020

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.02072/status.svg)](https://doi.org/10.21105/joss.02072)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.02072">
  <img src="https://joss.theoj.org/papers/10.21105/joss.02072/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.02072/status.svg
   :target: https://doi.org/10.21105/joss.02072

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
7 participants
You can’t perform that action at this time.