Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: LISC: A Python Package for Scientific Literature Collection and Analysis #1674

Closed
whedon opened this issue Aug 22, 2019 · 78 comments

Comments

@whedon
Copy link
Collaborator

commented Aug 22, 2019

Submitting author: @TomDonoghue (Thomas Donoghue)
Repository: https://github.com/lisc-tools/lisc
Version: v0.1.1
Editor: @danielskatz
Reviewer: @timClicks, @linuxscout
Archive: 10.5281/zenodo.3462035

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/7be5e75ec454ef0946841b02a80cd947"><img src="https://joss.theoj.org/papers/7be5e75ec454ef0946841b02a80cd947/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/7be5e75ec454ef0946841b02a80cd947/status.svg)](https://joss.theoj.org/papers/7be5e75ec454ef0946841b02a80cd947)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@timClicks & @linuxscout, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @danielskatz know.

Please try and complete your review in the next two weeks

Review checklist for @timClicks

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Version: v0.1.1
  • Authorship: Has the submitting author (@TomDonoghue) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

Review checklist for @linuxscout

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Version: v0.1.1
  • Authorship: Has the submitting author (@TomDonoghue) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 22, 2019

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @timClicks, @linuxscout it looks like you're currently assigned to review this paper 🎉.

⭐️ Important ⭐️

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 22, 2019

Attempting PDF compilation. Reticulating splines etc...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 22, 2019

@danielskatz

This comment has been minimized.

Copy link

commented Aug 22, 2019

👋 @timClicks, @linuxscout - your job as a reviewer is to examine the paper and software repo, and to check off items in your review list as possible. When you have some issue where improvement is needed, either flag it here or with a new issue in the repository (mentioning this review issue). If you have any questions or problems, please let me know. (FYI, I'm mostly off-line this week and next and will be slower to respond than normal.)

@timClicks

This comment has been minimized.

Copy link

commented Aug 23, 2019

Thanks @danielskatz. I'm also away until the middle of next week. I will aim to have my review completed by next Friday.

@linuxscout

This comment has been minimized.

Copy link

commented Aug 29, 2019

I work on review

@danielskatz

This comment has been minimized.

Copy link

commented Sep 1, 2019

👋 @linuxscout - thanks for your work so far. Are there issues that are preventing you from checking the remaining boxes? If so, please flag them here or with new issues in the repository (mentioning this review issue).

@danielskatz

This comment has been minimized.

Copy link

commented Sep 1, 2019

@whedon remind @timClicks in 6 days

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 1, 2019

Reminder set for @timClicks in 6 days

@linuxscout

This comment has been minimized.

Copy link

commented Sep 1, 2019

I work on testing the software.
For the automated tests, there is none in the repository.

@linuxscout

This comment has been minimized.

Copy link

commented Sep 1, 2019

No automated tests
lisc-tools/lisc#22

@danielskatz

This comment has been minimized.

Copy link

commented Sep 1, 2019

As we say in the review criteria

Authors are strongly encouraged to include an automated test suite covering the core functionality of their software.

  • Good: An automated test suite hooked up to an external service such as Travis-CI or similar
  • OK: Documented manual steps that can be followed to objectively check the expected functionality of the software (e.g. a sample input file to assert behavior)
  • Bad (not acceptable): No way for you the reviewer to objectively assess whether the software works

So we do not require automated tests (though we encourage them), but we do require some tests.

In this case, if there are manual tests that you think are sufficient "to objectively check the expected functionality of the software (e.g. a sample input file to assert behavior)", that's enough for this criterion to be checked.

@linuxscout

This comment has been minimized.

Copy link

commented Sep 1, 2019

I think that tutorials are good as tests.

@TomDonoghue

This comment has been minimized.

Copy link

commented Sep 1, 2019

There are tests! In the folder lisc/tests is a set of unit-tests. So, from the repository home, they are in lisc/tests.

They are, I would say often smoke-tests, in that there is not always strong accuracy checking, but there are at least sanity check tests.

They are written with pytest. If you want to run them locally, then, with pytest installed, from inside the lisc code folder (so in lisc/lisc) run:
pytest .

I realize now it's not really documented anywhere how to run the tests. Is there a convention on how / where to note tests and how to run them?

Tests are also integrated with travis:
https://travis-ci.org/lisc-tools/lisc

@danielskatz

This comment has been minimized.

Copy link

commented Sep 1, 2019

You might add this info to the README, but certainly should add it to https://lisc-tools.github.io/lisc/, perhaps after installation (after you install, you test to make sure things are working...)

@linuxscout

This comment has been minimized.

Copy link

commented Sep 6, 2019

What I should do now?

@danielskatz

This comment has been minimized.

Copy link

commented Sep 6, 2019

What I should do now?

Nothing - your part is done, and thanks!

We're now waiting for @timClicks to do his review, check off his items, bring up needed changes, etc. He was away, but said he would have his review done around now.

I think we're also waiting for @TomDonoghue to add something about how to run the tests to the documentation.

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 7, 2019

👋 @timClicks, please update us on how your review is going.

@timClicks

This comment has been minimized.

Copy link

commented Sep 8, 2019

@danielskatz My recommendation: accept, subject to the test amendments that have been mentioned already :)

I've worked through the tutorials, and think that this a very handy package.

In general, the API documentation is in an OK state, rather than a Good state (as per the review criteria). That reflects the relative immaturity of the project. lisc is in a much better state than other packages that are released at a 0.1 state.

As a side comment, the tests are written using a very common test framework for Python applications (pytest). However, this isn't included as a dependency in the requirements.txt file so installing the test framework manually is required.


@TomDonoghue thanks for taking the time to submit this to JOSS.


@danielskatz @linuxscout @TomDonoghue Sincerely apologise for delaying this process.

@danielskatz

This comment has been minimized.

Copy link

commented Sep 8, 2019

👋 @TomDonoghue - so it seem like the only remaining action for you is to add something about how to run the tests to the documentation.

@TomDonoghue

This comment has been minimized.

Copy link

commented Sep 9, 2019

@whedon generate pdf

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 9, 2019

Attempting PDF compilation. Reticulating splines etc...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 9, 2019

@TomDonoghue

This comment has been minimized.

Copy link

commented Sep 9, 2019

@whedon generate pdf

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 9, 2019

Attempting PDF compilation. Reticulating splines etc...
@danielskatz

This comment has been minimized.

Copy link

commented Sep 26, 2019

@whedon set v0.1.1 as version

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 26, 2019

OK. v0.1.1 is the version.

@danielskatz

This comment has been minimized.

Copy link

commented Sep 26, 2019

@whedon set 10.5281/zenodo.3462035 as archive

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 26, 2019

OK. 10.5281/zenodo.3462035 is the archive.

@danielskatz

This comment has been minimized.

Copy link

commented Sep 26, 2019

@whedon accept

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 26, 2019

Attempting dry run of processing paper acceptance...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 26, 2019


OK DOIs

- 10.1145/2623330.2623667 is OK
- 10.1145/3097983.3098057 is OK
- 10.1016/j.jneumeth.2012.04.019 is OK

MISSING DOIs

- None

INVALID DOIs

- None
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 26, 2019

Check final proof 👉 openjournals/joss-papers#983

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#983, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true
@danielskatz

This comment has been minimized.

Copy link

commented Sep 26, 2019

@TomDonoghue

This comment has been minimized.

Copy link

commented Sep 26, 2019

@danielskatz - thanks for wording updates / fixes! I have merged the PR.

@danielskatz

This comment has been minimized.

Copy link

commented Sep 26, 2019

@whedon generate pdf

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 26, 2019

Attempting PDF compilation. Reticulating splines etc...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 26, 2019

@danielskatz

This comment has been minimized.

Copy link

commented Sep 26, 2019

@whedon accept

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 26, 2019

Attempting dry run of processing paper acceptance...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 26, 2019


OK DOIs

- 10.1145/2623330.2623667 is OK
- 10.1145/3097983.3098057 is OK
- 10.1016/j.jneumeth.2012.04.019 is OK

MISSING DOIs

- None

INVALID DOIs

- None
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 26, 2019

Check final proof 👉 openjournals/joss-papers#984

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#984, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true
@danielskatz

This comment has been minimized.

Copy link

commented Sep 26, 2019

@whedon accept deposit=true

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 26, 2019

Doing it live! Attempting automated processing of paper acceptance...
@whedon whedon added the accepted label Sep 26, 2019
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 26, 2019

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 26, 2019

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 openjournals/joss-papers#985
  2. Wait a couple of minutes to verify that the paper DOI resolves https://doi.org/10.21105/joss.01674
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? notify your editorial technical team...

@danielskatz

This comment has been minimized.

Copy link

commented Sep 26, 2019

@TomDonoghue - this seems to be taking a bit longer than normal to have the DOI work - It should work eventually...

@danielskatz

This comment has been minimized.

Copy link

commented Sep 26, 2019

@timClicks & @linuxscout - thanks for reviewing!

@danielskatz

This comment has been minimized.

Copy link

commented Sep 26, 2019

👋 @arfon - any insight on the DOI delay here?

@danielskatz

This comment has been minimized.

Copy link

commented Sep 26, 2019

never mind - now working...

@arfon

This comment has been minimized.

Copy link
Member

commented Sep 26, 2019

👋 @arfon - any insight on the DOI delay here?

Looks like it's resolving now.

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 26, 2019

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.01674/status.svg)](https://doi.org/10.21105/joss.01674)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.01674">
  <img src="https://joss.theoj.org/papers/10.21105/joss.01674/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.01674/status.svg
   :target: https://doi.org/10.21105/joss.01674

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

@TomDonoghue

This comment has been minimized.

Copy link

commented Sep 26, 2019

Thank you @danielskatz for managing this submission!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
6 participants
You can’t perform that action at this time.