Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: LISC: A Python Package for Scientific Literature Collection and Analysis #1674

Open
whedon opened this issue Aug 22, 2019 · 42 comments

Comments

@whedon
Copy link
Collaborator

commented Aug 22, 2019

Submitting author: @TomDonoghue (Thomas Donoghue)
Repository: https://github.com/lisc-tools/lisc
Version: v0.1.0
Editor: @danielskatz
Reviewer: @timClicks, @linuxscout
Archive: Pending

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/7be5e75ec454ef0946841b02a80cd947"><img src="https://joss.theoj.org/papers/7be5e75ec454ef0946841b02a80cd947/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/7be5e75ec454ef0946841b02a80cd947/status.svg)](https://joss.theoj.org/papers/7be5e75ec454ef0946841b02a80cd947)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@timClicks & @linuxscout, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @danielskatz know.

Please try and complete your review in the next two weeks

Review checklist for @timClicks

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Version: Does the release version given match the GitHub release (v0.1.0)?
  • Authorship: Has the submitting author (@TomDonoghue) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

Review checklist for @linuxscout

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Version: Does the release version given match the GitHub release (v0.1.0)?
  • Authorship: Has the submitting author (@TomDonoghue) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 22, 2019

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @timClicks, @linuxscout it looks like you're currently assigned to review this paper 🎉.

⭐️ Important ⭐️

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 22, 2019

Attempting PDF compilation. Reticulating splines etc...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 22, 2019

@danielskatz

This comment has been minimized.

Copy link

commented Aug 22, 2019

👋 @timClicks, @linuxscout - your job as a reviewer is to examine the paper and software repo, and to check off items in your review list as possible. When you have some issue where improvement is needed, either flag it here or with a new issue in the repository (mentioning this review issue). If you have any questions or problems, please let me know. (FYI, I'm mostly off-line this week and next and will be slower to respond than normal.)

@timClicks

This comment has been minimized.

Copy link
Collaborator

commented Aug 23, 2019

Thanks @danielskatz. I'm also away until the middle of next week. I will aim to have my review completed by next Friday.

@linuxscout

This comment has been minimized.

Copy link
Collaborator

commented Aug 29, 2019

I work on review

@danielskatz

This comment has been minimized.

Copy link

commented Sep 1, 2019

👋 @linuxscout - thanks for your work so far. Are there issues that are preventing you from checking the remaining boxes? If so, please flag them here or with new issues in the repository (mentioning this review issue).

@danielskatz

This comment has been minimized.

Copy link

commented Sep 1, 2019

@whedon remind @timClicks in 6 days

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 1, 2019

Reminder set for @timClicks in 6 days

@linuxscout

This comment has been minimized.

Copy link
Collaborator

commented Sep 1, 2019

I work on testing the software.
For the automated tests, there is none in the repository.

@linuxscout

This comment has been minimized.

Copy link
Collaborator

commented Sep 1, 2019

No automated tests
lisc-tools/lisc#22

@danielskatz

This comment has been minimized.

Copy link

commented Sep 1, 2019

As we say in the review criteria

Authors are strongly encouraged to include an automated test suite covering the core functionality of their software.

  • Good: An automated test suite hooked up to an external service such as Travis-CI or similar
  • OK: Documented manual steps that can be followed to objectively check the expected functionality of the software (e.g. a sample input file to assert behavior)
  • Bad (not acceptable): No way for you the reviewer to objectively assess whether the software works

So we do not require automated tests (though we encourage them), but we do require some tests.

In this case, if there are manual tests that you think are sufficient "to objectively check the expected functionality of the software (e.g. a sample input file to assert behavior)", that's enough for this criterion to be checked.

@linuxscout

This comment has been minimized.

Copy link
Collaborator

commented Sep 1, 2019

I think that tutorials are good as tests.

@TomDonoghue

This comment has been minimized.

Copy link

commented Sep 1, 2019

There are tests! In the folder lisc/tests is a set of unit-tests. So, from the repository home, they are in lisc/tests.

They are, I would say often smoke-tests, in that there is not always strong accuracy checking, but there are at least sanity check tests.

They are written with pytest. If you want to run them locally, then, with pytest installed, from inside the lisc code folder (so in lisc/lisc) run:
pytest .

I realize now it's not really documented anywhere how to run the tests. Is there a convention on how / where to note tests and how to run them?

Tests are also integrated with travis:
https://travis-ci.org/lisc-tools/lisc

@danielskatz

This comment has been minimized.

Copy link

commented Sep 1, 2019

You might add this info to the README, but certainly should add it to https://lisc-tools.github.io/lisc/, perhaps after installation (after you install, you test to make sure things are working...)

@linuxscout

This comment has been minimized.

Copy link
Collaborator

commented Sep 6, 2019

What I should do now?

@danielskatz

This comment has been minimized.

Copy link

commented Sep 6, 2019

What I should do now?

Nothing - your part is done, and thanks!

We're now waiting for @timClicks to do his review, check off his items, bring up needed changes, etc. He was away, but said he would have his review done around now.

I think we're also waiting for @TomDonoghue to add something about how to run the tests to the documentation.

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 7, 2019

👋 @timClicks, please update us on how your review is going.

@timClicks

This comment has been minimized.

Copy link
Collaborator

commented Sep 8, 2019

@danielskatz My recommendation: accept, subject to the test amendments that have been mentioned already :)

I've worked through the tutorials, and think that this a very handy package.

In general, the API documentation is in an OK state, rather than a Good state (as per the review criteria). That reflects the relative immaturity of the project. lisc is in a much better state than other packages that are released at a 0.1 state.

As a side comment, the tests are written using a very common test framework for Python applications (pytest). However, this isn't included as a dependency in the requirements.txt file so installing the test framework manually is required.


@TomDonoghue thanks for taking the time to submit this to JOSS.


@danielskatz @linuxscout @TomDonoghue Sincerely apologise for delaying this process.

@danielskatz

This comment has been minimized.

Copy link

commented Sep 8, 2019

👋 @TomDonoghue - so it seem like the only remaining action for you is to add something about how to run the tests to the documentation.

@TomDonoghue

This comment has been minimized.

Copy link

commented Sep 9, 2019

@whedon generate pdf

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 9, 2019

Attempting PDF compilation. Reticulating splines etc...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 9, 2019

@TomDonoghue

This comment has been minimized.

Copy link

commented Sep 9, 2019

@whedon generate pdf

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 9, 2019

Attempting PDF compilation. Reticulating splines etc...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 9, 2019

@TomDonoghue

This comment has been minimized.

Copy link

commented Sep 9, 2019

@whedon generate pdf

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 9, 2019

Attempting PDF compilation. Reticulating splines etc...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 9, 2019

@TomDonoghue

This comment has been minimized.

Copy link

commented Sep 9, 2019

Test Coverage

In terms of noting the test coverage, what I have done is add a section to the README, which gets built as the homepage of the documentation page, which mentions the tests, and how to run them, and the test dependency of pytest.

I'm actually not too sure what is considered convention / best practice for documenting tests & test coverage - so please let me know if this isn't the best answer! I haven't, for example, typically listed 'pytest' in the requirements.txt (it seems not necessarily standard to do so?). Would best practice be to list pytest in requirements, even if its not needed to use the package? Pytest is noted in setup.py. And/or, more generally, if anyone has any suggestions on additional or different ways to mention tests, please let me know!

API Documentation
I totally agree that the level of API documentation is middle ground, with it wanting for more examples with inputs / outputs, etc. That's definitely on the list of things to keep updating (lisc-tools/lisc#20).

Paper Notes
There weren't any specific comments on the paper itself. I did a last sweep through, with some very minor edits for wording, layout, and minor ref fixes. I rebuilt the paper above. I hope it's okay to do some last tweaks - just wanted to make nothing was weird there before it gets locked in. Let me know if there are any comments on the wording, etc - happy to tweak / fix up!

Thanks!

A huge thank you to @linuxscout & @timClicks for doing the review, and @danielskatz for managing! Please let me know if there is anything else I should do!

@danielskatz

This comment has been minimized.

Copy link

commented Sep 9, 2019

👋 @linuxscout & @timClicks - Please let me know your comments on the changes described above

@timClicks

This comment has been minimized.

Copy link
Collaborator

commented Sep 10, 2019

@danielskatz LGTM. I've reviewed the new paper and the new README. Both look fine.

@danielskatz

This comment has been minimized.

Copy link

commented Sep 13, 2019

👋 @linuxscout - Please let me know your comments on the changes described above

@danielskatz

This comment has been minimized.

Copy link

commented Sep 17, 2019

👋 @linuxscout - Please let me know your comments on the changes described above
(last chance, then I'll assume this is ok with you based on your previous comments and your checking of all the items on your list)

@linuxscout

This comment has been minimized.

Copy link
Collaborator

commented Sep 18, 2019

Hi,
I think it okay for updates.
Thanks.

@danielskatz

This comment has been minimized.

Copy link

commented Sep 18, 2019

👋 @TomDonoghue - to finish up now, I need you to

  1. deposit the repository in an archive (e.g. zenodo) and to report the DOI here
  2. tell me the latest version number here
  3. confirm that the paper text and references are final
@danielskatz

This comment has been minimized.

Copy link

commented Sep 18, 2019

@whedon generate pdf

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 18, 2019

Attempting PDF compilation. Reticulating splines etc...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 18, 2019

@danielskatz

This comment has been minimized.

Copy link

commented Sep 18, 2019

@whedon check references

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 18, 2019

Attempting to check references...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Sep 18, 2019


OK DOIs

- 10.1145/2623330.2623667 is OK
- 10.1145/3097983.3098057 is OK
- 10.1016/j.jneumeth.2012.04.019 is OK

MISSING DOIs

- None

INVALID DOIs

- None
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
5 participants
You can’t perform that action at this time.