Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: PyBIDS: Python tools for BIDS datasets #1294

Closed
whedon opened this issue Mar 3, 2019 · 87 comments

Comments

@whedon
Copy link
Collaborator

commented Mar 3, 2019

Submitting author: @tyarkoni (Tal Yarkoni)
Repository: https://github.com/bids-standard/pybids
Version: 0.9.2
Editor: @cMadan
Reviewer: @grlee77
Archive: 10.5281/zenodo.3333492

Status

status

Status badge code:

HTML: <a href="http://joss.theoj.org/papers/2bc5802f0656395e0312f4243e4a61ce"><img src="http://joss.theoj.org/papers/2bc5802f0656395e0312f4243e4a61ce/status.svg"></a>
Markdown: [![status](http://joss.theoj.org/papers/2bc5802f0656395e0312f4243e4a61ce/status.svg)](http://joss.theoj.org/papers/2bc5802f0656395e0312f4243e4a61ce)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@grlee77, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.theoj.org/about#reviewer_guidelines. Any questions/concerns please let @cMadan know.

Please try and complete your review in the next two weeks

Review checklist for @grlee77

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Version: 0.9.2
  • Authorship: Has the submitting author (@tyarkoni) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Mar 3, 2019

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @grlee77 it looks like you're currently assigned as the reviewer for this paper 🎉.

⭐️ Important ⭐️

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Mar 3, 2019

Attempting PDF compilation. Reticulating splines etc...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Mar 3, 2019

@cMadan

This comment has been minimized.

Copy link
Member

commented Mar 3, 2019

@tyarkoni, I see that you have list your authors as "[LastName], [FirstName]". I'm not sure if that's actually against our rules, but normally authors are just listed as "[FirstName] [LastName]". Is there a particular reason you decided to list them this way?

@tyarkoni

This comment has been minimized.

Copy link

commented Mar 4, 2019

I adapted it from the repo's .zenodo.json file. Convention for that seems to be [LastName], [FirstName].

@cMadan

This comment has been minimized.

Copy link
Member

commented Mar 10, 2019

@arfon, do you have any thoughts on the author list formatting here? I don't recall any prior JOSS papers using "[LastName], [FirstName]" formatting, but you'd know better than me--and if we want to require a specific formatting (e.g., for consistency).

@grlee77, let me know if you have any questions in starting the review!

@arfon

This comment has been minimized.

Copy link
Member

commented Mar 10, 2019

@arfon, do you have any thoughts on the author list formatting here? I don't recall any prior JOSS papers using "[LastName], [FirstName]" formatting, but you'd know better than me--and if we want to require a specific formatting (e.g., for consistency).

Yes, please have author names as a single string (firstname lastname) with any middle initials, e.g. Arfon M Smith

@grlee77

This comment has been minimized.

Copy link

commented Mar 12, 2019

I opened a couple of minor issues:
bids-standard/pybids#415
bids-standard/pybids#416

and commented on (bids-standard/pybids#245) after an initial read-through of the docs and examples. I plan to finish the checklist and review by tomorrow.

@grlee77

This comment has been minimized.

Copy link

commented Mar 13, 2019

It looks like the approach here was to include all contributors to the repository. I just had a question about whether all coauthors have already actively confirmed their desire to be included?

@cMadan: If past contributors have not been notified and explicitly agreed to be included in the publication then it seems they would need to be removed from the list? The JOSS author guidelines state

"The authors themselves assume responsibility for deciding who should be credited with co-authorship, and co-authors must always agree to be listed. In addition, co-authors agree to be accountable for all aspects of the work."

edit: on second glance it seems that the number of authors is a little less than the total number of contributors, so perhaps the list has already been pruned? I am just curious on what the decision mechanism was for inclusion or if there are any unintentional omissions.

@grlee77

This comment has been minimized.

Copy link

commented Mar 13, 2019

The submitted paper is well written and nicely summarizes the capabilities of the software. The proposed software is likely to be useful to any researchers working with data in the BIDS format and complements an increasingly diverse ecosystem of python-based neuroimaging analysis software.

The software functions as advertised. I was able to run the included example notebook without issues and tried out some other variations on the examples given in the documentation.

Aside from the minor documentation issues already raised above, the following minor issues in relation to the review criteria above were found:

1.) In regards to the "Community guidelines" criterion, I did not see any mention about how to contribute or raise issues/get help in the documentation. This is likely self-evident to those already working with GitHub/open source, but it would be useful to include something brief to guide new contributors.

2.) Some of the references are missing a DOI in paper.bib and the rendered PDF.

3.) The software is tested using automated CI across platforms and python versions, but there are currently a handful of test failures on the master branch (Appveyor is listed as failing on the main project page)

@grlee77

This comment has been minimized.

Copy link

commented Mar 13, 2019

The following are not specific to review checklist above, but more about two things that were not entirely clear to me as a new user when going through the examples in the documentation.

1.) There was an example using the path_patterns argument to the build_path method of a BIDSLayout. However, the API docs do not seem to explicitly describe the expected syntax for this argument.

2.) In the documentation section on generating reports, it is mentioned that the reports are returned as a collections.Counter, but it isn't mentioned why this would be useful format or much detail on how it should typically be used. The example shown just uses

main_report = counter.most_common()[0][0]

but does not comment on why the indexing [0][0] is needed or when one would typically want or need to use the other elements returned by most_common?

@tyarkoni

This comment has been minimized.

Copy link

commented Mar 19, 2019

@grlee77 all authors did not actively confirm their desire to be included; instead I gave people the option of opting out (and none did). Does that work? I can also repeat the request, this time requesting opt-in, but I'm concerned that there may be some people who contributed to the repo a while back and aren't paying close attention (I know from experience that there can be a mental hurdle in following up on such requests, even if it only takes a minute or two). Similarly, for some of the authors, I have no contact information aside from GitHub handle, and a few folks don't seem to be active.

@tyarkoni

This comment has been minimized.

Copy link

commented Mar 19, 2019

1.) In regards to the "Community guidelines" criterion, I did not see any mention about how to contribute or raise issues/get help in the documentation. This is likely self-evident to those already working with GitHub/open source, but it would be useful to include something brief to guide new contributors.

Thanks for catching this. I've added a "Community guidelines" section to the top-level README (see bids-standard/pybids#418).

2.) Some of the references are missing a DOI in paper.bib and the rendered PDF.

This is fixed now; see rebuilt PDF below.

3.) The software is tested using automated CI across platforms and python versions, but there are currently a handful of test failures on the master branch (Appveyor is listed as failing on the main project page)

We currently don't officially support Windows, so this isn't technically an issue for us (though I realize it doesn't look great). We haven't removed the Appveyor tests in the hopes that someone might spontaneously come along and fix this for us (it will likely require a fairly thorough dive through the codebase), but I've added a note to the README indicating the scope of official support.

There was an example using the path_patterns argument to the build_path method of a BIDSLayout. However, the API docs do not seem to explicitly describe the expected syntax for this argument.

This was documented elsewhere (in the writing module), but I've copied the relevant part into the build_path docstring.

2.) In the documentation section on generating reports, it is mentioned that the reports are returned as a collections.Counter, but it isn't mentioned why this would be useful format or much detail on how it should typically be used.

@tsalo, could you comment on this and/or make any required changes? Thanks!

@tyarkoni tyarkoni referenced this issue Mar 19, 2019

Merged

Joss review #418

@tyarkoni

This comment has been minimized.

Copy link

commented Mar 19, 2019

@whedon generate pdf from branch joss-review

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Mar 19, 2019

Attempting PDF compilation from custom branch joss-review. Reticulating splines etc...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Mar 19, 2019

@tyarkoni

This comment has been minimized.

Copy link

commented Mar 19, 2019

@grlee77 I think I've now addressed all of the comments in this thread and on the pybids repo, pending one minor issue that I've asked @tsalo (who authored the code in question) to look at. The joss-review branch collects all changes. Please let me know if there's anything else. Thanks for your effort and excellent feedback!

@tsalo

This comment has been minimized.

Copy link

commented Mar 19, 2019

BIDSReport returns a Counter because missing data or variable acquisition parameters across participants can lead to different reports. With a Counter, users can access the most common report, which is likely to be the most accurate in cases with missing data, or they can access everything if different participants had different protocols. I can add this rationale to the documentation somewhere or can make BIDSReport only return the most common report, if you'd like.

@tyarkoni

This comment has been minimized.

Copy link

commented Mar 19, 2019

I think adding that rationale to the docstring would work. Feel free to add to the joss-review branch (I added you as a collaborator).

@grlee77

This comment has been minimized.

Copy link

commented Mar 19, 2019

@grlee77 all authors did not actively confirm their desire to be included; instead I gave people the option of opting out (and none did). Does that work?

That seems fine to me, but let's check with @cMadan if that is typically okay for JOSS?

@effigies

This comment has been minimized.

Copy link

commented Mar 19, 2019

As another data point, there was a metadata gathering thread (bids-standard/pybids#308) for Zenodo. There were a couple people who never responded, but none who requested not to be authors. I recognize Zenodo isn't JOSS, but if you're looking for active consent to be considered authors on the software (if not the paper), that thread is a worthwhile one.

@grlee77

This comment has been minimized.

Copy link

commented Mar 19, 2019

Thanks for addressing the issues @tyarkoni. I reviewed the changes in the joss-review branch and they look good to me. I just checked off the remaining items in the review checklist.

We currently don't officially support Windows, so this isn't technically an issue for us (though I realize it doesn't look great).

I suppose you could remove the Appveyor badge from the readme until support is officially added, but not a big deal either way

@cMadan

This comment has been minimized.

Copy link
Member

commented Mar 23, 2019

@grlee77 all authors did not actively confirm their desire to be included; instead I gave people the option of opting out (and none did). Does that work?

Most of the other JOSS submissions I've reviewed had shorter author lists and everyone was aware of their inclusion.

@openjournals/joss-eics, is there any precedence of having JOSS co-authorship be opt-out as described here?

@arfon

This comment has been minimized.

Copy link
Member

commented Mar 23, 2019

@openjournals/joss-eics, is there any precedence of having JOSS co-authorship be opt-out as described here?

Not that I know of. I'd be curious to hear how you notified them @tyarkoni?

@tyarkoni

This comment has been minimized.

Copy link

commented Mar 23, 2019

I opened a PR in the repo soliciting feedback, plus we had the earlier discussion related to the Zenodo DOI @effigies linked to above. I could have sworn I explicitly mentioned authorship, but reading the various issues and PRs over, it appears not. Sorry about that!

Happy to raise the issue more explicitly, and will tag everyone by name this time. I still feel pretty strongly that making it opt-in will result in several people who are happy to be authors and have made meaningful contributions being left off, but if you prefer opt-in to opt-out, I'm fine with that. We call also trial the former and then make a decision based on feedback.

[Edit: FWIW, this seems like an issue that will arise again in future; e.g., I have other repos with a lot of contributors that I plan to submit to JOSS at some point. So it might make sense to address this explicitly in the guidelines one way or the other.]

@tyarkoni

This comment has been minimized.

Copy link

commented Mar 23, 2019

Okay, see the request here. I haven't indicated exactly what we'll take a non-response to mean, in the hopes of pushing people to respond, but let's see what happens.

@arfon

This comment has been minimized.

Copy link
Member

commented Jul 31, 2019

@arfon — Are we OK with an author-list discrepancy between the JOSS paper and Zenodo archive, for reasons described above?

Yes, that’s fine.

@cMadan

This comment has been minimized.

Copy link
Member

commented Jul 31, 2019

@effigies, thanks for the detailed response. I think the only thing remaining then is for @tyarkoni to update the title in Zenodo.

@effigies

This comment has been minimized.

Copy link

commented Aug 9, 2019

Turns out, @tyarkoni doesn't have the magic touch for Zenodo. It's @yarikoptic who started the archive (bids-standard/pybids#299), so only he has the power.

Yarik, would you mind updating the title of https://zenodo.org/record/3333492 (specifically that one) to "PyBIDS: Python tools for BIDS datasets".

@yarikoptic

This comment has been minimized.

Copy link

commented Aug 9, 2019

  • how do I share the super powers?
  • updated title for that one and 0.9.3 -- clicked Save -- but do not see that reflected anywhere
@effigies

This comment has been minimized.

Copy link

commented Aug 9, 2019

I haven't been able to find a way to share super powers on my repositories. We should probably open an issue with Zenodo, because this is extremely inconvenient for projects that don't have a BDFL.

Edit: This was reported by a familiar face two years ago. zenodo/zenodo#1325

Related to zenodo/zenodo#35, zenodo/zenodo#151, zenodo/zenodo#163

@effigies

This comment has been minimized.

Copy link

commented Aug 9, 2019

@yarikoptic I wonder if hitting "Publish" instead of "Save" would make a difference.

@effigies

This comment has been minimized.

Copy link

commented Aug 10, 2019

@cMadan I'm going to see if I can update the title through our metadata (bids-standard/pybids#470). In which case, we'll go with the 0.9.4 tag.

@yarikoptic

This comment has been minimized.

Copy link

commented Aug 12, 2019

@yarikoptic I wonder if hitting "Publish" instead of "Save" would make a difference.

Sorry for the delay!!! I just did it and it published. But now it doesn't show the version in the title. I wonder if I should add it (PyBIDS: Python tools for BIDS datasets (0.9.2)) and also adjust subsequent release records?

@effigies

This comment has been minimized.

Copy link

commented Aug 12, 2019

@yarikoptic The version is listed on the page, so I'm not too worried.

@cMadan WDYT?

@cMadan

This comment has been minimized.

Copy link
Member

commented Aug 12, 2019

@yarikoptic @effigies, I don't think the version needs to be in the title since it's prominent elsewhere.

@cMadan

This comment has been minimized.

Copy link
Member

commented Aug 12, 2019

@whedon set 0.9.3 as version

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 12, 2019

OK. 0.9.3 is the version.

@cMadan

This comment has been minimized.

Copy link
Member

commented Aug 12, 2019

@whedon set 0.9.2 as version

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 12, 2019

OK. 0.9.2 is the version.

@cMadan

This comment has been minimized.

Copy link
Member

commented Aug 12, 2019

@whedon set 10.5281/zenodo.3333492 as archive

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 12, 2019

OK. 10.5281/zenodo.3333492 is the archive.

@cMadan

This comment has been minimized.

Copy link
Member

commented Aug 12, 2019

@openjournals/joss-eics, I think we're all set to accept here!

@danielskatz

This comment has been minimized.

Copy link

commented Aug 12, 2019

@whedon accept

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 12, 2019

Attempting dry run of processing paper acceptance...
@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 12, 2019

Check final proof 👉 openjournals/joss-papers#896

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#896, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true
@danielskatz

This comment has been minimized.

Copy link

commented Aug 12, 2019

@whedon accept deposit=true

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 12, 2019

Doing it live! Attempting automated processing of paper acceptance...

@whedon whedon added the accepted label Aug 12, 2019

@danielskatz

This comment has been minimized.

Copy link

commented Aug 12, 2019

Thanks to @grlee77 for reviewing and to @cMadan for editing

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 12, 2019

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 12, 2019

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 openjournals/joss-papers#897
  2. Wait a couple of minutes to verify that the paper DOI resolves https://doi.org/10.21105/joss.01294
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? notify your editorial technical team...

@whedon

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 12, 2019

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.01294/status.svg)](https://doi.org/10.21105/joss.01294)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.01294">
  <img src="https://joss.theoj.org/papers/10.21105/joss.01294/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.01294/status.svg
   :target: https://doi.org/10.21105/joss.01294

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
You can’t perform that action at this time.