Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dark-mode to PyVista-Brain (pair-programmed with @GuillaumeFavelier) #9149

Merged
merged 14 commits into from
Mar 22, 2021

Conversation

marsipu
Copy link
Member

@marsipu marsipu commented Mar 18, 2021

This follows PR #9126.

What does this implement/fix?

Add a theme-parameter to Brain with PyVista-Renderer accepting 'dark', 'light' and 'auto'. Dark-Mode depends on qdarkstyle and the automatic detection works with darkdetect.

Comment on lines 260 to 261
to style the widgets. Automatic detection does not yet work on
Linux. Can also be a path-like to a custom style sheet.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would open an upstream issue with darkdetect for this if they don't have one. In the meantime I would use:

gsettings get org.gnome.desktop.interface gtk-theme

if it ends in -dark assume it's dark. It at least works on my machine™ :)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cbrnr opened an issue

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mne/viz/_brain/_brain.py Outdated Show resolved Hide resolved
mne/viz/backends/_qt.py Outdated Show resolved Hide resolved
@GuillaumeFavelier
Copy link
Contributor

@marsipu I pushed a commit to reorder theme and take #9149 (comment) into account

@marsipu
Copy link
Member Author

marsipu commented Mar 18, 2021

@marsipu I pushed a commit to reorder theme and take #9149 (comment) into account

Thank you @GuillaumeFavelier . I also thought we maybe could shorten the documentation-text a bit more like this?

@GuillaumeFavelier
Copy link
Contributor

we maybe could shorten the documentation-text a bit more like this?

Sounds good 👍

theme : str | path-like
Can be "auto" (default), "light", or "dark" or a path-like to a
custom stylesheet. For Dark-Mode and automatic Dark-Mode-Detection,
`qdarkstyle` respectively `darkdetect` is required.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CircleCI seems to have some issues with the new modules:

/home/circleci/project/mne/viz/_brain/_brain.py:docstring of mne.viz._brain._brain.Brain:97: WARNING: py:obj reference target not found: qdarkstyle
/home/circleci/project/mne/viz/_brain/_brain.py:docstring of mne.viz._brain._brain.Brain:97: WARNING: py:obj reference target not found: darkdetect

Probably the docstring needs to be updated:

Suggested change
`qdarkstyle` respectively `darkdetect` is required.
:mod:`qdarkstyle` respectively :mod:`darkdetect` is required.

And the list intersphinx_mapping in doc/conf.py too.

@marsipu
Copy link
Member Author

marsipu commented Mar 18, 2021

Somehow when I test it locally, the :mod:-phrases doesn't turn into a link leading to the websites. It is supposed to do that right? Do I maybe need to import the intersphinx-references into `_brain.py´?

@larsoner
Copy link
Member

The :mod: should not be necessary I think. The problem is probably more that these modules are not in our intersphinx inventory in doc/conf.py.

https://qdarkstylesheet.readthedocs.io/en/latest/reference/qdarkstyle.html

I don't see a Sphinx documentation page for darkdetect so I would just make that one code with double-backticks rather than try to link.

@larsoner
Copy link
Member

(or you could do a custom HTML link to the darkdetect pypi or GitHub page if you want)

doc/conf.py Outdated Show resolved Hide resolved
mne/viz/_brain/_brain.py Outdated Show resolved Hide resolved
@marsipu
Copy link
Member Author

marsipu commented Mar 19, 2021

The failed test in mne/decoding/tests/test_transformer.py::test_scaler is probably unrelated, right?

@GuillaumeFavelier
Copy link
Contributor

This is what I saw:

=================================== FAILURES ===================================
_________________________________ test_scaler __________________________________
mne/decoding/tests/test_transformer.py:67: in test_scaler
    assert_allclose(X * stds[:, np.newaxis] + means[:, np.newaxis],
E   AssertionError: 
E   Not equal to tolerance rtol=1e-12, atol=1e-20
E   mean
E   Mismatched elements: 23576 / 23576 (100%)
E   Max absolute difference: 2.00587641e-11
E   Max relative difference: 42.17865626
E    x: array([[[-1.005809e-13, -1.005809e-13, -1.005809e-13, ...,
E            -1.005809e-13, -1.005809e-13, -1.005809e-13],
E           [ 2.151313e-13,  2.151313e-13,  2.151313e-13, ...,...
E    y: array([[[ 2.454723e-12,  1.490368e-12, -4.383434e-13, ...,
E            -1.402699e-12, -2.367054e-12, -4.383434e-13],
E           [-1.681645e-12, -8.432133e-12, -6.503422e-12, ...,...

I don't think it's related and I don't see it happening on master(82b3459). Maybe you can rebase @marsipu ?

Copy link
Contributor

@GuillaumeFavelier GuillaumeFavelier left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried it locally and it works as expected, I'm +1 for merge once CIs come back happy 👍

light dark custom™
image image image

@drammock
Copy link
Member

This is what I saw:

I've seen that on another PR too, agree it's unrelated.

@larsoner
Copy link
Member

Indeed on albertosottile/darkdetect#8 and this branch after installing qdarkstyle I get:

Screenshot from 2021-03-19 13-28-55

mne/viz/backends/_qt.py Outdated Show resolved Hide resolved
@hoechenberger
Copy link
Member

CIs are green – @marsipu @GuillaumeFavelier is this good to merge?

@marsipu
Copy link
Member Author

marsipu commented Mar 22, 2021

We also talked about inverting the icon-pixmaps with the theme once and haven't done it yet. I am just trying to add it, but it may need another review from @GuillaumeFavelier.

@larsoner
Copy link
Member

Yeah I think that's reasonable assuming it's not too difficult

Copy link
Member

@larsoner larsoner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add a brain_kwargs=None to SourceEstimate.plot and VectorSourceEstimate.plot, and VolSourceEstimate.plot_3d? See for example how we handle add_data_kwargs, the difference being that brain_kwargs should be passed to the Brain constructor.

doc/changes/latest.inc Show resolved Hide resolved
mne/viz/backends/_qt.py Outdated Show resolved Hide resolved
Co-authored-by: Eric Larson <larson.eric.d@gmail.com>
@marsipu
Copy link
Member Author

marsipu commented Mar 22, 2021

Can you add a brain_kwargs=None to SourceEstimate.plot and VectorSourceEstimate.plot, and VolSourceEstimate.plot_3d? See for example how we handle add_data_kwargs, the difference being that brain_kwargs should be passed to the Brain constructor.

I am sorry, I can't follow you: I already see the parameter brain_kwargs with default=None documented and established to pass kwargs on to the renderer (mne.viz._3d.py::1846 and mne.viz._3d.py::1857) for all mentioned plot-methods. Or do you mean something else?

@larsoner
Copy link
Member

Oh I didn't realize we already had that param, never mind!

@larsoner larsoner merged commit acf25ef into mne-tools:main Mar 22, 2021
@larsoner
Copy link
Member

Thanks @marsipu @GuillaumeFavelier !

@marsipu marsipu deleted the dark_brain branch March 22, 2021 20:04
@agramfort
Copy link
Member

😎

@hoechenberger
Copy link
Member

super good!

@GuillaumeFavelier
Copy link
Contributor

Awesome work @marsipu 👌

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants