-
Notifications
You must be signed in to change notification settings - Fork 590
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
can I use ploty graohs with vaex dataframe ? #2020
Comments
Vaex with Dash Your error |
thank you so much for your reply I'm passing in to the parameters the name of my columns of my dataframe (df) , but doesn't work |
@sanaeO Could you send sample data for dfvx? |
You can't use plotly express directly with a vaex dataframe. This has been
discussed elsewhere on these boards. If you'd like plotly express to
support vaex dataframes, please raise this on the plotly side. There were
some efforts on this..
In the meantime, easiest is to covert the vaex dataframe to pandas (see
docs) after the group by operation. In most cases data for viz should be in
memory anyway.
…On Sat, Apr 16, 2022, 20:28 Matt Beeman ***@***.***> wrote:
@sanaeO <https://github.com/sanaeO> Could you send sample data for dfvx?
—
Reply to this email directly, view it on GitHub
<#2020 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AENW4Z3OA67B4IQC4GOAUYTVFMBGFANCNFSM5TQZJCTA>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
As Jovan said:
should work |
If plotly/plotly.py#3387 is merged in the future, this won't be needed anymore. |
I wanna use a dataframe vaex with ploty express to make a dash app
I don't know if I can do this
df = dfvx.groupby((dfvx.PRO, dfvx.AGE), agg='count')
scatter = px.scatter(df,
size="PRO, color="AGE",
hover_name="PRO", log_x=True, size_max=50)
the Error :
ValueError: Value of 'size' is not the name of a column in 'data_frame'. Expected one of [0] but received: count
If there is a solution , let Me know
thaaank you
The text was updated successfully, but these errors were encountered: