Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clarify docs and maybe refine example for larger than mem with duckdb #51

Closed
drizk1 opened this issue Aug 7, 2024 · 0 comments
Closed

Comments

@drizk1
Copy link
Member

drizk1 commented Aug 7, 2024

The example right now fits into memory, but interestingly, when reading files from the url there's a much longer latency than if I copy them them to a local database db

This link outlines how to query larger than RAM files

SET memory_limit = '4GB';

SET temp_directory = '/tmp/duckdb_swap';
SET max_temp_directory_size = '100GB';

If you put a table name from the db directly in db_table it would do this after having it set up as above, it would work correctly.

I clarified the docs, and added a link. we could make a larger_than_mem equivalent that would automatically do this versus just leaving it as with the Julia / duckdb code in the example

Edit, did further reading and such, the slow down was in reading from a URL vs a local file. reverted most changes, kept the link and added note about URL vs local

@drizk1 drizk1 closed this as completed Aug 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant