Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mongo Memory Limit #708

Closed
brianlball opened this issue Jul 3, 2023 · 1 comment
Closed

Mongo Memory Limit #708

brianlball opened this issue Jul 3, 2023 · 1 comment
Assignees

Comments

@brianlball
Copy link
Contributor

The mongo memory limit has appeared again.

image

This is because the show() method in the /controller/analysis_controller, calls the search() method in the /model/analysis which accesses the mongoDB and creates Mongoid::Criteria with the .where() from the Mongoid gem.

That query gets lazily executed later in the simulations.each part of the analysis view and can run out of memory if the .dataSize() of the simulations in the DB is larger than the MONGO_MEM ENV which changes the internalQueryMaxAddToSetBytes parameter in the various docker-compose.yml files which are used for deployments.

Mongo says to allow allowDiskUse:true, similar to what we added here in the Datapoint.collection.aggregate() call. This cannot be done in the .where() queries, however, since that is not an allowable argument and changing the .where to the .aggregate() queries would be a real pain and require significant changes.

Starting in Mongo 6.0, allowDiskUse:true is now the default for queries. While this may impact performance for large queries, this will keep the analysis GUI page from crashing.

So, bump Mongo from 4.0.4 to 6.0.7 which will require a bump for Mongoid from 7.2 to 7.4.3 accoring to compatibility chart.

Note: This issue appeared in a large analysis where the --cli_debug and --cli_verbose options were left on, which put 7mb of OpenStudio CLI output into the sdp_log_file for each datapoint, which gets put in the MongoDB. After 50 datapoints, this was over the query limit for the DB and crashed the webpage. The analysis still completed, but the user experience was not great. The simulate datapoint log, on debug settings, contains the 'registry' information for each Measure. In this case, there was a discrete variable with 15,000 values in it and this information was put in the log several times, once for each measure, and that was the reason the datapoint entry was so large. We should look at ways to remove the simulate datapoint log from the database and provide it to the user a different way, but that can be a new issue.

@brianlball brianlball self-assigned this Jul 3, 2023
@brianlball
Copy link
Contributor Author

#709

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant