-
Notifications
You must be signed in to change notification settings - Fork 109
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Restore prover keys #87
Comments
Verification key is done in the tx construction branch, TODO prover key |
Prover key is only necessary as a performance improvement. Proof transactions can also be constructed by computing the prover key on the fly, which currently doubles the time to generate proofs. Removing from the tx construction epic, to be addressed in a (less-high-priority) performance improvement epic |
this is actually harder than I thought first, because the JS code needs to be bundled in as well (it's called by the prover). bumping estimate to 3 days |
JS code doesn't need to be bundled; pickles can be modified to return the parts it needs in addition to the user code, which is just imported as usual. However, I learned that the prover key is huge (100s of MBs). We should investigate whether some parts of it can be recomputed quickly on demand |
Users asking for this feature, "dramatic improvement to UX": https://discord.com/channels/484437221055922177/1070570936799084554 |
Needs research & RFC or mini RFC |
Hundreds of megs were mentioned but is this the size of the raw data or we have a way to compress it? If compression is possible at all and makes sense. |
Answering that is part of this task. I imagine the data is mostly field elements. If those are very structured, e.g. lots of 0s and 1s each stored in 32 bytes, then general-purpose compression could indeed make a huge difference. But I don't recommend that the developer doing this treats the prover index as a black box blob of data |
Motivation described well here: #951
The text was updated successfully, but these errors were encountered: