Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

perf(diagnostic): avoid first send #1075

Merged
merged 1 commit into from
Apr 19, 2022

Conversation

kevinhwang91
Copy link
Contributor

If the cache is equal to the full result, we should avoid to notify
client during first stage.

If the cache is equal to the full result, we should avoid to notify
client during first stage.
@sumneko sumneko merged commit e81ffc1 into LuaLS:master Apr 19, 2022
@kevinhwang91 kevinhwang91 deleted the perf-diagnostic branch April 19, 2022 16:56
@sumneko
Copy link
Collaborator

sumneko commented Apr 22, 2022

I have reverted this PR, this conflic with feature: #946

@kevinhwang91
Copy link
Contributor Author

kevinhwang91 commented Apr 22, 2022

Why? until 30f1cea, I can't reproduce the issue like #946 in coc.nvim or nvimlsp.

@sumneko
Copy link
Collaborator

sumneko commented Apr 22, 2022

嗯,是我描述有问题。
其实是因为这个提交引发了一个bug:
当文件之前包含诊断,但后来修复了诊断后

  1. 259行会先发送上一次的诊断结果(但被缓存阻止)
  2. 263行清空了缓存
  3. 288行发送了清除诊断(但被缓存阻止)

由于VSCode会在客户端侧缓存之前的诊断结果,而此次文件变化服务器完全没有发送新的诊断结果,那么这个文件就会始终残留着诊断内容。

出现这个问题的根本原因是服务器这边的“缓存“同时包含了服务器缓存与客户端缓存这2个语义,而且在诊断业务里判断当前状态来预先清除缓存比较费解,所以我之后的提交做了2个改动:

  1. 给缓存加了第3态false,表示既不阻止发送诊断也不阻止清空诊断
  2. 改为了在文件 open 事件中将缓存设置为第3态,这样只有打开文件的时候会有点浪费可能会多发送一次完整的诊断信息(仅对于VSCode来说有浪费),简化了实现。不过这是基于所有客户端都会在文本变化后保留诊断的假设,如果存在客户端在文本变化后也会立即清除诊断,那么得在 update 事件中也将缓存设置为第3态。

@kevinhwang91
Copy link
Contributor Author

了解了,当初我提这个pr是因为coc这边的客户端在diagnostics较多的时候会分批渲染。

当一个改动就清空诊断的时候,server这边会发上次的n个诊断和0个诊断。第一次的诊断n太大的的话分批渲染,第二次立马0个诊断清空了。但coc渲染这部分是在vimscript,不好做cancel处理,会导致第一次未渲染完的diagnostic出现在第二次0个诊断清空之后,也就是残留诊断渲染。我考虑到极少server会做这种一次改动发两次诊断,所以才在lua server这边修的。

经过你修改了已经没问题了,谢谢。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants