-
-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Consider adding an offline cache? #26
Comments
This plugin already maintains an offline "cache" of the catalog itself, although it does not cache individual schemas. Currently, we use a GitHub action to poll the SchemaStore catalog on an hourly basis. If the catalog has changed, we re-generate the Off the top of my head, caching individual schemas could work in one of two ways:
Each approach has pros and cons: Approach 1:
Approach 2:
Happy to hear any of your thoughts. |
If we add add an option in setup, and let user can add their common scheme id in the option.
SchemeStore can download and cache these in setup stage. In my case, I need to run neovim on the pure offline machine. We can pack the neovim data and conf, then move it and workaround on the offline machine. |
Thinking about it some more, since we don't have the cooperation of the language servers, the only feasible method is to download some set of schemas ahead of time and present them to the LSPs. I think for the sake of avoiding a lot of unnecessary disk usage, which schemas are downloaded should be left to the user. Maybe we can identify a few very common ones to have some initial seed? (The YAML LSP basically does that for an ancient version of Kubernetes.) |
Schemastore.org website is offline, returning 503, today and I've experienced a couple of issues with it last week (timeouts).
It seems like clients should improve and cache things offline. The YAML and JSON language servers should probably handle this themselves, but I think if there was a central cache that neovim manages it would be better. And your plugin seems like a natural fit for providing that API/managing the cache.
The text was updated successfully, but these errors were encountered: