I set up an LSP server for OpenAI’s GPT-3/Codex or any LM via your favourite NLP service provider.
First LSP server for LMs in the world, as far as I know.
It gives documentation and refactoring to any language including world languages and fictional ones. As an LSP server, it works for VSCode too.
The LSP server can create hover documentation, code actions and linting services for any programming language, world language, fictional language or computing context.
See it in action
I provide some introspection into the LSP protocol stream.
Documentation is instantaneous with caching enabled.
Obviously, this means that a centralised cache of truthful documentation is now a necessity.
But how do we have consensus on what is a high quality suggestion/generation from a LM?
- enter blockchain or Ministry of Truth (you decide)
How does it work?
Pen.el utilises the
efm-langserver along with its shell-interop.
Essentially, you design documentation and refactoring functions (prompt-functions) like so.
Then you configure EFM langserver like so:
The LSP client communicates with the LSP server.
Hovering over “The Anatomy of a Monad”.
Ensure pen can receive the POSITION
Then remove the parts of the text which are not required, and forward the rest to the prompt function.
Study the racket language server
- see how output is formed
examining racket langserver output
json was sent by the client to the server:
The following json was returned by the server:
json that comes back isn’t really
the completions yet. I think they’re what’s
considered int the same scope as the
identifier being completed.
The original list was very long.
I decided that I wouldn’t try to write the
jq for this myself, but rather rely on Codex
I want to create something like the following
- However, I don’t have the
efm-langserver expects one
completion per line from the shell script is
uses (rather than json), I had to make a
When I provide
completion candidates, I
one-linerize them by replacing “\n” with
efm-langserver is not providing the
textEdit to unonlinerize, I have
to do it myself in
company-lsp uses lexical
scope, so I had to copy the entire file across
and rename it to add the functionality.
This is a demo of both documentation and completion using the Pen.el language server.
Built into the docker image
EFM Langserver is also built into the Pen.el docker image.
If this article appears incomplete, it may be intentional. Try prompting for a continuation.