This is a demonstration of using a LM to search the internet.
This is a very effective search engine, actually. It pretty much replaces Google Search. The technology has tighter integration with tooling, faster look- up times, can be used offline etc. with (free models).
Select any text, code or prose or whatever, irrespective of if they have the keywords you would typically search for. Then it generates a bunch of URLs and you filter the results. Firstly extracting the urls from the generrated text with a tool such as ‘xurls’, then validating those URLs by pinging the server. Then you have a list of valid urls that are related to the text you selected. It’s like the world wide web but without the hyperlinks.
GPT-3 can do this too
But GPT-3 can be thought of as being too high temperature; The urls themselves are chimeric, and a chimeric url is usually a wrong url.
This is the validator I have chosen to use.
- Select arbitrary text and gather URLs on it
- Select any raw text (i.e. code with tricky and unnameable syntax)
- Select large amounts of text and gather URLs on it
- Much longer queries taking into account more context
- Much more intelligent results
- Much more specified results
- Much more accurate results
- Full control
- No ads
If this article appears incomplete, it may be intentional. Try prompting for a continuation.