A prototype is available, though it’s Chrome-only and English-only at the moment. How this’ll work is you select some text and then click on the extension, which will try to “return the relevant quote and inference for the user, along with links to article and quality signals”.
How this works is it uses ChatGPT to generate a search query, utilizes WP’s search API to search for relevant article text, and then uses ChatGPT to extract the relevant part.
Based on this article, it seems that on average an LLM query costs about 10x when compared to a search engine query.
Man - that’s wild. Thank you for coming though with a citation - I appreciate it!