GPT3 as HyperObject

the best public example i've seen of this:

https://gpt-3-explorer.vercel.app/

https://share.getcloudapp.com/YEuojdmm?embed=true

https://gist.github.com/un1crom/defa0396178a6b30757c5bb6e78c4f83

Scratch pad of thoughts from July 22, 2020

THOUGHT 1:

it is a very interesting thing to see how quickly humans will curate, especially early in a new systems time. I think as the low hanging fruit is plucked we will need a second generation of toolsets to mine this.THOUGHT 2:

I could be wrong but mining GPT3 (and 4,5,6) is actually going to end up being cheaper than webcrawls etc. Webcrawls contain so much noise/redundancy that they aren't really worth it. Once GPT3 has been "curated" enough it will just be orders of magnitude cheaper to hit GPT3 and skip websearches etc... and only when GPT3 has been unable to do useful completions for your task do you send the machine out to forage via searching.i mean, obviously openAI team saw that hence the Search API.THOUGHT 3:

It would be a very interesting experiment to EXCLUDE wikipedia from model. I suspect wikipedia is a huge chunk of glue in the network as it links tons of concepts and is well "curated" for language usage and factoids.THOUGHT 4:

The success of prompt curation is going to lead to an effort to just flat out better produce full data sets by domain. e.g. based on the success of generating programming language templates each language designer and language community should just flat out publish a very clean prompt of all prompts and find the canonical programs/code bases and expose them.Better raw data for the model will greatly reduce the prompt curation AND reduce the overall retraining for future GPTs...THOUGHT 5:

I returned to GPT2 training based on these thoughts to see how much further I can push finetuning in GPT2 by domain based on the success of prompts for GPT3.THOUGHT 6:

I bet, strong bet, that networks of tiny domain specific GPTs able to federate request to each other might provide their own ability to answer queries directly and when they can't get good enough coherence to call into the Mothership (GPT3 and beyond). In a sense, GPT2s/mini GPT3s can just be prompt generators etc.Thanks for listening. I've been in the GPT3 void for long enough I now speak like it, whatever it is.

Scratch pad of thoughts from July 16/17

Point 1: Thought 1

all of us playing really do need to get a wiki or github set up so we can have all these "maps between systems" easily find-able.

(think of what artbreeder.com or DeepDream did for GANs etc)