I’m not a hundred percent sure, but afaik it has to do with how random the output of the GPT model will be. At 0 it will always pick the most probable next continuation of a piece of text according to its own prediction. The higher the temperature, the more chance there is for less probable outputs to get picked. So it’s most likely to pick 42, but as the temperature increases you see the chance of (according to the model) less likely numbers increase.
This is how temperature works in the softmax function, which is often used in deep learning.
What does “temperature” on the Y-axis refer to?
https://youtu.be/wjZofJX0v4M your answer from the 22:00 mark on.
I’m not a hundred percent sure, but afaik it has to do with how random the output of the GPT model will be. At 0 it will always pick the most probable next continuation of a piece of text according to its own prediction. The higher the temperature, the more chance there is for less probable outputs to get picked. So it’s most likely to pick 42, but as the temperature increases you see the chance of (according to the model) less likely numbers increase.
This is how temperature works in the softmax function, which is often used in deep learning.
Super helpful, thanks!