- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
cross-posted from: https://lemmy.zip/post/27030131
The full repo: https://github.com/vongaisberg/gpt3_macro
cross-posted from: https://lemmy.zip/post/27030131
The full repo: https://github.com/vongaisberg/gpt3_macro
Is this the freaking antithesis of reproducible builds‽ Sheesh, just thinking of the implications in the build pipeline/supply chain makes me shudder
Just set the temperature to zero, duh
Looking at the source they thankfully already use a temp of zero, but max tokens is 320. That doesn’t seem like much for code especially since most symbols are a whole token.
When your CPU is at 0 degrees Kelvin, nothing unpredictable can happen.
>cool CPU to 0 Kelvin
>CPU stops working
yeah I guess you’re right
CPUs work faster with better cooling.
So at 0K they are infinitely fast.
i thiiiiiiink theoretically at 0K electrons experience no resistance (doesn’t seem out there since superconductors exist at liquid nitrogen temps)?
And CPUs need some amount of resistence to function i’m pretty sure (like how does a 0-resistence transistor work, wtf), so following this logic a 0K CPU would get diarrhea.
this is how we end up with lost tech a few decades later
someone post this to the guix mailinglist 😄
You’d have to consider it somewhat of a black box, which is what people already do.
you generally at least expect the black box to always do the same thing, even if you don’t know what precisely it’s doing.
Just hash the binary and include it with the build. When somebody else compiles they can check the hash and just recompile until it is the same. Deterministic outcome in presumambly finite time. Untill the weights of the model change then all bets are off.