Lets say you live in a world where the world government has decided people are getting too addicted to the internet and ordered the internet to be shut down for 5 years. The 100 GB of storage is all you have (excluding essential system files for your Operating System). You have 24 hours before the internet is getting shut down. What do you download?
Wikipedia
I think the full dump of wikipedia complete with multimedia is like 5 GB. the text-only version is something like 100MB.
English only with media is a bit over 100GB, at least the kiwix package is
i wonder where i got the low number from then :/ can’t find it now
22 gb without media
https://en.m.wikipedia.org/wiki/Wikipedia:Database_download
is that for all languages and with history?
I checked yesterday, English Wikipedia is 86 GB
huh. wonder where i got that from then.
If you were smart about it and had time to prepare, you could find a way to restrict the size of the Wikipedia download by eliminating things that are likely to also be in print books and therefore much less important to preserve. It might have to involve some sort of machine learning element.