This is only the size of the text at one moment in time. The edit history is dozens of terabytes and the media is hundreds. All of this must be served to billions of requests a day. Almost every single request involves executing arbitrary lua and/or template syntax which was transcluded into the page. The media, particularly, introduces huge copyright problems. Ect ect. The WMF is undoubtably bloated, but it is not a walk in the park to run Wikipedia.
I would be extremely surprised if most articles that are regularly viewed aren't precached and thus served directly without involving any code execution.
https://meta.wikimedia.org/wiki/Data_dump_torrents#English_W...