"A former senior U.S. intelligence agent described Alexander's program: "Rather than look for a single needle in the haystack, his approach was, 'Let's collect the whole haystack. Collect it all, tag it, store it ... And whatever it is you want, you go searching for it.""
What you're describing is a program from 20 years ago design to surveil limited parties in a limited geographic region overseas, during a war, in a place that enjoyed Stone Age information systems. That is not in the sense that the people in this discussion meant by blanket surveillance. They are talking about broad interception of all communications by U.S. persons, an undertaking that it should be obvious to you if you are in this industry would be economically if not thermodynamically impossible.
"After 9/11, they took one of the programs I had done, or the backend part of it, and started to use it to spy on everybody in this country. That was a program I created called Stellar Wind. That was seperate and compartmented from the regular activity which was ongoing because it was doing domestic spying. All the equipment was coming in, I knew something was happening but then when the contractors I had hired came and told me what they were doing, it was clear where all the hardware was going and what they were using it to do. It was simply a different input, instead of being foreign it was domestic." - William Binney
Civilian information systems have radically expanded in size since 2001, even if we take that ancient statement at face value. In the year 2025 it's crazy to believe that every newspaper is shouting that civilian information systems are destabilizing the national power grid and drying up the water table, but the government possesses a larger, far more capable information system that paradoxically has no observable physical presence.
"The Utah Data Center (UDC), also known as the Intelligence Community Comprehensive National Cybersecurity Initiative Data Center, is a data storage facility for the United States Intelligence Community that is designed to store data estimated to be on the order of exabytes or larger."
"The structure provides 1 to 1.5 million sq ft (93,000 to 139,000 m2), with 100,000 sq ft (9,300 m2) of data center space and more than 900,000 sq ft (84,000 m2) of technical support and administrative space."
"The completed facility is expected to require 65 megawatts of electricity, costing about $40 million per year. Given its open-evaporation-based cooling system, the facility is expected to use 1.7 million US gal (6,400 m3) of water per day.
An article by Forbes estimates the storage capacity as between 3 and 12 exabytes as of 2013, based on analysis of unclassified blueprints, but mentions Moore's Law, meaning that advances in technology could be expected to increase the capacity by orders of magnitude in the coming years."
According to Sandvine, the vast majority of internet traffic from 2013 (chosen to coincide with the Forbes storage estimates) was video such as Netflix and Youtube[1] and remains so today[2]. Assuming NSA is aware of industry standard techniques such as data de-duplication and compression, Forbe's estimate of 3 - 12 exabytes in 2013 would have been sufficient to store the entire year's world internet traffic in full.
In 2025 The Internet Archive holds approximately 100 exabytes[3] and contains data dating back to 1995[4]. Adjusting the 2013 Forbes numbers for the Utah Data Center for 2025 storage density (4Tb drives in 2013, 36Tb drives in 2025) yields 27 - 108 exabytes. Which demonstrates clearly that a datacenter on the scale of the Utah Data Center is capable of storing and retaining a versioned history of a significant fraction of the world's internet over a significant period of time.
Assuming they prioritize metadata and unique traffic further extends the horizon on how much can be stored and for how long.
https://en.wikipedia.org/wiki/Keith_B._Alexander#NSA_appoint...