This is great, but you should include some formatting corrections (superscript didn't translate from wherever you got this):
One of the consequences of the second law of thermodynamics is that a certain amount of energy is necessary to represent information. To record a single bit by changing the state of a system requires an amount of energy no less than kT, where T is the absolute temperature of the system and k is the Boltzman constant. (Stick with me; the physics lesson is almost over.)
Given that k = 1.38×10-16 erg/°Kelvin, and that the ambient temperature of the universe is 3.2°Kelvin, an ideal computer running at 3.2°K would consume 4.4×10^-16 ergs every time it set or cleared a bit. To run a computer any colder than the cosmic background radiation would require extra energy to run a heat pump.
Now, the annual energy output of our sun is about 1.21×1041 ergs. This is enough to power about 2.7×1056 single bit changes on our ideal computer; enough state changes to put a 187-bit counter through all its values. If we built a Dyson sphere around the sun and captured all its energy for 32 years, without any loss, we could power a computer to count up to 2192. Of course, it wouldn't have the energy left over to perform any useful calculations with this counter.
But that's just one star, and a measly one at that. A typical supernova releases something like 1051 ergs. (About a hundred times as much energy would be released in the form of neutrinos, but let them go for now.) If all of this energy could be channeled into a single orgy of computation, a 219-bit counter could be cycled through all of its states.
These numbers have nothing to do with the technology of the devices; they are the maximums that thermodynamics will allow. And they strongly imply that brute-force attacks against 256-bit keys will be infeasible until computers are built from something other than matter and occupy something other than space.
You should probably clarify that an erg is equal to 10
-7 joules, too. That clears the numbers up, I hope. Still, those are theoretical limits, practical ones are, as always, lower. Classic computers are running into thermodynamic barriers
right now. High-performance gaming systems, for instance, run into incredible heat dissipation problems. The latest Intel's CPU line, in particular. I recall it being not that much of an improvement, but a real devil to keep cool.
Also, there is an entire class of problems which could
only be solved by quantum computing. They were called "P-NP problems" (at least IIRC), and this was mathematically proven. I don't know if you could design a cryptographic algorithm that would use a such a problem, but if you could, trial and error would probably be the only way to approach this kind of problem with a classical computer (and, as shown in the quote, brute-forcing this would be futile). Also, by "taking millennia to solve" I referred to "a classic computer would need to run for a millennia before solving it", not that someone wouldn't be able to break it anyway. Either way, quantum computing will someday shake cryptography up quite a bit, and would probably have some interesting influence on mathematics in general. It's not just a matter of Shor's algorithm and resistance to it (you don't need a QC to theorize about it and it's algorithms), quantum computers would open up completely new alleys in algorithm design.
No, my bull**** call is straight on; you're just taking it too personally. About three quarters of the stuff you said was accurate, and a quarter of it needs to be chiseled back to a reasonable position with sane qualifications.
infinite bandwidth
Bull****. Information theory provides ceilings on bandwidth. (You've correctly recognized that this is bull****, and stepped back from the claim.)
Every time I write "infinite" replace that with "infinite for all intents and purposes" (more than we could possibly use/not a bottleneck/not gonna run out anytime soon). In the same vein, "infinitely small" means "finitely small, but irrelevant to the calculations".

Few things in the universe are truly infinite, but the word makes for a nice rhetoric (works well enough for laymen, at least). I should've been more specific.
Here you're conflating two developments: optical communication networks (which we've had for ages, they're fiber optics) and optical computation. You fail to successfully identify the major challenges facing optical computation - shot noise and the weak coupling capabilities of photons compared to electrons. Compounding the error, you're 'imagining' that optical computation will be a great partner for a photon-mediated electron spin entanglement without addressing the OEO problem, which seems like it could be pretty necessary. You need to be more rigorous about delineating communication problems and computation problems.
In all your excitement over global computational meshes firing off Shor's algorithm, you're losing focus on what's really exciting here - which you correctly identified in the very first sentence you posted! One of the biggest hurdles in QC is the problem of quantum error. By achieving full determinism, these researchers have made a huge hurdle towards a scalable quantum processor, which is necessary before we can build useful quantum computers (let alone start networking them). And if the diamond trap substrate is robust and economical, then that would be (finally) a reasonable consensus architecture to start from.
That's what's super cool here, and what you're right to be excited about.
Well, I thought the matter of implications for the very existence of an usable QC has been settled already.

Yes, it's nice that we can actually make a quantum processor now. But I see little to discuss in that matter.
Other implications are a bit more far-fetched. I must say that I'm not that much into optics right now, but it's definitely a thing. We'll definitely have to go optical to overcome the aforementioned thermodynamic barriers. Oh, and I was talking about computation the whole time, which requires communication between computer's subsystems. Perhaps I wasn't clear, but I wasn't referring to bandwidth as in "internet bandwidth", but rather as bandwidth between computer subsystems (at least I think so. I posted that late at night). Ordinary electricity moves at the speed of light (according to one interpretation), and we've got fiber optics, so light-speed comms are nothing new in that field. As for the OEO problem, my whole idea was to sort of go around it by having an all-optical classic system directly interfacing with the quantum PMESE system. Again, I "imagine", because I'm not sure if that's possible. I should've stated that.
Or perhaps it was late at night and I got those two subject confused/tried to discuss them both at the same time. I've been sleep-deprived lately (in no small part thanks to learning physics).
