5

Imagine you want to simulate a cubic meter down to the particle. By following the Standard Model and other basic physical equations, how much computing power would be required to do this, in say, a day?

Would a quantum computer help you in this task? Could you somehow directly simulate the particles?

Thanks

APCoding
  • 151
  • 2
  • 1
    Seems like a question for https://what-if.xkcd.com/ – Denis Sep 17 '15 at 12:48
  • 1
    while currently nonmainstream/ sideline think this is a valid research question/ problem crossing physics & CS & relates to questions about the intrinsic computational aspects of universe/ physics laws/ reality associated with digital physics. fyi this question is somewhat related, what is the volume of information – vzn Sep 18 '15 at 05:34
  • 1
    This article touches on the issue, if anyone's interested http://www.pbs.org/wgbh/nova/blogs/physics/2014/04/is-there-anything-beyond-quantum-computing/ – APCoding Sep 19 '15 at 00:32
  • A wiki article https://en.wikipedia.org/wiki/Quantum_simulator – APCoding Sep 19 '15 at 00:36
  • Cross-posted to physics.se: http://physics.stackexchange.com/questions/207796/how-much-computing-power-would-be-required-to-fully-simulate-a-cubic-meter – Norbert Schuch Sep 19 '15 at 13:46
  • 1
    The answers at Physics.SE are fairly complete. In short: it is not likely to be feasible in the foreseeable future, no, and no. – András Salamon Sep 20 '15 at 14:21
  • somewhat related wikipedia on limits to computation. one way to estimate this would be to estimate the max # of smallest particles per unit of space and the amt of cost of particle simulation. but note that "real" physics simulation is far more complex eg QFT involving virtual particles. the quantity seems finite but nearly inestimable. note that in a sense it is not even really possible to simulate a unit of space due to the complexity of interactions of virtual particles etc.; physics gets more nondeterministic the deeper the theory. – vzn Sep 21 '15 at 02:47
  • so, there is some problem with the phrase "fully simulate". one could use the standard model which is indeed a computational/ algorithmic process, but it is widely admitted even by physicists to be incomplete wrt some crucial derivable elements of reality that are at the boundaries of known physics eg black holes, virtual particles, dark matter, large scale vs small scale theories/ incompatibility ie GR vs QM, etc.! also note the Higgs particle was "newly" discovered only within the last few years, and physicists do not rule out new particle discoveries, some even expect them, etc! – vzn Sep 21 '15 at 04:12

1 Answers1

1

tricky question! there is some diverse crosscutting research into this question, and will attempt to outline it, but will in the end take the position here that the question is contradictory/ impossible at heart (and anticipating some this may be a controversial conclusion). here are two key recent references from a physics pov addressing your question.

a rough complexity theory estimate of "simulating particle physics" is to count the number of particles, and there are about $O(n^2)$ interactions between particles, and many supercomputer simulations of particle dynamics fit this. so as a rough bound, one would pick the smallest particles, but, wait! the standard model has many subatomic particles! so one might use neutrinos, one of the smallest known stable particles, as an estimate... but then what about unstable particles such as quarks/ leptons?

there is also the whole other problematic accuracy/ precision problem of the butterfly effect long known in computational physics aka "sensitive dependence on initial conditions".

Aaronson write in his essay Is There Anything Beyond Quantum Computing?:

Is there any such problem that couldn’t be solved efficiently by a quantum computer, but could be solved efficiently by some other computer allowed by the laws of physics?

so he sketches out weird physics such as black holes or quantum gravity that might not be simulable by a quantum computer. but flipping this whole essay on its head (in a manner it which it was unintended), what he is describing are frontier areas of physics that currently do not have complete/ definite physical theories known by humans (at best, only plausible candidates/ approximations floating around.) eg:

  • black holes
  • time curves
  • quantum gravity

(adding to his list)

  • dark matter/ energy
  • virtual particles, vacuum energy

so the problem comes in your title that you literally want to fully simulate a cubic meter of space. but the standard model is actually ultimately/ technically widely acknowledged/ conceded by physicists themselves (if pressed!) as an incomplete model of reality due to all the extreme/ problematic "edge cases" listed above, and others not known. even the standard model itself is problematic in that new particles/ tiny forces and revisions are added/ modified all the time, nearly "regularly", even the famous/ celebrated Higgs particle recently.

so actually its a major open question of physics whether the basic rules of reality are computable. this ties in with the digital physics research agenda.

so physicists generally regard the idea that physics of reality is computable as something of an "approximation" that physical theories are in the continual process of evolving/ finetuning but "very thorough/ precise models" are ultimately not to be confused with accurate simulations.

also relevant to this question/ impinging is that there is new "cutting edge/ borderline" physics theory/ research and experiments to attempt detect if we are "living in a hologram" ie some kind of "simulation". it is theorized that cosmic rays might have detectable properties indicating this, or other very sensitive experiments could reveal kinds of "digitization artifacts", etc.; one example:

vzn
  • 11,014
  • 2
  • 31
  • 64