Å·±¦ÓéÀÖ

Computronium Quotes

Quotes tagged as "computronium" Showing 1-2 of 2
Nick Bostrom
“It might not be immediately obvious to some readers why the ability to perform 10^85 computational operations is a big deal. So it's useful to put it in context. [I]t may take about 10^31-10^44 operations to simulate all neuronal operations that have occurred in the history of life on Earth. Alternatively, let us suppose that the computers are used to run human whole brain emulations that live rich and happy lives while interacting with one another in virtual environments. A typical estimate of the computational requirements for running one emulation is 10^18 operations per second. To run an emulation for 100 subjective years would then require some 10^27 operations. This would be mean that at least 10^58 human lives could be created in emulation even with quite conservative assumptions about the efficiency of computronium. In other words, assuming that the observable universe is void of extraterrestrial civilizations, then what hangs in the balance is at least 10,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 human lives. If we represent all the happiness experienced during one entire such life with a single teardrop of joy, then the happiness of these souls could fill and refill the Earth's oceans every second, and keep doing so for a hundred billion billion millennia. It is really important that we make sure these truly are tears of joy.”
Nick Bostrom, Superintelligence: Paths, Dangers, Strategies

Nick Bostrom
“Consider an AI that has hedonism as its final goal, and which would therefore like to tile the universe with “hedoniumâ€� (matter organized in a configuration that is optimal for the generation of pleasurable experience). To this end, the AI might produce computronium (matter organized in a configuration that is optimal for computation) and use it to implement digital minds in states of euphoria. In order to maximize efficiency, the AI omits from the implementation any mental faculties that are not essential for the experience of pleasure, and exploits any computational shortcuts that according to its definition of pleasure do not vitiate the generation of pleasure. For instance, the AI might confine its simulation to reward circuitry, eliding faculties such as a memory, sensory perception, executive function, and language; it might simulate minds at a relatively coarse-grained level of functionality, omitting lower-level neuronal processes; it might replace commonly repeated computations with calls to a lookup table; or it might put in place some arrangement whereby multiple minds would share most parts of their underlying computational machinery (their “supervenience basesâ€� in philosophical parlance). Such tricks could greatly increase the quantity of pleasure producible with a given amount of resources.”
Nick Bostrom, Superintelligence: Paths, Dangers, Strategies