exceeds quantum limit of Earth-based storage

Doer

Well-Known Member
This phrase is coming up in the marketing literature in my job space. I try to ingore the hype but at some point I need to figure out what they mean. So I wait unitl I can find an explaination....well not to the common man perhaps, but this suits the common geek, like me.

Just thought I'd share this interesting concept. Do you know about Moore's Law? He said that every aspect of tech that enables computing will double it's capability, every 18 months. That was back a long time ago. For a while in the early 90s we thought it had stalled. But, instead, it was an emerging breakout. We now see, Mr Moore, every 9 - 12 months. Of course, we see Mr Murphy every day. :)

We are doing 64 bit computing, but have moved to 128 bit strorage, to honor Moore's Law or Imperitive in the commercial sense. I doubt we will need to exceed the quanutum limit, It takes more energy than boiling the oceans, but we will think of something. Enjoy:roll:.
--------------

128-bit storage: are you high?
Some customers already have datasets on the order of a petabyte, or 2[SUP]50[/SUP] bytes. Thus the 64-bit capacity limit of 2[SUP]64[/SUP] bytes is only 14 doublings away. Moore's Law for storage predicts that capacity will continue to double every 9-12 months, which means we'll start to hit the 64-bit limit in about a decade. Storage systems tend to live for several decades, so it would be foolish to create a new one without anticipating the needs that will surely arise within its projected lifetime.

If 64 bits isn't enough, the next logical step is 128 bits. That's enough to survive Moore's Law until I'm dead, and after that, it's not my problem. But it does raise the question: what are the theoretical limits to storage capacity?

Although we'd all like Moore's Law to continue forever, quantum mechanics imposes some fundamental limits on the computation rate and information capacity of any physical device. In particular, it has been shown that 1 kilogram of matter confined to 1 liter of space can perform at most 10[SUP]51[/SUP] operations per second on at most 10[SUP]31[/SUP] bits of information [see Seth Lloyd, "Ultimate physical limits to computation." Nature 406, 1047-1054 (2000)]. A fully-populated 128-bit storage pool would contain 2[SUP]128[/SUP] blocks = 2[SUP]137[/SUP] bytes = 2[SUP]140[/SUP] bits; therefore the minimum mass required to hold the bits would be (2[SUP]140[/SUP] bits) / (10[SUP]31[/SUP] bits/kg) = 136 billion kg.

That's a lot of gear.

To operate at the 10[SUP]31[/SUP] bits/kg limit, however, the entire mass of the computer must be in the form of pure energy. By E=mc[SUP]2[/SUP], the rest energy of 136 billion kg is 1.2x10[SUP]28[/SUP] J. The mass of the oceans is about 1.4x10[SUP]21[/SUP] kg. It takes about 4,000 J to raise the temperature of 1 kg of water by 1 degree Celcius, and thus about 400,000 J to heat 1 kg of water from freezing to boiling. The latent heat of vaporization adds another 2 million J/kg. Thus the energy required to boil the oceans is about 2.4x10[SUP]6[/SUP] J/kg \* 1.4x10[SUP]21[/SUP] kg = 3.4x10[SUP]27[/SUP] J. Thus, fully populating a 128-bit storage pool would, literally, require more energy than boiling the oceans.
 

Toolage 87

Well-Known Member
I think that some people have already exceeded those 2TB HDDs. IMO the problem is that they haven't perfected computer's power efficiency for desktops and they haven't perfected programing Operating systems for Desktops to. If companies actually sat down and mastered programing we could support 128bit storage.

ie the PS Vita has a Quad Core CPU clocked at 2GHz and draws around 2w of power while a Desktop's CPU at Quad Core 2GHz draws around 50w to 100w of power. It shouldn't be hard to make 64bit support 1PB+. There are programs out there that can make say XP handle 2TB Hard Drives. So over all having to move to 128bit isn't a must do thing its programing the operating system and such for 64 bit to handle 128bit stuff.
 

Doer

Well-Known Member
Yeah, I see your point. This is more about the Big Iron storage pools, IBM, Oracle, like that. They are already accessing Peta Bytes. It is about the coding, sure enough. But, it's also about storing the complete visualization, at any and every layer of detail, of an entire Airbus 300, as one big file. Then gate it very quickly indeed, thru optical fiber channel, into super fast memory. Load the entire thing in minutes instead of days.

And it's about eventually having processors that can crunch 128 bits. In this world we don't talk about individual HDDs. It's all virtualized storage.

The "spindles" are hot swapped, RAID stripped. Millions of $$$ to have 99.9 uptime. Time is money. :)
 

godtowers

Member
What do you think about Quantum Computing?

According to that article theoretically a 500 qubit quantum computer can be simulated by 2**500 complex variables (imaginary numbers) in a normal computer. A complex number requires 2 times 64bit floats assuming a 64bit computer. That's 2 * 8 bytes for each member of that 2**500 element set (set of bytes). For example you can use it to simulate a 500 body quantum systems (Feynman theorized about them: http://en.wikipedia.org/wiki/Quantum_algorithm#Quantum_simulation) which requires a classical computer exponential amount of time to simulate (i.e. 2**500 on/off states) I think what that means (my knowledge is limited) you have a 500 entangled atoms you can play with according to the rules of quantum mechanics. It's a weird fucking thing to wrap your head around because QM computers are like non-deterministic turing machines. But also tripped out stuff like the wave equation. It's tripped out for a single particle but this is like a 500 particle entangled system.

What I'm trying to say perhaps there is a way to represent storage using quantum mechanical wizardry. I pretty ignorant of all that stuff, I just do computer stuff.
 

godtowers

Member
Well my point was what if you could develop a QM based memory module? Where everything is encoded into probabilistic functions and stored using a QM with 500 qubits and say fucking stores the internet for eternity.
 

godtowers

Member
Haha this page on into http://en.wikipedia.org/wiki/Quantum_information. So basically when you read the computer the wave function collapses and all you have is 500 bits. So the trick is to do your computation with quantum mechanics and have the answer in the final state. I can see how you can only do certain types of problems faster than classical computers.

I dunno...
 

godtowers

Member
it's sat I'm still up. Oh I actually read the quantum computing wiki article. A quantum computer is just a computer. It's operations over quantum data. The computation can compute on a quantum computer in a space that involves 2^n states (n is the number of qubit in the system). Hence the keywoard arbitrary superosition of the states. Obviously your algorithm has to compute that. The computation is done through quantum logical gates analogues to classical logic gates. How would come up with the algorithms?
 

Doer

Well-Known Member
Well, I see you are doing your homework. Self assigned is the only kind. :) To me, it is one of those vast game changers like social media, that is very unpredictable. I have not delved very greatly into it. I think there is only 2 qubit machines now. Maybe only 2 of those. :) Like the ENIAC it takes rooms of supporting equipment. I read something recently that work in String 'hardly-a-theory", has at last found a possible experiment having to do with a 3rd or 4th qubit. String math has the curious distinction of being, "Not even Wrong." So, they dearly need an experiment.

All this reminds me of the story of Bill Gates and the first desk top computer, the Alpha. Just a box with bit flip switches and lights. If you knew register math, with bitwise shifting, etc, you could flip and shift and read the answer in lights. No keyboard or monitor, of course, or BASIC. That was a few years off, still.

Someone had a small transistor radio and noticed how the static would change on the radio when they flipped the bit switches on the Alpha. RF energy was understood, so they knew what it was.

The AH-HA!! moment, that has changed our life, was Bill and crew set out to play Mary Had a Little Lamb in RF switches and the personal computer was born that day. The fancy calculator, was meaningless. Music!! Now, that's something.

I saw my first computer at Heathkit store, by now it was Apple 1. A keyboard and a monitor. "What does it do?" He didn't know. "It runs Basic...?"

Well I found out what that was when I got a TI 99A, with polymorphic sound. Programmed in Basic. Well, I could set up randomized loops in rules of interval and diatonic harmony and let it rip. It could play it's own Music creations for hours.

So, it is very likely in this wonderful way we call emergence, that we don't know what the Qubit will bring. :)
 

OGEvilgenius

Well-Known Member
Hmm, recently read some basics on chaos theory, pretty interesting stuff. One thing I will say is we think we know it all, and then we don't. It'll be interesting to see how Quantum computing develops. What's up with organic computing? DNA based?
 

OGEvilgenius

Well-Known Member
You would think Quantum might have even more potential than just 4 characters though. 1's and 0's definitely limit us.

I just know I read some theoretical stuff years ago about DNA based computing but I haven't seen much lately and I figure I might get more out of someone who obviously works closely in the field.
 

godtowers

Member
You would think Quantum might have even more potential than just 4 characters though. 1's and 0's definitely limit us.

I just know I read some theoretical stuff years ago about DNA based computing but I haven't seen much lately and I figure I might get more out of someone who obviously works closely in the field.
All interesting stuff. Yeah Quantum Computer can have a large amount of equivalent classical bits. Just 500 qubits are in a superposition, which has an operating space of 2**500 classical bits. You can only do these quantum mechanical based computations, once you look at it you settle on a single 500 bit variable element in that 2**500 state space. But you can run a program on a QC and have it spit out the result you want.

I think other types of exotic computational methods have equivalent things, due to the shear massiveness of parallelism that these computer can achieve. I don't know too much about it. Is it even pratical? What kind of problems can they solve?
 

Toolage 87

Well-Known Member
One thing I've found interesting and crazy is that there is already discs out there that can hold 6TB of data for a while now and that got wondering why they haven't released it to the public besides the cost of the item? We all know that the cost would come down the more people demand them and that they don't have to make very small amount of discs at a time or make 1 at a time.
 

Doer

Well-Known Member
The big rock in the road about parallel processing is the data race condition. It is not theoretically un-solveable, but I don't see it being solved, either.

We can easily have many processors cores working in close concert through their local fast cache memory. And if you are very careful, you can identify these race conditions as more or less likely, but a data race detector is a Grail not found. A data race, as the name implies is when two calls to a variable and the race is which thread actually got the current data before it was changed. One thread didn't get the correct data, but something after the change.

You can tell which thread was there first, but you can't say why. And although locking techniques have been tried, it defeats the idea of working in parallel.

So, qubit gives us instant parallel processing, but, ...what's the question, again. :)
 

fb360

Active Member
Doer, it's a cool topic, but it is limited to our current knowledge of electronics, and biology, which is exceeding moores law of transistor quantity. I see this everyday in my circuit design life. Just wait until we start making 3D transistors common, or better yet, superconductors, bioconductors which will blow the pants off our current wildest dreams. A few great examples of this are: stick memory (ddr3), and flash/other external memory. If you told someone 10 years ago that they could go to bestbuy and purchase a 32gb tumbdrive for 8.99$, they would have told you to fuck off. If you told another individual if they could believe that a 4gb stick of ddr3 memory would be 12.50$, 10 years ago, they also would have told you to fuck off.

My guess is that in the next 25 years, the amount of computing, and computing power, will far exceed any possible limits we could think of. I also hope we work on some cosmos shit.
 

Toolage 87

Well-Known Member
I still believe that our best bet is to turn around and sop using 1 CPU in desktop computers because we already have CPU in a PS Vita that is a Quad Core clocked at 2GHz and draws only around 2w of power. So in reality we can have way more powerful computers then what we already have. Not only that but we could have them draw 1/4 to heck even almost 3/4 the power that they do now. We also have Solid State Drives and those use flash memory pretty much so we can pretty much eliminate Hard Disc Drives as for those big sticks of ram well we don't need them either tbh. We can make them smaller, faster, more space and cheaper with the tech that we have right now.

We have 64GB memory cards so all they would have to do is mod one to have alot of connection surface (with in reason) and put i onto a small board and that's our ram or even 2x 32GB memory cards for 1 ram stick. If I had the money and know how I would turn around and advance the computer industry many years into the future at the rate we are going.

But sadly as we know companies don't wanna advance things to fast because they wanna milk every single penny from people that they can before people stop buying that stuff as much thus holding it all back.
 
Top