Advertisment

Single-atom bits–Future storage solutions..?

author-image
DQW Bureau
New Update





Advertisment

Speaking of things tiny, we learned last issue that 360 gigabyte notebook

hard drives will be here in a few years, and that new magnetic sensing

techniques, and the use of the spin of electrons (spintronics), hold the promise

of a terabit of data in something the size of a credit card. But of course

Science marches on, and innovative people who just insist on asking ‘Why?’

continue to push the bar ever-higher--or ever-smaller in this case.

Consider the picture -- we're looking at individual atoms of gold on a

silicon substrate which have been convinced, by Dr Himpsel and his colleagues at

the University of Wisconsin at Madison, to self-assemble themselves into regular

rows 1.7 nanometers apart, with a consistent 1.5 nanometer spacing between each

atom within a row. According to Dr Himpsel, "We can actually get atoms to

assemble themselves... precisely, without any type of lithography. It is

actually quite simple, and my graduate students make the surfaces routinely

now."

But why are some atoms missing in the picture? Because they were

intentionally ‘written out’ using an atomic force microscope, which of

course is the same thing as writing data INTO this tiny array. At each spot

where an atom ought to be, the data is a ‘one’ if the atom is there, and a

‘zero’ if it's not.

Advertisment

That's data density of 250 trillion bits per square inch. Or 7,800 DVDs full

of movies -- in one square inch. Which is storage that is denser than the DNA

that defines -- us!

"The density and readout speed of DNA quite similar to our silicon

memory," While DNA uses 32 atoms to store one bit using one

of four base molecules, silicon memory uses 20 atoms including the atoms

between the individual atoms that store the bits."

What's particularly fascinating about this is that 250 terabits/square inch

is nowhere near the ultimate density for this type of storage. In order to gain

room-temperature stability, these researchers trod a careful middle-ground

between ultimate density and practicality. Although this process is

"impractically slow at present," Georgia Tech professor Phillip First

indicates that this work is,

Advertisment

"... a realistic analysis of bit stability, which is good;

recording density, which is high; and readout speed... It is a very

impressive demonstration of the practical limits of two-dimensional data storage

using single-atom bits."


’Single-atom bits.’ That's pretty impressive, indeed.

And they're working on that slow "read speed;" they figure they

have a headroom of about 100,000 times, which means that this technique might

eventually rival the speed of today's magnetic disks.


Of course this is currently just a laboratory prototype, and there might well be

practical issues that prevent its eventual commercialization.

But this is the same way that all of the storage technologies that we now

take for granted, began. Even if this particular technique gets sidelined, the

idea of our storing data at the same scale as Nature does, now seems a likely,

er, evolution. And might that not change more than a few rules...?

Advertisment

Again, Don't Blink!

CPU Update - Want a Teraflop?


It's not a new cuddly toy, but the term "teraflop" (one TRILLION

complex mathematical operations per second, a computing speed often associated

with supercomputers) will soon be IN a toy, when the work between Sony and IBM

and Toshiba hits the market. Perhaps in a near generation of Sony console video

games, and other "computing devices," a chip called the

"Cell" is due in 2005. What makes the Cell different from today's

typical microprocessors or Digital Signal Processors (DSPs) is that the Cell

contains several different types of computing cores (or cells), each optimized

to its own task (such as video processing, high-bandwidth communications

processing, and more), but the processing cells can be interconnected in

different ways under program control to optimize the task at hand.

If the Cell does come out of the manufacturing process as intended, and if

software developers succeed at the tricky task of optimizing the Cell's

capabilities, then we may hit a new high in what consumer devices can do. And

what particularly fascinates me about this development is that its prime

motivation isn't for the next-best generation of supercomputers or PCs, but for

the PlayStation -- a game.

Advertisment

Which just underscores what I'm continually re-learning from the evolving

history of computing: that games often drive the advances that eventually become

de rigueur for business computing (remember color monitors, sound cards, 3D

graphics acceleration, and more...) "Gaming" is not a game that any of

us who are interested in how we and our businesses are going to be using

technology, dare ignore.

It's "your turn..."

Advertisment