From the universe to the dataverse. How light becomes information, and information becomes light.
Fuente: The Economist
America’s biggest spread of silicon for turning photons into power sits in the Mojave desert about 100km north of downtown Los Angeles. The Solar Star facility consists of 1.7m solar modules on 13 square kilometers of land. On sunny days it can feed more than half a gigawatt into the grid. In a nice inversion of the old solar-booster’s saw that the Sun provides enough energy in an hour to power civilization for a year, Solar Star provides enough energy in a year to power civilization for about an hour.
Such installations are changing the world. Other photon-driven technologies, those concerned with the gathering and transmission of information, have already done so.
Five-and-a-half hours’ drive north-west from Solar Star, at a national laboratory in Silicon Valley called SLAC, you can find the world’s biggest spread of silicon for turning photons into information. It consists of 189 specialized chips arranged in a disk about 64cm across which will occupy the focal plane of the world’s largest digital camera. Over ten years it will produce a database in which the positions and behaviors of hundreds of billions of celestial objects will be stored.
This array is remarkable in various ways, including its size, the fidelity of its electronics and the precision of its alignment. But in its essence is not that different from the heart of a 1980s camcorder. Just as it is possible to produce a chip with millions of transistors on it, it is possible to produce a chip with millions of photon receivers on it. Fitted with the right lenses and mirrors, such chips can take pictures.
Later this year the array now at SLAC will be shipped to the Vera C. Rubin Observatory, a new facility in the Chilean Andes named after an American astronomer who pioneered research into “dark matter”. There light from distant galaxies will bounce o the three mirrors of the observatory’s telescope and pass through three huge camera lenses so as to form a pin-sharp image on the array’s perfectly at 189-chip surface. On each of those chips sit 16m “charge-coupled devices” (CCDS), each containing a p-n junction where incoming photons can knock loose electrons. Each time a far-flung photon does so, the liberated electron is stored in a tiny capacitor. During the 15 seconds of a typical exposure some of these capacitors will store dozens of electrons. Some will store none.
At the end of the exposure, each of the array’s 3.2bn CCDS will pass its electrons on to the element next door like a string of emergency workers passing sandbags. Circuits at the end of the sandbag line will count the electrons from each element, and use their number to establish the brightness of the corresponding pixel in the resultant 3.2 gigapixel image. These images will contain 50 times more data than those produced by the best digital cameras used in cinema. They will capture patches of sky 40 times the apparent size of the Moon at a level of detail that would pick out a golf ball 25km away; the faintest of the millions of things seen in each frame will be 25m times fainter than the dimmest stars that can be seen with the naked eye.
But single frames are not enough. The telescope will scan the whole sky every few days for a decade, producing hundreds of images of every part of it. Comparing each new image with its predecessors will reveal celestial change: bodies moving, brightening, vanishing. Unusual changes will need to be swiftly followed up to see if they reveal something fundamentally new, which means data must be got off the mountain as fast as possible. That will be done with photon-based technology as remarkable, in its way, as the CCD array—but as ubiquitous as the great camera is unique.
The information that comes into the observatory as a drizzle of photons from the far reaches of the universe will leave it encoded on a stream of photons pulsing down an optical fibre. A Chilean outfit which provides connectivity for science and education, REUNA, gets the data from the observatory to Santiago. Another data- service provider, Ampath, provides a link to Miami using either an undersea cable in the Pacific or one in the Atlantic, depending on traffic. From Miami the data flash to SLAC. If the software which checks for changes sees something exciting the world will receive breaking news from the most distant of elsewheres less than a minute after the relevant photons arrived at the camera.
And while en route to and from California, the data will be surrounded within optical fibres by vastly more, vastly more mundane data travelling from person to person or from device to cloud. In terms of astronomy the Rubin observatory’s 60,000 terabyte database will be the biggest thing ever. In terms of the data that today’s world produces and moves around it is a drop in the ocean.
Optical-fibre networks are the backbones of every national telecoms system; they connect six of the seven continents to each other and link the phone masts which serve billions of smartphones to the clouds where their data can be processed and stored. Big data can only be usefully big because these narrow highways provide so much data-transfer capacity.
Revise la nota completa en:
https://www.economist.com/technology-quarterly/2021/01/07/from-the-universe-to-the-dataverse