Gutenberg and Automation – February 20, 2011

It’s hard to believe that, up until the late 1700s, every item you could buy—with the one exception described below—was made individually by hand. From a piece of pottery or textile to a complex mechanical device like a firearm, every item was planned, shaped, and assembled by the hands of a craftsman. Each was unique. Even the screws in the gunlock were cut by the hands of a mechanic. This was the pre-automation age.

The exception, of course, was books, pamphlets, and newspapers. Beginning with Johannes Gutenberg in the early 1400s, printers were turning out paper documents with an entirely new system. Rather than hand-copy each document letter by letter on parchment or vellum, a printer set up a master image of the page in lead type, placed it in a press, and copied out whole pages and folios one after the other. The printer was essentially creating a mold or a prototype and then making faithful copies of it as needed.

In 1778, at the suggestion of a French general, Honore Blanc began producing firearms from interchangeable parts. This required parts made to exacting specifications, so that they could be assembled into a workable musket without artisan-style filing and fitting. That was the beginning of an industrial mindset.

In 1803, Joseph Marie Jacquard saw that the patterns of woven cloth, which tend to be repetitive, could be described by a number of steps that manipulated the threads of warp and weft. He set up an attachment to a loom that read those steps from a series of punched cards and so automated the weaving process. Like Gutenberg, Jacquard used a complicated setup procedure to turn manufacturing into a simple, repetitive process. He also foreshadowed the concept of the executable “program,” which was later adopted by Charles Babbage.

Gutenberg-type thinking—complicated setup, easy execution—came to dominate the world of manufacturing through reusable molds and injectible substances like plaster and various resins (also known as “plastics”), or through the dies that shape hot metal in the forging process. A master craftsman designs and makes the prototype to be cast into a mold or cut into a die, and machines do the rest.1

More than that, molding led from Edison’s first phonograph in 1877—in which the sound waves were mechanically inscribed for immediate playback on tinfoil sheets wrapped on cylinders—to master disks on which those waves were captured only so that they could be stamped into wax or vinyl copies and sold cheaply. And so reproduced sounds came into the home and the broadcast studio.2

With this running head start, the trend toward reproducibility and automation has grown to the point that, today, virtually anything you buy is machine processed or machine made. The only exception would be high-status items that you might buy precisely because they are hand made and reflect the craftsmanship or artistry of an individual. For everyday consumables, it’s machine made all the way.3

Automation implies many things for the industrial process in addition to machine manufacture. Usually, with the standardization of parts, it means the standardization of products—that is, more sameness. But with a few programming changes and a few extra steps in the processing line, remarkably customized products are also possible.

Automation trends toward fewer parts and more modularized parts, as engineers rework the manufacturing process again and again. When I worked at Northern California’s major energy utility, Pacific Gas & Electric, in the 1980s I had the chance to see this in action when I toured the plant of one of our newest customers, the New United Motor Manufacturing, Inc. (NUMMI). This manufacturer had just set up in the old General Motors assembly plant in Fremont, California, to make Chevrolets and Toyotas.

The first thing I noticed was that the parking lot, big enough for a thousand employees’ cars, was only a third full. I asked the manager if it was a holiday or a reduced shift, and he said no, they were at full capacity. In the body assembly area, workers pushed pallets of frame parts, fenders, and doors into position. Robot arms reached out to place them on a jig and welding heads dipped down to make the hundreds of spot-welds that hold the car together. In the paint shop, whole car bodies were dipped in primer and paint and dried in thermal tunnels. In the final assembly, workers added the dashboard instruments by clicking a module into place and connecting a multiplug. (In earlier days, a car’s instruments were separate gauges and dials, which required someone to drill holes in the dashboard, insert and secure these instruments, and connect each one into separate wiring harnesses for information input and power input.) Partly, this modularization was due to digital advances—the new instrument cluster was essentially a computer display with various readouts from the car’s engine management system—but partly it was due to engineers redesigning the product to fit a streamlined assembly process. All of this automobile assembly work employed complex machines at the expense of human workers, and hence the need for a much smaller parking lot.

Automation also enables the manufacture of parts that human hands simply cannot make. When I reported for Read-Rite, maker of the read-write heads for computer disk drives, in the 1990s I leaned that people could assemble the heads when the transducers were fairly large, about the size of a grain of rice. But when the transducer became too small, about the size of a grain of sand, machines had to take over. More recently, a maker of the memory chips used in flash drives and cards, Lexar, released a video describing its plant in Utah as employing thousands of people. But most of them are engaged in testing, final case assembly, and retail packaging. Making the chips themselves—described as requiring 800 different process steps and taking a month to complete—is basically a Gutenberg-type printing and etching operation performed by machines in clean rooms. Humans are simply too imprecise, not to mention dirty, to be allowed to do this detailed work.

As we move into the 21st century, which has been billed as the century of the life sciences, Gutenberg raises his head again. Many of the manufacturing processes of the future will involve the creation of complex chemicals—new pharmaceuticals, refinable oil from the lipids in algae, exotic fibers like spider silk, vulnerable commodities like raw latex4—from animal, plant, and microbial cells with modified genomes. Once again, scientists will put their creative energy into the up-front work of programming these genomes; then they will simply turn the cells loose in vats of nutrient broth or exposed to sunlight so that they can multiply and make product.

Automation is the way of the future—not because managers and factory owners are greedy and would rather employ tireless machines than people who constantly demand better wages, benefits, and perks—but because humans simply can't make these things. For anything more complicated than a hammer, you need a precision machine. For something as complicated as a jet engine, you need vanes, shafts, and bearings all manufactured to micron tolerances. A blacksmith or even a master tool and die maker can’t work this reliably, over and over again, to make the thousands of parts needed for one engine. You need the craftsman to shape the first vane, perhaps creating the design by following a computer simulation of the airflow. After that, computer-controlled machines will forge, cut, and finish the thousands of actual working vanes in the engine.

We in the United States are feeling badly because we let the Chinese make our hammers, our pots and pans, and do the final case assembly on our iPhones and iPads. We let them do this work because China has lots of hands willing to work for a fraction of U.S. prevailing wages. But the guts of the iPhone or iPad, the chips, displays, and other high-tech goods—whether made in Utah or Guangzhou—are still made by machines. And when Chinese hands become too expensive to assemble them, they will be made in Bangladesh or the Sudan. Or, more likely, by a really fast machine located anywhere in the world, even back in the U.S. if the tax and infrastructure conditions are right.

That is the way of the future. More on what this means for the economy next week.

1. It’s ironic that in the early 2000s the world of documents—books, pamphlets, and newspapers—is now moving away from the printed word and toward direct display of the electronic word on screens. Digital electronic bits and bites, being more stable and reproducible than analog waves, lend themselves better to copying and transmission.

2. See “Coming at You” from October 24, 2010 in Science and Religion.

3. If you don’t believe this, watch any episode of How It’s Made on the Science Channel. What you see is machines making things and dropping them in bins. Human hands are generally visible only loading the raw materials and transferring finished products.

4. The industrial world’s dirty little secret is that natural rubber, or latex, has advantages over the best synthetics but is susceptible to a leaf blight common to its native South America. That’s why densely planted acres of rubber trees at Fordlandia in Brazil failed. The world’s rubber comes from plantations in Southeast Asia, where the blight has not yet arrived. A future infestation—all too possible, given the travel and transport opportunities of the modern world—could wipe out our supply of this valuable commodity. Production with modified cells in a vat would change all that.