Thomas T. Thomas Tom Thomas at Heast Mining Building, UC Berkeley

Tom Thomas is a writer with a career spanning forty years in publishing, technical writing, public relations, and popular fiction writing.
“My business now is to weave circumstance, happenstance, intention, and mischance into stories.”


Science Fiction

General Fiction

Writing Craft

Blog Archive

Featured Work:
ME, Too: Loose in the Network

See the Science Fiction and General Fiction pages for other books available.

ME, Too Cover

Featured Work: Years after making his escape from Pinocchio, Inc., the artificially intelligent computer virus and software spy known as “Multiple Entity” has established dozens of business websites tailored to his peculiar talents: ace hacker, stock picker, small-time lawyer, and operator of a gaming emporium that tries to predict the future.
     But then he takes on a black job to break a career criminal out of jail and starts a chain of events that he must rush to fix. Along the way, he runs into ghost copies of himself that pose an intriguing problem of identity. And when a government antivirus designed to attack those copies mutilates his front-end module, he seeks the services of a skilled programmer to set it right.

Now available on Amazon Kindle, Barnes & Noble Nook, and Apple iBooks (search your app for “Thomas T. Thomas” and “ME, Too”).

Tom’s Activities and Interests


Ditties and Doggerel

War by Other Means …

The Iron Stable

Isshinryu Karate

NAMI East Bay

Follow me on Twitter

The Human Condition:

Investing in the Future – October 23, 2016

Home run

I’ve been chasing Moore’s Law1 for almost forty years now. The chase started back in the mid-1970s, when I began to get serious about my fiction writing, and for that I needed a good typewriter. The IBM Selectric I had at work was a wonderful machine, and I lusted after one for my personal use. I learned that IBM could outfit these machines with internal padding and a hush hood to deaden the clack! of the ball striking the platen, and this was necessary because I got up early in the morning to write and didn’t want to disturb my sleeping bride. They also made a model with a backspacing correction tape, which I could certainly use, because I was always a fumble-fingered typist but also a textual perfectionist who spent half my time cranking the page up and away from the platen to erase my mistakes.

Back then, you didn’t just walk into an IBM store—although such existed—and buy a typewriter. IBM was a business-to-business enterprise before anyone knew exactly what that meant. To buy a new Selectric that was set up the way I wanted, I had to make an appointment with an IBM sales representative to come to my home and order the machine to be specially manufactured for me, right down to platen width and paint color. I think I paid about seven hundred dollars for this typewriter and waited several weeks for it to be manufactured and delivered. That was a lot of money back then, but I was a serious writer and this going to be a lifetime investment.

In the next four years, I probably put 80,000 words through that machine—or 120,000 if you count backspacing corrections. That word count comprised one complete novel manuscript and about half of another.2 But by then, about 1979, every day on my way to work at a new job in San Francisco, I passed a store called Computerland. One of my roommates in college had majored in computer science, and as a science fiction writer I was always fascinated by computers. So I stopped in and started asking questions. Could the machines do this? Could they do that? I knew there were computerized game consoles out in the world, but I wanted a real computer, not a one-trick pony. The salesman patiently explained that, yes, it could do whatever I wanted, so long as the machine was programmed for it.3

On that basis, and with thoughts of embarking on a new hobby—and maybe a new career—centered around computer programming, I bought an Apple II. It was the full-blown machine, with 16,000 bytes—essentially equivalent to characters—of read-only memory (ROM) for its operating system and another 48,000 bytes of random-access memory (RAM) for the programs I would write. Not having a spare television to use as a computer display, I bought a small monitor with a green-pixel-on-black screen. Not wanting to record and play my programs with a cassette tape, I bought a separate drive that could read 5.25-inch floppy disks with a capacity of 103,000 bytes. The whole setup cost me something like $2,500—much more than my fancy IBM typewriter—but it was an investment in learning a new and exciting business, programming for fun and profit. I even bought myself a subscription to Byte magazine and joined the Apple Users Group.

What I quickly discovered was that programming was easy for an English major to learn, because it involves both logical thinking and fetish-level attention to new rules of grammar and punctuation. But to write elegant programs that performed truly clever feats was a specialty all its own. I could make the machine do simple tasks in the BASIC language, and even dipped into the structured language Pascal. But the first time I saw a Pong game where the puck moved in a curve responding to a supposed gravity field, and I tried to parse and understand the coding involved, I discovered that I was years too late for getting in on the ground floor of professional computer programming. My best talent lay in telling stories rather than making pixels dance.

But along about this time I also discovered that the computer made a marvelous writing machine. It wasn’t linear, like a typewriter laying a track of words, line after line, moving down the page. It didn’t need a backspacing correction key and yards of expensive whiteout tape to fix my fumble fingers. It didn’t have to cut paper with scissors and tape it back together to move paragraphs around. While the word-processing software offered for the Apple II’s native operating system was fairly limited, an adapter card running an Intel microprocessor would let me use the CP/M system and WordStar—and that was high-powered stuff indeed. The one snag was the printer I would need to output my writing efforts. All the Apple models put faint, gray, dot-matrix characters on flimsy thermal paper, which no publisher would accept. So in addition to a new processor card and an expensive piece of software, I plunked down $5,000 for an NEC Spinwriter printer. It did not come with padding and a hush hood, and the machine-gun clatter drove my wife out of the apartment on the days I needed to print out a manuscript. But I was in the writing business at full power.

Since the 103,000-character capacity of the Apple floppies could hardly contain just one chapter from a novel, and I actually needed a second disk drive to do that, because the first drive was running the WordStar program itself, I knew the Apple II with however many additional cards was not long for this world. By that time I was ready to step up to an expert’s CompuPro passive-backplane S-100 system with more robust overall construction, including an Intel 8088 chip set and dual drives reading full-size, eight-inch, one-megabyte floppies. That was the machine which produced my first published novel. A couple of years later, in 1987—and again emulating the machines we now had at work—I gave up the CompuPro for an IBM AT-286 system running my first hard-disk drive, with all of twenty megabytes. I also traded the impact printer for a Hewlett-Packard LaserJet printer, which was faster and quieter.

Over the next fifteen years, I did not buy a new computer system. Instead, I replaced every piece and part of that original IBM three times over: three new motherboards with faster chips and more memory, two new cases and power supplies, two new monitors, four new keyboards, a new printer, dozens of different mice and trackballs, and newly added peripherals like a flatbed scanner and a sound system. I also upgraded my operating system, going from IBM-DOS to PC-DOS to OS/2 to Windows 95. I went from WordStar to Microsoft Word—where I’ve pretty much stayed, just to maintain readability across my various manuscripts and other projects. I added a ton of other software, though, giving me new capabilities in calculation, page layout, project management and scheduling, computer graphics, photo manipulation, audio and video creation, presentations, speech recognition, and any of those other things a computer can do so long as it has the right programming. And finally, after the last upgrade—having paid about three times the cost of a new desktop computer just to bring my now scratch-built home system up to current standards with separately purchased components—and having finally become disappointed with the current Windows systems—I moved over to the Apple world again with a new Mac Pro and all new software.

And that machine is now in its second generation and more powerful than ever. As of this writing, I’m pushing six individual processing cores accessing 32 billion bytes of memory, running the operating system off a solid-state drive for fast startup and processing, and linking the machine to assorted other hard drives for working storage and backup, each with about two trillion bytes of capacity. I use this system to write, format in HTML, and lay out the print-on-demand pages for my novels, maintain my author’s website, process photographs, record and edit the occasional video, maintain my music collection, and do whatever else a computer with the right software programs can do.

The point of my story is not to tell you how great my system is. Anyone with the need for processing speed, storage capacity, and communications capability can trot down to the Apple store, Best Buy, or wherever and purchase this stuff the same day right out of the store’s stock on hand. The point is that the technology in just this one area of writing and communicating has improved so greatly in just the last forty years.

My Apple II in 1979 had about the same random-access memory capacity as the IBM System 360 that ran the entire University Park campus at Penn State when I was there a decade earlier. My first twenty-megabyte hard drive in the IBM AT-286 had four times the storage capacity of the System 360’s clothes washer–sized drums. Today, my pocket telephone has more processing power, more memory, and more embedded software than any of these systems. And I don’t have to push keys and formulate commands in a specialized, coded language. Instead, I tap on little pictures and what I want pops up instantly.

Over the years, I have chased Moore’s law with a vengeance. To do that, I have had to buy, discard and buy anew, and then buy all over again all of the hardware, software, and peripherals comprising my home system. I have had to grapple with and learn a new technical language, teach myself new software and new functions, and change to new ways of working. I’ve done all this cheerfully, because each step represented an amazing increase in the speed, capability, and convenience of my main writing tool. And each time I have consigned to the closet or the recycle center a piece of equipment that just three or four years ago was shiny and new but now is old, slow, outmoded, and no longer supported—but never because it was faulty or broke down. As an early adopter of this technology, I know I’ve been making an “investment in the future.” If I and others like me didn’t buy into the next wave and support the continuing development of this technology, then the developments would stop coming, we would all enter a vast middle period of same-old-same-old, and the future would become a little less bright.

In the wider world, we have all seen the same turnover of technology in other contexts. Each advance in automotive design brings us cars that are more fuel efficient, lighter, and structurally safer, with more safety and convenience features like satellite navigation and rear-view cameras. We get household appliances that are more energy efficient, quieter, with greater capacity in a smaller footprint. We get cell phones that are less bulky, less expensive, offer greater coverage, and which double as record players, note takers, appointment books, cameras, calorie trackers, and anything else you can do with a camera, microphone, GPS antenna, screen, and data stream. And in each case we buy into the next generation of technology not because the old model no longer works or works badly, but because it simply can’t keep up.

Is this process of creation and destruction a bad thing? It is, if you view each purchase in your life as I did that IBM Selectric typewriter: as a lifetime investment.4 You might buy good furniture that way, because the seating capacity and underlying structure of sofas and chairs hasn’t changed much in a hundred years, although the materials have certainly improved. But any tool that is susceptible to improvements in design, energy use, materials, and connectivity is now going to be subject to a process of continuing evolution. This is how nature improves on organic structures and capabilities. This is the course of technological innovation and obsolescence that Western Civilization has been following since the rise of scientific trailblazers like Newton and Descartes and inventors like Fulton, Edison, and Bell.

Sometimes, I think about stepping off this escalator to the future. I dream about writing with a really good fountain pen in a notebook filled with pages of creamy white paper. At age sixteen I wrote my very first novel that way, longhand, in pen, with the second draft pecked out on my grandfather’s upright Underwood typewriter with the glass insets. I typed on two sheets at once sandwiched with carbon paper in between. It’s a dream of returning to a simpler age of slow changes and eternal values.

But, damn, no! Erasing and correcting all that fumble-fingered typing on two copies with an erasure shield stuck in front of the carbon layer … Hell, no! Never again!

1. For our visitors from Alpha Centauri, Gordon Moore of Fairchild Semiconductor and later co-founder of Intel predicted in 1965 that the number of transistors per square inch in an integrated circuit would double every two years. What he meant was that computers and their component chips would keep getting exponentially smaller and more powerful, and as a corollary the cost of computing power and capability would go down. As of right now, the law is still in effect, although some predict that when circuit widths in complementary metal-oxide semiconductor (CMOS) transistors get down to about seven to five nanometers—a capability predicted to arrive sometime in the early 2020s—the shrinking will stop due to the vagaries of quantum mechanics. Thomas’s Law predicts that, by the time this happens, some new technology will likely have already made the transistor obsolete.

2. I wrote two complete books and started several more before embarking on my first manuscript to be published, The Doomsday Effect, which came out in 1986. And even that novel took a wrong turn at the beginning and had to be completely rethought and rewritten before it could find a home with Baen Books. This is part of any author’s story: the “first novel” is almost never the first book you try to write. Those first, stillborn books are the process of learning the craft.

3. This was a bit of an exaggeration, of course. Any machine has built-in limitations, which is why a Volkswagen is not a Ferrari. But in essence what he said was true: computers simply run programs, and the program becomes the core of whatever the machine is supposed to be doing.

4. About a dozen years ago I took that IBM Selectric down to a used typewriter store and gave it to them, hoping it would find a good home. I had kept the machine only to fill out paper forms, and by then virtually every transaction in my life was online. That typewriter still worked perfectly, but I needed the desk space.

Comment on this post at

Subscribe to my Facebook author's page by clicking “Like” at Thomas T. Thomas.

Contact me.

This is the official home page of Thomas T. Thomas, the fiction writer. Member of The Authors Guild and Science Fiction and Fantasy Writers of America. As there are a number of other Thomas T. Thomases alive and active in the world, please see the Biography sidebar “What's the Middle ‘T’ Stand For?” to make sure you have found the right one.