With every new piece of technology—today it’s generative artificial intelligence like OpenAI’s ChatGPT—I’m fascinated by the possibilities but always ask: Will it scale? Can it get smaller, cheaper, faster, better? Early releases are usually clunky. After the initial “huh, I didn’t know that was possible,” often comes denial and ridicule. I’ve been guilty of this. So how do you figure out what works and what’s a dud?
ChatGPT uses machine learning to find patterns of patterns in training data, mostly written by humans, to produce human-sounding prose in response to prompts. Machine learning is the greatest pattern-recognition system ever invented. It’s why Alexa’s voice interface works and how Google can find you in photos from when you were 3.
I’ve played around with ChatGPT, and it’s pretty good—if you need to turn in a high-school freshman term paper. Its answers are dull, repetitive and often filled with mistakes, like most freshmen.
Speaking of dull, lawyers may have the greatest reason to be nervous. In February, online ticket fixer DoNotPay will coach someone to fight a speeding ticket in a live courtroom using its AI chatbot speaking into the defendant’s earpiece. DoNotPay has even offered $1 million to the first lawyer arguing before the Supreme Court who agrees to wear an earpiece and repeat what the bot says.
Will this work? Who cares? This is Kitty Hawk. Google, which funds its own generative-AI efforts, has declared a “code red,” worried about threats to its money-gushing search business, as it should. Microsoft was years late in responding to a quirky but scaling internet.
Pure digital technology almost always scales. In 1970, Intel’s 3101 memory chip with 64 bits (not 64K) sold for nearly $1 a bit. Today, $1 can buy 10 billion bits of memory. Moore’s Law, the doubling of chip density every 18 months, is Scale City. Compare the original slight iPhone with today’s iPhone 14 Pro Max.
Will other technologies in the news—the metaverse, Crispr gene editing, fusion, quantum computing—scale?
The metaverse’s digital worlds, from games to fitness apps, sit on servers in the cloud, so they can definitely scale in complexity, resolution and speed. It’s the human interface I worry about. Wearing ski-goggle dongles to traverse the metaverse goes only so far. A screen an inch from your eyeballs causes headaches and nausea. Apple will reportedly unveil a mixed-reality headset this spring, though Bloomberg suggests the company’s “lightweight augmented-reality glasses” are delayed until at least 2024. Invention is still a necessity. Plus, like VCRs and e-commerce, we need a killer app to bring the technology to the masses.
Nuclear fusion saw a breakthrough in December at Lawrence Livermore National Laboratory, a system that produced 3.15 megajoules of power, more than the 2.05 megajoules pumped in by 192 lasers. Cheap electricity is coming! But read the fine print. The lasers required 300 megajoules of electricity to generate the 2.05 megajoules of output. More work is required. And the fusion chamber requires precision-made pellets of heavy hydrogen in a diamond shell. That doesn’t sound scalable to me.
Quantum computing has shown early indications that it can scale but—physics pun alert—may have a tough time jumping to the next level. Computing units are known as quantum bits, or qubits. Early prototypes were four- or eight-qubit machines. IBM
recently showcased 433 qubits. Will it double every few years? Maybe. This has cyber types nervous. It might take 6,000 qubits to break today’s encryption, though that machine may be a decade or more in the future.As far as gene editing and the amazing advances with Crispr technology, note that biology is slow, both its processes and advances. Even the latest, mRNA vaccines, let our bodies do the work. You can’t speed it up. Gene editing to remove sickle-cell disease can cost $1 million a treatment. Lifesaving gene editing will scale, but not at the pace of digital technology.
So will generative AI scale? Inevitably. We already have silicon chips, such as Google’s Tensor, purpose-built for machine learning and AI. We’re seeing baby steps so far. According to OpenAI CEO Sam Altman, ChatGPT costs “probably single-digit cents per chat.” That gets expensive quickly. One of the reasons the company is selling equity to Microsoft is to gain access to cheap cloud computing.
Over time, ChatGPT will get faster, cheaper and, like Google searches, more focused and accurate. But remember, AI is only as good as the data it’s trained on. Garbage in, garbage out. I asked it: “Write 800 words in the voice of Andy Kessler on whether ChatGPT scales.” It was as bad as a New York Times guest essay. Generative AI could be stuck at high-school freshman level for a while. But hey, if it wins a Supreme Court case, that may be good enough.
Write to kessler@wsj.com.
https://ift.tt/OSn1uzW
2023-01-22 20:32:00Z
CBMifGh0dHBzOi8vd3d3Lndzai5jb20vYXJ0aWNsZXMvY2FuLWNoYXRncHQtd3JpdGUtdGhpcy1jb2x1bW4tb3BlbmFpLWNvbXB1dGluZy1zY2FsZS1mdXNpb24tZ2VuZS1lZGl0aW5nLXRlY2hub2xvZ3ktMTE2NzQzOTk0MTbSAQA
0 Commentaires