Can we talk about the cables in our lives? I’ll start: I have a circa-2020 iPhone, which features a Lightning port for charging. My monitor, laptop, and e-reader all have ports for USB-C, the connector that looks like a pill; my car has USB-A, which is the older, rectangular design that is somehow always upside-down. My fancy webcam uses something called micro-HDMI, which is not the same as mini-HDMI or standard HDMI, and to get it to work with my computer, I have to plug its cable into a pair of daisy-chained adapters. I have two sets of wireless earbuds, and they, too, take different cables. If I upgraded to the newest iPhone, which uses USB-C, I’d be somewhat better off, but what about my family, and all of their devices with different ports? Let them eat cable, I suppose.
This chaos was supposed to end, with USB-C as our savior. The European Union even passed a law to make that port the charging standard by the end of this year. I do not live in Europe, and you might not either, but the requirement helped push Apple, which has long insisted on its own proprietary plugs, to get on board. As a part of that transition, Apple just put USB-C connectors in its wireless mice and keyboards, which previously used Lightning. (Incredibly, its mice will still charge dead-cockroach-style, flipped on their back.)
People think the shape of the plug is the only thing that matters in a cable. It does matter: If you can’t plug the thing in, it’s useless. But the mere joining of a cable’s end with its matching socket is just the threshold challenge, and one that leads to other woes. In fact, a bunch of cables that look the same—with matching plugs that fit the same-size holes—may all do different things. This is the second circle of our cable hell: My USB-C may not be the same as yours. And the USB-C you bought two years ago may not be the same as the one you got today. And that means it might not do what you now assume it can.
I am unfortunately old enough to remember when the first form of USB was announced and then launched. The problem this was meant to solve was the same one as today’s: “A rat’s nest of cords, cables and wires,” as The New York Times described the situation in 1998. Individual gadgets demanded specific plugs: serial, parallel, PS/2, SCSI, ADB, and others. USB longed to standardize and simplify matters—and it did, for a time.
But then it evolved: USB 1.1, USB 2.0, USB 3.0, USB4, and then, irrationally, USB4 2.0. Some of these cords and their corresponding ports looked identical, but had different capabilities for transferring data and powering devices. I can only gesture to the depth of absurdity that was soon attained without boring you to tears or lapsing into my own despair. For example, the Thunderbolt standard, commonly used by Apple and now on its fifth iteration, looks just like USB-C. But to use its full capacities, you need to connect it to a Thunderbolt-compatible port, which is identical in appearance to any other that would fit a USB-C connector. Meanwhile, today’s Thunderbolt cable will probably charge your Android phone, but an older one might not effectively power your current laptop, or some future device. As one manufacturer explains, “For charging most devices including laptops, Thunderbolt 3 will provide virtually identical speeds to USB-C. However, Thunderbolt 4 requires PC charging on at least one port, whereas USB-C charging is optional.” Which … what does that even mean? It means that Thunderbolt is a kind of USB-C that is also not USB-C.
Muddled charging capabilities are not particular to Thunderbolt. If you have ever plugged a perfectly USBish USB cable into a matching USB power brick and found that your device doesn’t charge or takes forever to do so, that’s because the amount of current your brick provides might not be supported by the USB-shaped cable and its corresponding USB-underlying standard, or it might be weaker than your device requires. Such details are usually printed on the brick in writing so tiny, nobody can read it—but even if you could, you would still have to know what it means, like some kind of USB savant.
This situation is worsened by the fact that many manufacturers now ship devices without a charging brick. Some, like Apple, say they do this for ecological reasons. But more cost-conscious manufacturers do so to save money, and also because forgoing a brick allows them to avoid certifications related to AC power plugs, which vary around the world.
A lack of standardization is not the problem here. The industry has designed, named, and rolled out a parade of standards that pertain to USB and all its cousins. Some of those standards live inside other standards. For example, USB 3.2 Gen 1 is also known as USB 3.0, even though it’s numbered 3.2. (What? Yes.) And both of these might be applied to cables with USB-A connectors, or USB-B, or USB-Micro B, or—why not?—USB-C. The variations stretch on and on toward the horizon.
Hope persists that someday, eventually, this hell can be escaped—and that, given sufficient standardization, regulatory intervention, and consumer demand, a winner will emerge in the battle of the plugs. But the dream of having a universal cable is always and forever doomed, because cables, like humankind itself, are subject to the curse of time, the most brutal standard of them all. At any given moment, people use devices they bought last week alongside those they’ve owned for years; they use the old plugs in rental cars or airport-gate-lounge seats; they buy new gadgets with even better capabilities that demand new and different (if similar-looking) cables. Even if Apple puts a USB-C port in every new device, and so does every other manufacturer, that doesn’t mean that they will do everything you will expect cables to do in the future. Inevitably, you will find yourself needing new ones.
Back in 1998, the Times told me, “If you make your move to U.S.B. now, you can be sure that your new devices will have a port to plug into.” I was ready! I’m still ready. But alas, a port to plug into has never been enough.