The term “internet of things” (IoT) was first mentioned in 1999, but it really started gaining traction in the early 2010s. Since then, big tech companies, futurists, and industry analysts have promised a future for consumers that involves fully connected households, where every appliance is stitched together in a web of interconnectivity and high-tech convenience.
But the IoT prophecies spun by those optimists have consistently failed to come to fruition. IoT is a hot field, with lots of talented people striving to create innovative new devices or set firm ethical standards for development, but that doesn’t change the fact that we aren’t seeing nearly as much progress as we’ve come to expect.
In 2010, when IoT was just starting to captivate the attention of futurists, industry analyses were predicting that there would be more than 50 billion connected devices in play by 2020. Now that we’re a great deal closer to that date, we know this prediction has fallen far short—experts now suspect there will be something closer to 20 billion connected devices by that time. That’s only 40 percent of the original estimate.
On top of that, for the past several years, there have been year-end tech industry retrospectives that predict that next year will be the year IoT finally takes over. In 2014, it was going to be 2015. In 2015, it was going to be 2016, and so on.
And while it’s true that we’ve seen some great progress—especially as virtual reality, artificial intelligence, and IoT start working together—we aren’t growing in this area nearly as quickly as we once expected. So why is this the case?
The first problem comes with the excessive optimism of these initial predictions—as well as the continued optimism of proponents of IoT. The late 1990s and early 2000s were a time ripe with unfathomable technological growth. The world was introduced to the possibilities of search engines and internet connections, and eventually, the capabilities of the modern smartphone. Moore’s Law was in full force, and both consumers and engineers got used to the idea that technology would always grow exponentially, overcoming new challenges at a consistent and reliable rate.
But by the late 2000s, Moore’s Law was dragging—it’s now all but come to an end—and technology was beginning to splinter. Some expected technologies failed to catch on with users (like Google Glass and other smart glasses), and others kept running into logistical challenges that prevented them from ever taking off. This mismatch led tech experts to sensationalize the concept of IoT before engineers had all the kinks worked out—so they projected exponential growth prematurely.
The idea behind IoT is both fascinating and exciting for tech enthusiasts—someday, you could have an orchestra of connected devices communicating with each other to make your life more convenient. But conceptual brilliance doesn’t lead to widescale adoption—for that, you need to have a strong value proposition to pitch to your customers.
And for the average customer, that value isn’t there. Sure, to a tech enthusiast, it’s exciting to be able to adjust your thermostat with your smartphone from 1,000 miles away, but a simple programmable thermostat can give you the same level of control. And while having a refrigerator that talks to you and keeps track of your groceries sounds like it came out of a best-case-scenario Philip K. Dick novel, to many consumers it’s just a more expensive refrigerator with a few extra bells and whistles. It isn’t life-changing, nor are its features interesting enough to justify the extra cost.
There’s also a problem with defined, consistent standards and protocols for IoT devices—including how they connect to each other, how and when they gather data, and how (or if) they can integrate with a central hub. Motivated, in part, by the overoptimistic projections by futurists in the early 2010s, app developers, engineers, and CEOs in thousands of different startups began working on IoT projects independently, with no agreed-upon standard for how IoT devices should be developed, or even what the definition of IoT was.
As a result, the market has been bombarded with hundreds, if not thousands of different devices, only some of which are compatible with one another. For developers, this has become a nightmare; getting your device to communicate with others can be problematic, and getting the entire IoT community to agree on one set of standards is practically impossible.
There are some practical and cost limitations to IoT as well. Consumers want devices that are affordable, but also ones that offer a significant upgrade to the features and/or services they currently enjoy. Finding a way to do this can be problematic when you have to integrate new connectivity, dedicated servers, extended-life batteries, cloud storage, and regular software updates. For many IoT companies, the pursuit just isn’t profitable, or there isn’t enough consumer interest to justify these costly upgrades.
We also need to think carefully about the average consumer’s privacy in a world dominated by IoT. One of the easiest ways for a big tech company to circumvent the high costs of consumer adoption is to make the consumer the product; instead of charging $300 for a smart speaker device, they can afford to charge $100 or less, if it means gathering data on those consumers for advertisers, or using the technology to encourage more purchases.
For now, consumers don’t seem too acutely worried about the data their IoT devices are gathering, but for developers, this is a legal and ethical concern. How much consumer data should you be gathering? How much money can you get from it? And where are consumers and lawmakers in the future going to draw the line? It’s a complicated battleground, and because there’s no solid answer, some IoT development has been indefinitely stalled.
One of the most attractive and ambitious promises of early IoT predictions was the development of the “smart home”—a fully integrated house, with almost every appliance or device working together as a coherent, self-contained system. The idea was to have a central hub or operating system where a consumer could control everything from the temperature settings in the refrigerator to the volume on the TV.
But in practice, this proved difficult to accomplish. Part of this difficulty extends from the lack of a dominant competitor; no big tech company has made themselves enough of a leader in the IoT field that they have both a universal hub and a fleet of in-brand appliances and devices to sync with that hub. Part of it is the sheer expensiveness of buying or converting an entire smart home; most consumers aren’t willing to take the plunge, especially this early in the technology’s development.
That said, there are some key points of success to note in the IoT world:
IoT adoption continues to be weaker than expected, but that doesn’t mean it’s a doomed technology. It just means our initial expectations were more optimistic than they should have been, and that the tech may not progress along predicted or expected lines. There are tremendous opportunities for growth and development here, but we need to start aligning our expectations with reality if we want to be successful.
Avots: readwrite