How edge computing has become a $250 billion ‘perfect complement’ to cloud computing

Hivecell
Hivecell is a startup focused on edge computing that helps companies avoid the cost of storing massive amounts of data in the cloud.

  • Edge computing places processing power closer to where data is being created in the physical world.
  • It has received increased investment thanks to connected devices like factory robots or autonomous vehicles.
  • It complement’s cloud computing by solving issues of latency, bandwidth, autonomy, or compliance.
  • This article is part of a series about cloud technology called At Cloud Speed.

As technologies like autonomous vehicles, factory robots, and remote monitoring systems become more commonplace, a concept called edge computing is receiving increased attention and investment.

Edge computing refers to a model in which processing power is placed closer to where data is being created in the physical world: While cloud computing platforms like Amazon Web Services are hosted from the retailer’s own massive data centers scattered across the world, edge computing focuses on smartening up the car, robot, or other systems right on the device or placing a processor in closer proximity.

It’s a concept that’s only become more popular as a surge in connected devices – like Tesla’s semi-autonomous cars or camera-laden robots in Amazon’s factories – collides with the rise of cloud computing, presenting an opportunity for both.

“Edge computing is actually a counterbalance to the cloud,” Gartner analyst Bob Gill told Insider. “It’s a perfect complement to the cloud that solves for the weakness of the cloud.”

As the flexibility, efficiency, and pricing of cloud computing have led firms to abandon their in-house data centers, it’s created a new set of technical challenges. While the cloud offers immense raw computing power, relying on it comes with trade-offs, too.

“People realized that not all the things that they want to do in the cloud worked well in the cloud,” IDC analyst Dave McCarthy told Insider.

Specifically, edge computing can help solve issues of latency (where systems need to be able to process data incredibly fast), bandwidth (where machines are generating vast amounts of data that would be inefficient to send to a distant data center), autonomy (where systems need to be able to function without network connection), or compliance (like when information needs to remain within a specific country to adhere with local regulations).

Gartner expects that by 2022 more than 50% of enterprise-generated data will be created and processed outside the traditional data center or cloud.

Why edge computing is on the rise

One of the big reasons for the move towards edge computing is the explosion of the so-called Internet of Things, where connected devices often collect vast amounts of data that can then be analyzed, like smart factory equipment that can flag machines that may break down or restaurant point-of-sale systems that can make predictions about what ingredients will run out first.

“Edge computing and IoT devices go together like peanut butter and jelly,” Forrester’s Glenn O’Donnell said.

But the amount of data these devices collect can be so vast that sometimes it doesn’t make sense financially to store it all, or the process of moving it to the cloud for processing takes too long.

“You see this in connected cars, in smart factories, even in retail environments where a piece of data is generated, and you want to take action on that data, but you need to do it really fast,” said IDC’s McCarthy. “The whole round trip – like the time it takes to send it to the cloud, have the cloud make a decision, and then send the answer back so you can do something – is too long.”

Startups and major public firms alike are focused on different sides of the problem, from services that automatically route workloads through the closest possible data centers, to making chips more capable of performing machine learning-related processes on-device. Intel, for example, expects the silicon-related side of edge computing to be a $65 billion market by 2024, and analyst firm IDC predicts that the global market for edge computing-related products and services will reach $250 billion by 2024.

And while edge computing provides a counterbalance to cloud computing, the hyperscale cloud firms like Amazon Web Services, Microsoft Azure, and Google Cloud are all making big investments too, especially by partnering with telecom companies to extend their networks – with a major push towards helping the carriers adopt 5G, which would allow their respective platforms to reach devices far afield from a traditional WiFi connection.

“The cloud companies are making huge bets on edge technology, which almost seems like an oxymoron, but they recognize that there are a lot of scenarios that don’t work in the cloud,” said IDC’s McCarthy. “They realized that they had two choices: They could ignore that business or they could find a way of extending their cloud technology to these edge locations. And they’ve chosen to do the latter.”

Read the original article on Business Insider

Here’s how quantum computing could transform the future

quantum computing google
Sundar Pichai (left) and Daniel Sank pose with one of Google’s quantum computers in the company’s Santa Barbara lab in California, October 2019.

  • Quantum computers are able to process information millions of times faster than classic computers.
  • The quantum computing market is projected to reach $64.98 billion by 2030.
  • Companies like Microsoft, Google, and Intel are racing to build quantum computing tools.
  • This article is part of a series about cloud technology called At Cloud Speed.

Today, our phones are millions of times more powerful than the computers that landed Apollo 11 on the moon.

Technologists are now exploring the power of quantum computers that are 100 million times faster than any classical computer that will, in theory, be able to solve computation problems deemed impossible today. The appeal of quantum computers is the promise of helping to quickly answer questions so difficult that it would take decades for today’s computers to solve.

“The differences between quantum computers and classical computers are even more vast than those between classical computers and pen and paper,” Peter Chapman, CEO of quantum startup IonQ, told Insider. “Because quantum computers process information differently, they are expected to be able to address humanity’s greatest challenges.”

Companies and researchers are racing to conquer the massive opportunity, but what exactly is a quantum computer?

Regular computers use bits to store information that only has two states: zero or one. Quantum computers, however, allow subatomic particles to exist in more than one state simultaneously so that they can exist as either a zero, a one, or both at the same time.

Quantum bits, called “qubits,” can thus handle a much vaster amount of information much faster than a normal computer.

Quantum computers are not meant to replace typical computers. In practice, they will be separate instruments used to solve complex, data-heavy problems, particularly those that make use of machine learning, where the system can make predictions and improve over time. 

Big companies are investing in quantum tech 

Quantum computing has progressed from a research experiment to a tool on the brink of transforming a variety of industries, including medicine – where quantum computers have achieved rapid DNA sequencing – and transportation – where they have precisely predicted future traffic volumes.

Experts expect quantum computing to help us understand biology and evolution, cure cancer, and even take steps to reverse climate change. The quantum computing market is projected to reach $64.98 billion by 2030 from just $507.1 million in 2019.

A handful of big tech companies have been investing heavily in the space. Microsoft’s Azure cloud has released quantum tools, as have Google and Amazon’s respective cloud platforms.

Additionally, AT&T partnered with the California Institute of Technology to form the Alliance for Quantum Technologies (AQT) with the goal of bringing “industry, government, and academia together to speed quantum technology development and emerging practical applications.” Meanwhile, quantum-focused startups D-Wave and IonQ have raised $199.69 million and $192 million respectively, per PitchBook.

One of the major goals companies are currently striving for is so-called quantum supremacy, when a quantum computer performs a calculation that no classical computer can perform in a reasonable amount of time. In October 2019, Google claimed it reached quantum supremacy, though this claim was disputed. 

Some experts, like Intel’s director of quantum hardware, Jim Clarke, think that quantum supremacy is even besides the point: The real goal should be “quantum practicality” he told IEEE, referring to the point when quantum computers can actually do something life-changing and unique. Yes, quantum computers have begun completing basic tasks, but researchers are still slowly inching towards that threshold of quantum computers being able to do anything game-changing. 

Experts say there’s plenty of work ahead. 

“Quantum computing is easily five to 10 years out before it can actually deliver any sort of meaningful value,” VP analyst at Gartner Chirag Dekate told Insider. That’s in part because there are still so many problems to be solved around the physics of quantum computing, like stabilizing the qubits in a system.

Still, Forrester principal analyst Brian Hopkins told Insider that because there will be an exponential curve of capabilities once quantum takes off, the time for investment is now.

“That’s why smart companies are investing today,” Hopkins said. “They know when we’re going to be able to do something useful, how it’s going to impact their industry, and when we might get to that kind of point in the curve where things really take off.”

Read the original article on Business Insider

The 5 things everyone should know about cloud AI, according to a Sequoia Capital partner

Alexa in the kitchen
Many people encounter cloud AI through their smart speaker

  • Konstantine Buhler, a partner at Sequoia Capital, believes “cloud is going to become AI.”
  • Buhler insists that AI is not “magic,” and that it should be demystified and measured.
  • Buhler says companies can bake AI into their processes “horizontally.” 
  • This article is part of a series about cloud technology called At Cloud Speed.

If you ask Sequoia Capital partner and early-stage investor Konstantine Buhler about the role of artificial intelligence in cloud computing, his answer is unequivocal: “Cloud is going to become AI,” he told Insider. “I mean, all of the cloud will be based on AI.”

Snowflake’s $3.4 billion initial public offering and DataBricks’ $1 billion funding round over the past year suggest big things ahead for AI in the cloud, and the industry is estimated at $40 billion and climbing. Major platforms like Amazon’s AWS, Microsoft Azure, and Google Cloud – as well as a host of startups – sell cloud-based tools and services for data labeling, automation, natural language processing, image recognition, and more, making it more affordable than ever before for firms to dabble in AI. 

Buhler, who has a master’s degree in artificial intelligence engineering from Stanford, revels in AI’s contributions, but also insists that the sector be demystified, and basic business fundamentals applied to it. 

His investments include CaptivateIQ, which automates business commissions, and Verkada, a security camera company that uses AI to recognize information like license plate numbers. Sequoia in general is an investor in some of the biggest names in AI, including Snowflake and Nvidia. 

“This next wave of enterprise and consumer technologies will all need AI built in,” Buhler said. “That’s going to be the standard going forward.”

AI’s ubiquity in the future is the first of a few basic lessons Buhler believes everyone should understand about AI’s impact over the next decade in the cloud. Here are the rest:

AI is not magic – it’s math

There is an (unwarranted) aura around artificial intelligence that ascribes to it supernatural brilliance.

“It seems complicated – it seems like magic of some sort, so people get intimidated and awed by it,” Buhler said. “Artificial intelligence is just more and more mathematical computations done rapidly, which at some point, for a moment, seems ‘magical.’ But it never is.”

Ordinary people should ask to understand it, because it impacts their lives. If you talk to Apple’s Siri or Amazon’s Alexa, you are conversing with AI. If your cat hops aboard a Roomba vacuum, both of you can appreciate how it “learns” to avoid objects in its path. On the other hand, a red-light camera that zooms in to read your license plate when you go through intersections late and automatically fines you might not be such a welcome innovation. 

AI should learn from the internet revolution

Buhler believes that AI is at a similar inflection point as what the internet revolution experienced 20 years ago: “Let’s learn a lesson from the dot.com boom,” when many over-valued companies imploded as they failed to materialize as real companies, Buhler said: “Everybody had that mentality of, ‘let’s stick internet on this thing.'”

While cloud-based tools allow companies to spin up AI models with relative ease, not every problem needs to be solved with these kinds of algorithms. 

The business case must always be there – with the customer centered – or AI will not be practical.

“When you build an artificial intelligence model, it is not about the AI: It is about the customer,” Buhler said. “The internet was a communication revolution, and AI is a computation revolution. This is a new mechanism to serve people, and you have to understand their needs, or you’re going to spend years building the wrong thing.”

Konstantine Buhler of Sequoia Capital
Konstantine Buhler is a partner and early stage investor at Sequoia Capital.

Every company has a ‘horizontal’ AI opportunity

Buhler believes every company can bake AI into their business using the same basic “horizontal stack,” or processes that take raw data and turn it into actionable intelligence that can be used in different ways across business units. Buhler says companies like Databricks, Dataiku, DataRobot, and Domino Data Lab (“they all start with D for some reason”) help enterprises do this. 

Horizontal data processes can include data preparation (sorting text from image files, for instance), data labeling, data storage, creating algorithms that process the data, and, finally, applying the algorithms to specific business processes to help guide decision making. 

“It should be laid out that simply,” he says. That process “is all about enabling enterprises to bake artificial intelligence directly into their systems.”

AI startups can also focus on verticals 

Buhler says there are also AI startups that are providing products tailored to more specific business needs. Gong, for example, helps salespeople evaluate opportunities, while competitor Chorus turns sales conversations into data. In the financial world, the startup Vise automates investment management, while in the legal world, Ironclad helps attorneys build contracts faster. Gong, Vise, and Chorus are Sequoia portfolio companies. 

The key in picking great AI startups, Buhler says, is being able to measure how a company is helping its customers: “It has to be a real business with outputs that can be quantified.” 

Read the original article on Business Insider