Scrubbin' away day and night, low and nigh, far and high, to and fro.
Is my Roomba computing on the edge?

My Roomba definitely computes. With enough ignorance we might even say that it is intelligent to an extent. Of course, it might not be the cleverest of the bunch but it maps my floor, learns about changes and avoids obstacles with digital premonition. And the jargon of the day puts it right onto the edge, which vaguely means, well, wherever is closest to you.

The edge. Where many marketers, service providers and product managers long to be, giving the service right next to you. It is where they would put a human if we were disposable.

So, does my Roomba's set of actions fall into the realm of edge computation or edge intelligence?

What is edge computing anyway and why my Roomba, Gameboy or even dear Alexa does not seem to qualify for the definition? Is it a buzz word to fool us around, or shall we start digging up our doomsday bunkers? We would like to share our perception of edge computation/intelligence.

As with many terms, computation on the edge requires a context, so bear with me. We might talk about some culture, history and language. People name to differentiate and markets care to popularize only the concepts that relieve pain points, therefore I would like to go into how pains were and how they will be.

An example of a connected consumer hardware in the era of cloud computing, Amazon Dash. Now discontinued, it is still used by many for their own dire personal needs.

Good old days of IoT and the Cloud

And their love child, the Amazon Dash.

Well, let's not quite dash ahead the company. IoT is still a thing-to-be and the Cloud is now a daily reality. But we can still talk about how this pair was initially dreamt and how it is conceived, till now.

In the early days, companies would develop software for their client's computers, and/or would deliver hardware that have pretty defined functional and operational boundaries. Inter-compatibility, upgradability, reproducibility, security were serious pains for the software of the time.

Also, the development effort for the hardware is extremely high. And the service industry changes with an ever-increasing momentum. Therefore, relatively rigid services given by those hardware started to no longer justify their development effort. Therefore, the winds of change blew us towards the clouds.

And came our savior

With the advent of Amazon cloud (AWS), pains subsided a bit. We can say that AWS leveraged a paradigm shift that Linux, Git, easy virtualization and virtual environments prepared in advance.

With time, the services that are core to the business moved to the cloud, the interfaces shifted to web browsers and hardware dumbed down to a point that they could justify their rigidness with their relatively easier development.

Handling was up to the clouds, where systems were heavenly, upgrades were instant, services were not disrupted, development was easy and all was scalable. Plus, it was safe occasionally trip over things since clouds are not rigid as product recalls are.

The processors were cool and powerful. There seemed to be nothing that could challenge millions of Intel processors submerged into their coolants. Some pains were relieved somewhere and it was good.

In the meantime, development on the cloud seemed so straightforward that the hardware essentially turned into connected sensors. They became very straightforward.

As an example, Dash is a connected button and Alexa is a microphone/speaker.

Amazon AWS's current webpage doesn't even have a comprehensive list of its products anymore, but even the number of categories is exhausting. It grew to have a different level of obfuscation that was not seen before. The number of products they have is inching closer towards 200 (depends on how you count) and possible architectural configurations are countless.

The cloud became a panacea - a universal remedy. As a result, it has become as complicated as the things that it had committed to replace.

Therefore, it is now legitimately an ecosystem, where once many demigods are mere mortals now. Whereas software development is still easier than how it used to be, hardware development for this ecosystem is reduced to creating the devices that send the right information to be serviced, right.

But cloud is a long-distance relationship

And nothing good comes from that.

Interestingly, system development follows a cyclic pattern of centralization and decentralization of things when they are posed to a challenge. We tend to distribute systems, look at the resulting mess and recentralize them. Then we can't take the load anymore and find ways to redistribute parts of it.

And we just don't apply it on a case by case basis, we find a way to apply it everywhere. It might be due to our talent in finding easily applicable analogies.

This talent is why "sushirittos" are a thing and why we thought we might also have baby clouds, --well, simply put-- wherever they are actually needed.

Not that every invention of ours are actually needed.

There are many forces of development that lead to this and the ground was ripe. Some honorable mentions go to hobbyist electronics like the $30 Raspberry Pi, proliferation of machine learning, our unappeasable thirst for cybernetics and the speed of light.

Because lag is buzzkill

Let's start with the reasons that were driven by the market. It is also exemplified by a quintessential cloud service: voice assistance.  

Some of the services or uses of products have become so ingrained in our behaviors and actions that our expectations are very high. The sub one second latency in interacting with a voice assistant is one of the things that holds back the application. Even the speed of light challenges us here. But it is predominantly due to other things.

And it seems that regardless of how effective we use the cloud, it is prohibitively late for the application.

Users long for a voice assistant to be a flawless cybernetic extension of themselves, therefore the clunky assistants today are like computer mice with substantial delay. The big vendors are also aware of this problem. For example, Google has realized that some of its voice assistant queries are very prevalent, therefore it is tuned to recognize and answer those queries on the phone, rather than in Google Cloud. It eliminates the computation, the latency and provides smooth interaction.

Alexa keeps you waiting every single sentence because its mind is literally someplace else.

Also, robots want to get personal

Machine learning inference and occasionally training is also planned to be moved to the edge.

Google has several hardware endeavors like Coral and the Edge TPU, such that it not only can be initiated/called/queried from anywhere, but it can also just be anywhere, with the full meaning of being present somewhere.

These devices will not only know about you, but also be aware of the immediate spatio-temporal context of the service that they are built to give. In my opinion, these devices showcase the department that Google thinks they lack in as a cloud provider. And they are investing in it.

Intel, Nvidia and many others including exciting ASIC startups are pushing towards a cheaper, faster and power efficient machine learning chips to landgrab for this machine learning processor market.  

And we need to ration our bandwidth

It seems that every time we were able to find a way to circumvent the next crisis that looms over telecommunications. But this time we chose to adapt in an unconventional way.

Our demand forecast for wireless communication is so high that the telecom operators have established plans to process as much as they can, as close as they can get to where it is needed.

This is due to the fact that they don't want the extra traffic anymore. Squeezing bandwidth out of thin air has become so prohibitively expensive that carrying only the worthwhile data is a serious tendency on the road to 5G networks.

Thus, from the viewpoint of network operators, the edge seems to be the base station, maybe even your wireless router.

Also, the silicon is cheap and geeks love it

Yes, the processing performance of our business needs are growing. Now embedded processors (mostly with multiple ARM cores) are exhibiting performances that are starting to tackle such a workload.

After all, embedded processors are those which power almost all modern day cell phones and evidently many future laptops.

I think that the ubiquitous $30 Raspberry Pi triggered the imagination of many by presenting an uncontested simplicity for personal projects.

Even the Google AI Vision, an object recognition development board by Google, relies on a Raspberry Pi Zero with its camera. For many developers who need single environment and single server, an Ubuntu on a Pi with python and git was all that was needed.

I think that the popularity of this system is magnetically pushing the services to the edge.

People had already realized they can do so many things so easily and cheaply on the cloud. This has driven the rise of the cloud's popularity.

Now some portion of the community is realizing the bells and whistles of creating independent intelligent endpoints. And they are driving the growth of it.

So the growth is not only due to the market, it is also due to the community.

Even Google Vision AI set includes a Raspberry Pi Zero in it. 

TL;DR

In short, once upon a time problems were resolved with products and people simply bought them from companies.

Companies would bear the many burdens of being separated from the product.

With time, people realized that they would rather not pay for products, but subscribe to services that solve their problems.

It was also reasonable to position the service in such a way that the burdens that I mentioned are eliminated - in the cloud. After this became the norm, complicated systems that we created turned to complex systems, privacy started to be an issue and the physical proximity to the problem gained importance.

Because computation became cheaper and more power efficient, it became possible to bud away from the cloud with all of its advantages and perks, and land where the service is.

So far this is the best of both worlds. Until we start complaining again.

When our world of centralized systems is challenged, it gives way to distributed systems. This change does not occur in unison, but is eventual, since we are constantly challenging ourselves.

Because of this change, the jumbojet Airbus A380 is being retired, modern armies had to relearn combat when faced with terrorism and we are trying to build intelligent machines.

This is our way of dealing with complexity. We divide and outsource it.