Saturday, January 12, 2019

To compute on the edge or on the cloud, that is the question.






It’s interesting. When I got my first computer in 1995, I was so excited to buy PC magazine and get a CD with a bunch of video games demo and trial software etc. what all the software had in common back then is they run completely independent from a server. Plug, install and Play. That comes with a limitation; Storage and compute. Developers have constrained to PC machines with limited resources and writing compute-heavy software was difficult because PCs can’t run them.

Slowly, software started to take a different shape when client-server architecture emerged. You will install the “client” piece software on your PC and the rest of the software lives on a server somewhere. This made developers offload the compute and storage to the server making the client code much “thinner”. This also allowed vendors to mass produce affordable client end-user hardware (laptops, pc, phones) since the compute required to run client software is minimum. This pattern is still dominant especially with the emergence of the cloud ☁️.

In this day, client hardware is so much more powerful it is a waste to not utilize. That is why developers started to run compute jobs on the “edge” (another word for the client) to avoid the latency of sending a job to the cloud and wait for the result. This also served to be beneficial in case the edge network is intermittent.

Will we eventually move back to running everything on the client? Will sending jobs to the Cloud becomes more expensive than executing it locally?

It's interesting.


No comments:

Post a Comment

Share your thoughts

Note: Only a member of this blog may post a comment.