services Archives - Club Cloud Computing

Serverless computing is the final frontier in scalable computing

Warning: this post is forward looking and does not give easy answers.

Cloud computing continues to enable innovation in the way we develop and deploy software.

Service Oriented Architecture (or SOA) is a software development paradigm for breaking up large systems into more manageable independent components. This probably got off in the nineties. As a matter of fact I worked on the architecture of a travel information system along those lines at the time.

Then we moved into multi-tier client server architectures, where a farm of web servers talks to a farm of application and database servers.

These days micro-services are the rage, each implementing a specific API to enable event driven software for full responsiveness.

However, micro-services still need to be deployed on compute units. This is one of the reasons why containers such as Docker’s are becoming so popular: they are much quicker to scale than virtual machine instances.

Yet, you are still deploying on discrete compute units. And while being able to scale up your resources is key to the cloud computing benefits, scaling down below the level of a compute unit makes it possible to think in terms of lots of smaller micro services. Maybe we should call them nano services.

The notion of the server, even if it is a container, then disappears, and the minimum cost for running a service drops from a dollar a day or so to millicents.

AWS Lambda is a new platform that enables this serverless computation style. The basic unit of deployment is a piece of code that handles one API request that can last no longer than a few seconds. The AWS platform does the replication and scaling. And billing of course, you can trust Amazon to do that :-).

The idea is not entirely new. Google App Engine did something similar 5 years ago, when I started measuring the cloud. However, it was too far out, lacked certain features to make it easy to use and deploy. In the meantime, it has made big steps, and Google App Engine  certainly remains an option in this space.

What is the strategic implication? This serverless computing style revolutionizes application architecture. We can now really start thinking in terms of service orientation and grid style computation, without being distracted by our target server architecture. At the same time, this requires significant reengineering of even the most advanced DevOps continuous deployment toolchains. And finally, security and risk management need to make some ‘adjustments’ in their thinking, to say the least.

Cloud computing can make you more secure

The number one concern cited for avoiding cloud computing is security. And there is a reason for that. Cloud providers have demonstrated some spectacular failures in the past, including Amazon’s near total shutdown of an entire region, Dropbox’s authentication snafu, and innumerous cloud providers that go belly-up.

However, in the long run, cloud computing is destined to become more secure than in-house IT. I will briefly describe two dynamics in the industry that point in that direction, with substantiating evidence.

First, good cloud providers are getting better, as they have more staff available to do security, and bigger economies of scale, allowing them to sustain more security processes. Here is a case in point. Security people are, by nature, pretty paranoid. However, some are more paranoid than others. At a cloud security training I recently conducted, one of the attendants had created an Amazon Web Services account solely for the training. He terminated the entire account on the last afternoon. Just before the training was over, he showed me a message on his smartphone. Within an hour after he terminated the account, his LinkedIN profile was visited by somebody from the Amazon compliance department. Apparently his behavior was suspect. Either that, or they played a game on who can be the most paranoid.

Take a look at my Cloud Security training and get certified by the leading industry coalition and make cloud computing more secure!

As another example, does your IT department track rogue resource usage and credential leakage on a systematic basis? Some cloud providers do this for you, as this story of API credential leakage demonstrates.

Second, while the previous examples show that cloud providers can become better than the average IT department across the board, in specific areas specialized services are already way ahead of the competence and resources of the average IT department. This is nowadays called ‘Security as a Service’, or SecaaS (another example of an acronymic cloud nonomatopoeia), but the trend has roots that go back quite a while. Basically the idea is that a lot of security functionality is done in a better way by taking advantage of cloud computing essential characteristics such as elastic scalability and resource pooling.

Examples of SecaaS that you may be familiar with or are actually using are: Email spam and malware filtering, blacklist and other reputation services, DDoS mitigation and monitoring (i.e. performance). We are also seeing companies using cloud services as a component of a disaster recovery strategy. Innovation in this field is strong.

So, in conclusion, the market is nearing a ‘tipping point’ where the cloud may actually be more secure than on-premise IT.

For more information, visit CCSK Cloud Security Training.

The cloud of things

Cloud computing also promises to be a great tool for the ‘Internet of Things’.

Here is an example of how that is applied right now.

The heating system in my house has a bit of a challenge: the thermostat does not seem to be working well under certain circumstances. So I built a small sensor device, consisting of a few temperature sensors that I strapped to both the incoming and the outgoing pipe of the main radiator. The sensors are read by a Electric Imp, which is basically a WiFi enabled micro-controller.

The Electric Imp is programmed through a cloud service on which I developed a simple temperature logging application. That application logs its data through an API on Xively. Xively hosts the data, and provides some management and visualization features.

temp sensor delta example

Here is an example of the graph. It displays the difference between the two pipe temperatures, which is a measure for the amount of heat transferred into the room. Notice the high peaks in the morning when we have set the thermostat to heat up the room, after which it goes into a more or less stationary pattern.

One of the features is notification through a Web API. I set notifications for the situation where the data stops coming, or when we are experiencing sub-zero temperatures. In these cases, there is probably something wrong with the sensor hardware or software. Xively basically throws a webhook with data. This webhook is caught by another service: Zapier.

Zapier is like a message queue programmable through an API. I let it send emails as a respons to events.

The final component is more customizable presentation. As a matter of fact, that can be done directly from a webpage with javascript, but I have the webpage hosted at github. This page allows me to monitor the room temperature and other data from anywhere in the world on my smartphone. All in all I spend a few hours and the equivalent of around 50 dollars on this. Google spent 3 billion on Nest.

Wrapping it up, I took a bunch of cloud services, hooked them up together, and I have a live system. All the cloud services promise to scale very well. And maybe in a next post I will elaborate on the Cloud Security features that these services have, or should have.

And I can now show on which days the heating starts and stops way too early. Time for the repairman.


Take your backups to places

I finally got around to working on one of my New Year’s resolutions (I am not saying which New Year!): doing better backups for the computers under my administration.

I used to do my own backups, but it is a hassle. CDs need administration, hard disks need attending to. I must have spent hours keeping track of stuff, fixing problems, and what have you. I am not alone in this, see

To name one problem: Outlook personal store files (pst) get big, so a full backup takes ages, and if Outlook is still open, a straight copy fails.

So, if you have a broadband internet connection, remote backup is the place to go. I researched a number of services, and finally settled on mozy. It has all the features you want, and no other. Installation is just a 1.8 Megabyte download, the first 2 gigabytes of storage are free, and after that it is $5 a month. I won’t go into all the features. If mozy does not have it, you probably do not need it for backup purposes. The client is a simple, user friendly, small and efficient application. The only downside: Windows XP only, Mac is in the works.

The alternatives I looked at include which has more features for sharing big files, and which is also quite nice. Unix/Linux people might want to look at For local backups you may want to have a look at syncback from

So, do yourself (and me) a favor and go to mozy now, using this link.