I am not a formally trained economist. I am a software engineer, and economics (and economics in IT in particular) is merely a hobby of mine.
I have now seen many mentions of Jevons paradox in the context of cloud computing. Initially, when I first learned about it, it sounded interesting and exciting, just as any proposition that on the surface defies common sense and explains something fundamental in one’s area of professional interest. But as I read how other people apply Jevons paradox in their lines of thinking, I realized that it’s not as clear cut as one might imagine.
What is Jevons Paradox?
While you can read the entire paper here, Jevons paradox in short states that “improvements in fuel efficiency tend to increase, rather than decrease, fuel use” (both link and interpretation are from Wikipedia).
Let me first outline a hypothetical scenario in cloud computing domain where a phenomenon similar to the one described by Jevons could be observed.
Imagine you are developing a model that can predict future spot prices in Amazon EC2. To be able to obtain results with 50% confidence level, your model needs to run for 5 hours on 100 cloud instances (500 machine-hours in total).
Now imagine that due to enhanced orchestration and coordination between various parallel tasks of your model, you no longer need to run all 100 instances for the entire duration - you can stop and start instances on demand at the right times such that your overall model now takes only 220 machine-hours. In other words, you switched to more efficient use of the resource (compute instances).
Looking at significant savings, you start contemplating whether to improve the results by increasing desired confidence level, which will obviously increase the amount of computation and hence will drive up the number of required machine hours.
Predict the future vs Explain the past
The last paragraph above is what’s known as rebound effect. If you choose not to require higher confidence level, your rebound effect is 0. It means that you fully realized all savings made possible by increased efficiency of your use of resources. This case is not Jevons paradox.
If you choose to require say 60% confidence level and it leads to 300 machine-hours, you still realized some savings (500 machine-hours vs 300), just not as much as you would have realized had you stayed with 50% level. This case is not Jevons paradox either.
Furthermore, if you choose to require 80% confidence level which happens to lead to your enhanced model requiring 500 machine-hours, your overall spend will remain the same (500 vs 500). Also not Jevons paradox.
And only if you choose to go still higher with confidence level - say 90% - will you end up with your model taking more than 500 machine-hours, and in doing so you would end up with situation described by Jevons.
All in all, Jevons paradox is a special case of rebound effect, and there is absolutely nothing that can tell you in general case in advance how big the rebound effect is going to be. Hence, you can't tell in advance if a rebound effect will be big enough to turn into Jevons effect or not. Therefore, predicting in general case that cloud computing will not save you money because of Jevons paradox alone is logically inaccurate.
On the other hand, if you saw me going from 500 machine-hours for 50% confidence level to 700 machine-hours for 90% level, you could explain what happened to me as Jevons paradox. Post factum.
Macro vs Micro
Jevons formulated his statement on a macro level. Rebound effect is also usually studied on macro level. Macroecnomics focuses on economy as a whole - or, in other words, on a collection of individual entities (firms and households) each of which is guided by their own self-interest, independent of that of others.
What happens to a set of entities as a result of their independent actions taken cumulatively can’t tell us how a given individual household or firm will behave - that’s on micro level.
Automatically assuming that whatever happens on macro level happens to every participant on micro level is a fallacy of division.
Therefore, even if cloud computing will not lead to reduction in overall IT spend, saying that a given company can't lower its IT spend by going to the cloud because of Jevons paradox alone is inaccurate.