The idea of open source could by now be familiar to most folks in the industry. When a new technology is open sourced by an individual, the situation seems to be well understood. But when a corporation opens source code for one of its products, it looks to me like not everyone is fully aware of all peculiarities.
I can think of 3 common situations.
Something is developed for internal needs and then is open sourced after some internal use, after it becomes obvious others may benefit. I perceive Google and Facebook to be favoring this approach. This approach is the closest a corporation can get to motives that drive individuals to open source their work - sharing of knowledge, earning street cred, recruiting, and so on. Key differentiator here is that owner pretty much doesn't care what others do with their technology - their internal plans more or less do not depend on whether open sourced project becomes popular or not.
Something is open sourced while being developed or at early stages (say right after the first shippable milestone) , without owner's aspirations to build an ecosystem or get it adopted as a standard, at least initially. This is a business move - it's conceivable to think of a situation where a company's product needs to have its source code publicly available, in order to help sales. How permissive a license is may or may not matter here, and neither does whether contributions are accepted or not or even encouraged.
Something is open sourced while being developed and the goal is right from the start to build an ecosystem, or for technology to become a standard. In this case, main driver to open source was the ecosystem or aspirations to become a standard or some sort of fundamental building block (such as a "Linux kernel of the cloud"). Talk of licensing, code ownership foundations, governance, commoditization, contributions, etc are all found in this approach.
In all three scenarios, open sourcing is great and I applaud every company that does it, no matter what their motives are. But #3 is extremely difficult. In fact, of all open source tools and technologies that we commonly use today (Linux, Pyton, Ruby, any project in Apache Incubator and many thounsands of others) - I can’t think of any that followed scenario #3.
For those who have not been following the industry lately, Open Compute project initiated by Facebook seems to follow approach #1 - they have a finished datacenter based on this spec (in Prineville, OR) and they shared their findings after it was completed. They seem to like their design and looks like they will continue building to this spec, regardless whether it becomes adopted by others or not.
Openstack and Cloudfoundry (led by Rackspace and VMware, respectively) seem to follow approach #3. It’s a great idea - all I am saying is that their job is more difficult, and they are truly blazing their trail, it hasn’t been attempted thus far. Or if it has been attempted, I can’t think of it or it didn’t work out. If they make it in their respective areas, they will be the first.
This post on GigaOm could be interesting read in this context.
Can Openstack and Cloudfoundry both succeed at the same time at something that has not worked out up until now?