Blue glowing high energy plasma field in space, computer generated abstract background
Uncategorized

What would you do with a billion core hours of compute power?

I was interested to read that Google is donating a billion core-hours of CPU time to science as part of their Visiting Faculty Program. Their description of the kinds of workloads that would be appropriate for their Exacycle program reads:

“The best projects will have a very high number of independent work units, a high CPU to I/O ratio, and no inter-process communication (commonly described as Embarrassingly or Pleasantly Parallel). The higher the CPU to I/O rate, the better the match with the system. Programs must be developed in C/C++ and compiled via Native Client.”

This is precisely the type of HPC workload that should run well in a virtual environment so I wonder if Google is planning to take advantage of the encapsulation and mobility offered by a virtualization layer or whether they will be allocating physical boxes to this effort. Virtualization would also let each research team deploy whatever stack they need for running their computations rather than restricting them to a specific operating environment.

I’m just sayin’…

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *