BARCELONA - The OpenStack Summit keynotes got underway the morning of October 25, with Mark Collier, Chief Operating Officer of the OpenStack Foundation, declaring that the world runs on OpenStack.
Collier's claims were not exactly bravado, as they were backed by a conga line of large operators all using OpenStack to power their cloud services. Collier emphasized that OpenStack is different than other software projects for a number of reasons.
The core design approach around OpenStack is driven by what Collier referred to as the four opens: open source, open community, open development and open design.
Today open-source development is relatively commonplace; what isn't common is complete openness through the development cycle, which is what differentiates OpenStack in Collier's view.
By having an open community where anyone can participate, Collier argues it's a model that's better at solving problems and also brings in a broader set of contributors too.
When You Think OpenStack, Think Globally
Some of those contributors joined Collier on stage, including Banco Santander, which is a large Spanish banking conglomerate. Luis Enriquez, Head of Architecture at Santander Group, explained that his company started its cloud journey in 2014, with automation and open source being critical requirements.
Today Banco Santander runs OpenStack for Big Data workloads across data centers, including four in Spain, one in Mexico, one in the U.K and one in Brazil. All told, Enriquez stated Banco Santander has over 1,000 compute nodes running OpenStack.
Behind the OpenStack compute nodes is 1.8 petabytes of data in Big Data storage. Enriquez said the different Big Data analytics use-cases include risk management, customer experience and operational excellence.
Sky, which is one of Europe's leading content companies, is also an OpenStack user. Matt Smith, Infrastructure Design Manager at Sky, said his company decided it needed to pivot to a Software-Defined Data Center approach to be more agile.
Today the company has some 7,000 cores running OpenStack, enabling the company's set-top boxes as hosted applications. Smith noted that Sky also does lot of video transcoding, which is a good workload for OpenStack.
Chinese vendor Huawei is also heavily invested in OpenStack. Anni Lai VP, Huawei IT Product Line, told the OpenStack audience that Huawei has worked with the Jiangxi province in China to enable a government cloud that provides services to 45 million citizens. Lai noted China is a big player in OpenStack overall, with 23 different Chinese companies contributing code to the recent OpenStack Newton release.
"When you write code or write a process for OpenStack, think globally," Lai encouraged the audience.
In Lai's view, developers and users of OpenStack shouldn't limit themselves by constantly comparing it to proprietary platforms.
"Our goal is to create something amazing and to deliver to customers' requirements instead of trying to playing catch-up with proprietary vendors," Lai said.
OpenStack Use Surges in Scientific Computing Sector
Another rapidly growing area for OpenStack is in scientific computing. Among the early adopters of OpenStack was CERN, which started using OpenStack in 2013 to help analyze data coming from the Large Hadron Collider.
Tim Bell, who leads the OpenStack effort at CERN, said that today he has 190,000 compute cores running OpenStack and he's looking to add another 100,000 cores in the next six months. CERN's storage needs are also massive with 0.5 petabytes per day of data now being collected.
For Bell, it has become obvious that CERN will not be able to meet all of its compute needs on its own, which is why it's also looking to use the public cloud to help scale as needed.
That's where the use of containers comes into play and specifically the OpenStack Magnum container project. The basic promise of containers for Bell is that he can now take the same in-house tooling that he uses on-premises at CERN and use it on the public cloud.
While CERN's LHC today is the largest science project in the world, OpenStack is set to help enable the next big science effort, known as the Square Kilometer Array (SKA) radio telescope.
Rosie Bolton, SKA Science Data Processor Consortium Project Scientist, told the OpenStack audience that the SKA when fully operational will generate 1 Petabyte per day and will use up to half an exaflop per second of computing power, and she expects that OpenStack will be well suited for the task.
Sean Michael Kerner is a senior editor at ServerWatch and InternetNews.com. Follow him on Twitter @TechJournalist