NAVI GATIONSEARCH BOX
Join us on LinkedIn Follow us on Twitter
Eura Nova RD
Eura Nova

Activity

In this section you will find EURA NOVA’s latest news and activities.

Activity

30-09-2013

OpenNebula Conference

Open-nebula

And then it’s over, 3 days at the 1st OpenNebula Conference! I took part in 18 talks, put my hands into OpenNebula core, met a lot of interesting people, enjoyed a really nice German dinner in the capital and filled my brain with tons of information and inspiration.

What’s OpenNebula?
OpenNebula is an open source Cloud Computing Solution. It’s allowing you to pilot your virtual infrastructure. Independently of the hypervisor you’re running, from VMware to KVM via Xen.

When I registered to the conference, I was a bit afraid because it’s the 1st edition: “Who will be the speakers? Are they of good quality? Will I learn interesting things? Will it profit to me and to EURA NOVA?” To all this questions, I have one single answer: YES!

Let’s start this story by the beginning:
On Tuesday, after arriving in Berlin I joined the “Hands-on session”. A room full of laptops, strangers saying “Hallo!” to each other, introducing themselves. We finally started, driven by OpenNebula developers themselves. Objective: Setup an OpenNebula cloud in less than 3 hours on our laptops. We were able to start the front end, create a worker node, use the CLI and the “Sunstone” User Interface, start 2 VMs and “live migrate” them, not forgetting to configure bridges and virtual networks…

On Wednesday the conference started for real:

ignacio-llorente

We met Dr. Ignacio M. Llorente (@imllorente), one of the C12G lab founders, sponsor of OpenNebula in the industrial world. He told us that 1400 people were registered to OpenNebula mailing list. What a big community!

daniel-concepcionFollowing Ignacio’s presentation, Daniel Concèpcion – IT manager at Produban – gave his feedback on the bumpy road: he and his team took to go from a legacy banking infrastructure full of mainframe, big UNIX systems toward “The Cloud”. They have tried public clouds, but banking regulators told them that it wasn’t allowed, so they built their own private cloud to support 14 400 branches around the world. When building it they tried to use commercial solutions, but rapidly faced some vendor lock-ins, lack of flexibility… Daniel told “It was easy to find cloud providers, it was harder to find good cloud providers!”. So Produban chose to go for the open source world and to OpenNebula. “Open source isn’t about cheaper or free, it’s all about knowledge, freedom, community and collaboration” Daniel said. But what are the triggers that make a robust company to go for the cloud? The evolution of virtualization efficiency, the high number of manual intervention in their process for which they thought that the cloud would almost automatized the whole thing and they succeeded in it! Regarding their implementation, they chose OpenNebula, with OpenCompute hardware to really reduce their PUE (Power Usage Effectiveness)  to 1.07 (unlike the 1.7 of legacy industrial standards) and they have the full enterprise support of C12G Labs and Viapps, so they’re backed up, stable and happy. What a great success story! The cost of this operation? Around 1M€ VAT included.

After Daniel’s talk we heard about Akamai’s Thomas Higdon‘s experience. Akamai is running 10⁵ Production clusters (10² in QA and Test) in a private cloud with OpenNebula and supporting 30% of the Web traffic everyday.

steven-timmWe also heard about Fermilab’s aventure, with Dr. Steven Timm. Fermilab is a high performance computing research centre, who’s got 60 Petabytes of Data (Yes 60 Petabytes, 6000 Terabytes, 6 000 000 Gigabytes and so on…). Scientists are needing high availability, fault tolerance, high speed computing and other requirements familiar to all people working in critical environments. They achieved this with a private cloud, split on 2 physical sites and trying to reach 4500 VMs like in the CERN. Of course they didn’t forget about security and worked a lot to protect their credentials in the thousands of VMs they are running. Amazing!

The success stories continued to be presented, Cloudweaver’s Dr. Carlo Daffara‘s Steve Jobs imitation about their USB stick fitting cloud for SMEs was inspiring, you put an USB stick in each of your physical nodes and the whole stack is deployed, ready to go, including OpenNebula, hypervisors, monitoring and backup. Carlo’s explained why he is keen on the private cloud compared to Amazon’s public cloud for example: 4 instances running 100% of the time would cost 3500€/month, 3 times the cost of the same infra hosted on premises. (@cdaffara)

A great Canadian, Simon Boulet from CloudNorth, explained us how he is able to monitor 4000 VMs, with sub-minute polling interval, resulting in more than 4400 events/minute with OpenNebula, Ganglia and a simple MySQL database…

I hear your brains thinking about “Cloud standards aren’t secured enough”… You’re wrong ! The usual best practices are applicable, like Nils Magnus from Inovex demonstrated us, from SSH key securing to Physical segregation of networks. (@nilsmagnus)

Netways experience with big customers like Audi shown that you could host the infrastructure of big players within a cloud and still give them the best service level.

Thursday was a great day as well. 1st talk of the day: Dr Jordi Farrès, of the European Space Agency! He taught us how the biggest program of the ESA mastered high level calculations and processing on Earth Observation data, in bundle of dozens of terabytes as well…

Quite the same for Netherlands’s high performance computing centre SURFSara, which is hosting around 7000 cores of CPU. This kind of numbers leaves me dreamy. They succeeded in creating a user wizard to allow non-tech people to order VMs fitting their needs by themselves, which reduced administrative overlap, time to market and increased productivity and customer satisfaction according to Ander Astudillo.

And last but not least, Joel Merrick from the BBC R&D department has shown us how OpenNebula enhanced their capacity to capture and stream tons of content toward end users. Is it necessary to remind us that the BBC R&D started in 1922 in Great Britain and was part of major technological evolutions? (like something called “microphones”…)

Of course this is a short excerpt of this amazing event and the full presentation set will be available soon, so keep coming, it will be displayed as soon as it’s out of the oven.

As a conclusion I might confess that I loved this conference and found a lot of big interests in the transition to the cloud, in the power of community and I’m more confident than ever that every company whatever its size, is able to go to the next level, increase its benefits, productivity and decrease its time to market by optimizing its infrastructure, leaning its processes and reducing its IT costs.

I’d like to give a big “thank you” to OpenNebula, Netways and to all the speakers of the conference who made it a real success, which gave me a lot of ideas to enhance our OpenNebula experience at EURA NOVA.

Latest News: Discover the presentations & slides of the OpenNebula Conference.

And you, are you ready for the cloud? Would you go for open source solutions? Why?

 

Still in the clouds, Cyrille
Twitter: @CydsWorld


 

2 responses to “OpenNebula Conference”

  1. Steven Timm says:

    Cyrille–just one correction on the comments above. The FermiCloud project has not yet been able to run 4500 simultaneous VM’s but that is mainly due to lack of hardware and network address space. In my talk I made mention of the CMS experiment at CERN/LHC (not affiliated with the FermiCloud project) which has run at approximately that scale in their High Level Trigger farm with GlideinWMS and OpenStack. Wanted to be sure there was no confusion on this. We will let you know when our project also succeeds to run at that scale.

    Steven Timm
    Fermilab, FermiCloud Project

  2. […] with very interesting OpenNebula use cases. And it is not only us saying so! Check these GigaOM and euronova […]

Leave a Reply