November 16, 2011
At Hadoop World, Facebook’s Jonathan Gray gave two talks: HBase Roapmap, and Building Realtime Big Data Services at Facebook with Hadoop and HBase. While I wasn’t able to attend the sessions, at the end of the conference I was able to catch up with the man himself.
Here’s what he had to say:
Some of the ground Jonathan covers
- How Jonathan got involved with HBase and how Facebook uses it
- (1:00) Where does HBase fit in the big data ecosystem
- (1:54) HBase vs MySQL
- (2:44) The HBase community and where the committers reside
- (5:35) What is Jonathan looking forward to in HBase (the “HBase DBA”)
Pau for now…
November 3, 2011
One of the interviews I did at Dell World was a video with IT in Canada. I did the video with Paul Cooper, Dell’s country manager for Canada.
In the first half of the video I talk about how Dell got into the cloud and where we play in the space. In the second half Paul talks about the roll the telcos will play in the delivery of cloud services in Canada as well as issues around privacy and data sovereignty.
Check it out.
From the article itself, here’s a great summary of our cloud participation and shows how we have built, bought and partnered along the way:
Dell’s excursion into cloud began with organic development of server and data centre capability in specialized systems to meet the needs of large cloud providers (Facebook, Microsoft Azure and Bing), progressed through modification of these systems for marketing to the “next 1,000”, and shifted to partnership with software makers such as Joyent to develop complete cloud solutions, and with companies such as VMware for the creation of a full service public cloud offering.
Supporting acquisitions along the way include companies with specific capabilities such as SecureWorks, which was purchased to address web security concerns that continue to dog broader cloud adoption, and BOOMI, a specialist in cloud integration, which enables Dell to better service customers who adopt a hybrid cloud approach to sourcing compute resources.
September 26, 2011
Dell has been working for the last four plus years outfitting the biggest of the big web superstars like Facebook and Microsoft Azure with infrastructure. More recently we have been layering software such as Hadoop, OpenStack and crowbar on top of that infrastructure. This has not gone unnoticed by web pub GigaOm:
Want to become the next Amazon Web Services or Facebook? Dell could have sold you the hardware all along, but now it has the software to make those servers and storage systems really hum.
They also made the following observation:
Because [Dell] doesn’t have a legacy [software] business to defend, it can blaze a completely new trail that has its trailhead where Oracle, IBM and HP leave off.
Letting customers focus on what matters most
Its a pretty exciting time to be at Dell as we continue to move up the stack outfitting web players big and small. The idea is to get these players established and growing in an agile and elastic way so they can concentrate on serving customers rather than building out their underpinning software and systems.
Stay tuned for more!
Pau for now…
April 12, 2011
Last Thursday a group of us from Dell attended and participated in the unveiling of Facebook’s Open Compute project.
Much the way open source software shares the code behind the software, the Open Compute project has been created to provide the specifications behind the servers and the data center. By releasing these specs, Facebook is looking to promote the sharing of data center and server technology best practices across the industry.
The unassuming entrance to Facebook's Palo Alto headquarters.
The Facebook wall.
Facebook headquarters at 8am. (nice monitors! :)
Words of wisdom on the wall.
Founder and CEO Mark Zuckerburg kicks off the Open Compute event.
The panel moderated by Om Malik that closed the event. Left to right: Om, Graham Weston of Rackspace, Frank Frankovsky of Facebook, Michael Locatis of the DOE, Alan Leinwand of Zynga, Forrest Norrod of Dell (with the mic) and Jason Waxman of Intel.
Post-event show & tell: Drew Schulke of Dell's DCS team being interviewed for the nightly news and showing off a Dell DCS server that incorporates elements of Open Compute.
Extra credit reading
- GigaOM: Bringing Facebook’s Open Compute Project Down to Earth
- The Register: Facebook’s open hardware: Does it compute?
Pau for now…
April 7, 2011
This morning at Facebook’s headquarters in Palo Alto the company announced their Open Compute project Partners and kindred spirits were there to tell the story behind Open Compute and explain what they think it means to the industry. One group of kindred spirits were the individuals from Rackspace. I got some time with Jim Curry who heads up OpenStack at Rackspace after the event officially ended.
Here is what Jim had to say:
Some of the topics Jim covers:
- Driving efficiencies in data center design requires looking at the issue holistically.
- Learning from Facebook’s successes and failures.
- Looking forward to collaboration in an area that hasn’t historically had a lot of collaboration.
- Engagement with Facebook engineers on how to run OpenStack on their hardware.
Pau for now…
April 7, 2011
Former Dell DCS dude Frank Frankovsky has been at Facebook for about 18 months. Frank is Facebook’s Director, Hardware Design and Supply Chain and since he arrived, he has been heavily involved in the Open Compute project. Today was the big day when Open Compute made its worldwide debut.
Frank represented Facebook on the panel discussion which was moderated by GigaOM’s Om Malik. After the panel I was able to grab a few minutes with Frank, between press interviews, and learn first hand about the project.
Some of the topics Frank covers:
- What he and his team do at Facebook
- Their brand new data center which is running open compute infrastructure
- Opening up the details and specs of their data center and the systems they are running
- The genesis of the open compute project
- What are the next steps for the open compute project
Pau for now…
April 7, 2011
This morning, at Facebook’s headquarters in Palo Alto, the company unveiled the Open Compute project. Also on hand to support the announcement were partners such as Dell and Intel, who served on a panel alongside representatives from Rackspace, the Department of Energy, Zynga and Facebook. Forrest Norrod, GM of Dell’s server platform division represented Dell on the panel.
I caught up with Forrest after the event to get his take on the Open Compute project and what it means for Dell.
Pau for now…
April 7, 2011
Today at its headquarters in Palo Alto, Facebook and a collection of partners such as Dell, Intel and AMD — as well as kindred spirits like RackSpace’s founder (the company behind OpenStack) and the CIO of the Department of Energy — are on hand to reveal the details behind Facebook’s first custom-built data center and to announce the Open Compute project.
Efficiency: saving energy and cost
The big message behind Facebook’s new data center, located in Prineville Oregon, is one of efficiency and openness. The facility will use servers and technology that deliver a 38 percent gain ìn energy efficiency. To bring the knowledge that the company and its partners have gained in constructing this hyper-efficient hyper-scale data center Facebook is announcing the Open Compute project.
Much the way open source software shares the code behind the software, the Open Compute project has been created to provide the specifications behind the hardware. As a result, Facebook will be publishing the specs for the technology used in their data center’s servers, power supplies, racks, battery backup systems and building design. By releasing these specs, Facebook is looking to promote the sharing of data center and server technology best practices across the industry.
How does Dell fit in?
Dell, which has a long relationship with Facebook, has been collaborating on the Open Compute project. Dell’s Data Center Solutions group has designed and built a data center solution using components from the Open Compute project and the server portion of that solution will be on display today at Facebook’s event. Additionally Forrest Norrod, Dell’s GM of server platforms will be a member of the panel at the event talking about the two companies’ common goal of designing the next generation of hyper efficient data centers.
A bit of history
Dell first started working with Facebook back in 2008 when they had a “mere” 62 million active users. At that time the three primary areas of focus in with regards to the Facebook IT infrastructure were:
- Decreasing power usage
- Creating purpose-built servers to match Facebook’s tiered infrastructure needs
- Having tier 1 dedicated engineering resources to meet custom product and service needs
Over the last three-plus years, as Facebook has grown to over 500 million active users, Dell has spefically helped out to address these challenges by:
- Building custom solutions to meet Facebook’s evolving needs, from custom-designed servers for their web cache, to memcache systems to systems supporting their database tiers.
- Delivering these unique servers quickly and cost effectively via Dell’s global supply chain. Our motto is “arrive and live in five”, so within five hours of the racks of servers arriving at the dock doors, they’re live and helping to support Facebook’s 500 million users.
- Achieving the greatest performance with the highest possible efficiency. Within one year, as the result of Dell’s turnkey rack integration and deployment services, we were able to save Facebook 84,000 pounds of corrugated cardboard and 39,000 pounds of polystyrene during that same year.
Congratulations Facebook! And thank you for focusing on both open sharing and on energy efficiency from the very beginning!
Pau for now…
January 17, 2011
Earlier this month an interview I did with Robert Duffner, Director of Product management for Windows Azure, went live on the Windows Azure team blog. Robert asked me a variety of questions about Cloud security, how I see the Cloud evolving, the pitfalls of the cloud, where Dell plays etc.
I was pleasantly surprised to see that my ramblings actually turned out coherent :) Here is a section from the interview (you can check out the whole piece here):
Cloud computing is a very exciting place to be right now, whether you’re a customer, an IT organization, or a vendor. As I mentioned before, we are in the very days of this technology, and we’re going to see a lot happening going forward.
In much the same way that we really focused on distinctions between Internet, intranet, and extranet in the early days of those technologies, there is perhaps an artificial level of distinction between virtualization, private cloud, and public cloud. As we move forward, these differences are going to melt away, to a large extent.
That doesn’t mean that we’re not going to still have private cloud or public cloud, but we will think of them as less distinct from one another. It’s similar to the way that today, we keep certain things inside our firewalls on the Internet, but we don’t make a huge deal of it or regard those resources inside or outside as being all that distinct from each other.
I think that in general, as the principles of cloud grab hold, the whole concept of cloud computing as a separate and distinct entity is going to go away, and it will just become computing as we know it.
Pau for now…
October 19, 2010
Timothy Prickett Morgan of everyone’s favorite vulture-branded media site The Register attended a round table discussion we held a few weeks ago in New York. His piece from that event, which was focused around the cloud, was posted yesterday.
You should check out the whole article but here are some snippets to whet your appetite:
What DCS is all about
For the past several years – and some of them not particularly good ones – Dell’s Data Center Services (DCS) bespoke iron-making forge down in Round Rock, Texas, has been a particularly bright spot in the company’s enterprise business.
The unit has several hundred employees, who craft and build custom server kit for these picky Webby shops, where power and cooling issues actually matter more than raw performance. The high availability features necessary to keep applications running are in the software, so you can rip enterprise-class server features out of the boxes – they are like legs on a snake.
How we’re working with web-based gaming company OnLive
“These guys took a bet on Facebook early, and they benefited from that,” says Perlman [OnLive Founder and CEO]. “And now they are making a bet on us.”
OnLive allows gamers to play popular video games on their PCs remotely through a Web browser and soon on their TVs with a special (and cheap) HDMI and network adapter. The games are actually running back in OnLive’s data centers, and the secret sauce that Perlman has been working on to make console games work over the Internet and inside of a Web browser is what he called “error concealment”.
DCS had to create a custom server to integrate their video compression board into the machine, as well as pack in some high-end graphics cards to drive the games. Power and cooling are big issues. And no, you can’t see the servers. It’s a secret.
Pau for now…
June 11, 2010
Dell’s Data Center Solutions (DCS) group has both custom offerings and, as we announced a couple of months ago, a new line of systems and solutions targeted at a wider audience.
One the the key markets we are looking at for our new line is gaming. To get up to speed on the market I took a look at the report that the PC gaming alliance put together for its members. It was a very cool read. Here a few things I learned:
Some fun facts to know and tell:
- Last year the global PC game software market was just over $13B while the global console software market was nearly $20B.
- The revenue from PC games is expected to pass the revenue from console software in 2012.
- Last year China was the leading country for PC game revenue, 99+% which came from non-retail sources e.g. subscriptions and digital distribution.
- Worldwide piracy is decreasing as PC games move from package software to a service based business where users pay per usage.
- On a revenue basis the majority or leading PC game companies come from China or South Korea.
- Biggest growth last year came from the free-to-play (F2P) games where delivery of these games on social networks like Zynga’s Farmville on Facebook took off.
Dell has publicly been a big player in the PC gaming market through our line of Alienware systems (in fact we had an announcement yesterday). Where we have been a lot quieter however is talking about how our Data Center Solutions (DCS) group fits in. Next week at E3 we will be making an announcement to explain just what we’ve been up to. So stay tuned next week and see how DCS “plays” in gaming :)
Pau for now….