Yahoo! cuddles Google's bastard grid-child

Gets 'butt kicking' from Microsoft


Stuffed Elephant Summit Sometime after the New Year, Yahoo! flipped the switch on what it calls the world’s largest Hadoop application, using the much-hyped open-source grid computing platform to tackle a task no smaller than the web itself.

Known as Yahoo! Search Webmap, this Hadoopified mega-app provides the world’s second most popular search engine with a database of all known web pages – complete with all the metadata needed to understand those pages. Yes, Yahoo! has crunched such data for years now, but thanks to Hadoop - an Apache project that mimics the GFS and MapReduce grid technologies developed at Google – Webmap can deliver the goods significantly faster than the company’s old school setup.

"When building our web index, one of the things we do is build a graph of all the links on the web. We start with all the web pages we know of. We extract links and other metadata. And then we aggregate up a big system-wide view of the Web," Yahoo! Grid Computing Pooh-Bah Eric Baldeschwieler told The Reg at the Yahoo!-sponsored Hadoop developer’s summit in Santa Clara, California. "With Webmap, we can do this 33 per cent faster on the same hardware."

According to Baldeschwieler, this far exceeded expectations. "The previous system – which we built in 2000 – was all C++. And with the new system we moved to Java," he explained. "The belief was that moving to Java would slow everything and that we would pay a penalty in moving to the new framework.

"When it’s running perfectly, the old system does outperform the new one. But of course hardware fails and there are all sorts of scenarios under which the old system doesn’t perform perfectly. Hadoop gives us much more flexibility. It’s built around the idea of running commodity hardware that runs all the time."

Hadoop is the bastard brainchild of Google and a man named Doug Cutting. Back in 2004, while developing Nutch, an open source search engine, Cutting realized that his engine wouldn't purr unless it was juiced with some sort of distributed computing platform. And for reasons unknown, Google had just published a pair of research papers that detailed GFS, its distributed file system, and MapReduce, a means of pooling processing power.

So Cutting and his open-source pals went to work on a project that would duplicate Google’s technologies – and maybe even (cough) improve them. He dubbed the project Hadoop after his son’s yellow stuffed elephant.

By early 2006, Yahoo! was flirting with the project, and the company soon gave Cutting a job. At the time, Hadoop and Nutch ran on just 20 nodes, indexing about 100m web pages. Two years later, Hadoop and Yahoo! Search Webmap run on 10,000 processor cores, indexing, um, many more web pages. "I can’t say exactly how many," Cutting told the Stuffed Elephant Summit. "Let’s just say it’s far in excess of 100 million."

But Yahoo! isn’t the only one that’s fallen for Hadoop. IBM Research turned up at the summit to show off JAQL, a language suited to building JSON (JavaScript Object Notation) apps atop Hadoop. Amazon, a summit co-sponsor, discussed the benefits of running Hadoop on its EC2 web services. And more than 350 developers turned up to listen – though Yahoo! had originally expected fewer than 100.

Baldeschwieler also pointed out that 28 separate developers have trumpeted their Hadoop clusters on the official Hadoop wiki. "And that’s just a small fraction of the people using it," he said. "I would guess 100s of organizations have adopted the platform. There’s definitely a lot of interest - and a lot of discussion."

Microsoft is not one of those organizations. Redmond’s research arm is building its own grid-computing platform, Dryad. And Dryad is not open source. But that didn’t stop the company from attending a conference dedicated to all things Hadoop.

Microsoft’s Michael Isard used his half hour to trumpet DryadLINQ, a programming language that mirrors IBM’s JAQL and a Yahoo!-led open source initiative called Pig. Except that it doesn’t run on Hadoop. It runs on Dryad.

At least one developer was mighty impressed with the presentation. But he still wondered whether DryadLINQ was already irrelevant. "I think you’re kicking everyone’s butt. You’re already working on a higher level of abstraction than anyone else," he told Isard. "But since you’re proprietary technology, we’ll have to wait and see how effective you’ll be."

Of course, Google’s grid computing technologies are also proprietary. But that's a different matter. You can debate the merits of Hadoop and Dryad all you want - but they're both playing catch-up. ®

Similar topics

Broader topics


Other stories you might like

  • Lenovo halves its ThinkPad workstation range
    Two becomes one as ThinkPad P16 stands alone and HX replaces mobile Xeon

    Lenovo has halved its range of portable workstations.

    The Chinese PC giant this week announced the ThinkPad P16. The loved-by-some ThinkPad P15 and P17 are to be retired, The Register has confirmed.

    The P16 machine runs Intel 12th Gen HX CPUs, but only up to the i7 models – so maxes out at 14 cores and 4.8GHz clock speed. The laptop is certified to run Red Hat Enterprise Linux, and can ship with that, Ubuntu, and Windows 11 or 10. The latter is pre-installed as a downgrade right under Windows 11.

    Continue reading
  • US won’t prosecute ‘good faith’ security researchers under CFAA
    Well, that clears things up? Maybe not.

    The US Justice Department has directed prosecutors not to charge "good-faith security researchers" with violating the Computer Fraud and Abuse Act (CFAA) if their reasons for hacking are ethical — things like bug hunting, responsible vulnerability disclosure, or above-board penetration testing.

    Good-faith, according to the policy [PDF], means using a computer "solely for purposes of good-faith testing, investigation, and/or correction of a security flaw or vulnerability."

    Additionally, this activity must be "carried out in a manner designed to avoid any harm to individuals or the public, and where the information derived from the activity is used primarily to promote the security or safety of the class of devices, machines, or online services to which the accessed computer belongs, or those who use such devices, machines, or online services."

    Continue reading
  • Intel plans immersion lab to chill its power-hungry chips
    AI chips are sucking down 600W+ and the solution could be to drown them.

    Intel this week unveiled a $700 million sustainability initiative to try innovative liquid and immersion cooling technologies to the datacenter.

    The project will see Intel construct a 200,000-square-foot "mega lab" approximately 20 miles west of Portland at its Hillsboro campus, where the chipmaker will qualify, test, and demo its expansive — and power hungry — datacenter portfolio using a variety of cooling tech.

    Alongside the lab, the x86 giant unveiled an open reference design for immersion cooling systems for its chips that is being developed by Intel Taiwan. The chip giant is hoping to bring other Taiwanese manufacturers into the fold and it'll then be rolled out globally.

    Continue reading

Biting the hand that feeds IT © 1998–2022