The tablet of Doom, or Riptide

Benchmarking computer performance has always been a complex mix of science and art. Coming up with a representative workload reflecting single and multiple users, turning compilers fairly, and really evaluating a mix of processor, memory, and I/O performance across architectures is tough.

The one slam on benchmarking has always been whether the effort is synthetic. Many a developer selected a computer and OS based on a benchmark to find out their real-world application mix was nothing like the benchmark, and performance was wildly different. In the PC and server space, SPEC does a pretty good job of coming up with a mix that stresses different subsystems, and for embedded processors there’s EEMBC with a suite of tests.

There was a day when to test a PC, one fired up Flight Simulator, and eventually as graphics effects become even more complex, Doom became the standard. By tweaking the effects knobs, one could get a reproducible feeling of how good a machine was at certain settings, and where it would roll over like the Lusitania as the complexity was cranked up. A couple FPS in multiplayer gaming is a big deal.

I’m writing this series over at SemiWiki.com on mobile processors, trying to answer if what’s inside the tablet or phone really matters, at least when comparing new processors and roadmaps. We are finding no good absolute way to compare performance, and it’s setting off some interesting debate.

Apple went on air claiming their graphics in the new iPad was 4x faster than anything based on an NVIDIA Tegra 3. NVIDIA shot back that 4 cores are a whole lot faster than 2, and that Apple’s graphics aren’t nearly that much faster. There’s a whole line of thinking that the benchmarks aren’t even exploring the multicore CPU and GPU features, for either side. To make matters worse, evaluation of iOS and Android on even footing is difficult, and we may really be opening a kettle of fish when Windows Phone 8 and (possibly) BlackBerry 10 step into the mix.

The widely accepted visual benchmark is GLBenchmark, which can be run on both iOS and Android devices. There are a whole raft of CPU tests for Android including AnTuTu, Quadrant, Rightware’s BasemarkOS and EEMBC’s latest AndEBench. One of the few benchmarks that actually runs on both iOS and Android is GeekBench.

But the debate is on. Does a benchmark that plays 720p frames off screen really test what a tablet does? Is there a point to getting all 4 cores to turn on when 99% of the day is spent with only 1 or 2 cores running? These aren’t multiuser machines we’re testing. Users are doing things like syncing email and social updates in background. When one plays a movie, one’s attention is fixed there, and past a certain frame rate and without network lag, it all looks good – at least until you compare the retina display of a new iPad or the AMOLED of a Samsung device to something else.

Tablets are also moving to become the new gaming platform. The Sony PlayStation Vita is very likely the last of it’s kind: a dedicated handheld game console. We haven’t seen games optimized for the current crop of tablets and their GPU capability.

While Angry Birds is the mobile acceptance benchmark, the pig explosions don’t stress a system. There are new games starting to show up that do. Riptide GP has been tossed out there as one possible contender that runs on several platforms.

Riptide GP

image of Riptide GP courtesy Vector Unit

I’m still grappling for answers on these questions:

What will be the Doom of the tablet age, the standard for evaluating a tablet that everyone will look to? Does Riptide or something like it test a tablet well?

Does tablet or superphone performance even matter, given new devices are all fairly fast now and network lag doesn’t hold things up?

Or is this all the realm of hardcore gamers looking for a slight edge?

Part of the PC thing was you, the semi-average user, could actually tune a system with a bit of aptitude, ranging from selecting a graphics card to picking memory and hard disks to overclocking and cooling. Tablets don’t offer the semi-average user much of that opportunity.

I’m looking for where the “wow” factor is going to be, and if all this advanced and very expensive mobile SoC development effort is worth the trouble. History tells us performance needs to advance, but at some point graphics performance is past our acuity to tell the difference.

As my 11th grade chemistry teacher used to say: “Comments? Questions? Criticisms? Complaints?” Enjoy hearing your ideas.

, , , , , , , , , , , , ,