

When we shot this video of Intel Infoscape at CES 2010, we couldn't believe what we were seeing: a mere laptop driving two 7-foot screens, displaying 576 cubes hooked up to 20,000 info sources, including 20 live video feeds. Seems impossible. Touch one of the cubes, and an infobox displaying that content tumbles forward. Wow.
"Take THAT, NVIDIA," says Intel to its chiphead nemesis. The techno-tour de force is "powered by a single all new 2010 Core i7 processor w/ Intel Hyper-Threading technology and Intel HD graphics," crows Intel. Of course, graphics giant NVIDIA has processors that can do this kind of stuff, too, but it's never been displayed with such drama.
This was a mind-blowing demo of the Core i7 processor, handling boatloads of data and graphics at the same time. The graphics on the giant screens were a tons of fun to move around with their uncanny quickness and smooth motion, and the whole thing felt super responsive, Giving us a peek into the future, it seemed a lot like that computer screen in the movie Minority Report. It was the most spectacular demo we saw at CES 2010.
By roshinobi at 8:04 PM ON 01/09/10
way to make me feel bad about going with amd
By LostMK at 8:59 PM ON 01/09/10
So... this is what they killed Project Offset and so many other potential projects for?... After years of resentment, I must admit - It was a fair trade.
By Zibri at 9:59 PM ON 01/09/10
Literally jaw dropping... I was watching this video with my mouth open and unresponsive (which is very difficult to achieve)... and for the first time I wish I was there.
By TH4T6UY at 1:10 AM ON 01/10/10
There's no way that'll be better than an ATI or nVidia card for gaming. Discrete cards will always be better for that.
By Xultima226 at 11:32 AM ON 01/10/10
@TH4T6UY
As of now it is better, much better. NVIDIA will release something that can stack up i'm sure but that processor is like no other processor before it including GPUs
By Mikeykrug at 12:06 PM ON 01/10/10
So, does anyone else see the potential behind this? Pair this with 3d glasses, a form of 4d tech to trick the senses, especially touch, and surround a room , or at least 3 walls, with these panels and you would have a completely immersive atmosphere, similar to 1984 and possibly the holodeck from Star Trek.
By MorituriMax at 1:17 PM ON 01/10/10
Take THAT Microsoft, this thing screams, Vertical Surface!
Wow.
By Paul at 3:05 PM ON 01/10/10
The core iX chips along with the 5500, and 3500 series Xeons are really impressive. Once they have adequate cores that are flexible enough to mimic stream processors and include the onboard HD GPU's I think they'll be able to rival or even beat dedicated graphics cards.
The core i3's are already going this route. The only issue will be that you can't swap out just the gpu, or just the cpu and that you'll have to do both as an integrated bundle. Not the best option for workstations but could be great for the 'average' user looking for an upgrade (bi)annually.
By Think realistically at 3:54 PM ON 01/10/10
OK, I saw think and it looked amazing, but then I realized it was on a 7-foot screen. I don't know how many people have 7-foot screens on their house, but I've never met anyone. I just wonder how tight and crammed this would look on my 42" TV at home.
By TH4T6UY at 4:48 PM ON 01/10/10
@Xultima226:
No...it's capable of being done on just about any GPU since...07ish. The thing is that modern nVidia and ATI GPUs are so massively parallel in their design that they just blow away all that a processor has to offer. Take the Fermi architecture for example, it has 512 processing cores. An i7 has 8. Intel is just now playing catch up to what nVidia and ATI have been doing for so long now. Discrete still rules the GPU arena in terms of performance and capability.
By RG at 6:22 AM ON 01/11/10
Intel tried to make a high performance GPU (labaree) but was unable to achieve the performance of NVidia or ATI.
So they had to abandon it.
RG:
Intel tried to make a high performance GPU (labaree) but was unable to achieve the performance of NVidia or ATI. So...More »