Larrabee @ SIGGRAPH

Started by
18 comments, last by MJP 15 years, 8 months ago
Intel has made their paper ready ahead of time. Pretty interesting read, I found. It's a shame that the final chip hasn't been produced yet, so we don't know 'how many' Larrabee units there'll be in the consumer available models. Hopefully it'll be enough to be competitive vs. the chips that AMD/NV put out in 2010. I mean, it's been awhile since I've run it, but I think my 8800GTS was able to render Gears PC at 1280*1024 locked at 60fps, which is equivalent to having a 24-Larrabee unit chip. To be able to match that in 2010, won't intel have to have over 60 units on the highend? [Edited by - Cypher19 on August 8, 2008 6:24:08 PM]
Advertisement
Yeah it's interesting stuff alright. My questions are...

-Is anybody going to bother writing a custom software rasterizer for Larrabee? Obviously the potential is there, but I don't see anyone ditching DX at any point in the foreseeable future. Hopefully Intel actually comes up with DX drivers that are worth a damn.

-Is it going to get used in as console? I'd imagine the landscape would change dramatically if someone like Microsoft was working with Larrabee and developing tools for it.

-Is Larrabee going to replace Intel's integrated solutions? That could also have some interesting effects on the PC graphics market if the low-end got boosted up a bit.
Quote:-Is anybody going to bother writing a custom software rasterizer for Larrabee? Obviously the potential is there, but I don't see anyone ditching DX at any point in the foreseeable future. Hopefully Intel actually comes up with DX drivers that are worth a damn.

Well, the softrast developed as part of the paper will be part of the DX/OGL driver. Since Larrabee is supposed to be directly competing against AMD/NV on the high end, they'll have no choice but to have a high quality driver. Besides, they already managed to demonstrate via simulation that they are getting correct images in HL2Ep2, Gears, and Fear.

Quote:-Is Larrabee going to replace Intel's integrated solutions? That could also have some interesting effects on the PC graphics market if the low-end got boosted up a bit.

I think it possibly will be. Some Intel docs even go so far as to suggest that Larrabee(-like logic) will also be part of the CPU in the future, in addition to/instead of using a QPI socket or PCIe board.
For understanding the longterm merits of Larrabee it is interesting to look at Matt Pharrs slides from his presentation Interactive Rendering In The Post-GPU Era.
Quote:Original post by Cypher19
Intel has made their paper ready ahead of time.

By the way, you can get a free copy here for those without an ACM account.


I thought I-...hm, I changed the link, but it seems the edit didn't go through. I changed it now, though.
I think that RadTools is adapting their Pixomatic software renderer for Larrabee. Check out Tom Forscythe's blog for some more information.
-----------------------------------Indium Studios, Inc.
Quote:Original post by MJP
Yeah it's interesting stuff alright. My questions are...

-Is anybody going to bother writing a custom software rasterizer for Larrabee? Obviously the potential is there, but I don't see anyone ditching DX at any point in the foreseeable future. Hopefully Intel actually comes up with DX drivers that are worth a damn.


I already have a software renderer that supports shaders. This should also mean a lot more games going cross platform.

Quote:Original post by supagu
Quote:Original post by MJP
Yeah it's interesting stuff alright. My questions are...

-Is anybody going to bother writing a custom software rasterizer for Larrabee? Obviously the potential is there, but I don't see anyone ditching DX at any point in the foreseeable future. Hopefully Intel actually comes up with DX drivers that are worth a damn.


I already have a software renderer that supports shaders. This should also mean a lot more games going cross platform.


Sure but is it written for Larrabee? [wink]

The problem isn't writing one, it's the fact that the userbase is going to be so fractured. If a studio decided to go with a software renderer, they'd be ignoring all the existing DX-based GPU's. Like I said before...I think uptake will depend a lot on whether Intel can get MS to stick a Larrabee in their next box.

Also what does Larrabee have to do with games being cross-platform?

Well, if more developers implement software renderers for Larrabee, then they aren't tied to DX, and would have an easier time porting their game to other platforms.

This topic is closed to new replies.

Advertisement