Larrabee @ SIGGRAPH

Started by
18 comments, last by MJP 15 years, 8 months ago
all the developers that are cross-platform do not have a problem in staying cross platform on 360, PS3, PC and the other platforms. Depending on the quality of the DX driver, Larrabee might require a dedicated software driver. As far as I know they will provide lots of source code to make the transition easier.
If Larrabee only comes out for PC most of us won't care anyway because in the moment this platform does not make much money ... this might change in 2 - 3 years so.
Advertisement
Quote:Original post by jkleinecke
I think that RadTools is adapting their Pixomatic software renderer for Larrabee. Check out Tom Forscythe's blog for some more information.


Tom doesn't work for RadTools anymore, he's working for Intel themselves on Larrabee's software renderer, which is in turn used as the back-end for the Direct3D and OpenGL drivers.

Michael Abrash, who continues to work at RadTools last I heard, is not working for Intel but was contracted to help determine the instruction set extensions that should be included in Larrabee.

I don't see Pixomatic being ported in any commercial sense, unless Abrash believes he can write a faster back-end than Intel, because it defeats the purpose of Pixomatic, which is to provide a fallback when no graphics card is present... obviously if you have a Larrabee GPU, you have the hardware to run 3D Apps using the Direct3D or OpenGL APIs.

Everything I've read in that white-paper indicates to me that the Larrabee renderer is state-of-the-art in software rendering. Pixomatic is still only a DX-7 class renderer (no shaders) and there are plenty of software renderers, such as SwiftShader, that are DX-9 class.

The one chance that it might come to pass is if Pixomatic and Abrash have the will and the means to best Intel's software driver, and to ride the wave of disappointment surrounding OpenGL 3.0 to become the de-facto cross-platform graphics API. Even still, it would be limited to the Larrabee cards (and possibly x86 CPUs at much-reduced quality) and it will be some time (if ever) before Larrabee represents a significant portion of the market.

throw table_exception("(? ???)? ? ???");

One thing that I haven't heard anyone comment on is the fact that by the time that Larrabee comes out we will be up to our necks in DX11. It seems pretty obvious that OpenGL is losing ground daily, and unfortunately DX10 hasn't really picked up quite as much popularity as it should have. These two facts seem to make the success of DX11 a bit more important in the long run.

As far as how Larrabee is accepted - it will need to be efficient enough to compete with the other discrete GPUs. Personally, I don't think the first generation is going to keep pace with NV and AMD. However, Intel is a big company and continually improves their manufacturing processes which makes future generations of the chip (with their corresponding increases in real estate) look promising.

Of course, this is all somewhat premature since the hardware hasn't even been tested really. I'm sure Intel has something up and running, but there is a lot of time between now and the release of the chip. We'll see where it goes...
IMO Larrabee could be huge, if a few things take place,

1. Apple adopts it, which they may due to they are already in Intel land and could eliminate the need for multiple GFX vendors... which Apple likes to do.

2. IIRC I heard MS, Sony, Nintendo are all talking to Intel about Larrabee, if they all adopt it and use it to code there own software rendering, vs. I got GF"X" or Radeon"X" and they support only these features, where Larrabee will allow them to break the chains, so to speak.

3. If all the above happens and DX/GL performance is good and IQ is up to par with ATI/Nvidia cards out at the time, this could very well be a serious threat to ATI/Nvidia. ATI could survive this, as they make x86 CPUs and could follow Intel's approach, but Nvidia would be hosed.

We shall see, what happens, but one thing for sure is, its great times to be in Graphics programming right now.
Quote:Original post by MARS_999
We shall see, what happens, but one thing for sure is, its great times to be in Graphics programming right now.

Well spoken [grin]
Quote:Original post by MARS_999
3. If all the above happens and DX/GL performance is good and IQ is up to par with ATI/Nvidia cards out at the time, this could very well be a serious threat to ATI/Nvidia. ATI could survive this, as they make x86 CPUs and could follow Intel's approach, but Nvidia would be hosed.


Well, That presupposes that AMD have the resources to create a Larrabee-like core, which is a pretty dim prospect at the moment -- but it is an out for them if Larrabee becomes a dominant force.

I'm still unclear of the x86 license arrangement that would reportedly prevent nVidia from acquiring an x86 licensed company, but I've heard it reported that the agreement is that the license cannot be transferred out of the country by sale, not that it cannot be transferred at all. If that information is correct, I think both nVidia and IBM would be able to acquire someone like AMD, if they go on the outs, and pick up the x86 license.

That said, nVidia has made it well clear that they do not like the x86 architecture, and would rather they could do away with it entirely. They do work with ARM cores, however, and there's no reason a Larrabee-like card couldn't be made with ARM cores (which are inherently lower wattage and have small dies anyway, both huge advantages to cramming more of them into a single piece of silicon.) While its not x86 and gives up a more-direct porting advantage, the ARM instruction set is probably the second-most popular ISA in the consumer space, followed by PPC and MIPS. ARM also has the distinct advantage of being the second-largest Linux arch as well, and licenses are easily had. Throw in extensions and a wide vector unit to match Larrabee 1-to-1 in ISA features and you've got something just as programmable as Larrabee, and potentially with greater density and less power consumption.

Call it crazy if you will, I call it just crazy enough to work.

throw table_exception("(? ???)? ? ???");

Quote:Original post by Jason Z
One thing that I haven't heard anyone comment on is the fact that by the time that Larrabee comes out we will be up to our necks in DX11. It seems pretty obvious that OpenGL is losing ground daily, and unfortunately DX10 hasn't really picked up quite as much popularity as it should have. These two facts seem to make the success of DX11 a bit more important in the long run.

As far as how Larrabee is accepted - it will need to be efficient enough to compete with the other discrete GPUs. Personally, I don't think the first generation is going to keep pace with NV and AMD. However, Intel is a big company and continually improves their manufacturing processes which makes future generations of the chip (with their corresponding increases in real estate) look promising.

Of course, this is all somewhat premature since the hardware hasn't even been tested really. I'm sure Intel has something up and running, but there is a lot of time between now and the release of the chip. We'll see where it goes...


Larrabee doesn't have to compete with the high-end at all..

If Intel can put something together that's reasonable enough to compete at the mid-range but with the flexibility of a CPU then they could easily plant themselves a nice huge slab of marketshare..
Added to the fact that Intel already OWN the IGP space for PCs I don't see why they couldn't ship 32nm 4-8 core larrabee IGP mobos (with a nice fast bus between the chip & the CPU) of which they could solidify a HUGE market segment, capturing practically ALL of the low-end market. With that kind of penetration I don't see why it wouldn't be profitable for software vendors to target the hardware, even providing more of an incentive for heavily optimized software renderers making DX & GL even less relevant.

If Intel can do all of these things and really grab a hold of the low-end IGP & mid-range PCIe markets, it really wouldn't leave much room for the competitors IMO.
Quote:Original post by MJP
-Is anybody going to bother writing a custom software rasterizer for Larrabee? Obviously the potential is there, but I don't see anyone ditching DX at any point in the foreseeable future. Hopefully Intel actually comes up with DX drivers that are worth a damn.


With Pixomatic maybe they can just ignore the whole driver nonsense and use a Larrabee-optimized Pixomatic as a drop-in replacement for the DX HAL device. Pixomatic 3 supported the full DX9 featureset (Pixo 2 was DX7-class), but since Intel reportedly aquired it back in 2005, hopefully they'll be able to bring a full D3D10/11 featureset to bear.

Hopefully, not because I'm much of an Intel fan, but because I sincerely hope Larrabee (and to lesser extent the OpenGL debacle) isn't the herald of all kinds of splintered APIs popping up.


Quote:Original post by MARS_999
ATI could survive this, as they make x86 CPUs and could follow Intel's approach, but Nvidia would be hosed.


I'm not sure. NVidia already has their Tesla thing working, which is basically Larrabee the other way around. In some ways, I think the Larrabee concept is a reaction to the (arguable) threat GPGPU posed to x86 architectures. Intel obviously holds the best cards for the consumer market and I think Larrabee will prove succesful, but NVidia is coming in from the other flank. It may be the long way around, but they do already have a market-ready system.

Interesting times indeed [smile]
Rim van Wersch [ MDXInfo ] [ XNAInfo ] [ YouTube ] - Do yourself a favor and bookmark this excellent free online D3D/shader book!
Does it really matter if Intel can keep up with the seperate GPU boards? Intel owns the gpu market, because most people get intel GPUs on their pre-built machines.

If everyone just starts getting a larabee with their new machines, and it is alt least somewhat capable, then people just need to start targeting their games to the performance of the larabee.
Quote:Original post by remigius

Hopefully, not because I'm much of an Intel fan, but because I sincerely hope Larrabee (and to lesser extent the OpenGL debacle) isn't the herald of all kinds of splintered APIs popping up.



Yeah I keep hearing people suggest that in the future every graphics programmer is going to suddenly start writing right to the metal on Larrabee using straight-up C++. I really don't see this as likely because

A) ATI and Nvidia GPU's very widespread and they're not going anywhere

and

B) Writing the kind of parallelized code you'd need for Larrabee in straight C++ is a nightmare.

I think as always developers are going to want to get behind something that lets them targets all the platforms with a reasonable level of performance and abstraction. Whether that's D3D, or CUDA, or Ct, or whatever.

This topic is closed to new replies.

Advertisement