Jump to content

  • Log In with Google      Sign In   
  • Create Account

Larrabee @ SIGGRAPH


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
19 replies to this topic

#1 Cypher19   Members   -  Reputation: 768

Like
0Likes
Like

Posted 08 August 2008 - 04:59 AM

Intel has made their paper ready ahead of time. Pretty interesting read, I found. It's a shame that the final chip hasn't been produced yet, so we don't know 'how many' Larrabee units there'll be in the consumer available models. Hopefully it'll be enough to be competitive vs. the chips that AMD/NV put out in 2010. I mean, it's been awhile since I've run it, but I think my 8800GTS was able to render Gears PC at 1280*1024 locked at 60fps, which is equivalent to having a 24-Larrabee unit chip. To be able to match that in 2010, won't intel have to have over 60 units on the highend? [Edited by - Cypher19 on August 8, 2008 6:24:08 PM]

Sponsor:

#2 MJP   Moderators   -  Reputation: 11339

Like
0Likes
Like

Posted 08 August 2008 - 07:23 AM

Yeah it's interesting stuff alright. My questions are...

-Is anybody going to bother writing a custom software rasterizer for Larrabee? Obviously the potential is there, but I don't see anyone ditching DX at any point in the foreseeable future. Hopefully Intel actually comes up with DX drivers that are worth a damn.

-Is it going to get used in as console? I'd imagine the landscape would change dramatically if someone like Microsoft was working with Larrabee and developing tools for it.

-Is Larrabee going to replace Intel's integrated solutions? That could also have some interesting effects on the PC graphics market if the low-end got boosted up a bit.

#3 Cypher19   Members   -  Reputation: 768

Like
0Likes
Like

Posted 08 August 2008 - 08:27 AM

Quote:
-Is anybody going to bother writing a custom software rasterizer for Larrabee? Obviously the potential is there, but I don't see anyone ditching DX at any point in the foreseeable future. Hopefully Intel actually comes up with DX drivers that are worth a damn.

Well, the softrast developed as part of the paper will be part of the DX/OGL driver. Since Larrabee is supposed to be directly competing against AMD/NV on the high end, they'll have no choice but to have a high quality driver. Besides, they already managed to demonstrate via simulation that they are getting correct images in HL2Ep2, Gears, and Fear.

Quote:
-Is Larrabee going to replace Intel's integrated solutions? That could also have some interesting effects on the PC graphics market if the low-end got boosted up a bit.

I think it possibly will be. Some Intel docs even go so far as to suggest that Larrabee(-like logic) will also be part of the CPU in the future, in addition to/instead of using a QPI socket or PCIe board.

#4 DonnieDarko   Members   -  Reputation: 251

Like
0Likes
Like

Posted 08 August 2008 - 10:20 AM

For understanding the longterm merits of Larrabee it is interesting to look at Matt Pharrs slides from his presentation Interactive Rendering In The Post-GPU Era.

#5 AndyTX   Members   -  Reputation: 802

Like
0Likes
Like

Posted 08 August 2008 - 12:00 PM

Quote:
Original post by Cypher19
Intel has made their paper ready ahead of time.

By the way, you can get a free copy here for those without an ACM account.




#6 Cypher19   Members   -  Reputation: 768

Like
0Likes
Like

Posted 08 August 2008 - 12:23 PM

I thought I-...hm, I changed the link, but it seems the edit didn't go through. I changed it now, though.

#7 jkleinecke   Members   -  Reputation: 251

Like
0Likes
Like

Posted 11 August 2008 - 05:02 AM

I think that RadTools is adapting their Pixomatic software renderer for Larrabee. Check out Tom Forscythe's blog for some more information.

#8 supagu   Members   -  Reputation: 148

Like
0Likes
Like

Posted 11 August 2008 - 11:25 PM

Quote:
Original post by MJP
Yeah it's interesting stuff alright. My questions are...

-Is anybody going to bother writing a custom software rasterizer for Larrabee? Obviously the potential is there, but I don't see anyone ditching DX at any point in the foreseeable future. Hopefully Intel actually comes up with DX drivers that are worth a damn.


I already have a software renderer that supports shaders. This should also mean a lot more games going cross platform.



#9 MJP   Moderators   -  Reputation: 11339

Like
0Likes
Like

Posted 12 August 2008 - 03:41 AM

Quote:
Original post by supagu
Quote:
Original post by MJP
Yeah it's interesting stuff alright. My questions are...

-Is anybody going to bother writing a custom software rasterizer for Larrabee? Obviously the potential is there, but I don't see anyone ditching DX at any point in the foreseeable future. Hopefully Intel actually comes up with DX drivers that are worth a damn.


I already have a software renderer that supports shaders. This should also mean a lot more games going cross platform.


Sure but is it written for Larrabee? [wink]

The problem isn't writing one, it's the fact that the userbase is going to be so fractured. If a studio decided to go with a software renderer, they'd be ignoring all the existing DX-based GPU's. Like I said before...I think uptake will depend a lot on whether Intel can get MS to stick a Larrabee in their next box.

Also what does Larrabee have to do with games being cross-platform?



#10 DeathRay2K   Members   -  Reputation: 100

Like
0Likes
Like

Posted 12 August 2008 - 06:30 AM

Well, if more developers implement software renderers for Larrabee, then they aren't tied to DX, and would have an easier time porting their game to other platforms.

#11 wolf   Members   -  Reputation: 848

Like
0Likes
Like

Posted 12 August 2008 - 07:19 AM

all the developers that are cross-platform do not have a problem in staying cross platform on 360, PS3, PC and the other platforms. Depending on the quality of the DX driver, Larrabee might require a dedicated software driver. As far as I know they will provide lots of source code to make the transition easier.
If Larrabee only comes out for PC most of us won't care anyway because in the moment this platform does not make much money ... this might change in 2 - 3 years so.

#12 Ravyne   GDNet+   -  Reputation: 7373

Like
0Likes
Like

Posted 12 August 2008 - 11:31 AM

Quote:
Original post by jkleinecke
I think that RadTools is adapting their Pixomatic software renderer for Larrabee. Check out Tom Forscythe's blog for some more information.


Tom doesn't work for RadTools anymore, he's working for Intel themselves on Larrabee's software renderer, which is in turn used as the back-end for the Direct3D and OpenGL drivers.

Michael Abrash, who continues to work at RadTools last I heard, is not working for Intel but was contracted to help determine the instruction set extensions that should be included in Larrabee.

I don't see Pixomatic being ported in any commercial sense, unless Abrash believes he can write a faster back-end than Intel, because it defeats the purpose of Pixomatic, which is to provide a fallback when no graphics card is present... obviously if you have a Larrabee GPU, you have the hardware to run 3D Apps using the Direct3D or OpenGL APIs.

Everything I've read in that white-paper indicates to me that the Larrabee renderer is state-of-the-art in software rendering. Pixomatic is still only a DX-7 class renderer (no shaders) and there are plenty of software renderers, such as SwiftShader, that are DX-9 class.

The one chance that it might come to pass is if Pixomatic and Abrash have the will and the means to best Intel's software driver, and to ride the wave of disappointment surrounding OpenGL 3.0 to become the de-facto cross-platform graphics API. Even still, it would be limited to the Larrabee cards (and possibly x86 CPUs at much-reduced quality) and it will be some time (if ever) before Larrabee represents a significant portion of the market.

#13 Jason Z   Crossbones+   -  Reputation: 5062

Like
0Likes
Like

Posted 13 August 2008 - 04:10 PM

One thing that I haven't heard anyone comment on is the fact that by the time that Larrabee comes out we will be up to our necks in DX11. It seems pretty obvious that OpenGL is losing ground daily, and unfortunately DX10 hasn't really picked up quite as much popularity as it should have. These two facts seem to make the success of DX11 a bit more important in the long run.

As far as how Larrabee is accepted - it will need to be efficient enough to compete with the other discrete GPUs. Personally, I don't think the first generation is going to keep pace with NV and AMD. However, Intel is a big company and continually improves their manufacturing processes which makes future generations of the chip (with their corresponding increases in real estate) look promising.

Of course, this is all somewhat premature since the hardware hasn't even been tested really. I'm sure Intel has something up and running, but there is a lot of time between now and the release of the chip. We'll see where it goes...

#14 MARS_999   Members   -  Reputation: 1280

Like
0Likes
Like

Posted 13 August 2008 - 04:45 PM

IMO Larrabee could be huge, if a few things take place,

1. Apple adopts it, which they may due to they are already in Intel land and could eliminate the need for multiple GFX vendors... which Apple likes to do.

2. IIRC I heard MS, Sony, Nintendo are all talking to Intel about Larrabee, if they all adopt it and use it to code there own software rendering, vs. I got GF"X" or Radeon"X" and they support only these features, where Larrabee will allow them to break the chains, so to speak.

3. If all the above happens and DX/GL performance is good and IQ is up to par with ATI/Nvidia cards out at the time, this could very well be a serious threat to ATI/Nvidia. ATI could survive this, as they make x86 CPUs and could follow Intel's approach, but Nvidia would be hosed.

We shall see, what happens, but one thing for sure is, its great times to be in Graphics programming right now.

#15 Jason Z   Crossbones+   -  Reputation: 5062

Like
0Likes
Like

Posted 13 August 2008 - 11:48 PM

Quote:
Original post by MARS_999
We shall see, what happens, but one thing for sure is, its great times to be in Graphics programming right now.

Well spoken [grin]


#16 Ravyne   GDNet+   -  Reputation: 7373

Like
0Likes
Like

Posted 14 August 2008 - 08:37 AM

Quote:
Original post by MARS_999
3. If all the above happens and DX/GL performance is good and IQ is up to par with ATI/Nvidia cards out at the time, this could very well be a serious threat to ATI/Nvidia. ATI could survive this, as they make x86 CPUs and could follow Intel's approach, but Nvidia would be hosed.


Well, That presupposes that AMD have the resources to create a Larrabee-like core, which is a pretty dim prospect at the moment -- but it is an out for them if Larrabee becomes a dominant force.

I'm still unclear of the x86 license arrangement that would reportedly prevent nVidia from acquiring an x86 licensed company, but I've heard it reported that the agreement is that the license cannot be transferred out of the country by sale, not that it cannot be transferred at all. If that information is correct, I think both nVidia and IBM would be able to acquire someone like AMD, if they go on the outs, and pick up the x86 license.

That said, nVidia has made it well clear that they do not like the x86 architecture, and would rather they could do away with it entirely. They do work with ARM cores, however, and there's no reason a Larrabee-like card couldn't be made with ARM cores (which are inherently lower wattage and have small dies anyway, both huge advantages to cramming more of them into a single piece of silicon.) While its not x86 and gives up a more-direct porting advantage, the ARM instruction set is probably the second-most popular ISA in the consumer space, followed by PPC and MIPS. ARM also has the distinct advantage of being the second-largest Linux arch as well, and licenses are easily had. Throw in extensions and a wide vector unit to match Larrabee 1-to-1 in ISA features and you've got something just as programmable as Larrabee, and potentially with greater density and less power consumption.

Call it crazy if you will, I call it just crazy enough to work.

#17 ArchangelMorph   Members   -  Reputation: 262

Like
0Likes
Like

Posted 15 August 2008 - 02:08 AM

Quote:
Original post by Jason Z
One thing that I haven't heard anyone comment on is the fact that by the time that Larrabee comes out we will be up to our necks in DX11. It seems pretty obvious that OpenGL is losing ground daily, and unfortunately DX10 hasn't really picked up quite as much popularity as it should have. These two facts seem to make the success of DX11 a bit more important in the long run.

As far as how Larrabee is accepted - it will need to be efficient enough to compete with the other discrete GPUs. Personally, I don't think the first generation is going to keep pace with NV and AMD. However, Intel is a big company and continually improves their manufacturing processes which makes future generations of the chip (with their corresponding increases in real estate) look promising.

Of course, this is all somewhat premature since the hardware hasn't even been tested really. I'm sure Intel has something up and running, but there is a lot of time between now and the release of the chip. We'll see where it goes...


Larrabee doesn't have to compete with the high-end at all..

If Intel can put something together that's reasonable enough to compete at the mid-range but with the flexibility of a CPU then they could easily plant themselves a nice huge slab of marketshare..
Added to the fact that Intel already OWN the IGP space for PCs I don't see why they couldn't ship 32nm 4-8 core larrabee IGP mobos (with a nice fast bus between the chip & the CPU) of which they could solidify a HUGE market segment, capturing practically ALL of the low-end market. With that kind of penetration I don't see why it wouldn't be profitable for software vendors to target the hardware, even providing more of an incentive for heavily optimized software renderers making DX & GL even less relevant.

If Intel can do all of these things and really grab a hold of the low-end IGP & mid-range PCIe markets, it really wouldn't leave much room for the competitors IMO.

#18 remigius   Members   -  Reputation: 1172

Like
0Likes
Like

Posted 15 August 2008 - 04:05 AM

Quote:
Original post by MJP
-Is anybody going to bother writing a custom software rasterizer for Larrabee? Obviously the potential is there, but I don't see anyone ditching DX at any point in the foreseeable future. Hopefully Intel actually comes up with DX drivers that are worth a damn.


With Pixomatic maybe they can just ignore the whole driver nonsense and use a Larrabee-optimized Pixomatic as a drop-in replacement for the DX HAL device. Pixomatic 3 supported the full DX9 featureset (Pixo 2 was DX7-class), but since Intel reportedly aquired it back in 2005, hopefully they'll be able to bring a full D3D10/11 featureset to bear.

Hopefully, not because I'm much of an Intel fan, but because I sincerely hope Larrabee (and to lesser extent the OpenGL debacle) isn't the herald of all kinds of splintered APIs popping up.


Quote:
Original post by MARS_999
ATI could survive this, as they make x86 CPUs and could follow Intel's approach, but Nvidia would be hosed.


I'm not sure. NVidia already has their Tesla thing working, which is basically Larrabee the other way around. In some ways, I think the Larrabee concept is a reaction to the (arguable) threat GPGPU posed to x86 architectures. Intel obviously holds the best cards for the consumer market and I think Larrabee will prove succesful, but NVidia is coming in from the other flank. It may be the long way around, but they do already have a market-ready system.

Interesting times indeed [smile]


Rim van Wersch [ MDXInfo ] [ XNAInfo ] [ YouTube ] - Do yourself a favor and bookmark this excellent free online D3D/shader book!

#19 MrDaaark   Members   -  Reputation: 3555

Like
0Likes
Like

Posted 15 August 2008 - 04:30 AM

Does it really matter if Intel can keep up with the seperate GPU boards? Intel owns the gpu market, because most people get intel GPUs on their pre-built machines.

If everyone just starts getting a larabee with their new machines, and it is alt least somewhat capable, then people just need to start targeting their games to the performance of the larabee.

#20 MJP   Moderators   -  Reputation: 11339

Like
0Likes
Like

Posted 15 August 2008 - 05:32 AM

Quote:
Original post by remigius

Hopefully, not because I'm much of an Intel fan, but because I sincerely hope Larrabee (and to lesser extent the OpenGL debacle) isn't the herald of all kinds of splintered APIs popping up.



Yeah I keep hearing people suggest that in the future every graphics programmer is going to suddenly start writing right to the metal on Larrabee using straight-up C++. I really don't see this as likely because

A) ATI and Nvidia GPU's very widespread and they're not going anywhere

and

B) Writing the kind of parallelized code you'd need for Larrabee in straight C++ is a nightmare.

I think as always developers are going to want to get behind something that lets them targets all the platforms with a reasonable level of performance and abstraction. Whether that's D3D, or CUDA, or Ct, or whatever.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS