Jump to content

  • Log In with Google      Sign In   
  • Create Account


Do You Think Any REAL Start-Up Competition Could Arise For The Intel/AMD Empires?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
12 replies to this topic

#1 Toothpix   Crossbones+   -  Reputation: 810

Posted 19 April 2013 - 06:58 PM

In your opinion, do you think that any type of new start-up company could become serious competition for Intel and AMD in the consumer PC CPU market, all while creating a new architecture and instruction set (i.e. no ARM or x86, new architecture)? I understand that ARM and mobile processors are the place to be for any CPU company right now, but do you think that it is economically possible for a startup to garner enough resources and staff to create a cutting edge 10nm chip (probably 5nm in near future) for the average PC/Server market that seriously competes with Intel and certain AMD products in 6-7 years or less?


C dominates the world of linear procedural computing, which won't advance. The future lies in MASSIVE parallelism.


Sponsor:

#2 Cornstalks   Crossbones+   -  Reputation: 6882

Posted 19 April 2013 - 06:59 PM

Nope.
[ I was ninja'd 71 times before I stopped counting a long time ago ] [ f.k.a. MikeTacular ] [ My Blog ] [ SWFer: Gaplessly looped MP3s in your Flash games ]

#3 SiCrane   Moderators   -  Reputation: 9124

Posted 19 April 2013 - 07:19 PM

Hell, even Intel has fallen flat on its face when it's tried to compete with x86. iAPX 432 and i860 come to mind. Ok, you can maybe argue Itanium is successful, but I wouldn't categorize it as a serious competitor with x86, even in the server space.

#4 Vortez   Crossbones+   -  Reputation: 2194

Posted 19 April 2013 - 07:36 PM

A new architecture would be refreshing...

 

5nm is pretty small, i believe the width of an atom is 1nm so things can't really get much smaller than maybe 3nm imo

 

What i hate most is that since we can't get transistors much smaller we keep adding cores, but in the future who will need 32-64 cores...

Except for some highly parallelizable applications like 3d rendering and video encoding, maybe compression it's a lot of wasted power.

Even a 4-8 cores today sleep most of the time in normal average user usage.

And processor speed have been pretty much stagnant at 3-4ghz for a while now.

 

Is it me or hard drives seem to suffer from the same syndrome too? I've buy a 2tb 1.5 years ago and it still the same price now...


Edited by Vortez, 19 April 2013 - 07:46 PM.

My 3D Engine.

#5 slicer4ever   Crossbones+   -  Reputation: 2600

Posted 19 April 2013 - 08:04 PM

A new architecture would be refreshing...

 

5nm is pretty small, i believe the width of an atom is 1nm so things can't really get much smaller than maybe 3nm imo

 

What i hate most is that since we can't get transistors much smaller we keep adding cores, but in the future who will need 32-64 cores...

Except for some highly parallelizable applications like 3d rendering and video encoding, maybe compression it's a lot of wasted power.

Even a 4-8 cores today sleep most of the time in normal average user usage.

And processor speed have been pretty much stagnant at 3-4ghz for a while now.

 

Is it me or hard drives seem to suffer from the same syndrome too? I've buy a 2tb 1.5 years ago and it still the same price now...

1nm is still relatively large compared to an atom(http://hypertextbook.com/facts/MichaelPhillip.shtml), and the atom itself is far smaller then the electron cloud surrounding it.

 

also, you might want to read up on electron spin gates(i forget the exact name, so someone point out). followed by quantum computing. we still have plenty of development room when it comes to making things smaller.

 

as for the OP, their might be a sliver of chance with intel's stance of soldering their future chips into mb'shttp://semiaccurate.com/2012/11/26/intel-kills-off-the-desktop-pcs-go-with-it/, but you'll likly need serious capital for startup, and to be competitve, you'll likely be selling at a loss until you've got your own manufacturing lineup. Let's also not forget that creating a new architecture means you need to get the windows onboard for making their os available to your potential customers.  You might be sitting their willing to say "screw windows", but then you minus as well just give away your chips if you don't get the biggest marketshare on your side.

 

Alternativly, you might be able to swing a deal with google, and their laptop googlechrome, if you could gain exclusivity to their chrome os being built ontop your processor, you might have a foot in the door that way.


Edited by slicer4ever, 19 April 2013 - 08:10 PM.

Check out https://www.facebook.com/LiquidGames for some great games made by me on the Playstation Mobile market.

#6 Shannon Barber   Moderators   -  Reputation: 1373

Posted 19 April 2013 - 10:11 PM

ARM is already outselling them and Intel is terrified. ARM has ~90% of the mobile market (everything that isn't an iPhone). Even Windows 8 runs on ARM.

 

Intel built an expensive fab in the US and so over capacity they were looking to resell fab time.

 

Actually thank-you, I was just trying to figure out what company besides Dell to short.


Edited by Shannon Barber, 19 April 2013 - 10:11 PM.

- The trade-off between price and quality does not exist in Japan. Rather, the idea that high quality brings on cost reduction is widely accepted.-- Tajima & Matsubara

#7 Hodgman   Moderators   -  Reputation: 24004

Posted 19 April 2013 - 11:02 PM

What i hate most is that since we can't get transistors much smaller we keep adding cores, but in the future who will need 32-64 cores...
Except for some highly parallelizable applications like 3d rendering and video encoding, maybe compression it's a lot of wasted power.
Even a 4-8 cores today sleep most of the time in normal average user usage.
And processor speed have been pretty much stagnant at 3-4ghz for a while now.

In the future, everyone is going to need 64 cores, because future software is going to be written to run well on 64 simple, low clock-speed cores.

The only reason we need complicated, high-clock-speed individual cores at the moment is because we all suck at writing software. It's a  myth that only certain types of problems can be parallelized...

Look at the brain: a trillion simple processors, running at clock speeds measured in Hz instead of GHz, using only 20W. That's pretty energy efficient.

GPU's are the same -- their evolution into using many cores has been driven largely by energy efficiency. Running lots of simpler cores is more efficient than a single super-fast and complicated one. The problem right now is a software one -- we need to unlearn a lot of engineering knowledge and re-learn it for a better type of computer.


Edited by Hodgman, 19 April 2013 - 11:03 PM.


#8 BGB   Crossbones+   -  Reputation: 1458

Posted 19 April 2013 - 11:47 PM

if Intel screws things up too bad, there is still AMD, and if AMD is also screwing things up, then this means more room for VIA.

third parties also have more of an opening now as well, since if things like MMX and SSE are omitted, then most of the rest of the x86 ISA could be implemented without raising any patent issues.

the rest would basically be pulling a Zilog, and maybe formally renaming the registers and many of the instruction mnemonics to sidestep possibly copyright issues (though, we may not even need this much).

say:
AL -> R0B, CL -> R1B, ...
AX -> R1W, CX -> R1W, ...
EAX -> R0D, ECX -> R1D, ...

LD R0D, [R5D+0x2C]
ST [R5D-0x18], R0D

then assemblers would just "quietly" accept the original names.

(this would be binary compatible with pretty much all 32-bit code compiled with default compiler settings, basically representing a 486DX or Pentium 1 like subset).

(64-bit is a little more of an issue at this point, since a binary compatible x86-64 would require licensing...).


but, FWIW, x86 is kind of hard to displace at this point.
it has been yet to be seen if ARM will (actually) make any inroads into traditional x86-dominated spaces (laptops, desktops, servers, ...), nor for that matter that laptops or desktops will "actually" die off.

I suspect probably most people are doing the whole thing of having a desktop, a laptop, and a tablet. tablets, being the new thing, will have higher sales, since pretty much everyone already has a desktop, and in recent years there isn't that much reason to buy a new desktop every few years (say, when 3 or 4 years later, the new chips are only marginally faster than they were a few years ago...).

like, people are like "well, sales are dropping, people must be moving away from desktops", rather than, say, "sales are dropping, maybe the market is saturated...".

Edited by cr88192, 19 April 2013 - 11:51 PM.


#9 phantom   Moderators   -  Reputation: 5715

Posted 20 April 2013 - 02:38 AM

ARM is already outselling them [in the mobile space] and Intel is terrified.

"Terrified" is a bit of an over statement.

Intel admit they made a mistake with a lack of focus on mobile but since they started focusing on the power issue (an engineer at GDC mentioned to a guy I work with that they simply hadn't considered power until recently) they have made great strides. If anything I would say that ARM/Apple should probably looking over their shoulder as they won't have it their own way for too much longer...

#10 Sik_the_hedgehog   Crossbones+   -  Reputation: 1411

Posted 20 April 2013 - 03:39 AM

What i hate most is that since we can't get transistors much smaller we keep adding cores, but in the future who will need 32-64 cores...
Except for some highly parallelizable applications like 3d rendering and video encoding, maybe compression it's a lot of wasted power.
Even a 4-8 cores today sleep most of the time in normal average user usage.
And processor speed have been pretty much stagnant at 3-4ghz for a while now.

In the future, everyone is going to need 64 cores, because future software is going to be written to run well on 64 simple, low clock-speed cores.

The only reason we need complicated, high-clock-speed individual cores at the moment is because we all suck at writing software. It's a  myth that only certain types of problems can be parallelized...

Look at the brain: a trillion simple processors, running at clock speeds measured in Hz instead of GHz, using only 20W. That's pretty energy efficient.

GPU's are the same -- their evolution into using many cores has been driven largely by energy efficiency. Running lots of simpler cores is more efficient than a single super-fast and complicated one. The problem right now is a software one -- we need to unlearn a lot of engineering knowledge and re-learn it for a better type of computer.

Honestly, if I recall correctly one of the biggest issues is that current multicore CPUs still have a single MMU shared among all cores. This prevents running multiple processes at the same time. If each core had its own MMU (or at least had something that allowed separate processes to be run simultaneously), that alone would improve performance by a lot because all modern OSes are multitasking and running a quite large amount of processes. Heck, maybe that could encourage sticking to single threading for simple stuff because that would give more room for other processes that need the time to use it.

 

But yes, we do have a software problem. Even with current hardware it could be a lot better... I don't get it, calculations became much faster but interfaces became less responsive over time =& At least that's the impression I get.


Don't pay much attention to "the hedgehog" in my nick, it's just because "Sik" was already taken =/ By the way, Sik is pronounced like seek, not like sick.

#11 Luckless   Crossbones+   -  Reputation: 1466

Posted 20 April 2013 - 07:24 AM

The other barrier to entry of the industry is the general issue of patents. One of my friend's families are owning partners in a chip manufacturing business, the name of which is escaping me. They are one of the largest IC producers in the world for things like signal converters and stuff for radio and audio equipment, a major industry player that likely everyone has a few of their products but doesn't know it. When I asked them why they hadn't gotten into the processor and general computer market ages ago, I was told that they had calculated the risks as far too high and costly to enter. One patent issue and all their investment goes down the drain, therefore they sit in the market they're in and 'play nice'.


Old Username: Talroth
If your signature on a web forum takes up more space than your average post, then you are doing things wrong.

#12 Bregma   Crossbones+   -  Reputation: 4358

Posted 20 April 2013 - 08:15 AM

In your opinion, do you think that any type of new start-up company could become serious competition for Intel and AMD in the consumer PC CPU market, all while creating a new architecture and instruction set (i.e. no ARM or x86, new architecture)? I understand that ARM and mobile processors are the place to be for any CPU company right now, but do you think that it is economically possible for a startup to garner enough resources and staff to create a cutting edge 10nm chip (probably 5nm in near future) for the average PC/Server market that seriously competes with Intel and certain AMD products in 6-7 years or less?

I don't think it's really a pointed question.

 

All ARM chips are made in China by fabs you would never recognize the name of, owned by Taiwanese ODMs you would never recognize the names of, generally working for Taiwanese OEMs you're less likely to have heard of who supply predominantly American, and a couple of scandanavian, labels.  These guys already outsell Intel and AMD in the consumer computer market.

 

In 5 or 6 years, I see no reason why the consumer computer market should not converge with the desktop/notebook PC market and the server market.  Already your typical ARM-based handheld is more powerful than a typical consumer or server blade needs and the power-saving features are far superior.

 

So, is you question about an IP company like ARM having its property in more consumer devices than Intel (because it already does) or about fabs having more silicon in the hands of consumers than Intel (because Intel is already waay down the list)?  Or are you talking about an entirely new CPU design becoming dominant, they things happened in the 1970s and 1980s?


Stephen M. Webb
Professional Free Software Developer

#13 Toothpix   Crossbones+   -  Reputation: 810

Posted 20 April 2013 - 08:56 AM

Or are you talking about an entirely new CPU design becoming dominant

Yes, that was what I was talking about.


C dominates the world of linear procedural computing, which won't advance. The future lies in MASSIVE parallelism.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS