• Create Account

### #Actualsooner123

Posted 07 December 2012 - 02:50 AM

I didn't "hear" a definition of prime

I think you're mincing words here. I wasn't implying you were a layman to mathematics. Just that the definition you might have been basing that statement on was a laymen's definition. I know, from years of experience dealing with you that you have an awesome grasp of mathematics and computer science. And probably much more.

(Fundamental Theorem of Algebra)

Actually it's the Fundamental Theorem of Arithmetic, which is what I was referring to. The Fundamental Theorem of Algebra is something very different.

The modern definition of prime is an element of a commutative ring that is not 0, not a unit and generates a prime ideal (that's the condition I just described in my previous post). The "not a unit" excludes 1. But this is a convention that people have found convenient, and has no deeper meaning.

Except that this modern definition, with which I'm familiar, is a consequence of its equivalence to the definition that satisfies the fundamental theorem of arithmetic and a desire to merge abstract algebra with number theory.

If we were to consider 1 a prime, many theorems would have to be reworded to exclude it. But some other theorems would be simplified. For instance, see Wilson's theorem. There are also theorems that need to exclude 2 (example), but that doesn't mean that 2 isn't prime.

Yep. But this isn't really related to what we're talking about, which is the definition and whether 1's exclusion is a convention or a consequence of something deeper. Not the consequences of things being different.

### #1sooner123

Posted 07 December 2012 - 02:42 AM

I didn't "hear" a definition of prime

I think you're mincing words here. I wasn't implying you were a layman to mathematics. I know, from years of experience dealing with you that you have an awesome grasp of mathematics and computer science. And probably much more.

(Fundamental Theorem of Algebra)

Actually it's the Fundamental Theorem of Arithmetic, which is what I was referring to. The Fundamental Theorem of Algebra is something very different.

The modern definition of prime is an element of a commutative ring that is not 0, not a unit and generates a prime ideal (that's the condition I just described in my previous post). The "not a unit" excludes 1. But this is a convention that people have found convenient, and has no deeper meaning.

Except that this modern definition, with which I'm familiar, is a consequence of its equivalence to the definition that satisfies the fundamental theorem of arithmetic and a desire to merge abstract algebra with number theory.

If we were to consider 1 a prime, many theorems would have to be reworded to exclude it. But some other theorems would be simplified. For instance, see Wilson's theorem. There are also theorems that need to exclude 2 (example), but that doesn't mean that 2 isn't prime.

Yep. But this isn't really related to what we're talking about, which is the definition and whether 1's exclusion is a convention or a consequence of something deeper. Not the consequences of things being different.

PARTNERS