• Advertisement
Sign in to follow this  

Generateing all primes from 2-N the fast way

This topic is 4739 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello all, i've been doing some primality generating recently: And i've been wondering, how many people have lists of all primes < 2^64? How long would it take to compute this list? (asuming one comp, 1ghz, with 256mb ram). How would you compute it efficiently? From, Nice coder

Share this post


Link to post
Share on other sites
Advertisement
I found a list of ways to test if numbers are prime @ Dr. Math, using some of the properties of primes it isn't hard to generate reasonbly fast tests.

A simple way to test if (an odd) n is prime is to do the following:


bool prime = true;
for(int a = 3 ; a < sqrt(n) && prime; a += 2)
{
if(n % a == 0)
{
prime = false;
}
}

Share this post


Link to post
Share on other sites
Quote:
Original post by Nice Coder
And i've been wondering, how many people have lists of all primes < 2^64?


I only have primes less than 10.006.721 - in a handbook.

Quote:
How long would it take to compute this list? (asuming one comp, 1ghz, with 256mb ram).


Depends on the algorithm you use.

Quote:
How would you compute it efficiently?


Primality testing @ Mathworld.

Share this post


Link to post
Share on other sites
Quote:
Original post by Nice Coder
Hello all, i've been doing some primality generating recently:

And i've been wondering, how many people have lists of all primes < 2^64?

How long would it take to compute this list? (asuming one comp, 1ghz, with 256mb ram).

How would you compute it efficiently?

From,
Nice coder

If you are working in C++ and you are worried about cost of calculation during the runtime of your program, you can compute the list at compile-time with the algorithm of your choice by using template (or even preprocessor) metaprogramming. This way, it would have literally no runtime cost in terms of calculation regardless of the method you use (though you'd still want a fast one otherwise your compile time will go through the roof). Also because of this, you get the added benefit of being able to use the list at compile-time if you would happen to need it.

Share this post


Link to post
Share on other sites
libgmp has a nextprime() function, which might suit your needs well. [it actually doesn't -guarantee- that the numbers are prime, but gives ~7 9's worth of likelihood]

Though, 2^64 is fairly large, so sieving through the entirety of the range should prove impractical.

There was a recent breakthrough in India [?] regarding primality testing, which might help.

Share this post


Link to post
Share on other sites
Quote:
Original post by Telastyn
libgmp has a nextprime() function, which might suit your needs well. [it actually doesn't -guarantee- that the numbers are prime, but gives ~7 9's worth of likelihood]

Though, 2^64 is fairly large, so sieving through the entirety of the range should prove impractical.

There was a recent breakthrough in India [?] regarding primality testing, which might help.


Sieving up to 2^32 is also impractical.

I've got lists to about a billion, but it took a few hours to get them (a version of trial division, using vb.)

From,
Nice coder

Share this post


Link to post
Share on other sites
Isn't there a formula, like 6x +-1 or something, which dictates most of the primes?

What is that formula?

From,
Nice coder

Share this post


Link to post
Share on other sites
Sieving up to 2^32 is impractical, but possible for some. With 768 MB DDR RAM, and only about 26 used by OS and background software, I easily have room for 256 2-MB unpaged chunks. All those together gives 512 MB. Which is 2^32 bits exactly. I've used this technique a couple times before, primarily for testing 32-bit digest algorithms.

Using trial division, my old 266MHz Pentium-1 laptop could spit to human-readable text file all the primes up to ~200,000,000 in eight hours, but could get the first 20,000,000 in only about three minutes.

Share this post


Link to post
Share on other sites
How would the code go with that? (one bit per number).
(please bear in mind that i'm working in vb, where mod only goes up to 2^16, and everything is really really slow).

I've only got 256mb of ram, so i can't do that :(.

Ow well, i'll go up to maybe 2B-3B then i'll trial divise the rest.

From,
Nice coder

Share this post


Link to post
Share on other sites
Quote:
Original post by Wyrframe
Sieving up to 2^32 is impractical, but possible for some. With 768 MB DDR RAM, and only about 26 used by OS and background software, I easily have room for 256 2-MB unpaged chunks. All those together gives 512 MB. Which is 2^32 bits exactly. I've used this technique a couple times before, primarily for testing 32-bit digest algorithms.

Using trial division, my old 266MHz Pentium-1 laptop could spit to human-readable text file all the primes up to ~200,000,000 in eight hours, but could get the first 20,000,000 in only about three minutes.


Mines 48 secs to 100K. (thats just a prototype).
edit: 5 secs at 100K
107Sec to 1M

How fast is yours?
From,
Nice coder

Share this post


Link to post
Share on other sites
Erathostenes comes to mind.

Make an array of bools of length n(n largest prime).
set all entries to true.
Just make a for loop from 2 until sqrt(n).
Now in the loop. 2 comes first.

if(array)
{
//set all multiples of i to false
}


Thats it very effective.

-CProgrammer

Share this post


Link to post
Share on other sites
Nice, so you don't have to do any primality checking on those numbers?

Sweet, now i'm 30 seconds for 20mill.
2 min 53 secons for 100M

But how can i use it for much larger numbers...

Now, i am trying to sieve up to about 3.2GB (using 200MB of ram, how much memory should i be able to use? i have 256mb of ram, win2K).

I was thinking about 1. Using bit tables (how slow are they?) and 2. by only storing every second number. So i use half the space. (so then i can use 2*8*200M = 3.2G prime numbers).

I would need to search 1,094,967,296 numbers using trial division later.
That would take ages to do......

Now to search the whole thing, i would need 268,435,456 or 268MB, i only have 256mb... I'm going to see if i can find a comp with more memory...

How would i use the bitfields and the using every second number thing? (keep in mind, that i'm using vb, so bitshifts = mults).

Am i going quickly, given programming language and experience?

From,
Nice coder

[Edited by - Nice Coder on March 5, 2005 4:38:38 AM]

Share this post


Link to post
Share on other sites
I've been thinking of making a little test program, so what it would do, is it would sieve up to x where x is a user specified number, before going and resolving queries.

Queries would be like: What is the closest prime number to x, is x prime, how many primes do you have, how fast did you search, what is a random prime number, multply x prime numbers together and give me the result, ect.

From,
Nice coder

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
I've been thinking about something like dynamic programming.
If you're generating primes from 2 until a certain number, your primality testing can be different and much more efficient then "the usual way".
That is because for every number, you already have generated all the primes smaller then it.

I thought about something like that (algorithm):

primes = {2}
currentNum = 3

while currentNum

eventually, the set "primes" will contain all the prime numbers smaller or equal to n.
The set "primes" can be implemented as a vector or a list. It does not really matter.


There is another, much simpler algorithm, but it uses more memory.
You make a big boolean vector V of size n-1. The n'th element represents the number n+2. That is V[0] represents 2, V[1] represents 3 and so on.
All the values in the vector are initialized to be false.
For index i in the vector, if V is false then you make all the values at indices i, 2i, 3i,... to be true.
Eventually, the array is scanned sequentially, and all indices of values which remained false represent prime numbers (they are not multiples of any number). This algorithm can be improved too to work faster.

Share this post


Link to post
Share on other sites
Thats the sieve.

I'm trying to figure out how to use bittables, so t hat i can work with individual bits quickly....

Has anybody got any experience in this?

From,
Nice coder

Share this post


Link to post
Share on other sites
Actually thats the type of thing that i tried. (really slow).

All that was needed was


open "C:\Outputfile.txt" for output as #1
dim primes() as boolean
redim primes(number)
for x = 3 to number
if primes(x) = false then
print #1, x
for y = x to number step x
primes(y) = true
next y
end if
next x
close #1


Which searchesa bit over 1/2 million nums/second on my machine (p31ghz, 256mb ram).
Without needing to print off the data, it searches about 3/4-1 million nums/second.

Now, the thing that i'm a bit clueless about, is using bittables....

From,
Nice coder

Share this post


Link to post
Share on other sites
If you're worried about speed then why don't you consider switching away from VB? (sorry I can't be more helpful...[smile]) (p.s. using the paging algorithm from the page that fruny linked to I just generated primes to over 200 million...so now I have a bunch of useless files full of primes sitting on my hard drive...stupid impulses)

[Edited by - lucky_monkey on March 6, 2005 8:33:19 AM]

Share this post


Link to post
Share on other sites
I would get away from vb, but then again, i want a fast algorithm... (i also have a hate-hate relationship with my c++ compiler.) Also, wheres the challange of programming something fast in a language which has no optimisation and does everything the slow way? where a division can cost a few hundred times what it should cost, when every instruction needs to be carefully hand optimised....

Maybe i could make all k primes to 2^32? k being 101? (after i check the first few billion numbers, using a sieve that i'm still looking forward to building. As well as being completely stumed in how to build it...)

There would only be a very small chance that the numbers that are k prime over 101 are actually compisite.

I'm making a smallish adjustment to the sieve:
I'm going to make a sieve up to 200 million (assuming that nobody can tell me how to implement the changes, and i cannot figure out myself)
After that i go into a kprime mode where i use an isprime of the form

dim j as double
j = x * 0.5
j += x * 0.3333333333333333333
j += x * 0.2
....
isprime = (int(j) <> j)


Is this a nice way to do it?
From,
Nice coder

Share this post


Link to post
Share on other sites
Quote:
Original post by Telastyn
There was a recent breakthrough in India [?] regarding primality testing, which might help.


The new algorithm was developed by a team of researchers from Indian Institure of Technology. It is, however, an "accurate primality" algorithm rather than a speedy algorithm. Apparently, it generates primes with NO errors and is more oriented towards encryption and security paradigms.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement