Archived

This topic is now archived and is closed to further replies.

Huges arrays

This topic is 5904 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

Hi. How could I use following array: DWORD ARY[4294967296]; // 2^32 I could not make it work in Visual C++ 6 because the size of array was too big. Is there any way to use very big arrays? Floru

Share on other sites
you have to use new or malloc to allocate the memory for the array

A.B.C.

A - Always
B - Be
C - Coding

Share on other sites
WOOOOAAAA!!! Thats 16 Gigabytes of memory your trying to allocate there, I know memory is cheap these days but I don''t believe its that cheap. I mean I think 128Mb-512Mb is standard for most new PCs now. Okay newer PC''s may have to have 2Gb RAM to run XP...LOL But unless you really want to be swapping like mad I think you''ll have to find another way to write your program!

If you just want it to compile see something like malloc() or new(c++) but I''m still not sure they''ll take numbers that big!

BTW: I think that even if you do have 16Gb of RAM there is issues with accessing the areas over 4Gb on 32bit processors - Correct me if I''m wrong.

Share on other sites
You can also increase the size of the stack (fails beyond ~128MB though)

Share on other sites
As far as I know, Windows NT/2K doesn''t give more than 2 GB of RAM to one process...

Anyway, what are you trying to do ?

Share on other sites

I''m reading a file from beginning to end. I read the file in 32-bit (DWORD) chunks. And I want to know what are the least and most occurred numbers in the file.

In very simple way:

DWORD ary[4294967296];
DWORD num;

ZeroMemory(ary)

while(!feof(file)
{
ary[num]++;
}

And I would know exactly how many occurrences of each number in file. But it''s not necessary to know all the occurrences perhaps only 20 biggest and smallest occurrences.

Floru

Share on other sites
why dont you instead use a linked list of a structure that holds the number and a counter to how many times you find the number, on each reading loop then you walk the list, if you find the newly loaded number, you ++ the counter on that node, if you dont, you add a new node at the end, and set the counter to 1. when you reach EOF you walk the list again to harvest the results.

hope it helps.

Share on other sites

Sounds good. I''ll have to try. Thanks.

Share on other sites
This may not be the most efficient way of doing this but it will certainly be more efficient than trying to allocate 16GB:

  struct info{ DWORD value; DWORD count;};int main(int argc, char **argv){ std::vector infovec; DWORD num; while(!feof(file) { fread(&num, sizeof(num), 1, file); std::vector::iterator it = infovec.begin(); std::vector::iterator end = infovec.end(); while (it != end) { if (it->value == num) { it->count++ break; } else { info i; i.value = num; i.count = 1; infovec.push_back(i); } ++it; } }}

Share on other sites
Don''t use a linked list!! (Linear search time, ARGH).
Use a hash table (constant search time). Or if you can''t, a std::map (log search time).

Share on other sites
16 gigs of mem. thought you were coding the Matrix.

Share on other sites

Doesn''t the hash table need the array to plug the numbers into? I think you are suggesting the solutions that this person already had, but with a possible faster search/indexing capability.

I think you need to sample the file and go with the linked list option. I don''t think you can reasonable read in the entire file and then parse it. The linked list could kak too if too many of the numbers are unique. Too many nodes would be allocated and the whole memory problem pops up again.

Maybe I''m not thinking about these problems the right way?

Share on other sites
sounds like a homework problem to me. Surely with something like this the teacher has told you that there is going to be a low value that the number wont go below and a high number the number wont go above.
Or perhaps there will be no more then x hundred or x thousand numbers in the file.
You really need to provide details like that if they exist.

"I pity the fool, thug, or soul who tries to take over the world, then goes home crying to his momma."
- Mr. T

Share on other sites
Like I said, just a starting point - not necessarily the most efficient

Share on other sites

I tried vector (list) and it was already too slow for 1MB file. With std::map I got quite good results.

Thanks.

Share on other sites
quote:
Original post by Anonymous Poster

Doesn''t the hash table need the array to plug the numbers into? I think you are suggesting the solutions that this person already had, but with a possible faster search/indexing capability.

The idea behind using a hash table or a map is that you aren''t likely to enconter all of the numbers (as pointed out, that would be a 16GB file...). Here, you only use the storage you really need. A hash table is typically smaller than the potential number of entries : that''s what they''re for, storing sparse data.