Loading Images - Urgent

Started by
15 comments, last by rick_appleton 19 years, 2 months ago
Hi, I am trying to load ( and render) 5000 images (tif format). Can anybody suggest me the best way to load all of them together as a texture (i am using 2d texture as I want to provide interactivity). Please keep in mind providing interactivity is very important for me. I will be loading nearly 500 images in one viewport (in all i will have 10 view ports at a time). Thanks GK
Advertisement
Could you give us some background information on what you're doing? There may be ways to work around loading 5000 images.
Hi,


We are carrying out the simulation for the atoms and charge density. The result of the simulation is the 500 x 500 x 500 grid points. From these I am generating the 500 images (for each simulation) assigning colors based on the difference in the charge at a point (between new set and original set).

I have to render the images generated using the simulation data and provide some operations. 500 images constitute a set. We have 10 sets showing the data in a different states which I plan to show in different view port for comparison. So what do you suggest, how is it possible. I have been trying it but after 500 images or so the system cant load images due to texture memory constraints.

Thanks
I suggest keeping the data as a set of points and use a tesselated grid to draw the data. This way you are loading a small amount of data which is infinitely scalable. If you assign an alpha (perhaps higher alpha for higher delta) to each point you could also display the set in 3d.

I am currently working on a visualization tool that might help. Its an interactive tool designed to display large data sets in 3d/real time. So far the company I work with have been dealing with geological data, but the viewer is very flexible. If you're interested, send me an email at thereisnocowlevel@hotmail.com.

Cheers,
- llvllatrix
Quote:Original post by gaurav khandujaWe are carrying out the simulation for the atoms and charge density. The result of the simulation is the 500 x 500 x 500 grid points. From these I am generating the 500 images (for each simulation) assigning colors based on the difference in the charge at a point (between new set and original set).


Here is you problem 500 rgb images with a resolution of(what must be 512x512, and 500x500 doesn't give much different values) gives us 512*512*3*500=393216000 bytes or 375 megabytes, so there is just no way even one data set will fit in the texture memory, and you wanted ten of them, that's over 3,6 gigabytes, wich means that you would have to render that directly from the harddrive.

even using the method that llvllatrix suggest takes up a lot of space, so you have to find a way to limit the displayed information in some way, perhaps by prerendering some of the data sets (create 1 image from 500 ones from one view point).
Quote:
Here is you problem 500 rgb images with a resolution of(what must be 512x512, and 500x500 doesn't give much different values) gives us 512*512*3*500=393216000 bytes or 375 megabytes, so there is just no way even one data set will fit in the texture memory, and you wanted ten of them, that's over 3,6 gigabytes, wich means that you would have to render that directly from the harddrive.


Very true, you're at a loss for memory. Using my method, to store an entire dataset it would take (for an entire dataset): 500 * 500 * 500 * 7 (x,y,z,r,g,b,a) * 4(sizeof(float)) = 3,500,000,000 bytes or 350 megs if the data point locations were not predictable. If they were it would take 500 * 500 * 500 * 4 (r,g,b,a) * 4(sizeof(float)) = 2,000,000,000 bytes or 200 megs, and you would still max out your memory for 10 data sets. I'm not too sure how well a vertex array might help, having never used them.

I think you have two options. Either reduce the sampling on your datasets or have multiple computers displaying the different datasets using your visualization app. I think this shouldnt be too difficult provided the visualization app works. So far I think the largest dataset we have loaded was on the order of 75 megs, ran in real time, and had 6 float components. While a visualizaiton of your dataset may not run in real time per say (a lot of lag), it should at least run.

Cheers,
- llvllatrix
a little note on your numbers there

3,500,000,000 bytes != 350Mb
3,500,000,000 bytes = 3,25Gb

2,000,000,000 bytes != 200Mb
2,000,000,000 bytes = 1,86GB

allso, with the right texture compression and reduction in data depth you could reduce those 300MB datasets to something more managble like 50 MB, even 25 or less if the colors are not that important, this depends a little on the data, but you might be able to use some kind of paleted texture or grayscales.
I have tried to reduce the image resolution by factor of 4. But still it seems impossible to fit all the data in the memory. What you guys suggest??? And lc_overlord it would be great if you could tell what texture compression scheme you are talking about.... I am little new to this field so dont get offended by my questions.....

One thing more ... I have reduced the resolution of the images ... what I need to do increase the resolution of the particular area of the image.

Thanks
Gaurav
You can use unsigned bytes for colors instead of full floats. But I think lowering the resolution would be very wise.

You said you're planning on displaying 10 sets at once. To get maximum resolution, each gridpoint should fall on a pixel, hence you'd need an area of more than 500x500 pixels on your screen for each data set. At 1600x1200 you might just get 6 sets on the screen at once. If you lower the image from 500x500x500 to 250x250x250 (making 250 images of 256x256) You not only need 1/8 as much memory (bringing a single dataset to 64 MB), but you'll still be able to view maximum detail for all 10 datasets on screen at once. If you then use ddx files to get compression on the images, you might be able to get this done.
When it comes to compression i am a little new to the area myself, but depending on your data DXT1, DXT3 or DXT5 compression could reduce the image size to about 25-50% without mutch loss of data

This topic is closed to new replies.

Advertisement