libcurl non existing page problem
i am using the fopen() URL example http://curl.haxx.se/lxr/source/docs/examples/fopen.c
and i added a few functions for more power (such as fopen with post data).
Right now, if i try to open a page that doesnt exist. I do get a handle back. When i do feof() it isnt an EOF, when i do fgets, it locks up in what i assume is a endless loop. (I am debugging with msvc, and i built with mingw as a dll to my fopen func and the easy interface).
How and where should i put in a check to see if the url exist? or a timeout? or anything to prevent this from locking up?
C only notices you have an end-of-file after you've attempted to read past the end. Are you checking for the possibility of a null pointer being returned from fgets?
What i said was a bit off. After doing url_fopen i attempt to read the page. This locks up when i call url_fread. If the page exist, it works as expected.
while(!ufile->url_feof(f)) { static char page[1024*1024]; int n = ufile->url_fread(page, 1, sizeof(page),f); printf("%s", page); }
If url_fread returns 0, you have an error, and should break out of that loop. Try that, and see what happens.
Found the solution. Keep in mind i am using the fopen example as my code base.
The code has a timeout but its in an infinite loop which makes it repeat itself. So if the error is -1 or 0 ('ok') i set still_running to 0. otherwise it will run default which calls multi. It works perfectly, i am able to dl images/bins files and haven't ran into a problem with it yet. But i haven't done much testing
The code has a timeout but its in an infinite loop which makes it repeat itself. So if the error is -1 or 0 ('ok') i set still_running to 0. otherwise it will run default which calls multi. It works perfectly, i am able to dl images/bins files and haven't ran into a problem with it yet. But i haven't done much testing
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement