Fixed a problem when --dump-header - was given with more than one URL,

which caused an error when the second header was dumped due to stdout
being closed.  Added test case 1066 to verify.  Also fixed a potential
problem where a closed file descriptor might be used for an upload
when more than one URL is given.
This commit is contained in:
Dan Fandrich
2008-08-22 22:57:25 +00:00
parent 4b64a8d20d
commit e3ad6d2bd1
5 changed files with 97 additions and 12 deletions

View File

@@ -6,6 +6,13 @@
Changelog Changelog
Daniel Fandrich (22 Aug 2008)
- Fixed a problem when --dump-header - was given with more than one URL,
which caused an error when the second header was dumped due to stdout
being closed. Added test case 1066 to verify. Also fixed a potential
problem where a closed file descriptor might be used for an upload
when more than one URL is given.
Yang Tse (22 Aug 2008) Yang Tse (22 Aug 2008)
- Improved libcurl's internal curl_m*printf() functions integral data type - Improved libcurl's internal curl_m*printf() functions integral data type
size and signedness handling. size and signedness handling.

View File

@@ -56,6 +56,7 @@ This release includes the following bugfixes:
o --stderr is now honoured with the -v option o --stderr is now honoured with the -v option
o memory leak in libcurl on Windows built with OpenSSL o memory leak in libcurl on Windows built with OpenSSL
o improved curl_m*printf() integral data type size and signedness handling o improved curl_m*printf() integral data type size and signedness handling
o error when --dump-header - used with more than one URL
This release includes the following known bugs: This release includes the following known bugs:
@@ -77,7 +78,8 @@ advice from friends like these:
Phil Pellouchoud, Eduard Bloch, John Lightsey, Stephen Collyer, Tor Arntsen, Phil Pellouchoud, Eduard Bloch, John Lightsey, Stephen Collyer, Tor Arntsen,
Rolland Dudemaine, Phil Blundell, Scott Barrett, Andreas Schuldei, Rolland Dudemaine, Phil Blundell, Scott Barrett, Andreas Schuldei,
Peter Lamberg, David Bau, Pramod Sharma, Yehoshua Hershberg, Peter Lamberg, David Bau, Pramod Sharma, Yehoshua Hershberg,
Constantine Sapuntzakis, Lars Nilsson, Andy Tsouladze, Jamie Lokier Constantine Sapuntzakis, Lars Nilsson, Andy Tsouladze, Jamie Lokier,
Vincent Le Normand
Thanks! (and sorry if I forgot to mention someone) Thanks! (and sorry if I forgot to mention someone)

View File

@@ -3890,9 +3890,6 @@ operate(struct Configurable *config, int argc, argv_item_t argv[])
int infilenum; int infilenum;
char *uploadfile=NULL; /* a single file, never a glob */ char *uploadfile=NULL; /* a single file, never a glob */
int infd = STDIN_FILENO;
bool infdopen;
FILE *headerfilep = NULL;
curl_off_t uploadfilesize; /* -1 means unknown */ curl_off_t uploadfilesize; /* -1 means unknown */
bool stillflags=TRUE; bool stillflags=TRUE;
@@ -4126,11 +4123,9 @@ operate(struct Configurable *config, int argc, argv_item_t argv[])
/* open file for output: */ /* open file for output: */
if(strcmp(config->headerfile,"-")) { if(strcmp(config->headerfile,"-")) {
heads.filename = config->headerfile; heads.filename = config->headerfile;
headerfilep=NULL;
} }
else else
headerfilep=stdout; heads.stream=stdout;
heads.stream = headerfilep;
heads.config = config; heads.config = config;
} }
@@ -4218,6 +4213,8 @@ operate(struct Configurable *config, int argc, argv_item_t argv[])
for(i = 0; for(i = 0;
(url = urls?glob_next_url(urls):(i?NULL:strdup(url))); (url = urls?glob_next_url(urls):(i?NULL:strdup(url)));
i++) { i++) {
int infd = STDIN_FILENO;
bool infdopen;
char *outfile; char *outfile;
struct timeval retrystart; struct timeval retrystart;
outfile = outfiles?strdup(outfiles):NULL; outfile = outfiles?strdup(outfiles):NULL;
@@ -4965,9 +4962,6 @@ show_error:
#endif #endif
quit_urls: quit_urls:
if(headerfilep)
fclose(headerfilep);
if(url) if(url)
free(url); free(url);
@@ -5026,7 +5020,7 @@ quit_curl:
if (easycode) if (easycode)
curl_slist_append(easycode, "curl_easy_cleanup(hnd);"); curl_slist_append(easycode, "curl_easy_cleanup(hnd);");
if(config->headerfile && !headerfilep && heads.stream) if(heads.stream && (heads.stream != stdout))
fclose(heads.stream); fclose(heads.stream);
if(allocuseragent) if(allocuseragent)

View File

@@ -56,7 +56,7 @@ EXTRA_DIST = test1 test108 test117 test127 test20 test27 test34 test46 \
test1040 test1041 test1042 test1043 test1044 test1045 test1046 test1047 \ test1040 test1041 test1042 test1043 test1044 test1045 test1046 test1047 \
test1048 test1049 test1050 test1051 test1052 test1053 test1054 test1055 \ test1048 test1049 test1050 test1051 test1052 test1053 test1054 test1055 \
test1056 test1057 test1058 test1059 test1060 test1061 test1062 test1063 \ test1056 test1057 test1058 test1059 test1060 test1061 test1062 test1063 \
test1064 test1065 test1064 test1065 test1066
filecheck: filecheck:
@mkdir test-place; \ @mkdir test-place; \

82
tests/data/test1066 Normal file
View File

@@ -0,0 +1,82 @@
<testcase>
<info>
<keywords>
HTTP
HTTP GET
</keywords>
</info>
# Server-side
<reply>
<data nocheck="1">
HTTP/1.1 200 OK
Server: thebest/1.0
Content-Type: text/plain
Content-Length: 6
first
</data>
<data1 nocheck="1">
HTTP/1.1 200 OK
Server: thebest/1.0
Content-Type: text/plain
Content-Length: 7
second
</data1>
</reply>
# Client-side
<client>
<server>
http
</server>
<name>
HTTP --dump-header - with two URLs
</name>
<command>
http://%HOSTIP:%HTTPPORT/want/1066 http://%HOSTIP:%HTTPPORT/want/10660001 --dump-header -
</command>
</client>
# Verify data after the test has been "shot"
<verify>
<strip>
^User-Agent:.*
</strip>
<protocol>
GET /want/1066 HTTP/1.1
Host: %HOSTIP:%HTTPPORT
Accept: */*
GET /want/10660001 HTTP/1.1
Host: %HOSTIP:%HTTPPORT
Accept: */*
</protocol>
<stdout>
HTTP/1.1 200 OK
HTTP/1.1 200 OK
Server: thebest/1.0
Server: thebest/1.0
Content-Type: text/plain
Content-Type: text/plain
Content-Length: 6
Content-Length: 6
first
HTTP/1.1 200 OK
HTTP/1.1 200 OK
Server: thebest/1.0
Server: thebest/1.0
Content-Type: text/plain
Content-Type: text/plain
Content-Length: 7
Content-Length: 7
second
</stdout>
</verify>
</testcase>