Compare commits

..

34 Commits

Author SHA1 Message Date
Daniel Stenberg
6aaee5f23b minor changes 2002-01-03 09:43:17 +00:00
Daniel Stenberg
dd06dcebe1 added required software and Guido Neitzer's Mac OS X build instructions 2002-01-03 09:12:41 +00:00
Daniel Stenberg
b35c26b751 added a little percentage for "ok coverage" 2002-01-03 08:22:05 +00:00
Daniel Stenberg
128f341635 Changed how -I/--head works when --include is also used... Test case 104
stopped working after the dec-20 fixes that now supports FTP operations to
skip the transfer phase.
2002-01-03 08:07:29 +00:00
Daniel Stenberg
e48bc1be48 Philip Gladstone's fixes 2002-01-03 07:23:21 +00:00
Daniel Stenberg
0077b9c0a2 pass an 'int' as the third argument to bind() 2002-01-03 00:51:33 +00:00
Daniel Stenberg
fe37fb5921 Philip Gladstone's 64-bit sparc native compiler compatibility issues fixed. 2002-01-02 10:06:47 +00:00
Daniel Stenberg
221ecd0a30 the changes from 1999 is now in CHANGES.1999 2001-12-21 09:55:13 +00:00
Daniel Stenberg
560492707d moved the changes from 1999 into its own file 2001-12-21 09:54:45 +00:00
Daniel Stenberg
dfdf4916fa rewrote 3.9 to be more generic with more languages:
"3.9 How do I use curl in my favourite programming language?"
2001-12-21 09:20:04 +00:00
Daniel Stenberg
97a8c98886 spell 2001-12-21 08:10:34 +00:00
Daniel Stenberg
62fb70e9d1 recent fixes 2001-12-21 08:02:35 +00:00
Daniel Stenberg
8a9098a36c *cool* fix by Bjrn Stenberg, makes proxy transfers work better...! :-) 2001-12-20 15:58:22 +00:00
Daniel Stenberg
28027c2aa2 If nobody is set we won't download any FTP file. If include_header is set,
we return a set of headers not more. This enables FTP operations that don't
transfer any data, only perform FTP commands.
2001-12-20 11:22:01 +00:00
Daniel Stenberg
d60029d66e Added 4.5.6 "301 Moved Permanently", as a reply to bug report #495215 2001-12-19 23:25:04 +00:00
Daniel Stenberg
226fe8bdf9 Gtz Babin-Ebell's contributed "simplessl.c" example source code 2001-12-18 10:13:41 +00:00
Daniel Stenberg
33237b4502 run automake last 2001-12-18 01:00:24 +00:00
Daniel Stenberg
af6c394785 Gtz Babin-Ebell's OpenSSL ENGINE patch 2001-12-17 23:01:39 +00:00
Daniel Stenberg
558d12d7f6 strip trailing CRs 2001-12-17 10:32:10 +00:00
Daniel Stenberg
bfa8a6da26 cut off the description to prevent people from using this! 2001-12-17 09:33:54 +00:00
Daniel Stenberg
aa6b3d22a2 Marcus Webster's added CURLFORM_CONTENTHEADER docs 2001-12-16 12:54:42 +00:00
Daniel Stenberg
2eb355733f Marcus Webster's newly added CURLFORM_CONTENTHEADER 2001-12-14 12:59:16 +00:00
Daniel Stenberg
e66cdacb93 minor changes 2001-12-13 07:16:27 +00:00
Daniel Stenberg
c67f2da283 solaris 2.5.1 needs the sys/types.h file before the sys/socket.h 2001-12-11 15:08:27 +00:00
Daniel Stenberg
e192261788 failf() calls should not have newlines in the message string! 2001-12-11 13:13:01 +00:00
Daniel Stenberg
c63ca99c1c when the file name given to -T is used to build an upload path, the local
directory part is now stripped off and only the actual file name part will be
used
2001-12-11 00:48:55 +00:00
Daniel Stenberg
1c99c4ad11 HTTP_PROXY => http_proxy as Bjrn pointed out 2001-12-10 11:59:05 +00:00
Daniel Stenberg
bbcfc10677 corrected the READFUNCTION docs slightly 2001-12-10 07:46:43 +00:00
Daniel Stenberg
47e67eab26 corrected the comment above gmtime_r 2001-12-07 15:56:57 +00:00
Daniel Stenberg
650b95045d added gmtime_r check 2001-12-07 15:51:59 +00:00
Cris Bailiff
5603134e58 Updated location information for Curl_easy 2001-12-07 09:24:42 +00:00
Daniel Stenberg
d12fd897cb Jason Mancini's -Oalways suggestion 2001-12-06 14:40:16 +00:00
Daniel Stenberg
5e95203a5d let us know if curl compiles on more platforms 2001-12-06 12:48:41 +00:00
Daniel Stenberg
cad4a571ce curl compiles on HURD 2001-12-06 07:11:33 +00:00
38 changed files with 1601 additions and 1074 deletions

85
CHANGES
View File

@@ -6,6 +6,91 @@
History of Changes
Daniel (3 January 2002)
- As the test case uses --include and then --head, I had to modify src/main.c
to deal with this situation slightly better than previously. When done, we
have 100% good tests again in the main branch.
Daniel (2 January 2002)
- Made test case 25 run again in the multi-dev branch. But it seems that the
changes done on dec-20 made test case 104 cease to work (in both branches).
- Philip Gladstone pointed out a few portability problems in the source code
that didn't compile on 64-bit sparcs using Sun's native compiler...
Daniel (20 December 2001)
- Bj<42>rn Stenberg caught an unpleasent (but hard-to-find) bug that could cause
libcurl to hang on transfers over proxy, when the proxy was specified with
an environment variable!
- Added code to make ftp operations treat the NO_BODY and HEADERS options
better:
NO_BODY set TRUE and HEADERS set TRUE:
Return a set of headers with file info
NO_BODY set FALSE
Transfer data as usual, HEADERS is ignored
NO_BODY set TRUE and HEADERS set FALSE
Don't transfer any data, don't return any headers. Just perform the set
of FTP commands.
Daniel (17 December 2001)
- G<>tz Babin-Ebell dove into the dark dungeons of the OpenSSL ENGINE stuff and
made libcurl support it! This allows libcurl to do SSL connections with the
private key stored in external hardware.
To make this good, he had to add a bunch of new library options that'll be
useful to others as well:
CURLOPT_SSLCERTTYPE set SSL cert type (PEM/DER)
CURLOPT_SSLKEY set SSL private key (file)
CURLOPT_SSLKEYTYPE: set SSL key type (PEM/DER/ENG)
CURLOPT_SSLKEYPASSWD: set the passphrase for your private key
(CURLOPT_SSLCERTPASSWD is an alias)
CURLOPT_SSLENGINE: set the name of the crypto engine
(returns CURLE_SSL_ENGINE_NOTFOUND on error)
CURLOPT_SSLENGINE_DEFAULT: set the default engine
There are two new failure codes:
CURLE_SSL_ENGINE_NOTFOUND
CURLE_SSL_ENGINE_SETFAILED
Daniel (14 December 2001)
- We have "branched" the source-tree at a few places. Checkout the CVS sources
with the 'multi-dev' label to get the latest multi interface development
tree. The idea is to only branch affected files and to restrict the branch
to the v8 multi interface development only.
*NOTE* that if we get bug reports and patches etc, we might need to apply
them in both branches!
The multi-dev branch is what we are gonna use as main branch in the future
if it turns out successful. Thus, we must maintain both now in case we need
them. The current main branch will be used if we want to release a 7.9.3 or
perhaps a 7.10 release before version 8. Which is very likely.
- Marcus Webster provided code for the new CURLFORM_CONTENTHEADER option for
curl_formadd(), that lets an application add a set of headers for that
particular part in a multipart/form-post. He also provided a section to the
man page that describes the new option.
Daniel (11 December 2001)
- Ben Greear made me aware of the fact that the Curl_failf() usage internally
was a bit sloppy with adding newlines or not to the error messages. Let's
once and for all say that they do not belong there!
- When uploading files with -T to give a local file name, and you end the URL
with a slash to have the local file name used remote too, we now no longer
use the local directory as well. Only the file part of the -T file name
will be appended to the right of the slash in the URL.
Daniel (7 December 2001)
- Michal Bonino pointed out that Digital Unix doesn't have gmtime_r so the
link failed. Added a configure check and corrected source code.
Version 7.9.2
Daniel (5 December 2001)

835
CHANGES.0
View File

@@ -1,838 +1,3 @@
Daniel (28 December 1999):
- Tim Verhoeven correctly identified that curl
doesn't support URL formatted file names when getting ftp. Now, there's a
problem with getting very weird file names off FTP servers. RFC 959 defines
that the file name syntax to use should be the same as in the native OS of
the server. Since we don't know the peer server system we currently just
translate the URL syntax into plain letters. It is still better and with
the solaris 2.6-supplied ftp server it works with spaces in the file names.
Daniel (27 December 1999):
- When curl parsed cookies straight off a remote site, it corrupted the input
data, which, if the downloaded headers were stored made very odd characters
in the saved data. Correctly identified and reported by Paul Harrington.
Daniel (13 December 1999):
- General cleanups in the library interface. There had been some bad kludges
added during times of stress and I did my best to clean them off. It was
both regarding the lib API as well as include file confusions.
Daniel (3 December 1999):
- A small --stderr bug was reported by Eetu Ojanen...
- who also brought the suggestion of extending the -X flag to ftp list as
well. So, now it is and the long option is now --request instead. It is
only for ftp list for now (and the former http stuff too of course).
Lars J. Aas (24 November 1999):
- Patched curl to compile and build under BeOS. Doesn't work yet though!
- Corrected the Makefile.am files to allow putting object files in
different directories than the sources.
Version 6.3.1
Daniel (23 November 1999):
- I've had this major disk crash. My good old trust-worthy source disk died
along with the machine that hosted it. Thank goodness most of all the
things I've done are either backed up elsewhere or stored in this CVS
server!
- Michael S. Steuer pointed out a bug in the -F handling
that made curl hang if you posted an empty variable such as '-F name='. It
was one of those old bugs that never have worked properly...
- Jason Baietto pointed out a general flaw in the HTTP
download. Curl didn't complain if it was prematurely aborted before the
entire download was completed. It does now.
Daniel (19 November 1999):
- Chris Maltby very accurately criticized the lack of
return code checks on the fwrite() calls. I did a thorough check for all
occurrences and corrected this.
Daniel (17 November 1999):
- Paul Harrington pointed out that the -m/--max-time option
doesn't work for the slow system calls like gethostbyname()... I don't have
any good fix yet, just a slightly less bad one that makes curl exit hard
when the timeout is reached.
- Bjorn Reese helped me point out a possible problem that might be the reason
why Thomas Hurst experience problems in his Amiga version.
Daniel (12 November 1999):
- I found a crash in the new cookie file parser. It crashed when you gave
a plain http header file as input...
Version 6.3
Daniel (10 November 1999):
- I kind of found out that the HTTP time-conditional GETs (-z) aren't always
respected by the web server and the document is therefore sent in whole
again, even though it doesn't match the requested condition. After reading
section 13.3.4 of RFC 2616, I think I'm doing the right thing now when I do
my own check as well. If curl thinks the condition isn't met, the transfer
is aborted prematurely (after all the headers have been received).
- After comments from Robert Linden I also rewrote some parts of the man page
to better describe how the -F works.
- Michael Anti put up a new curl download mirror in
China: http://www.pshowing.com/curl/
- I added the list of download mirrors to the README file
- I did add more explanations to the man page
Daniel (8 November 1999):
- I made the -b/--cookie option capable of reading netscape formatted cookie
files as well as normal http-header files. It should be able to
transparently figure out what kind of file it got as input.
Daniel (29 October 1999):
- Another one of Sebastiaan van Erk's ideas (that has been requested before
but I seem to have forgotten who it was), is to add support for ranges in
FTP downloads. As usual, one request is just a request, when they're two
it is a demand. I've added simple support for X-Y style fetches. X has to
be the lower number, though you may omit one of the numbers. Use the -r/
--range switch (previously HTTP-only).
- Sebastiaan van Erk suggested that curl should be
able to show the file size of a specified file. I think this is a splendid
idea and the -I flag is now working for FTP. It displays the file size in
this manner:
Content-Length: XXXX
As it resembles normal headers, and leaves us the opportunity to add more
info in that display if we can come up with more in the future! It also
makes sense since if you access ftp through a HTTP proxy, you'd get the
file size the same way.
I changed the order of the QUOTE command executions. They're now executed
just after the login and before any other command. I made this to enable
quote commands to run before the -I stuff is done too.
- I found out that -D/--dump-header and -V/--version weren't documented in
the man page.
- Many HTTP/1.1 servers do not support ranges. Don't ask me why. I did add
some text about this in the man page for the range option. The thread in
the mailing list that started this was initiated by Michael Anti.
- I get reports about nroff crashes on solaris 2.6+ when displaying the curl
man page. Switch to gnroff instead, it is reported to work(!). Adam Barclay
reported and brought the suggestion.
- In a dialogue with Johannes G. Kristinsson we came
up with the idea to let -H/--header specified headers replace the
internally generated headers, if you happened to select to add a header
that curl normally uses by itself. The advantage with this is not entirely
obvious, but in Johannes' case it means that he can use another Host: than
the one curl would set.
Daniel (27 October 1999):
- Jongki Suwandi brought a nice patch for (yet another) crash when following
a location:. This time you had to follow a https:// server's redirect to
get the core.
Version 6.2
Daniel (21 October 1999):
- I think I managed to remove the suspicious (nil) that has been seen just
before the "Host:" in HTTP requests when -v was used.
- I found out that if you followed a location: when using a proxy, without
having specified http:// in the URL, the protocol part was added once again
when moving to the next URL! (The protocol part has to be added to the
URL when going through a proxy since it has no protocol-guessing system
such as curl has.)
- Benjamin Ritcey reported a core dump under solaris 2.6
with OpenSSL 0.9.4. It turned out this was due to a bad free() in main.c
that occurred after the download was done and completed.
- Benjamin found ftp downloads to show the first line of the download meter
to get written twice, and I removed that problem. It was introduced with
the multiple URL support.
- Dan Zitter correctly pointed out that curl 6.1 and earlier versions didn't
honor RFC 2616 chapter 4 section 2, "Message Headers": "...Field names are
case-insensitive..." HTTP header parsing assumed a certain casing. Dan
also provided me with a patch that corrected this, which I took the liberty
of editing slightly.
- Dan Zitter also provided a nice patch for config.guess to better recognize
the Mac OS X
- Dan also corrected a minor problem in the lib/Makefile that caused linking
to fail on OS X.
Daniel (19 October 1999):
- Len Marinaccio came up with some problems with curl. Since Windows has a
crippled shell, it can't redirect stderr and that causes trouble. I added
--stderr today which allows the user to redirect the stderr stream to a
file or stdout.
Daniel (18 October 1999):
- The configure script now understands the '--without-ssl' flag, which now
totally disable SSL/https support. Previously it wasn't possible to force
the configure script to leave SSL alone. The previous functionality has
been retained. Troy Engel helped test this new one.
Version 6.1
Daniel (17 October 1999):
- I ifdef'ed or commented all the zlib stuff in the sources and configure
script. It turned out we needed to mock more with zlib than I initially
thought, to make it capable of downloading compressed HTTP documents and
uncompress them on the fly. I didn't mean the zlib parts of curl to become
more than minor so this means I halt the zlib expedition for now and wait
until someone either writes the code or zlib gets updated and better
adjusted for this kind of usage. I won't get into details here, but a
short a summary is suitable:
- zlib can't automatically detect whether to use zlib or gzip
decompression methods.
- zlib is very neat for reading gzipped files from a file descriptor,
although not as nice for reading buffer-based data such as we would
want it.
- there are still some problems with the win32 version when reading from
a file descriptor if that is a socket
Daniel (14 October 1999):
- Moved the (external) include files for libcurl into a subdirectory named
curl and adjusted all #include lines to use <curl/XXXX> to maintain a
better name space and control of the headers. This has been requested.
Daniel (12 October 1999):
- I modified the 'maketgz' script to perform a 'make' too before a release
archive is put together in an attempt to make the time stamps better and
hopefully avoid the double configure-running that use to occur.
Daniel (11 October 1999):
- Applied J<>rn's patches that fixes zlib for mingw32 compiles as well as
some other missing zlib #ifdef and more text on the multiple URL docs in
the man page.
Version 6.1beta
Daniel (6 October 1999):
- Douglas E. Wegscheid sent me a patch that made the exact same thing as I
just made: the -d switch is now capable of reading post data from a named
file or stdin. Use it similarly to the -F. To read the post data from a
given file:
curl -d @path/to/filename www.postsite.com
or let curl read it out from stdin:
curl -d @- www.postit.com
J<>rn Hartroth (3 October 1999):
- Brought some more patches for multiple URL functionality. The MIME
separation ideas are almost scrapped now, and a custom separator is being
used instead. This is still compile-time "flagged".
Daniel
- Updated curl.1 with multiple URL info.
Daniel (30 September 1999):
- Felix von Leitner brought openssl-check fixes for configure.in to work
out-of-the-box when the openssl files are installed in the system default
dirs.
Daniel (28 September 1999)
- Added libz functionality. This should enable decompressing gzip, compress
or deflate encoding HTTP documents. It also makes curl send an accept that
it accepts that kind of encoding. Compressed contents usually shortens
download time. I *need* someone to tell me a site that uses compressed HTTP
documents so that I can test this out properly.
- As a result of the adding of zlib awareness, I changed the version string
a little. I plan to add openldap version reporting in there too.
Daniel (17 September 1999)
- Made the -F option allow stdin when specifying files. By using '-' instead
of file name, the data will be read from stdin.
Version 6.0
Daniel (13 September 1999)
- Added -X/--http-request <request> to enable any HTTP command to be sent.
Do not that your server has to support the exact string you enter. This
should possibly a string like DELETE or TRACE.
- Applied Douglas' mingw32-fixes for the makefiles.
Daniel (10 September 1999)
- Douglas E. Wegscheid pointed out a problem. Curl didn't check the FTP
servers return code properly after the --quote commands were issued. It
took anything non 200 as an error, when all 2XX codes should be accepted as
OK.
- Sending cookies to the same site in multiple lines like curl used to do
turned out to be bad and breaking the cookie specs. Curl now sends all
cookies on a single Cookie: line. Curl is not yet RFC 2109 compliant, but I
doubt that many servers do use that syntax (yet).
Daniel (8 September 1999)
- J<>rn helped me make sure it still compiles nicely with mingw32 under win32.
Daniel (7 September 1999)
- FTP upload through proxy is now turned into a HTTP PUT. Requested by
Stefan Kanthak.
- Added the ldap files to the .m32 makefile.
Daniel (3 September 1999)
- Made cookie matching work while using HTTP proxy.
Bjorn Reese (31 August 1999)
- Passed his ldap:// patch. Note that this requires the openldap shared
library to be installed and that LD_LIBRARY_PATH points to the
directory where the lib will be found when curl is run with a
ldap:// URL.
J<>rn Hartroth (31 August 1999)
- Made the Mingw32 makefiles into single files.
- Made file:// work for Win32. The same code is now used for unix as well for
performance reasons.
Douglas E. Wegscheid (30 August 1999)
- Patched the Mingw32 makefiles for SSL builds.
Matthew Clarke (30 August 1999)
- Made a cool patch for configure.in to allow --with-ssl to specify the
root dir of the openssl installation, as in
./configure --with-ssl=/usr/ssl_here
- Corrected the 'reconf' script to work better with some shells.
J<>rn Hartroth (26 August 1999)
- Fixed the Mingw32 makefiles in lib/ and corrected the file.c for win32
compiles.
Version 5.11
Daniel (25 August 1999)
- John Weismiller pointed out a bug in the header-line
realloc() system in download.c.
- I added lib/file.[ch] to offer a first, simple, file:// support. It
probably won't do much good on win32 system at this point, but I see it
as a start.
- Made the release archives get a Makefile in the root dir, which can be
used to start the compiling/building process easier. I haven't really
changed any INSTALL text yet, I wanted to get some feed-back on this
first.
Daniel (17 August 1999)
- Another Location: bug. Curl didn't do proper relative locations if the
original URL had cgi-parameters that contained a slash. Nusu's page
again.
- Corrected the NO_PROXY usage. It is a list of substrings that if one of
them matches the tail of the host name it should connect to, curl should
not use a proxy to connect there. Pointed out to me by Douglas
E. Wegscheid. I also changed the README text a little regarding this.
Daniel (16 August 1999)
- Fixed a memory bug with http-servers that sent Location: to a Location:
page. Nusu's page showed this too.
- Made cookies work a lot better. Setting the same cookie name several times
used to add more cookies instead of replacing the former one which it
should've. Nusu <nus at intergorj.ro> brought me an URL that made this
painfully visible...
Troy (15 August 1999)
- Brought new .spec files as well as a patch for configure.in that lets the
configure script find the openssl files better, even when the include
files are in /usr/include/openssl
Version 5.10
Daniel (13 August 1999)
- SSL_CTX_set_default_passwd_cb() has been modified in the 0.9.4 version of
OpenSSL. Now why couldn't they simply add a *new* function instead of
modifying the parameters of an already existing function? This way, we get
a compiler warning if compiling with 0.9.4 but not with earlier. So, I had
to come up with a #if construction that deals with this...
- Made curl output the SSL version number get displayed properly with 0.9.4.
Troy (12 August 1999)
- Added MingW32 (GCC-2.95) support under Win32. The INSTALL file was also
a bit rearranged.
Daniel (12 August 1999)
- I had to copy a good <arpa/telnet.h> include file into the curl source
tree to enable the silly win32 systems to compile. The distribution rights
allows us to do that as long as the file remains unmodified.
- I corrected a few minor things that made the compiler complain when
-Wall -pedantic was used.
- I'm moving the official curl web page to http://curl.haxx.nu. I think it
will make it easier to remember as it is a lot shorter and less cryptic.
The old one still works and shows the same info.
Daniel (11 August 1999)
- Albert Chin-A-Young mailed me another correction for NROFF in the
configure.in that is supposed to be better for IRIX users.
Daniel (10 August 1999)
- Albert Chin-A-Young helped me with some stupid Makefile things, as well as
some fiddling with the getdate.c stuff that he had problems with under
HP-UX v10. getdate.y will now be compiled into getdate.c if the appropriate
yacc or bison is found by the configure script. Since this is slightly new,
we need to test the output getdate.c with win32 systems to make sure it
still compiles there.
Daniel (5 August 1999)
- I've just setup a new mailing list with the intention to keep discussions
around libcurl development in it. I mainly expect it to be for thoughts and
brainstorming around a "next generation" library, rather than nitpicking
about the current implementation or details in the current libcurl.
To join our happy bunch of future-looking geeks, enter 'subscribe
<address>' in the body of a mail and send it to
libcurl-request@listserv.fts.frontec.se. Curl bug reports, the usual curl
talk and everything else should still be kept in this mailing list. I've
started to archive this mailing list and have put the libcurl web page at
www.fts.frontec.se/~dast/libcurl/.
- Stefan Kanthak contacted me regarding a few problems in the configure
script which he discovered when trying to make curl compile and build under
Siemens SINIX-Z V5.42B2004!
- Marcus Klein very accurately informed me that src/version.h was not present
in the CVS repository. Oh, how silly...
- Linus Nielsen rewrote the telnet:// part and now curl offers limited telnet
support. If you run curl like 'curl telnet://host' you'll get all output on
the screen and curl will read input from stdin. You'll be able to login and
run commands etc, but since the output is buffered, expect to get a little
weird output.
This is still in its infancy and it might get changed. We need your
feed-back and input in how this is best done.
WIN32 NOTE: I bet we'll get problems when trying to compile the current
lib/telnet.c on win32, but I think we can sort them out in time.
- David Sanderson reported that FORCE_ALLOCA_H or HAVE_ALLOCA_H must be
defined for getdate.c to compile properly on HP-UX 11.0. I updated the
configure script to check for alloca.h which should make it.
Daniel (4 August 1999)
- I finally got to understand Marcus Klein's ftp download resume problem,
which turns out to be due to different outputs from different ftp
servers. It makes ftp download resuming a little trickier, but I've made
some modifications I really believe will work for most ftp servers and I do
hope you report if you have problems with this!
- Added text about file transfer resuming to README.curl.
Daniel (2 August 1999)
- Applied a progress-bar patch from Lars J. Aas. It offers
a new styled progress bar enabled with -#/--progress-bar.
T. Yamada <tai at imasy.or.jp> (30 July 1999)
- It breaks with segfault when 1) curl is using .netrc to obtain
username/password (option '-n'), and 2) is automatically redirected to
another location (option '-L').
There is a small bug in lib/url.c (block starting from line 641), which
tries to take out username/password from user- supplied command-line
argument ('-u' option). This block is never executed on first attempt since
CONF_USERPWD bit isn't set at first, but curl later turns it on when it
checks for CONF_NETRC bit. So when curl tries to redo everything due to
redirection, it segfaults trying to access *data->userpwd.
Version 5.9.1
Daniel (30 July 1999)
- Steve Walch pointed out that there is a memory leak in the formdata
functions. I added a FormFree() function that is now used and supposed to
correct this flaw.
- Mark Wotton reported:
'curl -L https://www.cwa.com.au/' core dumps. I managed to cure this by
correcting the cleanup procedure. The bug seems to be gone with my OpenSSL
0.9.2b, although still occurs when I run the ~100 years old SSLeay 0.8.0. I
don't know whether it is curl or SSLeay that is to blame for that.
- Marcus Klein:
Reported an FTP upload resume bug that I really can't repeat nor understand.
I leave it here so that it won't be forgotten.
Daniel (29 July 1999)
- Costya Shulyupin suggested support for longer URLs when following Location:
and I could only agree and fix it!
- Leigh Purdie found a problem in the upload/POST department. It turned out
that http.c accidentaly cleared the pointer instead of the byte counter
when supposed to.
- Costya Shulyupin pointed out a problem with port numbers and Location:. If
you had a server at a non-standard port that redirected to an URL using a
standard port number, curl still used that first port number.
- Ralph Beckmann pointed out a problem when using both CONF_FOLLOWLOCATION
and CONF_FAILONERROR simultaneously. Since the CONF_FAILONERROR exits on
the 302-code that the follow location header outputs it will never show any
html on location: pages. I have now made it look for >=400 codes if
CONF_FOLLOWLOCATION is set.
- 'struct slist' is now renamed to 'struct curl_slist' (as suggested by Ralph
Beckmann).
- Joshua Swink and Rick Welykochy were the first to point out to me that the
latest OpenSSL package now have moved the standard include path. It is now
in /usr/local/ssl/include/openssl and I have now modified the --enable-ssl
option for the configure script to use that as the primary path, and I
leave the former path too to work with older packages of OpenSSL too.
Daniel (9 June 1999)
- I finally understood the IRIX problem and now it seem to compile on it!
I am gonna remove those #define strcasecmp() things once and for all now.
Daniel (4 June 1999)
- I adjusted the FTP reply 227 parser to make the PASV command work better
with more ftp servers. Appearantly the Roxen Challanger server replied
something curl 5.9 could deal with! :-( Reported by Ashley Reid-Montanaro
and Mark Butler brought a solution for it.
Daniel (26 May 1999)
- Rearranged. README is new, the old one is now README.curl and I added a
README.libcurl with text I got from Ralph Beckmann.
- I also updated the INSTALL text.
Daniel (25 May 1999)
- David Jonathan Lowsky correctly pointed out that curl didn't properly deal
with form posting where the variable shouldn't have any content, as in curl
-F "form=" www.site.com. It was now fixed.
Version 5.9
Daniel (22 May 1999)
- I've got a bug report from Aaron Scarisbrick in which he states he has some
problems with -L under FreeBSD 3.0. I have previously got another bug
report from Stefan Grether which points at an error with similar sympthoms
when using win32. I made the allocation of the new url string a bit faster
and different, don't know if it actually improves anything though...
Daniel (20 May 1999)
- Made the cookie parser deal with CRLF newlines too.
Daniel (19 May 1999)
- Download() didn't properly deal with failing return codes from the sread()
function. Adam Coyne found the problem in the win32 version, and Troy Engel
helped me out isolating it.
Daniel (16 May 1999)
- Richard Adams pointed out a bug I introduced in 5.8. --dump-header doesn't
work anymore! :-/ I fixed it now.
- After a suggestion by Joshua Swink I added -S / --show-error to force curl
to display the error message in case of an error, even if -s/--silent was
used.
Daniel (10 May 1999)
- I moved the stuff concerning HTTP, DICT and TELNET it their own source
files now. It is a beginning on my clean-up of the sources to make them
layer all those protocols better to enable more to be added easier in the
future!
- Leon Breedt sent me some files I've not put into the main curl
archive. They're for creating the Debian package thingie. He also sent me a
debian package that I've made available for download at the web page
Daniel (9 May 1999)
- Made it compile on cygwin too.
Troy Engel (7 May 1999)
- Brought a series of patches to allow curl to compile smoothly on MSVC++ 6
again!
Daniel (6 May 1999)
- I changed the #ifdef HAVE_STRFTIME placement for the -z code so that it
will be easier to discover systems that don't have that function and thus
can't use -z successfully. Made the strftime() get used if WIN32 is defined
too.
Version 5.8
Daniel (5 May 1999)
- I've had it with this autoconf/automake mess. It seems to work allright
for most people who don't have automake installed, but for those who have
there are problems all over.
I've got like five different bug reports on this only the last
week... Claudio Neves and Federico Bianchi and root <duggerj001 at
hawaii.rr.com> are some of them reporting this.
Currently, I have no really good fix since I want to use automake myself to
generate the Makefile.in files. I've found out that the @SHELL@-problems
can often be fixed by manually invoking 'automake' in the archive root
before you run ./configure... I've hacked my maketgz script now to fiddle
a bit with this and my tests seem to work better than before at least!
Daniel (4 May 1999)
- mkhelp.pl has been doing badly lately. I corrected a case problem in
the regexes.
- I've now remade the -o option to not touch the file unless it needs to.
I had to do this to make -z option really fine, since now you can make a
curl fetch and use a local copy's time when downloading to that file, as
in:
curl -z dump -o dump remote.site.com/file.html
This will only get the file if the remote one is newer than the local.
I'm aware that this alters previous behaviour a little. Some scripts out
there may depend on that the file is always touched...
- Corrected a bug in the SSLv2/v3 selection.
- Felix von Leitner requested that curl should be able to send
"If-Modified-Since" headers, which indeed is a fair idea. I implemented it
right away! Try -z <expression> where expression is a full GNU date
expression or a file name to get the date from!
Stephan Lagerholm (30 Apr 1999)
- Pointed out a problem with the src/Makefile for FreeBSD. The RM variable
isn't set and causes the make to fail.
Daniel (26 April 1999)
- Am I silly or what? Irving Wolfe pointed out to me that the curl version
number was not set properly. Hasn't been since 5.6. This was due to a bug
in my maketgz script!
David Eriksson (25 Apr 1999)
- Found a bug in cookies.c that made it crash at times.
Version 5.7.1
Doug Kaufman (23 Apr 1999)
- Brought two sunos 4 fixes. One of them being the hostip.c fix mentioned
below and the other one a correction in include/stdcheaders.h
- Added a paragraph about compiling with the US-version of openssl to the
INSTALL file.
Daniel
- New mailing list address. Info updated on the web page as well as in the
README file
Greg Onufer (20 Apr 1999)
- hostip.c didn't compile properly on SunOS 5.5.1.
It needs an #include <sys/types.h>
Version 5.7
Daniel (Apr 20 1999)
- Decided to upload a non-beta version right now!
- Made curl support any-length HTTP headers. The destination buffer is now
simply enlarged every time it turns out to be too small!
- Added the FAQ file to the archive. Still a bit smallish, but it is a
start.
Eric Thelin (15 Apr 1999)
- Made -D accept '-' instead of filename to write to stdout.
Version 5.6.3beta
Daniel (Apr 12 1999)
- Changed two #ifdef WIN32 to better #ifdef <errorcode> when connect()ing
in url.c and ftp.c. Makes cygwin32 deal with them better too. We should
try to get some decent win32-replacement there. Anyone?
- The old -3/--crlf option is now ONLY --crlf!
- I changed the "SSL fix" to a more lame one, but that doesn't remove as
much functionality. Now I've enabled the lib to select what SSL version it
should try first. Appearantly some older SSL-servers don't like when you
talk v3 with them so you need to be able to force curl to talk v2 from the
start. The fix dated April 6 and posted on the mailing list forced curl to
use v2 at all times using a modern OpenSSL version, but we don't really
want such a crippled solution.
- Marc Boucher sent me a patch that corrected a math error for the
"Curr.Speed" progress meter.
- Eric Thelin sent me a patch that enables '-K -' to read a config file from
stdin.
- I found out we didn't close the file properly before so I added it!
Daniel (Apr 9 1999)
- Yu Xin pointed out a problem with ftp download resume. It didn't work at
all! ;-O
Daniel (Apr 6 1999)
- Corrected the version string part generated for the SSL version.
- I found a way to make some other SSL page work with openssl 0.9.1+ that
previously didn't (ssleay 0.8.0 works with it though!). Trying to get
some real info from the OpenSSL guys to see how I should do to behave the
best way. SSLeay 0.8.0 shouldn't be that much in use anyway these days!
Version 5.6.2beta
Daniel (Apr 4 1999)
- Finally have curl more cookie "aware". Now read carefully. This is how
it works.
To make curl read cookies from an already existing file, in plain header-
format (like from the headers of a previous fetch) invoke curl with the
-b flag like:
curl -b file http://site/foo.html
Curl will then use all cookies it finds matching. The old style that sets
a single cookie with -b is still supported and is used if the string
following -b includes a '=' letter, as in "-b name=daniel".
To make curl read the cookies sent in combination with a location: (which
sites often do) point curl to read a non-existing file at first (i.e
to start with no existing cookies), like:
curl -b nowhere http://site/setcookieandrelocate.html
- Added a paragraph in the TODO file about the SSL problems recently
reported. Evidently, some kind of SSL-problem curl may need to address.
- Better "Location:" following.
Douglas E. Wegscheid (Tue, 30 Mar 1999)
- A subsecond display patch.
Daniel (Mar 14 1999)
- I've separated the version number of libcurl and curl now. To make
things a little easier, I decided to start the curl numbering from
5.6 and the former version number known as "curl" is now the one
set for libcurl.
- Removed the 'enable-no-pass' from configure, I doubt anyone wanted
that.
- Made lots of tiny adjustments to compile smoothly with cygwin under
win32. It's a killer for porting this to win32, bye bye VC++! ;-)
Compiles and builds out-of-the-box now. See the new wordings in
INSTALL for details.
- Beginning experiments with downloading multiple document from a http
server while remaining connected.
Version 5.6beta
Daniel (Mar 13 1999)
- Since I've changed so much, I thought I'd just go ahead and implement the
suggestion from Douglas E. Wegscheid. -D or --dump-header is now storing
HTTP headers separately in the specified file.
- Added new text to INSTALL on what to do to build this on win32 now.
- Aaargh. I had to take a step back and prefix the shared #include files
in the sources with "../include/" to please VC++...
Daniel (Mar 12 1999)
- Split the url.c source into many tiny sources for better readability
and smaller size.
Daniel (Mar 11 1999)
- Started to change stuff for a move to make libcurl and a more separate
curl application that uses the libcurl. Made the libcurl sources into
the new lib directory while the curl application will remain in src as
before. New makefiles, adjusted configure script and so.
libcurl.a built quickly and easily. I better make a better interface to
the lib functions though.
The new root dir include/ is supposed to contain the public information
about the new libcurl. It is a little ugly so far :-)
Daniel (Mar 1 1999)
- Todd Kaufmann sent me a good link to Netscape's cookie spec as well as the
info that RFC 2109 specifies how to use them. The link is now in the
README and the RFC in the RESOURCES.
Daniel (Feb 23 1999)
- Finally made configure accept --with-ssl to look for SSL libs and includes
in the "standard" place /usr/local/ssl...
Daniel (Feb 22 1999)
- Verified that curl linked fine with OpenSSL 0.9.1c which seems to be
the most recent.
Henri Gomez (Fri Feb 5 1999)
- Sent in an updated curl-ssl.spec. I still miss the script that builds an
RPM automatically...
Version 5.5.1
Mark Butler (27 Jan 1999)
- Corrected problems in Download().
Danitel Stenberg (25 Jan 1999)
- Jeremie Petit pointed out a few flaws in the source that prevented it from
compile warning free with the native compiler under Digital Unix v4.0d.
Version 5.5
Daniel Stenberg (15 Jan 1999)
- Added Bjorns small text to the README about the DICT protocol.
Daniel Stenberg (11 Jan 1999)
- <jswink at softcom.net> reported about the win32-versioin: "Doesn't use
ALL_PROXY environment variable". Turned out to be because of the static-
buffer nature of the win32 environment variable calls!
Bjorn Reese (10 Jan 1999)
- I have attached a simple addition for the DICT protocol (RFC 2229).
It performs dictionary lookups. The output still needs to be better
formatted.
To test it try (the exact format, and more examples are described in
the RFC)
dict://dict.org/m:hello
dict://dict.org/m:hello::soundex
Vicente Garcia (10 Jan 1999)
- Corrected the progress meter for files larger than 20MB.
Daniel Stenberg (7 Jan 1999)
- Corrected the -t and -T help texts. They claimed to be FTP only.
Version 5.4
Daniel Stenberg
(7 Jan 1999)
- Irving Wolfe reported that curl -s didn't always supress the progress
reporting. It was the form post that autoamtically always switched it on
again. This is now corrected!
(4 Jan 1999)
- Andreas Kostyrka suggested I'd add PUT and he helped me out to test it. If
you use -t or -T now on a http or https server, PUT will be used for file
upload.
I removed the former use of -T with HTTP. I doubt anyone ever really used
that.
(4 Jan 1999)
- Erik Jacobsen found a width bug in the mprintf() function. I corrected it
now.
(4 Jan 1999)
- As John V. Chow pointed out to me, curl accepted very limited URL sizes. It
should now accept path parts that are up to at least 4096 bytes.
- Somehow I screwed up when applying the AIX fix from Gilbert Ramirez, so
I redid that now.
Version 5.3a (win32 only)
Troy Engel

835
CHANGES.1999 Normal file
View File

@@ -0,0 +1,835 @@
Daniel (28 December 1999):
- Tim Verhoeven correctly identified that curl
doesn't support URL formatted file names when getting ftp. Now, there's a
problem with getting very weird file names off FTP servers. RFC 959 defines
that the file name syntax to use should be the same as in the native OS of
the server. Since we don't know the peer server system we currently just
translate the URL syntax into plain letters. It is still better and with
the solaris 2.6-supplied ftp server it works with spaces in the file names.
Daniel (27 December 1999):
- When curl parsed cookies straight off a remote site, it corrupted the input
data, which, if the downloaded headers were stored made very odd characters
in the saved data. Correctly identified and reported by Paul Harrington.
Daniel (13 December 1999):
- General cleanups in the library interface. There had been some bad kludges
added during times of stress and I did my best to clean them off. It was
both regarding the lib API as well as include file confusions.
Daniel (3 December 1999):
- A small --stderr bug was reported by Eetu Ojanen...
- who also brought the suggestion of extending the -X flag to ftp list as
well. So, now it is and the long option is now --request instead. It is
only for ftp list for now (and the former http stuff too of course).
Lars J. Aas (24 November 1999):
- Patched curl to compile and build under BeOS. Doesn't work yet though!
- Corrected the Makefile.am files to allow putting object files in
different directories than the sources.
Version 6.3.1
Daniel (23 November 1999):
- I've had this major disk crash. My good old trust-worthy source disk died
along with the machine that hosted it. Thank goodness most of all the
things I've done are either backed up elsewhere or stored in this CVS
server!
- Michael S. Steuer pointed out a bug in the -F handling
that made curl hang if you posted an empty variable such as '-F name='. It
was one of those old bugs that never have worked properly...
- Jason Baietto pointed out a general flaw in the HTTP
download. Curl didn't complain if it was prematurely aborted before the
entire download was completed. It does now.
Daniel (19 November 1999):
- Chris Maltby very accurately criticized the lack of
return code checks on the fwrite() calls. I did a thorough check for all
occurrences and corrected this.
Daniel (17 November 1999):
- Paul Harrington pointed out that the -m/--max-time option
doesn't work for the slow system calls like gethostbyname()... I don't have
any good fix yet, just a slightly less bad one that makes curl exit hard
when the timeout is reached.
- Bjorn Reese helped me point out a possible problem that might be the reason
why Thomas Hurst experience problems in his Amiga version.
Daniel (12 November 1999):
- I found a crash in the new cookie file parser. It crashed when you gave
a plain http header file as input...
Version 6.3
Daniel (10 November 1999):
- I kind of found out that the HTTP time-conditional GETs (-z) aren't always
respected by the web server and the document is therefore sent in whole
again, even though it doesn't match the requested condition. After reading
section 13.3.4 of RFC 2616, I think I'm doing the right thing now when I do
my own check as well. If curl thinks the condition isn't met, the transfer
is aborted prematurely (after all the headers have been received).
- After comments from Robert Linden I also rewrote some parts of the man page
to better describe how the -F works.
- Michael Anti put up a new curl download mirror in
China: http://www.pshowing.com/curl/
- I added the list of download mirrors to the README file
- I did add more explanations to the man page
Daniel (8 November 1999):
- I made the -b/--cookie option capable of reading netscape formatted cookie
files as well as normal http-header files. It should be able to
transparently figure out what kind of file it got as input.
Daniel (29 October 1999):
- Another one of Sebastiaan van Erk's ideas (that has been requested before
but I seem to have forgotten who it was), is to add support for ranges in
FTP downloads. As usual, one request is just a request, when they're two
it is a demand. I've added simple support for X-Y style fetches. X has to
be the lower number, though you may omit one of the numbers. Use the -r/
--range switch (previously HTTP-only).
- Sebastiaan van Erk suggested that curl should be
able to show the file size of a specified file. I think this is a splendid
idea and the -I flag is now working for FTP. It displays the file size in
this manner:
Content-Length: XXXX
As it resembles normal headers, and leaves us the opportunity to add more
info in that display if we can come up with more in the future! It also
makes sense since if you access ftp through a HTTP proxy, you'd get the
file size the same way.
I changed the order of the QUOTE command executions. They're now executed
just after the login and before any other command. I made this to enable
quote commands to run before the -I stuff is done too.
- I found out that -D/--dump-header and -V/--version weren't documented in
the man page.
- Many HTTP/1.1 servers do not support ranges. Don't ask me why. I did add
some text about this in the man page for the range option. The thread in
the mailing list that started this was initiated by Michael Anti.
- I get reports about nroff crashes on solaris 2.6+ when displaying the curl
man page. Switch to gnroff instead, it is reported to work(!). Adam Barclay
reported and brought the suggestion.
- In a dialogue with Johannes G. Kristinsson we came
up with the idea to let -H/--header specified headers replace the
internally generated headers, if you happened to select to add a header
that curl normally uses by itself. The advantage with this is not entirely
obvious, but in Johannes' case it means that he can use another Host: than
the one curl would set.
Daniel (27 October 1999):
- Jongki Suwandi brought a nice patch for (yet another) crash when following
a location:. This time you had to follow a https:// server's redirect to
get the core.
Version 6.2
Daniel (21 October 1999):
- I think I managed to remove the suspicious (nil) that has been seen just
before the "Host:" in HTTP requests when -v was used.
- I found out that if you followed a location: when using a proxy, without
having specified http:// in the URL, the protocol part was added once again
when moving to the next URL! (The protocol part has to be added to the
URL when going through a proxy since it has no protocol-guessing system
such as curl has.)
- Benjamin Ritcey reported a core dump under solaris 2.6
with OpenSSL 0.9.4. It turned out this was due to a bad free() in main.c
that occurred after the download was done and completed.
- Benjamin found ftp downloads to show the first line of the download meter
to get written twice, and I removed that problem. It was introduced with
the multiple URL support.
- Dan Zitter correctly pointed out that curl 6.1 and earlier versions didn't
honor RFC 2616 chapter 4 section 2, "Message Headers": "...Field names are
case-insensitive..." HTTP header parsing assumed a certain casing. Dan
also provided me with a patch that corrected this, which I took the liberty
of editing slightly.
- Dan Zitter also provided a nice patch for config.guess to better recognize
the Mac OS X
- Dan also corrected a minor problem in the lib/Makefile that caused linking
to fail on OS X.
Daniel (19 October 1999):
- Len Marinaccio came up with some problems with curl. Since Windows has a
crippled shell, it can't redirect stderr and that causes trouble. I added
--stderr today which allows the user to redirect the stderr stream to a
file or stdout.
Daniel (18 October 1999):
- The configure script now understands the '--without-ssl' flag, which now
totally disable SSL/https support. Previously it wasn't possible to force
the configure script to leave SSL alone. The previous functionality has
been retained. Troy Engel helped test this new one.
Version 6.1
Daniel (17 October 1999):
- I ifdef'ed or commented all the zlib stuff in the sources and configure
script. It turned out we needed to mock more with zlib than I initially
thought, to make it capable of downloading compressed HTTP documents and
uncompress them on the fly. I didn't mean the zlib parts of curl to become
more than minor so this means I halt the zlib expedition for now and wait
until someone either writes the code or zlib gets updated and better
adjusted for this kind of usage. I won't get into details here, but a
short a summary is suitable:
- zlib can't automatically detect whether to use zlib or gzip
decompression methods.
- zlib is very neat for reading gzipped files from a file descriptor,
although not as nice for reading buffer-based data such as we would
want it.
- there are still some problems with the win32 version when reading from
a file descriptor if that is a socket
Daniel (14 October 1999):
- Moved the (external) include files for libcurl into a subdirectory named
curl and adjusted all #include lines to use <curl/XXXX> to maintain a
better name space and control of the headers. This has been requested.
Daniel (12 October 1999):
- I modified the 'maketgz' script to perform a 'make' too before a release
archive is put together in an attempt to make the time stamps better and
hopefully avoid the double configure-running that use to occur.
Daniel (11 October 1999):
- Applied J<>rn's patches that fixes zlib for mingw32 compiles as well as
some other missing zlib #ifdef and more text on the multiple URL docs in
the man page.
Version 6.1beta
Daniel (6 October 1999):
- Douglas E. Wegscheid sent me a patch that made the exact same thing as I
just made: the -d switch is now capable of reading post data from a named
file or stdin. Use it similarly to the -F. To read the post data from a
given file:
curl -d @path/to/filename www.postsite.com
or let curl read it out from stdin:
curl -d @- www.postit.com
J<>rn Hartroth (3 October 1999):
- Brought some more patches for multiple URL functionality. The MIME
separation ideas are almost scrapped now, and a custom separator is being
used instead. This is still compile-time "flagged".
Daniel
- Updated curl.1 with multiple URL info.
Daniel (30 September 1999):
- Felix von Leitner brought openssl-check fixes for configure.in to work
out-of-the-box when the openssl files are installed in the system default
dirs.
Daniel (28 September 1999)
- Added libz functionality. This should enable decompressing gzip, compress
or deflate encoding HTTP documents. It also makes curl send an accept that
it accepts that kind of encoding. Compressed contents usually shortens
download time. I *need* someone to tell me a site that uses compressed HTTP
documents so that I can test this out properly.
- As a result of the adding of zlib awareness, I changed the version string
a little. I plan to add openldap version reporting in there too.
Daniel (17 September 1999)
- Made the -F option allow stdin when specifying files. By using '-' instead
of file name, the data will be read from stdin.
Version 6.0
Daniel (13 September 1999)
- Added -X/--http-request <request> to enable any HTTP command to be sent.
Do not that your server has to support the exact string you enter. This
should possibly a string like DELETE or TRACE.
- Applied Douglas' mingw32-fixes for the makefiles.
Daniel (10 September 1999)
- Douglas E. Wegscheid pointed out a problem. Curl didn't check the FTP
servers return code properly after the --quote commands were issued. It
took anything non 200 as an error, when all 2XX codes should be accepted as
OK.
- Sending cookies to the same site in multiple lines like curl used to do
turned out to be bad and breaking the cookie specs. Curl now sends all
cookies on a single Cookie: line. Curl is not yet RFC 2109 compliant, but I
doubt that many servers do use that syntax (yet).
Daniel (8 September 1999)
- J<>rn helped me make sure it still compiles nicely with mingw32 under win32.
Daniel (7 September 1999)
- FTP upload through proxy is now turned into a HTTP PUT. Requested by
Stefan Kanthak.
- Added the ldap files to the .m32 makefile.
Daniel (3 September 1999)
- Made cookie matching work while using HTTP proxy.
Bjorn Reese (31 August 1999)
- Passed his ldap:// patch. Note that this requires the openldap shared
library to be installed and that LD_LIBRARY_PATH points to the
directory where the lib will be found when curl is run with a
ldap:// URL.
J<>rn Hartroth (31 August 1999)
- Made the Mingw32 makefiles into single files.
- Made file:// work for Win32. The same code is now used for unix as well for
performance reasons.
Douglas E. Wegscheid (30 August 1999)
- Patched the Mingw32 makefiles for SSL builds.
Matthew Clarke (30 August 1999)
- Made a cool patch for configure.in to allow --with-ssl to specify the
root dir of the openssl installation, as in
./configure --with-ssl=/usr/ssl_here
- Corrected the 'reconf' script to work better with some shells.
J<>rn Hartroth (26 August 1999)
- Fixed the Mingw32 makefiles in lib/ and corrected the file.c for win32
compiles.
Version 5.11
Daniel (25 August 1999)
- John Weismiller pointed out a bug in the header-line
realloc() system in download.c.
- I added lib/file.[ch] to offer a first, simple, file:// support. It
probably won't do much good on win32 system at this point, but I see it
as a start.
- Made the release archives get a Makefile in the root dir, which can be
used to start the compiling/building process easier. I haven't really
changed any INSTALL text yet, I wanted to get some feed-back on this
first.
Daniel (17 August 1999)
- Another Location: bug. Curl didn't do proper relative locations if the
original URL had cgi-parameters that contained a slash. Nusu's page
again.
- Corrected the NO_PROXY usage. It is a list of substrings that if one of
them matches the tail of the host name it should connect to, curl should
not use a proxy to connect there. Pointed out to me by Douglas
E. Wegscheid. I also changed the README text a little regarding this.
Daniel (16 August 1999)
- Fixed a memory bug with http-servers that sent Location: to a Location:
page. Nusu's page showed this too.
- Made cookies work a lot better. Setting the same cookie name several times
used to add more cookies instead of replacing the former one which it
should've. Nusu <nus at intergorj.ro> brought me an URL that made this
painfully visible...
Troy (15 August 1999)
- Brought new .spec files as well as a patch for configure.in that lets the
configure script find the openssl files better, even when the include
files are in /usr/include/openssl
Version 5.10
Daniel (13 August 1999)
- SSL_CTX_set_default_passwd_cb() has been modified in the 0.9.4 version of
OpenSSL. Now why couldn't they simply add a *new* function instead of
modifying the parameters of an already existing function? This way, we get
a compiler warning if compiling with 0.9.4 but not with earlier. So, I had
to come up with a #if construction that deals with this...
- Made curl output the SSL version number get displayed properly with 0.9.4.
Troy (12 August 1999)
- Added MingW32 (GCC-2.95) support under Win32. The INSTALL file was also
a bit rearranged.
Daniel (12 August 1999)
- I had to copy a good <arpa/telnet.h> include file into the curl source
tree to enable the silly win32 systems to compile. The distribution rights
allows us to do that as long as the file remains unmodified.
- I corrected a few minor things that made the compiler complain when
-Wall -pedantic was used.
- I'm moving the official curl web page to http://curl.haxx.nu. I think it
will make it easier to remember as it is a lot shorter and less cryptic.
The old one still works and shows the same info.
Daniel (11 August 1999)
- Albert Chin-A-Young mailed me another correction for NROFF in the
configure.in that is supposed to be better for IRIX users.
Daniel (10 August 1999)
- Albert Chin-A-Young helped me with some stupid Makefile things, as well as
some fiddling with the getdate.c stuff that he had problems with under
HP-UX v10. getdate.y will now be compiled into getdate.c if the appropriate
yacc or bison is found by the configure script. Since this is slightly new,
we need to test the output getdate.c with win32 systems to make sure it
still compiles there.
Daniel (5 August 1999)
- I've just setup a new mailing list with the intention to keep discussions
around libcurl development in it. I mainly expect it to be for thoughts and
brainstorming around a "next generation" library, rather than nitpicking
about the current implementation or details in the current libcurl.
To join our happy bunch of future-looking geeks, enter 'subscribe
<address>' in the body of a mail and send it to
libcurl-request@listserv.fts.frontec.se. Curl bug reports, the usual curl
talk and everything else should still be kept in this mailing list. I've
started to archive this mailing list and have put the libcurl web page at
www.fts.frontec.se/~dast/libcurl/.
- Stefan Kanthak contacted me regarding a few problems in the configure
script which he discovered when trying to make curl compile and build under
Siemens SINIX-Z V5.42B2004!
- Marcus Klein very accurately informed me that src/version.h was not present
in the CVS repository. Oh, how silly...
- Linus Nielsen rewrote the telnet:// part and now curl offers limited telnet
support. If you run curl like 'curl telnet://host' you'll get all output on
the screen and curl will read input from stdin. You'll be able to login and
run commands etc, but since the output is buffered, expect to get a little
weird output.
This is still in its infancy and it might get changed. We need your
feed-back and input in how this is best done.
WIN32 NOTE: I bet we'll get problems when trying to compile the current
lib/telnet.c on win32, but I think we can sort them out in time.
- David Sanderson reported that FORCE_ALLOCA_H or HAVE_ALLOCA_H must be
defined for getdate.c to compile properly on HP-UX 11.0. I updated the
configure script to check for alloca.h which should make it.
Daniel (4 August 1999)
- I finally got to understand Marcus Klein's ftp download resume problem,
which turns out to be due to different outputs from different ftp
servers. It makes ftp download resuming a little trickier, but I've made
some modifications I really believe will work for most ftp servers and I do
hope you report if you have problems with this!
- Added text about file transfer resuming to README.curl.
Daniel (2 August 1999)
- Applied a progress-bar patch from Lars J. Aas. It offers
a new styled progress bar enabled with -#/--progress-bar.
T. Yamada <tai at imasy.or.jp> (30 July 1999)
- It breaks with segfault when 1) curl is using .netrc to obtain
username/password (option '-n'), and 2) is automatically redirected to
another location (option '-L').
There is a small bug in lib/url.c (block starting from line 641), which
tries to take out username/password from user- supplied command-line
argument ('-u' option). This block is never executed on first attempt since
CONF_USERPWD bit isn't set at first, but curl later turns it on when it
checks for CONF_NETRC bit. So when curl tries to redo everything due to
redirection, it segfaults trying to access *data->userpwd.
Version 5.9.1
Daniel (30 July 1999)
- Steve Walch pointed out that there is a memory leak in the formdata
functions. I added a FormFree() function that is now used and supposed to
correct this flaw.
- Mark Wotton reported:
'curl -L https://www.cwa.com.au/' core dumps. I managed to cure this by
correcting the cleanup procedure. The bug seems to be gone with my OpenSSL
0.9.2b, although still occurs when I run the ~100 years old SSLeay 0.8.0. I
don't know whether it is curl or SSLeay that is to blame for that.
- Marcus Klein:
Reported an FTP upload resume bug that I really can't repeat nor understand.
I leave it here so that it won't be forgotten.
Daniel (29 July 1999)
- Costya Shulyupin suggested support for longer URLs when following Location:
and I could only agree and fix it!
- Leigh Purdie found a problem in the upload/POST department. It turned out
that http.c accidentaly cleared the pointer instead of the byte counter
when supposed to.
- Costya Shulyupin pointed out a problem with port numbers and Location:. If
you had a server at a non-standard port that redirected to an URL using a
standard port number, curl still used that first port number.
- Ralph Beckmann pointed out a problem when using both CONF_FOLLOWLOCATION
and CONF_FAILONERROR simultaneously. Since the CONF_FAILONERROR exits on
the 302-code that the follow location header outputs it will never show any
html on location: pages. I have now made it look for >=400 codes if
CONF_FOLLOWLOCATION is set.
- 'struct slist' is now renamed to 'struct curl_slist' (as suggested by Ralph
Beckmann).
- Joshua Swink and Rick Welykochy were the first to point out to me that the
latest OpenSSL package now have moved the standard include path. It is now
in /usr/local/ssl/include/openssl and I have now modified the --enable-ssl
option for the configure script to use that as the primary path, and I
leave the former path too to work with older packages of OpenSSL too.
Daniel (9 June 1999)
- I finally understood the IRIX problem and now it seem to compile on it!
I am gonna remove those #define strcasecmp() things once and for all now.
Daniel (4 June 1999)
- I adjusted the FTP reply 227 parser to make the PASV command work better
with more ftp servers. Appearantly the Roxen Challanger server replied
something curl 5.9 could deal with! :-( Reported by Ashley Reid-Montanaro
and Mark Butler brought a solution for it.
Daniel (26 May 1999)
- Rearranged. README is new, the old one is now README.curl and I added a
README.libcurl with text I got from Ralph Beckmann.
- I also updated the INSTALL text.
Daniel (25 May 1999)
- David Jonathan Lowsky correctly pointed out that curl didn't properly deal
with form posting where the variable shouldn't have any content, as in curl
-F "form=" www.site.com. It was now fixed.
Version 5.9
Daniel (22 May 1999)
- I've got a bug report from Aaron Scarisbrick in which he states he has some
problems with -L under FreeBSD 3.0. I have previously got another bug
report from Stefan Grether which points at an error with similar sympthoms
when using win32. I made the allocation of the new url string a bit faster
and different, don't know if it actually improves anything though...
Daniel (20 May 1999)
- Made the cookie parser deal with CRLF newlines too.
Daniel (19 May 1999)
- Download() didn't properly deal with failing return codes from the sread()
function. Adam Coyne found the problem in the win32 version, and Troy Engel
helped me out isolating it.
Daniel (16 May 1999)
- Richard Adams pointed out a bug I introduced in 5.8. --dump-header doesn't
work anymore! :-/ I fixed it now.
- After a suggestion by Joshua Swink I added -S / --show-error to force curl
to display the error message in case of an error, even if -s/--silent was
used.
Daniel (10 May 1999)
- I moved the stuff concerning HTTP, DICT and TELNET it their own source
files now. It is a beginning on my clean-up of the sources to make them
layer all those protocols better to enable more to be added easier in the
future!
- Leon Breedt sent me some files I've not put into the main curl
archive. They're for creating the Debian package thingie. He also sent me a
debian package that I've made available for download at the web page
Daniel (9 May 1999)
- Made it compile on cygwin too.
Troy Engel (7 May 1999)
- Brought a series of patches to allow curl to compile smoothly on MSVC++ 6
again!
Daniel (6 May 1999)
- I changed the #ifdef HAVE_STRFTIME placement for the -z code so that it
will be easier to discover systems that don't have that function and thus
can't use -z successfully. Made the strftime() get used if WIN32 is defined
too.
Version 5.8
Daniel (5 May 1999)
- I've had it with this autoconf/automake mess. It seems to work allright
for most people who don't have automake installed, but for those who have
there are problems all over.
I've got like five different bug reports on this only the last
week... Claudio Neves and Federico Bianchi and root <duggerj001 at
hawaii.rr.com> are some of them reporting this.
Currently, I have no really good fix since I want to use automake myself to
generate the Makefile.in files. I've found out that the @SHELL@-problems
can often be fixed by manually invoking 'automake' in the archive root
before you run ./configure... I've hacked my maketgz script now to fiddle
a bit with this and my tests seem to work better than before at least!
Daniel (4 May 1999)
- mkhelp.pl has been doing badly lately. I corrected a case problem in
the regexes.
- I've now remade the -o option to not touch the file unless it needs to.
I had to do this to make -z option really fine, since now you can make a
curl fetch and use a local copy's time when downloading to that file, as
in:
curl -z dump -o dump remote.site.com/file.html
This will only get the file if the remote one is newer than the local.
I'm aware that this alters previous behaviour a little. Some scripts out
there may depend on that the file is always touched...
- Corrected a bug in the SSLv2/v3 selection.
- Felix von Leitner requested that curl should be able to send
"If-Modified-Since" headers, which indeed is a fair idea. I implemented it
right away! Try -z <expression> where expression is a full GNU date
expression or a file name to get the date from!
Stephan Lagerholm (30 Apr 1999)
- Pointed out a problem with the src/Makefile for FreeBSD. The RM variable
isn't set and causes the make to fail.
Daniel (26 April 1999)
- Am I silly or what? Irving Wolfe pointed out to me that the curl version
number was not set properly. Hasn't been since 5.6. This was due to a bug
in my maketgz script!
David Eriksson (25 Apr 1999)
- Found a bug in cookies.c that made it crash at times.
Version 5.7.1
Doug Kaufman (23 Apr 1999)
- Brought two sunos 4 fixes. One of them being the hostip.c fix mentioned
below and the other one a correction in include/stdcheaders.h
- Added a paragraph about compiling with the US-version of openssl to the
INSTALL file.
Daniel
- New mailing list address. Info updated on the web page as well as in the
README file
Greg Onufer (20 Apr 1999)
- hostip.c didn't compile properly on SunOS 5.5.1.
It needs an #include <sys/types.h>
Version 5.7
Daniel (Apr 20 1999)
- Decided to upload a non-beta version right now!
- Made curl support any-length HTTP headers. The destination buffer is now
simply enlarged every time it turns out to be too small!
- Added the FAQ file to the archive. Still a bit smallish, but it is a
start.
Eric Thelin (15 Apr 1999)
- Made -D accept '-' instead of filename to write to stdout.
Version 5.6.3beta
Daniel (Apr 12 1999)
- Changed two #ifdef WIN32 to better #ifdef <errorcode> when connect()ing
in url.c and ftp.c. Makes cygwin32 deal with them better too. We should
try to get some decent win32-replacement there. Anyone?
- The old -3/--crlf option is now ONLY --crlf!
- I changed the "SSL fix" to a more lame one, but that doesn't remove as
much functionality. Now I've enabled the lib to select what SSL version it
should try first. Appearantly some older SSL-servers don't like when you
talk v3 with them so you need to be able to force curl to talk v2 from the
start. The fix dated April 6 and posted on the mailing list forced curl to
use v2 at all times using a modern OpenSSL version, but we don't really
want such a crippled solution.
- Marc Boucher sent me a patch that corrected a math error for the
"Curr.Speed" progress meter.
- Eric Thelin sent me a patch that enables '-K -' to read a config file from
stdin.
- I found out we didn't close the file properly before so I added it!
Daniel (Apr 9 1999)
- Yu Xin pointed out a problem with ftp download resume. It didn't work at
all! ;-O
Daniel (Apr 6 1999)
- Corrected the version string part generated for the SSL version.
- I found a way to make some other SSL page work with openssl 0.9.1+ that
previously didn't (ssleay 0.8.0 works with it though!). Trying to get
some real info from the OpenSSL guys to see how I should do to behave the
best way. SSLeay 0.8.0 shouldn't be that much in use anyway these days!
Version 5.6.2beta
Daniel (Apr 4 1999)
- Finally have curl more cookie "aware". Now read carefully. This is how
it works.
To make curl read cookies from an already existing file, in plain header-
format (like from the headers of a previous fetch) invoke curl with the
-b flag like:
curl -b file http://site/foo.html
Curl will then use all cookies it finds matching. The old style that sets
a single cookie with -b is still supported and is used if the string
following -b includes a '=' letter, as in "-b name=daniel".
To make curl read the cookies sent in combination with a location: (which
sites often do) point curl to read a non-existing file at first (i.e
to start with no existing cookies), like:
curl -b nowhere http://site/setcookieandrelocate.html
- Added a paragraph in the TODO file about the SSL problems recently
reported. Evidently, some kind of SSL-problem curl may need to address.
- Better "Location:" following.
Douglas E. Wegscheid (Tue, 30 Mar 1999)
- A subsecond display patch.
Daniel (Mar 14 1999)
- I've separated the version number of libcurl and curl now. To make
things a little easier, I decided to start the curl numbering from
5.6 and the former version number known as "curl" is now the one
set for libcurl.
- Removed the 'enable-no-pass' from configure, I doubt anyone wanted
that.
- Made lots of tiny adjustments to compile smoothly with cygwin under
win32. It's a killer for porting this to win32, bye bye VC++! ;-)
Compiles and builds out-of-the-box now. See the new wordings in
INSTALL for details.
- Beginning experiments with downloading multiple document from a http
server while remaining connected.
Version 5.6beta
Daniel (Mar 13 1999)
- Since I've changed so much, I thought I'd just go ahead and implement the
suggestion from Douglas E. Wegscheid. -D or --dump-header is now storing
HTTP headers separately in the specified file.
- Added new text to INSTALL on what to do to build this on win32 now.
- Aaargh. I had to take a step back and prefix the shared #include files
in the sources with "../include/" to please VC++...
Daniel (Mar 12 1999)
- Split the url.c source into many tiny sources for better readability
and smaller size.
Daniel (Mar 11 1999)
- Started to change stuff for a move to make libcurl and a more separate
curl application that uses the libcurl. Made the libcurl sources into
the new lib directory while the curl application will remain in src as
before. New makefiles, adjusted configure script and so.
libcurl.a built quickly and easily. I better make a better interface to
the lib functions though.
The new root dir include/ is supposed to contain the public information
about the new libcurl. It is a little ugly so far :-)
Daniel (Mar 1 1999)
- Todd Kaufmann sent me a good link to Netscape's cookie spec as well as the
info that RFC 2109 specifies how to use them. The link is now in the
README and the RFC in the RESOURCES.
Daniel (Feb 23 1999)
- Finally made configure accept --with-ssl to look for SSL libs and includes
in the "standard" place /usr/local/ssl...
Daniel (Feb 22 1999)
- Verified that curl linked fine with OpenSSL 0.9.1c which seems to be
the most recent.
Henri Gomez (Fri Feb 5 1999)
- Sent in an updated curl-ssl.spec. I still miss the script that builds an
RPM automatically...
Version 5.5.1
Mark Butler (27 Jan 1999)
- Corrected problems in Download().
Danitel Stenberg (25 Jan 1999)
- Jeremie Petit pointed out a few flaws in the source that prevented it from
compile warning free with the native compiler under Digital Unix v4.0d.
Version 5.5
Daniel Stenberg (15 Jan 1999)
- Added Bjorns small text to the README about the DICT protocol.
Daniel Stenberg (11 Jan 1999)
- <jswink at softcom.net> reported about the win32-versioin: "Doesn't use
ALL_PROXY environment variable". Turned out to be because of the static-
buffer nature of the win32 environment variable calls!
Bjorn Reese (10 Jan 1999)
- I have attached a simple addition for the DICT protocol (RFC 2229).
It performs dictionary lookups. The output still needs to be better
formatted.
To test it try (the exact format, and more examples are described in
the RFC)
dict://dict.org/m:hello
dict://dict.org/m:hello::soundex
Vicente Garcia (10 Jan 1999)
- Corrected the progress meter for files larger than 20MB.
Daniel Stenberg (7 Jan 1999)
- Corrected the -t and -T help texts. They claimed to be FTP only.
Version 5.4
Daniel Stenberg
(7 Jan 1999)
- Irving Wolfe reported that curl -s didn't always supress the progress
reporting. It was the form post that autoamtically always switched it on
again. This is now corrected!
(4 Jan 1999)
- Andreas Kostyrka suggested I'd add PUT and he helped me out to test it. If
you use -t or -T now on a http or https server, PUT will be used for file
upload.
I removed the former use of -T with HTTP. I doubt anyone ever really used
that.
(4 Jan 1999)
- Erik Jacobsen found a width bug in the mprintf() function. I corrected it
now.
(4 Jan 1999)
- As John V. Chow pointed out to me, curl accepted very limited URL sizes. It
should now accept path parts that are up to at least 4096 bytes.
- Somehow I screwed up when applying the AIX fix from Gilbert Ramirez, so
I redid that now.

View File

@@ -10,14 +10,10 @@ This file is only present in the CVS - never in release archives. It contains
information about other files and things that the CVS repository keeps in its
inner sanctum.
Use autoconf 2.50 and no earlier. Also, try having automake 1.5 and libtool
1.4.1 at least.
You will need perl to generate the src/hugehelp.c file. The file
src/hugehelp.c.cvs is a one-shot file that you can rename to src/hugehelp.c if
you really can't generate the true file yourself!
Compile and build instructions follow below.
CHANGES.0 contains ancient changes.
CHANGES.$year contains changes for the particular year.
memanalyze.pl is for analyzing the output generated by curl if -DMALLOCDEBUG
is used when compiling
@@ -26,12 +22,38 @@ you really can't generate the true file yourself!
Makefile.dist is included as the root Makefile in distribution archives
perl/contrib/ is a subdirectory with various perl scripts
java/ is a subdirectory with the Java interface to libcurl
perl/ is a subdirectory with various perl scripts
To build after having extracted everything from CVS, do this:
./buildconf
./configure
make
REQUIREMENTS
You need the following software installed:
o autoconf 2.50 (or later)
o automake 1.5 (or later)
o libtool 1.4 (or later)
o GNU m4 (required by autoconf)
o nroff + perl (if you don't have nroff and perl and you for some reason
don't want to install them, you can rename the source file
src/hugehelp.c.cvs to src/hugehelp.c and avoid having to generate this
file. This will of course give you an older version of the file that isn't
up-to-date. That file was checked in once and won't be updated very
regularly.)
MAC OS X
For Mac OS X users, Guido Neitzer write down the following step-by-step guide:
1. Install fink (http://fink.sourceforge.net)
2. Update fink to the newest version (with the installed fink)
3. Install the latest version of autoconf, automake and m4 with fink
4. Install version 1.4.1 of libtool - you find it in the "unstable" section
(read the manual to see how to get unstable versions)
5. Get cURL from the cvs
6. Build cURL with "./buildconf", "./configure", "make", "sudo make install"

View File

@@ -5,7 +5,7 @@ die(){
exit
}
automake || die "The command 'automake $MAKEFILES' failed"
aclocal || die "The command 'aclocal' failed"
autoheader || die "The command 'autoheader' failed"
autoconf || die "The command 'autoconf' failed"
automake || die "The command 'automake $MAKEFILES' failed"

View File

@@ -392,6 +392,10 @@ else
OPENSSL_ENABLED=1)
fi
dnl Check for the OpenSSL engine header, it is kind of "separated"
dnl from the main SSL check
AC_CHECK_HEADERS(openssl/engine.h)
AC_SUBST(OPENSSL_ENABLED)
fi
@@ -469,6 +473,8 @@ else
dnl is there a localtime_r()
CURL_CHECK_LOCALTIME_R()
AC_CHECK_FUNCS( gmtime_r )
fi
dnl **********************************************************************

View File

@@ -1,4 +1,4 @@
Updated: November 27, 2001 (http://curl.haxx.se/docs/faq.shtml)
Updated: December 21, 2001 (http://curl.haxx.se/docs/faq.shtml)
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
@@ -33,7 +33,7 @@ FAQ
3.6 Does curl support javascript, ASP, XML, XHTML or HTML version Y?
3.7 Can I use curl to delete/rename a file through FTP?
3.8 How do I tell curl to follow HTTP redirects?
3.9 How do I use curl in PHP, Perl, Tcl, Ruby or Java?
3.9 How do I use curl in my favourite programming language?
3.10 What about SOAP, WebDAV, XML-RPC or similar protocols over HTTP?
3.11 How do I POST with a different Content-Type?
3.12 Why do FTP specific features over HTTP proxy fail?
@@ -49,6 +49,7 @@ FAQ
4.5.3 "403 Forbidden"
4.5.4 "404 Not Found"
4.5.5 "405 Method Not Allowed"
4.5.6 "301 Moved Permanently"
4.6 Can you tell me what error code 142 means?
4.7 How do I keep user names and passwords secret in Curl command lines?
4.8 I found a bug!
@@ -334,11 +335,12 @@ FAQ
curl -L http://redirector.com
3.9 How do I use curl in PHP, Perl, Tcl, Ruby or Java?
3.9 How do I use curl in my favourite programming language?
There exist many language-interfaces for curl that integrates it better with
various languages. If you are fluid in a script language, you may very well
opt to use such an interface instead of using the command line tool.
There exist many language interfaces/bindings for curl that integrates it
better with various languages. If you are fluid in a script language, you
may very well opt to use such an interface instead of using the command line
tool.
At the time of writing, there are bindings for the five language mentioned
above, but chances are there are even more by the time you read this. Or you
@@ -349,16 +351,9 @@ FAQ
http://curl.haxx.se/libcurl/
PHP4 has the ability to use libcurl as an internal module if built with that
option enabled. You then get a set of extra functions that can be used
within your PHP programs. You find all details about those functions in the
curl section in the PHP manual, see the online version at:
http://www.php.net/manual/ref.curl.php
PHP also offers the option to run a command line, and then you can of course
invoke the curl tool using a command line. This is the way to use curl if
you're using PHP3 or PHP4 built without curl module support.
In December 2001, there are interfaces available for the following
languages: C/C++, Cocoa, Dylan, Java, Perl, PHP, Python, Rexx, Ruby, Scheme
and Tcl. By the time you read this, additional ones may have appeared!
3.10 What about SOAP, WebDAV, XML-RPC or similar protocols over HTTP?
@@ -386,7 +381,7 @@ FAQ
There is one exception to this rule, and that is if you can "tunnel through"
the given HTTP proxy. Proxy tunneling is enabled with a special option (-p)
and is generally not available as proxy admins usuable disable tunneling to
and is generally not available as proxy admins usually disable tunneling to
other ports than 443 (which is used for HTTPS access through proxies).
4. Running Problems
@@ -478,6 +473,17 @@ FAQ
identified by the Request-URI. The response MUST include an Allow header
containing a list of valid methods for the requested resource.
4.5.6 "301 Moved Permanently"
If you get this return code and an HTML outpt similar to this:
<H1>Moved Permanently</H1> The document has moved <A
HREF="http://same_url_now_with_a_trailing_slash/">here</A>.
it might be because you request a directory URL but without the trailing
slash. Try the same operation again _with_ the trailing URL, or use the
-L/--location option to follow the redirection.
4.6. Can you tell me what error code 142 means?
All error codes that are larger than the highest documented error code means

View File

@@ -364,7 +364,8 @@ CROSS COMPILE
PORTS
=====
This is a probably incomplete list of known hardware and operating systems
that curl has been compiled for:
that curl has been compiled for. If you know one system curl compiles and
runs on, that isn't listed, please let us know!
- Alpha DEC OSF 4
- Alpha Digital UNIX v3.2
@@ -391,6 +392,7 @@ PORTS
- Ultrix 4.3a
- i386 BeOS
- i386 FreeBSD
- i386 HURD
- i386 Linux 1.3, 2.0, 2.2, 2.3, 2.4
- i386 NetBSD
- i386 OS/2

View File

@@ -76,3 +76,6 @@ that have contributed with non-trivial parts:
- Tomasz Lacki <Tomasz.Lacki@primark.pl>
- Georg Huettenegger <georg@ist.org>
- John Lask <johnlask@hotmail.com>
- Eric Lavigne <erlavigne@wanadoo.fr>
- Marcus Webster <marcus.webster@phocis.com>
- G<>tz Babin-Ebell <babin<69>ebell@trustcenter.de>

View File

@@ -34,6 +34,8 @@ TODO
* Add asynchronous name resolving. http://curl.haxx.se/dev/async-resolver.txt
* Strip any trailing CR from the error message when Curl_failf() is used.
DOCUMENTATION
* Document all CURLcode error codes, why they happen and what most likely
@@ -117,6 +119,12 @@ TODO
the same syntax to specify several files to get uploaded (using the same
persistant connection), using -T.
* Say you have a list of FTP addresses to download in a file named
ftp-list.txt: "cat ftp-list.txt | xargs curl -O -O -O [...]". curl _needs_
an "-Oalways" flag -- all addresses on the command line use the base
filename to store locally. Else a script must precount the # of URLs,
construct the proper number of "-O"s...
TEST SUITE
* Extend the test suite to include more protocols. The telnet could just do

View File

@@ -4,8 +4,7 @@
.\"
.TH curl 1 "30 Nov 2001" "Curl 7.9.2" "Curl Manual"
.SH NAME
curl \- get a URL with FTP, TELNET, LDAP, GOPHER, DICT, FILE, HTTP or
HTTPS syntax.
curl \- transfer a URL
.SH SYNOPSIS
.B curl [options]
.I [URL...]
@@ -677,7 +676,7 @@ If this option is used several times, the last one will be used.
Default config file.
.SH ENVIRONMENT
.IP "HTTP_PROXY [protocol://]<host>[:port]"
.IP "http_proxy [protocol://]<host>[:port]"
Sets proxy server to use for HTTP.
.IP "HTTPS_PROXY [protocol://]<host>[:port]"
Sets proxy server to use for HTTPS.
@@ -688,11 +687,8 @@ Sets proxy server to use for GOPHER.
.IP "ALL_PROXY [protocol://]<host>[:port]"
Sets proxy server to use if no protocol-specific proxy is set.
.IP "NO_PROXY <comma-separated list of hosts>"
list of host names that shouldn't go through any proxy. If set to a
asterisk '*' only, it matches all hosts.
.IP "COLUMNS <integer>"
The width of the terminal. This variable only affects curl when the
--progress-bar option is used.
list of host names that shouldn't go through any proxy. If set to a asterisk
'*' only, it matches all hosts.
.SH EXIT CODES
There exists a bunch of different error codes and their corresponding error
messages that may appear during bad conditions. At the time of this writing,

View File

@@ -2,7 +2,7 @@
.\" nroff -man [file]
.\" $Id$
.\"
.TH curl_easy_setopt 3 "30 Nov 2001" "libcurl 7.9.2" "libcurl Manual"
.TH curl_easy_setopt 3 "10 Dec 2001" "libcurl 7.9.2" "libcurl Manual"
.SH NAME
curl_easy_setopt - Set curl easy-session options
.SH SYNOPSIS
@@ -77,9 +77,8 @@ function gets called by libcurl as soon as it needs to read data in order to
send it to the peer. The data area pointed at by the pointer \fIptr\fP may be
filled with at most \fIsize\fP multiplied with \fInmemb\fP number of
bytes. Your function must return the actual number of bytes that you stored in
that memory area. Returning -1 will signal an error to the library and cause
it to abort the current transfer immediately (with a \fICURLE_READ_ERROR\fP
return code).
that memory area. Returning 0 will signal end-of-file to the library and cause
it to stop the current transfer.
.TP
.B CURLOPT_INFILESIZE
When uploading a file to a remote site, this option should be used to tell

View File

@@ -58,6 +58,13 @@ are allowed. The effect of this parameter is the same as giving multiple
\fBCURLFORM_FILE\fP options possibly with \fBCURLFORM_CONTENTTYPE\fP after or
before each \fBCURLFORM_FILE\fP option.
Should you need to specify extra headers for the form POST section, use
\fBCURLFORM_CONTENTHEADER\fP. This takes a curl_slist prepared in the usual way
using \fBcurl_slist_append\fP and appends the list of headers to those Curl
automatically generates for \fBCURLFORM_CONTENTTYPE\fP and the content
disposition. The list must exist while the POST occurs, if you free it before
the post completes you may experience problems.
The last argument in such an array must always be \fBCURLFORM_END\fP.
The pointers \fI*firstitem\fP and \fI*lastitem\fP should both be pointing to

View File

@@ -2,7 +2,7 @@
.\" nroff -man [file]
.\" $Id$
.\"
.TH curl_formparse 3 "21 May 2001" "libcurl 7.7.4" "libcurl Manual"
.TH curl_formparse 3 "17 Dec 2001" "libcurl 7.9.2" "libcurl Manual"
.SH NAME
curl_formparse - add a section to a multipart/formdata HTTP POST:
deprecated (use curl_formadd instead)
@@ -13,75 +13,6 @@ deprecated (use curl_formadd instead)
.BI "struct HttpPost ** " lastitem ");"
.ad
.SH DESCRIPTION
curl_formparse() is used to append sections when building a multipart/formdata
HTTP POST (sometimes refered to as rfc1867-style posts). Append one section at
a time until you've added all the sections you want included and then you pass
the \fIfirstitem\fP pointer as parameter to \fBCURLOPT_HTTPPOST\fP.
\fIlastitem\fP is set after each call and on repeated invokes it should be
left as set to allow repeated invokes to find the end of the list in a faster
way. \fIstring\fP must be a zero terminated string abiding to the syntax
described in a section below
The pointers \fI*firstitem\fP and \fI*lastitem\fP should both be pointing to
NULL in the first call to this function. All list-data will be allocated by
the function itself. You must call \fIcurl_formfree\fP after the form post has
been done to free the resources again.
This function will copy all input data and keep its own version of it
allocated until you call \fIcurl_formfree\fP. When you've passed the pointer
to \fIcurl_easy_setopt\fP, you must not free the list until after you've
called \fIcurl_easy_cleanup\fP for the curl handle.
See example below.
.SH "FORM PARSE STRINGS"
The
.I string
parameter must be using one of the following patterns. Note that the []
letters should not be included in the real-life string.
.TP 0.8i
.B [name]=[contents]
Add a form field named 'name' with the contents 'contents'. This is the
typcial contents of the HTML tag <input type=text>.
.TP
.B [name]=@[filename]
Add a form field named 'name' with the contents as read from the local file
named 'filename'. This is the typcial contents of the HTML tag <input
type=file>.
.TP
.B [name]=@[filename1,filename2,...]
Add a form field named 'name' with the contents as read from the local files
named 'filename1' and 'filename2'. This is identical to the upper, except that
you get the contents of several files in one section.
.TP
.B [name]=@[filename];[type=<content-type>]
Whenever you specify a file to read from, you can optionally specify the
content-type as well. The content-type is passed to the server together with
the contents of the file. curl_formparse() will guess content-type for a
number of well-known extensions and otherwise it will set it to binary. You
can override the internal decision by using this option.
.TP
.B [name]=@[filename1,filename2,...];[type=<content-type>]
When you specify several files to read the contents from, you can set the
content-type for all of them in the same way as with a single file.
.PP
.SH RETURN VALUE
Returns non-zero if an error occurs.
.SH EXAMPLE
HttpPost* post = NULL;
HttpPost* last = NULL;
/* Add an image section */
curl_formparse("picture=@my-face.jpg", &post, &last);
/* Add a normal text section */
curl_formparse("name=FooBar", &post, &last);
/* Set the form info */
curl_easy_setopt(curl, CURLOPT_HTTPPOST, post);
.SH "SEE ALSO"
.BR curl_easy_setopt "(3), "
.BR curl_formadd "(3), "
.BR curl_formfree "(3)
.SH BUGS
Surely there are some, you tell me!
This has been removed deliberately. The \fBcurl_formadd\fP has been introduced
to replace this function. Do not use this. Convert to the new function
now. curl_formparse() will be removed from a future version of libcurl.

View File

@@ -6,7 +6,8 @@ AUTOMAKE_OPTIONS = foreign no-dependencies
EXTRA_DIST = README curlgtk.c sepheaders.c simple.c postit.c postit2.c \
win32sockets.c persistant.c ftpget.c Makefile.example \
multithread.c getinmemory.c ftpupload.c httpput.c
multithread.c getinmemory.c ftpupload.c httpput.c \
simplessl.c
all:
@echo "done"

110
docs/examples/simplessl.c Normal file
View File

@@ -0,0 +1,110 @@
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |
* / __| | | | |_) | |
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* $Id$
*/
#include <stdio.h>
#include <curl/curl.h>
#include <curl/types.h>
#include <curl/easy.h>
/* some requirements for this to work:
1. set pCertFile to the file with the client certificate
2. if the key is passphrase protected, set pPassphrase to the
passphrase you use
3. if you are using a crypto engine:
3.1. set a #define USE_ENGINE
3.2. set pEngine to the name of the crypto engine you use
3.3. set pKeyName to the key identifier you want to use
4. if you don't use a crypto engine:
4.1. set pKeyName to the file name of your client key
4.2. if the format of the key file is DER, set pKeyType to "DER"
!! verify of the server certificate is not implemented here !!
*/
int main(int argc, char **argv)
{
CURL *curl;
CURLcode res;
FILE *headerfile;
const char *pCertFile = "testcert.pem";
const char *pKeyName;
const char *pKeyType;
const char *pEngine;
#if USE_ENGINE
pKeyName = "rsa_test";
pKeyType = "ENG";
pEngine = "chil"; /* for nChiper HSM... */
#else
pKeyName = "testkey.pem";
pKeyType = "PEM";
pEngine = NULL;
#endif
const char *pPassphrase = NULL;
headerfile = fopen("dumpit", "w");
curl_global_init(CURL_GLOBAL_DEFAULT);
curl = curl_easy_init();
if(curl) {
/* what call to write: */
curl_easy_setopt(curl, CURLOPT_URL, "HTTPS://curl.haxx.se");
curl_easy_setopt(curl, CURLOPT_WRITEHEADER, headerfile);
while(1) /* do some ugly short cut... */
{
if (pEngine) /* use crypto engine */
{
if (curl_easy_setopt(curl, CURLOPT_SSLENGINE,pEngine) != CURLE_OK)
{ /* load the crypto engine */
fprintf(stderr,"can't set crypto engine\n");
break;
}
if (curl_easy_setopt(curl, CURLOPT_SSLENGINE_DEFAULT,1) != CURLE_OK)
{ /* set the crypto engine as default */
/* only needed for the first time you load
a engine in a curl object... */
fprintf(stderr,"can't set crypto engine as default\n");
break;
}
}
/* cert is stored PEM coded in file... */
/* since PEM is default, we needn't set it for PEM */
curl_easy_setopt(curl,CURLOPT_SSLCERTTYPE,"PEM");
/* set the cert for client authentication */
curl_easy_setopt(curl,CURLOPT_SSLCERT,pCertFile);
/* sorry, for engine we must set the passphrase
(if the key has one...) */
if (pPassphrase)
curl_easy_setopt(curl,CURLOPT_SSLKEYPASSWD,pPassphrase);
/* if we use a key stored in a crypto engine,
we must set the key type to "ENG" */
curl_easy_setopt(curl,CURLOPT_SSLKEYTYPE,pKeyType);
/* set the private key (file or ID in engine) */
curl_easy_setopt(curl,CURLOPT_SSLKEY,pKeyName);
res = curl_easy_perform(curl);
break; /* we are done... */
}
/* always cleanup */
curl_easy_cleanup(curl);
}
curl_global_cleanup();
return 0;
}

View File

@@ -62,6 +62,7 @@ struct HttpPost {
char *contents; /* pointer to allocated data contents */
long contentslength; /* length of contents field */
char *contenttype; /* Content-Type */
struct curl_slist* contentheader; /* list of extra headers for this form */
struct HttpPost *more; /* if one field name has more than one file, this
link should link to following files */
long flags; /* as defined below */
@@ -155,6 +156,8 @@ typedef enum {
CURLE_OBSOLETE, /* 50 - removed after 7.7.3 */
CURLE_SSL_PEER_CERTIFICATE, /* 51 - peer's certificate wasn't ok */
CURLE_GOT_NOTHING, /* 52 - when this is a specific error */
CURLE_SSL_ENGINE_NOTFOUND, /* 53 - SSL crypto engine not found */
CURLE_SSL_ENGINE_SETFAILED, /* 54 - can not set SSL crypto engine as default */
CURL_LAST /* never use! */
} CURLcode;
@@ -278,8 +281,10 @@ typedef enum {
/* name of the file keeping your private SSL-certificate */
CINIT(SSLCERT, OBJECTPOINT, 25),
/* password for the SSL-certificate */
/* password for the SSL-private key, keep this for compatibility */
CINIT(SSLCERTPASSWD, OBJECTPOINT, 26),
/* password for the SSL private key */
CINIT(SSLKEYPASSWD, OBJECTPOINT, 26),
/* send TYPE parameter? */
CINIT(CRLF, LONG, 27),
@@ -466,6 +471,23 @@ typedef enum {
PASV command. */
CINIT(FTP_USE_EPSV, LONG, 85),
/* type of the file keeping your SSL-certificate ("DER", "PEM", "ENG") */
CINIT(SSLCERTTYPE, OBJECTPOINT, 86),
/* name of the file keeping your private SSL-key */
CINIT(SSLKEY, OBJECTPOINT, 87),
/* type of the file keeping your private SSL-key ("DER", "PEM", "ENG") */
CINIT(SSLKEYTYPE, OBJECTPOINT, 88),
/* crypto engine for the SSL-sub system */
CINIT(SSLENGINE, OBJECTPOINT, 89),
/* set the crypto engine for the SSL-sub system as default
the param has no meaning...
*/
CINIT(SSLENGINE_DEFAULT, LONG, 90),
CURLOPT_LASTENTRY /* the last unusued */
} CURLoption;
@@ -543,6 +565,7 @@ typedef enum {
CFINIT(ARRAY_START), /* below are the options allowed within a array */
CFINIT(FILE),
CFINIT(CONTENTTYPE),
CFINIT(CONTENTHEADER),
CFINIT(END),
CFINIT(ARRAY_END), /* up are the options allowed within a array */

View File

@@ -193,7 +193,7 @@ static CURLcode bindlocal(struct connectdata *conn,
#ifdef HAVE_INET_NTOA
#ifndef INADDR_NONE
#define INADDR_NONE (unsigned long) ~0
#define INADDR_NONE (in_addr_t) ~0
#endif
struct SessionHandle *data = conn->data;
@@ -207,7 +207,7 @@ static CURLcode bindlocal(struct connectdata *conn,
char *hostdataptr=NULL;
size_t size;
char myhost[256] = "";
unsigned long in;
in_addr_t in;
if(Curl_if2ip(data->set.device, myhost, sizeof(myhost))) {
h = Curl_getaddrinfo(data, myhost, 0, &hostdataptr);
@@ -236,7 +236,8 @@ static CURLcode bindlocal(struct connectdata *conn,
infof(data, "We bind local end to %s\n", myhost);
if ( (in=inet_addr(myhost)) != INADDR_NONE ) {
in=inet_addr(myhost);
if (INADDR_NONE != in) {
if ( h ) {
memset((char *)&sa, 0, sizeof(sa));
@@ -284,7 +285,7 @@ static CURLcode bindlocal(struct connectdata *conn,
failf(data, "Insufficient kernel memory was available: %d", errno);
break;
default:
failf(data, "errno %d\n", errno);
failf(data, "errno %d", errno);
break;
} /* end of switch(errno) */

View File

@@ -121,7 +121,7 @@ CURLcode Curl_dict(struct connectdata *conn)
}
if ((word == NULL) || (*word == (char)0)) {
failf(data, "lookup word is missing\n");
failf(data, "lookup word is missing");
}
if ((database == NULL) || (*database == (char)0)) {
database = (char *)"!";
@@ -174,7 +174,7 @@ CURLcode Curl_dict(struct connectdata *conn)
}
if ((word == NULL) || (*word == (char)0)) {
failf(data, "lookup word is missing\n");
failf(data, "lookup word is missing");
}
if ((database == NULL) || (*database == (char)0)) {
database = (char *)"!";

View File

@@ -396,15 +396,16 @@ int curl_formparse(char *input,
* Returns newly allocated HttpPost on success and NULL if malloc failed.
*
***************************************************************************/
static struct HttpPost * AddHttpPost (char * name,
long namelength,
char * value,
long contentslength,
char *contenttype,
long flags,
struct HttpPost *parent_post,
struct HttpPost **httppost,
struct HttpPost **last_post)
static struct HttpPost * AddHttpPost(char * name,
long namelength,
char * value,
long contentslength,
char *contenttype,
long flags,
struct curl_slist* contentHeader,
struct HttpPost *parent_post,
struct HttpPost **httppost,
struct HttpPost **last_post)
{
struct HttpPost *post;
post = (struct HttpPost *)malloc(sizeof(struct HttpPost));
@@ -415,6 +416,7 @@ static struct HttpPost * AddHttpPost (char * name,
post->contents = value;
post->contentslength = contentslength;
post->contenttype = contenttype;
post->contentheader = contentHeader;
post->flags = flags;
}
else
@@ -823,6 +825,21 @@ FORMcode FormAdd(struct HttpPost **httppost,
}
break;
}
case CURLFORM_CONTENTHEADER:
{
struct curl_slist* list = NULL;
if( array_state )
list = (struct curl_slist*)array_value;
else
list = va_arg(params,struct curl_slist*);
if( current_form->contentheader )
return_value = FORMADD_OPTION_TWICE;
else
current_form->contentheader = list;
break;
}
default:
fprintf (stderr, "got unknown CURLFORM_OPTION: %d\n", option);
return_value = FORMADD_UNKNOWN_OPTION;
@@ -872,13 +889,16 @@ FORMcode FormAdd(struct HttpPost **httppost,
break;
}
}
if ( (post = AddHttpPost(form->name, form->namelength,
form->value, form->contentslength,
form->contenttype, form->flags,
post, httppost,
last_post)) == NULL) {
post = AddHttpPost(form->name, form->namelength,
form->value, form->contentslength,
form->contenttype, form->flags,
form->contentheader,
post, httppost,
last_post);
if(!post)
return_value = FORMADD_MEMORY;
}
if (form->contenttype)
prevtype = form->contenttype;
}
@@ -1029,6 +1049,8 @@ struct FormData *Curl_getFormData(struct HttpPost *post,
int size =0;
char *boundary;
char *fileboundary=NULL;
struct curl_slist* curList;
if(!post)
return NULL; /* no input => no output! */
@@ -1090,6 +1112,13 @@ struct FormData *Curl_getFormData(struct HttpPost *post,
file->contenttype);
}
curList = file->contentheader;
while( curList ) {
/* Process the additional headers specified for this form */
size += AddFormDataf( &form, "\r\n%s", curList->data );
curList = curList->next;
}
#if 0
/* The header Content-Transfer-Encoding: seems to confuse some receivers
* (like the built-in PHP engine). While I can't see any reason why it

View File

@@ -44,6 +44,7 @@ typedef struct FormInfo {
long contentslength;
char *contenttype;
long flags;
struct curl_slist* contentheader;
struct FormInfo *more;
} FormInfo;

View File

@@ -1169,7 +1169,6 @@ CURLcode ftp_use_port(struct connectdata *conn)
struct sockaddr_in sa;
struct hostent *h=NULL;
char *hostdataptr=NULL;
size_t size;
unsigned short porttouse;
char myhost[256] = "";
@@ -1193,6 +1192,7 @@ CURLcode ftp_use_port(struct connectdata *conn)
if ( h ) {
if( (portsock = socket(AF_INET, SOCK_STREAM, 0)) >= 0 ) {
int size;
/* we set the secondary socket variable to this for now, it
is only so that the cleanup function will close it in case
@@ -1211,10 +1211,10 @@ CURLcode ftp_use_port(struct connectdata *conn)
if(bind(portsock, (struct sockaddr *)&sa, size) >= 0) {
/* we succeeded to bind */
struct sockaddr_in add;
size = sizeof(add);
socklen_t socksize = sizeof(add);
if(getsockname(portsock, (struct sockaddr *) &add,
(socklen_t *)&size)<0) {
&socksize)<0) {
failf(data, "getsockname() failed");
return CURLE_FTP_PORT_FAILED;
}
@@ -1517,9 +1517,10 @@ CURLcode ftp_perform(struct connectdata *conn)
return result;
}
/* If we have selected NOBODY, it means that we only want file information.
Which in FTP can't be much more than the file size! */
if(data->set.no_body) {
/* If we have selected NOBODY and HEADER, it means that we only want file
information. Which in FTP can't be much more than the file size and
date. */
if(data->set.no_body && data->set.include_header) {
/* The SIZE command is _not_ RFC 959 specified, and therefor many servers
may not support it! It is however the only way we have to get a file's
size! */
@@ -1565,20 +1566,27 @@ CURLcode ftp_perform(struct connectdata *conn)
return CURLE_OK;
}
if(data->set.no_body)
/* don't transfer the data */
;
/* Get us a second connection up and connected */
if(data->set.ftp_use_port)
else if(data->set.ftp_use_port) {
/* We have chosen to use the PORT command */
result = ftp_use_port(conn);
else
if(CURLE_OK == result)
/* we have the data connection ready */
infof(data, "Connected the data stream with PORT!\n");
}
else {
/* We have chosen (this is default) to use the PASV command */
result = ftp_use_pasv(conn);
if(CURLE_OK == result)
infof(data, "Connected the data stream with PASV!\n");
}
if(result)
return result;
/* we have the data connection ready */
infof(data, "Connected the data stream!\n");
if(data->set.upload) {
/* Set type to binary (unless specified ASCII) */
@@ -1634,7 +1642,7 @@ CURLcode ftp_perform(struct connectdata *conn)
passed += actuallyread;
if(actuallyread != readthisamountnow) {
failf(data, "Could only read %d bytes from the input\n", passed);
failf(data, "Could only read %d bytes from the input", passed);
return CURLE_FTP_COULDNT_USE_REST;
}
}
@@ -1701,7 +1709,7 @@ CURLcode ftp_perform(struct connectdata *conn)
return result;
}
else {
else if(!data->set.no_body) {
/* Retrieve file or directory */
bool dirlist=FALSE;
long downloadsize=-1;

View File

@@ -265,7 +265,7 @@ Curl_addrinfo *Curl_getaddrinfo(struct SessionHandle *data,
char **bufp)
{
struct hostent *h = NULL;
unsigned long in;
in_addr_t in;
int ret; /* this variable is unused on several platforms but used on some */
#define CURL_NAMELOOKUP_SIZE 9000

View File

@@ -449,7 +449,7 @@ CURLcode Curl_http_done(struct connectdata *conn)
if(0 == (http->readbytecount + conn->headerbytecount)) {
/* nothing was read from the HTTP server, this can't be right
so we return an error here */
failf(data, "Empty reply from server\n");
failf(data, "Empty reply from server");
return CURLE_GOT_NOTHING;
}
@@ -610,7 +610,7 @@ CURLcode Curl_http(struct connectdata *conn)
passed += actuallyread;
if(actuallyread != readthisamountnow) {
failf(data, "Could only read %d bytes from the input\n",
failf(data, "Could only read %d bytes from the input",
passed);
return CURLE_READ_ERROR;
}
@@ -621,7 +621,7 @@ CURLcode Curl_http(struct connectdata *conn)
data->set.infilesize -= conn->resume_from;
if(data->set.infilesize <= 0) {
failf(data, "File already completely uploaded\n");
failf(data, "File already completely uploaded");
return CURLE_PARTIAL_FILE;
}
}
@@ -735,10 +735,8 @@ CURLcode Curl_http(struct connectdata *conn)
* equal to UTC (Coordinated Universal Time)." (see page 20 of RFC2616).
*/
#ifdef HAVE_LOCALTIME_R
#ifdef HAVE_GMTIME_R
/* thread-safe version */
/* We assume that the presense of localtime_r() proves the presense
of gmtime_r() which is a bit ugly but might work */
struct tm keeptime;
thistime = (struct tm *)gmtime_r(&data->set.timevalue, &keeptime);
#else
@@ -795,7 +793,7 @@ CURLcode Curl_http(struct connectdata *conn)
char contentType[256];
int linelength=0;
if(Curl_FormInit(&http->form, http->sendit)) {
failf(data, "Internal HTTP POST error!\n");
failf(data, "Internal HTTP POST error!");
return CURLE_HTTP_POST_ERROR;
}
@@ -826,7 +824,7 @@ CURLcode Curl_http(struct connectdata *conn)
1,
(FILE *)&http->form);
if(linelength == -1) {
failf(data, "Could not get Content-Type header line!\n");
failf(data, "Could not get Content-Type header line!");
return CURLE_HTTP_POST_ERROR;
}
add_buffer(req_buffer, contentType, linelength);

View File

@@ -374,7 +374,7 @@ void Curl_krb_kauth(struct connectdata *conn)
memset(schedule, 0, sizeof(schedule));
memset(passwd, 0, sizeof(passwd));
if(Curl_base64_encode(tktcopy.dat, tktcopy.length, &p) < 0) {
failf(conn->data, "Out of memory base64-encoding.\n");
failf(conn->data, "Out of memory base64-encoding.");
Curl_set_command_prot(conn, save);
return;
}

View File

@@ -1,6 +1,14 @@
#ifdef MALLOCDEBUG
#include "setup.h"
#ifdef HAVE_SYS_TYPES_H
#include <sys/types.h>
#endif
#ifdef HAVE_SYS_SOCKET_H
#include <sys/socket.h>
#endif
#include <stdio.h>
#ifdef HAVE_MEMORY_H
#include <memory.h>

View File

@@ -220,6 +220,7 @@ CURLMcode curl_multi_perform(CURLM *multi_handle, int *running_handles)
struct Curl_multi *multi=(struct Curl_multi *)multi_handle;
struct Curl_one_easy *easy;
bool done;
CURLMcode result=CURLM_OK;
if(!GOOD_MULTI_HANDLE(multi))
return CURLM_BAD_HANDLE;
@@ -229,28 +230,30 @@ CURLMcode curl_multi_perform(CURLM *multi_handle, int *running_handles)
switch(easy->state) {
case CURLM_STATE_INIT:
/* init this transfer. Hm, uh, I can't think of anything to init
right now, let's skip over to CONNECT at once!
easy->result = Curl_init(easy->easy_handle);
if(CURLE_OK == easy->result)
*/
/* after init, go CONNECT */
easy->state = CURLM_STATE_CONNECT;
/* init this transfer. */
easy->result=Curl_pretransfer(easy->easy_handle);
if(CURLE_OK == easy->result) {
/* after init, go CONNECT */
easy->state = CURLM_STATE_CONNECT;
result = CURLM_CALL_MULTI_PERFORM;
}
break;
case CURLM_STATE_CONNECT:
/* connect */
easy->result = Curl_connect(easy->easy_handle);
/* after connect, go DO */
if(CURLE_OK == easy->result)
if(CURLE_OK == easy->result) {
easy->state = CURLM_STATE_DO;
result = CURLM_CALL_MULTI_PERFORM;
}
break;
case CURLM_STATE_DO:
/* Do the fetch or put request */
easy->result = Curl_do(easy->easy_handle);
/* after do, go PERFORM */
if(CURLE_OK == easy->result)
if(CURLE_OK == easy->result) {
easy->state = CURLM_STATE_PERFORM;
}
break;
case CURLM_STATE_PERFORM:
/* read/write data if it is ready to do so */
@@ -258,8 +261,11 @@ CURLMcode curl_multi_perform(CURLM *multi_handle, int *running_handles)
/* hm, when we follow redirects, we may need to go back to the CONNECT
state */
/* after the transfer is done, go DONE */
if(TRUE == done)
if(TRUE == done) {
/* call this even if the readwrite function returned error */
easy->result = Curl_posttransfer(easy->easy_handle);
easy->state = CURLM_STATE_DONE;
}
break;
case CURLM_STATE_DONE:
/* post-transfer command */

View File

@@ -55,6 +55,7 @@
typedef void CURLM;
typedef enum {
CURLM_CALL_MULTI_PERFORM=-1, /* please call curl_multi_perform() soon */
CURLM_OK,
CURLM_BAD_HANDLE, /* the passed-in handle is not a valid CURLM handle */
CURLM_BAD_EASY_HANDLE, /* an easy handle was not good/valid */

View File

@@ -362,11 +362,11 @@ Curl_sec_vfprintf(struct connectdata *conn, FILE *f, const char *fmt, va_list ap
conn);
free(buf);
if(len < 0) {
failf(conn->data, "Failed to encode command.\n");
failf(conn->data, "Failed to encode command.");
return -1;
}
if(Curl_base64_encode(enc, len, &buf) < 0){
failf(conn->data, "Out of memory base64-encoding.\n");
failf(conn->data, "Out of memory base64-encoding.");
return -1;
}
if(conn->command_prot == prot_safe)
@@ -421,7 +421,7 @@ sec_prot_internal(struct connectdata *conn, int level)
return -1;
if(conn->data->state.buffer[0] != '2'){
failf(conn->data, "Failed to set protection buffer size.\n");
failf(conn->data, "Failed to set protection buffer size.");
return -1;
}
conn->buffer_size = s;
@@ -441,7 +441,7 @@ sec_prot_internal(struct connectdata *conn, int level)
return -1;
if(conn->data->state.buffer[0] != '2'){
failf(conn->data, "Failed to set protection level.\n");
failf(conn->data, "Failed to set protection level.");
return -1;
}

View File

@@ -26,6 +26,9 @@
#include <stdio.h>
#include <stdarg.h>
#include <stdlib.h>
#ifdef HAVE_SYS_TYPES_H
#include <sys/types.h>
#endif
#ifdef HAVE_SYS_SOCKET_H
#include <sys/socket.h> /* required for send() & recv() prototypes */
@@ -137,8 +140,9 @@ void Curl_infof(struct SessionHandle *data, const char *fmt, ...)
}
}
/* Curl_failf() is for messages stating why we failed, the LAST one will be
returned for the user (if requested) */
/* Curl_failf() is for messages stating why we failed.
* The message SHALL NOT include any LF or CR.
*/
void Curl_failf(struct SessionHandle *data, const char *fmt, ...)
{

View File

@@ -34,9 +34,9 @@
#ifdef HAVE_CONFIG_H
#ifdef VMS
#include "config-vms.h"
#include "../config-vms.h"
#else
#include "config.h" /* the configure script results */
#include "../config.h" /* the configure script results */
#endif
#else
@@ -46,13 +46,14 @@
#endif
#ifdef macintosh
/* hand-modified MacOS config.h! */
#include "config-mac.h"
#include "../config-mac.h"
#endif
#endif
#ifndef __cplusplus /* (rabe) */
typedef char bool;
#define typedef_bool
#endif /* (rabe) */
#ifdef NEED_REENTRANT

View File

@@ -5,7 +5,7 @@
* | (__| |_| | _ <| |___
* \___|\___/|_| \_\_____|
*
* Copyright (C) 2000, Daniel Stenberg, <daniel@haxx.se>, et al.
* Copyright (C) 2001, Daniel Stenberg, <daniel@haxx.se>, et al.
*
* In order to be useful for every potential user, curl and libcurl are
* dual-licensed under the MPL and the MIT/X-derivate licenses.
@@ -22,11 +22,12 @@
*****************************************************************************/
/*
* The original SSL code was written by
* The original SSL code for curl was written by
* Linas Vepstas <linas@linas.org> and Sampo Kellomaki <sampo@iki.fi>
*/
#include "setup.h"
#include <string.h>
#include <stdlib.h>
@@ -171,37 +172,59 @@ int random_the_seed(struct connectdata *conn)
return nread;
}
#ifndef SSL_FILETYPE_ENGINE
#define SSL_FILETYPE_ENGINE 42
#endif
static int do_file_type(const char *type)
{
if (!type || !type[0])
return SSL_FILETYPE_PEM;
if (curl_strequal(type, "PEM"))
return SSL_FILETYPE_PEM;
if (curl_strequal(type, "DER"))
return SSL_FILETYPE_ASN1;
if (curl_strequal(type, "ENG"))
return SSL_FILETYPE_ENGINE;
return -1;
}
static
int cert_stuff(struct connectdata *conn,
char *cert_file,
char *key_file)
const char *cert_type,
char *key_file,
const char *key_type)
{
struct SessionHandle *data = conn->data;
int file_type;
if (cert_file != NULL) {
SSL *ssl;
X509 *x509;
if(data->set.cert_passwd) {
if(data->set.key_passwd) {
#ifndef HAVE_USERDATA_IN_PWD_CALLBACK
/*
* If password has been given, we store that in the global
* area (*shudder*) for a while:
*/
strcpy(global_passwd, data->set.cert_passwd);
strcpy(global_passwd, data->set.key_passwd);
#else
/*
* We set the password in the callback userdata
*/
SSL_CTX_set_default_passwd_cb_userdata(conn->ssl.ctx, data->set.cert_passwd);
SSL_CTX_set_default_passwd_cb_userdata(conn->ssl.ctx,
data->set.key_passwd);
#endif
/* Set passwd callback: */
SSL_CTX_set_default_passwd_cb(conn->ssl.ctx, passwd_callback);
}
#if 0
if (SSL_CTX_use_certificate_file(conn->ssl.ctx,
cert_file,
SSL_FILETYPE_PEM) != 1) {
failf(data, "unable to set certificate file (wrong password?)\n");
failf(data, "unable to set certificate file (wrong password?)");
return(0);
}
if (key_file == NULL)
@@ -210,9 +233,86 @@ int cert_stuff(struct connectdata *conn,
if (SSL_CTX_use_PrivateKey_file(conn->ssl.ctx,
key_file,
SSL_FILETYPE_PEM) != 1) {
failf(data, "unable to set public key file\n");
failf(data, "unable to set public key file");
return(0);
}
#else
/* The '#ifdef 0' section above was removed on 17-dec-2001 */
file_type = do_file_type(cert_type);
switch(file_type) {
case SSL_FILETYPE_PEM:
case SSL_FILETYPE_ASN1:
if (SSL_CTX_use_certificate_file(conn->ssl.ctx,
cert_file,
file_type) != 1) {
failf(data, "unable to set certificate file (wrong password?)");
return 0;
}
break;
case SSL_FILETYPE_ENGINE:
failf(data, "file type ENG for certificate not implemented");
return 0;
default:
failf(data, "not supported file type '%s' for certificate", cert_type);
return 0;
}
file_type = do_file_type(key_type);
switch(file_type) {
case SSL_FILETYPE_PEM:
if (key_file == NULL)
/* cert & key can only be in PEM case in the same file */
key_file=cert_file;
case SSL_FILETYPE_ASN1:
if (SSL_CTX_use_PrivateKey_file(conn->ssl.ctx,
key_file,
file_type) != 1) {
failf(data, "unable to set private key file\n");
return 0;
}
break;
case SSL_FILETYPE_ENGINE:
#ifdef HAVE_OPENSSL_ENGINE_H
{ /* XXXX still needs some work */
EVP_PKEY *priv_key = NULL;
if (conn && conn->data && conn->data->engine) {
if (!key_file || !key_file[0]) {
failf(data, "no key set to load from crypto engine\n");
return 0;
}
priv_key = ENGINE_load_private_key(conn->data->engine,key_file,
data->set.key_passwd);
if (!priv_key) {
failf(data, "failed to load private key from crypto engine\n");
return 0;
}
if (SSL_CTX_use_PrivateKey(conn->ssl.ctx, priv_key) != 1) {
failf(data, "unable to set private key\n");
EVP_PKEY_free(priv_key);
return 0;
}
EVP_PKEY_free(priv_key); /* we don't need the handle any more... */
}
else {
failf(data, "crypto engine not set, can't load private key\n");
return 0;
}
}
#else
failf(data, "file type ENG for private key not supported\n");
return 0;
#endif
break;
default:
failf(data, "not supported file type for private key\n");
return 0;
}
#endif
ssl=SSL_new(conn->ssl.ctx);
x509=SSL_get_certificate(ssl);
@@ -229,7 +329,7 @@ int cert_stuff(struct connectdata *conn,
/* Now we know that a key and cert have been set against
* the SSL context */
if (!SSL_CTX_check_private_key(conn->ssl.ctx)) {
failf(data, "Private key does not match the certificate public key\n");
failf(data, "Private key does not match the certificate public key");
return(0);
}
#ifndef HAVE_USERDATA_IN_PWD_CALLBACK
@@ -269,6 +369,10 @@ void Curl_SSL_init(void)
init_ssl++; /* never again */
#ifdef HAVE_ENGINE_LOAD_BUILTIN_ENGINES
ENGINE_load_builtin_engines();
#endif
/* Lets get nice error messages */
SSL_load_error_strings();
@@ -293,6 +397,10 @@ void Curl_SSL_cleanup(void)
table. */
EVP_cleanup();
#ifdef HAVE_ENGINE_cleanup
ENGINE_cleanup();
#endif
init_ssl=0; /* not inited any more */
}
#else
@@ -428,6 +536,13 @@ int Curl_SSL_Close_All(struct SessionHandle *data)
/* free the cache data */
free(data->state.session);
}
#ifdef HAVE_OPENSSL_ENGINE_H
if (data->engine)
{
ENGINE_free(data->engine);
data->engine = NULL;
}
#endif
return 0;
}
@@ -569,7 +684,11 @@ Curl_SSLConnect(struct connectdata *conn)
}
if(data->set.cert) {
if (!cert_stuff(conn, data->set.cert, data->set.cert)) {
if (!cert_stuff(conn,
data->set.cert,
data->set.cert_type,
data->set.key,
data->set.key_type)) {
/* failf() is already done in cert_stuff() */
return CURLE_SSL_CONNECT_ERROR;
}
@@ -578,7 +697,7 @@ Curl_SSLConnect(struct connectdata *conn)
if(data->set.ssl.cipher_list) {
if (!SSL_CTX_set_cipher_list(conn->ssl.ctx,
data->set.ssl.cipher_list)) {
failf(data, "failed setting cipher list\n");
failf(data, "failed setting cipher list");
return CURLE_SSL_CONNECT_ERROR;
}
}
@@ -591,7 +710,7 @@ Curl_SSLConnect(struct connectdata *conn)
if (!SSL_CTX_load_verify_locations(conn->ssl.ctx,
data->set.ssl.CAfile,
data->set.ssl.CApath)) {
failf(data,"error setting cerficate verify locations\n");
failf(data,"error setting cerficate verify locations");
return CURLE_SSL_CONNECT_ERROR;
}
}
@@ -713,7 +832,7 @@ Curl_SSLConnect(struct connectdata *conn)
if(data->set.ssl.verifypeer) {
data->set.ssl.certverifyresult=SSL_get_verify_result(conn->ssl.handle);
if (data->set.ssl.certverifyresult != X509_V_OK) {
failf(data, "SSL certificate verify result: %d\n",
failf(data, "SSL certificate verify result: %d",
data->set.ssl.certverifyresult);
retcode = CURLE_SSL_PEER_CERTIFICATE;
}

View File

@@ -790,11 +790,75 @@ CURLcode Curl_setopt(struct SessionHandle *data, CURLoption option, ...)
*/
data->set.cert = va_arg(param, char *);
break;
case CURLOPT_SSLCERTPASSWD:
case CURLOPT_SSLCERTTYPE:
/*
* String that holds the SSL certificate password.
* String that holds file type of the SSL certificate to use
*/
data->set.cert_passwd = va_arg(param, char *);
data->set.cert_type = va_arg(param, char *);
break;
case CURLOPT_SSLKEY:
/*
* String that holds file name of the SSL certificate to use
*/
data->set.key = va_arg(param, char *);
break;
case CURLOPT_SSLKEYTYPE:
/*
* String that holds file type of the SSL certificate to use
*/
data->set.key_type = va_arg(param, char *);
break;
case CURLOPT_SSLKEYPASSWD:
/*
* String that holds the SSL private key password.
*/
data->set.key_passwd = va_arg(param, char *);
break;
case CURLOPT_SSLENGINE:
/*
* String that holds the SSL crypto engine.
*/
#ifdef HAVE_OPENSSL_ENGINE_H
{
const char *cpTemp = va_arg(param, char *);
ENGINE *e;
if (cpTemp && cpTemp[0]) {
e = ENGINE_by_id(cpTemp);
if (e) {
if (data->engine) {
ENGINE_free(data->engine);
}
data->engine = e;
}
else {
failf(data, "SSL Engine '%s' not found", cpTemp);
return CURLE_SSL_ENGINE_NOTFOUND;
}
}
}
#else
return CURLE_SSL_ENGINE_NOTFOUND;
#endif
break;
case CURLOPT_SSLENGINE_DEFAULT:
/*
* flag to set engine as default.
*/
#ifdef HAVE_OPENSSL_ENGINE_H
if (data->engine) {
if (ENGINE_set_default(data->engine, ENGINE_METHOD_ALL) > 0) {
#ifdef DEBUG
fprintf(stderr,"set default crypto engine\n");
#endif
}
else {
#ifdef DEBUG
failf(data, "set default crypto engine failed");
#endif
return CURLE_SSL_ENGINE_SETFAILED;
}
}
#endif
break;
case CURLOPT_CRLF:
/*
@@ -1502,6 +1566,7 @@ static CURLcode CreateConnection(struct SessionHandle *data,
/* we have a proxy here to set */
data->change.proxy = proxy;
data->change.proxy_alloc=TRUE; /* this needs to be freed later */
conn->bits.httpproxy = TRUE;
}
} /* if (!nope) - it wasn't specified non-proxy */
} /* NO_PROXY wasn't specified or '*' */

View File

@@ -58,6 +58,9 @@
#include "openssl/pem.h"
#include "openssl/ssl.h"
#include "openssl/err.h"
#ifdef HAVE_OPENSSL_ENGINE_H
#include <openssl/engine.h>
#endif
#else
#include "rsa.h"
#include "crypto.h"
@@ -111,6 +114,9 @@ enum protection_level {
};
#endif
#ifndef HAVE_OPENSSL_ENGINE_H
typedef void ENGINE;
#endif
/* struct for data related to SSL and SSL connections */
struct ssl_connect_data {
bool use; /* use ssl encrypted communications TRUE/FALSE */
@@ -525,8 +531,12 @@ struct UserDefined {
char *cookie; /* HTTP cookie string to send */
struct curl_slist *headers; /* linked list of extra headers */
struct HttpPost *httppost; /* linked list of POST data */
char *cert; /* PEM-formatted certificate */
char *cert_passwd; /* plain text certificate password */
char *cert; /* certificate */
char *cert_type; /* format for certificate (default: PEM) */
char *key; /* private key */
char *key_type; /* format for private key (default: PEM) */
char *key_passwd; /* plain text private key password */
char *crypto_engine; /* name of the crypto engine to use */
char *cookiejar; /* dump all cookies to this file */
bool crlf; /* convert crlf on ftp upload(?) */
struct curl_slist *quote; /* before the transfer */
@@ -559,7 +569,10 @@ struct UserDefined {
bool hide_progress;
bool http_fail_on_error;
bool http_follow_location;
bool http_include_header;
bool include_header;
#define http_include_header include_header /* former name */
bool http_set_referer;
bool http_auto_referer; /* set "correct" referer when following location: */
bool no_body;
@@ -594,6 +607,9 @@ struct SessionHandle {
struct UrlState state; /* struct for fields used for state info and
other dynamic purposes */
struct PureInfo info; /* stats, reports and info data */
#ifdef USE_SSLEAY
ENGINE* engine;
#endif /* USE_SSLEAY */
};
#define LIBCURL_NAME "libcurl"

View File

@@ -10,8 +10,14 @@ Perl
elegantly used from within it. You can either invoke external curl command
line or use the curl interface.
Georg Horn's Perl interface to curl is available in the Curl_easy/
subdirectory. Using the Curl::easy module is just straightforward and
The latest release of Curl_easy, a Perl interface to curl is available from
http://curl.haxx.se/libcurl/perl/
(Georg Horn's original version of Curl_easy, supporting curl versions
before 7.7 is still available from: http://www.koblenz-net.de/~horn/export/ )
Using the Curl::easy module is just straightforward and
works much like using libcurl in a C programm, so please refer to the
documentation of libcurl. Have a look at test.pl to get an idea of how
to start.

View File

@@ -77,7 +77,9 @@
#define DEFAULT_MAXREDIRS 50L
#ifndef __cplusplus /* (rabe) */
#ifndef typedef_bool
typedef char bool;
#endif
#endif /* (rabe) */
#define CURL_PROGRESS_STATS 0 /* default progress display */
@@ -318,6 +320,11 @@ static void help(void)
" --egd-file <file> EGD socket path for random data (SSL)\n"
" -e/--referer Referer page (H)");
puts(" -E/--cert <cert[:passwd]> Specifies your certificate file and password (HTTPS)\n"
" --cert-type <type> Specifies your certificate file type (DER/PEM/ENG) (HTTPS)\n"
" --key <key> Specifies your private key file (HTTPS)\n"
" --key-type <type> Specifies your private key file type (DER/PEM/ENG) (HTTPS)\n"
" --pass <pass> Specifies your passphrase for the private key (HTTPS)");
puts(" --engine <eng> Specifies the crypto engine to use (HTTPS)\n"
" --cacert <file> CA certifciate to verify peer against (SSL)\n"
" --ciphers <list> What SSL ciphers to use (SSL)\n"
" --connect-timeout <seconds> Maximum time allowed for connection\n"
@@ -420,8 +427,12 @@ struct Configurable {
char *cipher_list;
char *cert;
char *cert_type;
char *cacert;
char *cert_passwd;
char *key;
char *key_type;
char *key_passwd;
char *engine;
bool crlf;
char *customrequest;
char *krb4level;
@@ -884,6 +895,11 @@ static ParameterError getparameter(char *flag, /* f or -long-flag */
{"e", "referer", TRUE},
{"E", "cert", TRUE},
{"Ea", "cacert", TRUE},
{"Eb","cert-type", TRUE},
{"Ec","key", TRUE},
{"Ed","key-type", TRUE},
{"Ee","pass", TRUE},
{"Ef","engine", TRUE},
{"f", "fail", FALSE},
{"F", "form", TRUE},
{"g", "globoff", FALSE},
@@ -1180,35 +1196,53 @@ static ParameterError getparameter(char *flag, /* f or -long-flag */
}
break;
case 'E':
if(subletter == 'a') {
switch(subletter) {
case 'a': /* CA info PEM file */
/* CA info PEM file */
GetStr(&config->cacert, nextarg);
}
else {
char *ptr = strchr(nextarg, ':');
/* Since we live in a world of weirdness and confusion, the win32
dudes can use : when using drive letters and thus
c:\file:password needs to work. In order not to break
compatibility, we still use : as separator, but we try to detect
when it is used for a file name! On windows. */
break;
case 'b': /* cert file type */
GetStr(&config->cert_type, nextarg);
break;
case 'c': /* private key file */
GetStr(&config->key, nextarg);
break;
case 'd': /* private key file type */
GetStr(&config->key_type, nextarg);
break;
case 'e': /* private key passphrase */
GetStr(&config->key_passwd, nextarg);
break;
case 'f': /* crypto engine */
GetStr(&config->engine, nextarg);
break;
default: /* certificate file */
{
char *ptr = strchr(nextarg, ':');
/* Since we live in a world of weirdness and confusion, the win32
dudes can use : when using drive letters and thus
c:\file:password needs to work. In order not to break
compatibility, we still use : as separator, but we try to detect
when it is used for a file name! On windows. */
#ifdef WIN32
if(ptr &&
(ptr == &nextarg[1]) &&
(nextarg[2] == '\\') &&
(isalpha((int)nextarg[0])) )
/* colon in the second column, followed by a backslash, and the
first character is an alphabetic letter:
if(ptr &&
(ptr == &nextarg[1]) &&
(nextarg[2] == '\\') &&
(isalpha((int)nextarg[0])) )
/* colon in the second column, followed by a backslash, and the
first character is an alphabetic letter:
this is a drive letter colon */
ptr = strchr(&nextarg[3], ':'); /* find the next one instead */
this is a drive letter colon */
ptr = strchr(&nextarg[3], ':'); /* find the next one instead */
#endif
if(ptr) {
/* we have a password too */
*ptr=0;
ptr++;
GetStr(&config->cert_passwd, ptr);
}
GetStr(&config->cert, nextarg);
if(ptr) {
/* we have a password too */
*ptr=0;
ptr++;
GetStr(&config->key_passwd, ptr);
}
GetStr(&config->cert, nextarg);
}
}
break;
case 'f':
@@ -1245,10 +1279,23 @@ static ParameterError getparameter(char *flag, /* f or -long-flag */
config->conf ^= CONF_HEADER; /* include the HTTP header as well */
break;
case 'I':
config->conf ^= CONF_HEADER; /* include the HTTP header in the output */
config->conf ^= CONF_NOBODY; /* don't fetch the body at all */
if(SetHTTPrequest(HTTPREQ_HEAD, &config->httpreq))
return PARAM_BAD_USE;
/*
* This is a bit tricky. We either SET both bits, or we clear both
* bits. Let's not make any other outcomes from this.
*/
if((CONF_HEADER|CONF_NOBODY) !=
(config->conf&(CONF_HEADER|CONF_NOBODY)) ) {
/* one of them weren't set, set both */
config->conf |= (CONF_HEADER|CONF_NOBODY);
if(SetHTTPrequest(HTTPREQ_HEAD, &config->httpreq))
return PARAM_BAD_USE;
}
else {
/* both were set, clear both */
config->conf &= ~(CONF_HEADER|CONF_NOBODY);
if(SetHTTPrequest(HTTPREQ_GET, &config->httpreq))
return PARAM_BAD_USE;
}
break;
case 'K':
res = parseconfig(nextarg, config);
@@ -2091,17 +2138,29 @@ operate(struct Configurable *config, int argc, char *argv[])
to be able to do so, we have to create a new URL in another
buffer.*/
urlbuffer=(char *)malloc(strlen(url) + strlen(config->infile) + 3);
/* We only want the part of the local path that is on the right
side of the rightmost slash and backslash. */
char *filep = strrchr(config->infile, '/');
char *file2 = strrchr(filep?filep:config->infile, '\\');
if(file2)
filep = file2+1;
else if(filep)
filep++;
else
filep = config->infile;
urlbuffer=(char *)malloc(strlen(url) + strlen(filep) + 3);
if(!urlbuffer) {
helpf("out of memory\n");
return CURLE_OUT_OF_MEMORY;
}
if(ptr)
/* there is a trailing slash on the URL */
sprintf(urlbuffer, "%s%s", url, config->infile);
sprintf(urlbuffer, "%s%s", url, filep);
else
/* thers is no trailing slash on the URL */
sprintf(urlbuffer, "%s/%s", url, config->infile);
sprintf(urlbuffer, "%s/%s", url, filep);
url = urlbuffer; /* use our new URL instead! */
}
@@ -2202,6 +2261,8 @@ operate(struct Configurable *config, int argc, char *argv[])
}
#endif
curl_easy_setopt(curl, CURLOPT_SSLENGINE, config->engine);
curl_easy_setopt(curl, CURLOPT_SSLENGINE_DEFAULT, 1);
curl_easy_setopt(curl, CURLOPT_FILE, (FILE *)&outs); /* where to store */
/* what call to write: */
@@ -2249,7 +2310,10 @@ operate(struct Configurable *config, int argc, char *argv[])
curl_easy_setopt(curl, CURLOPT_HTTPHEADER, config->headers);
curl_easy_setopt(curl, CURLOPT_HTTPPOST, config->httppost);
curl_easy_setopt(curl, CURLOPT_SSLCERT, config->cert);
curl_easy_setopt(curl, CURLOPT_SSLCERTPASSWD, config->cert_passwd);
curl_easy_setopt(curl, CURLOPT_SSLCERTTYPE, config->cert_type);
curl_easy_setopt(curl, CURLOPT_SSLKEY, config->key);
curl_easy_setopt(curl, CURLOPT_SSLKEYTYPE, config->key_type);
curl_easy_setopt(curl, CURLOPT_SSLKEYPASSWD, config->key_passwd);
if(config->cacert) {
curl_easy_setopt(curl, CURLOPT_CAINFO, config->cacert);

View File

@@ -1,5 +1,5 @@
#ifndef __SETUP_H
#define __SETUP_H
#ifndef __CLIENT_SETUP_H
#define __CLIENT_SETUP_H
/*****************************************************************************
* _ _ ____ _
* Project ___| | | | _ \| |

View File

@@ -912,7 +912,8 @@ for(keys %run) {
}
if($total) {
print "$ok tests out of $total reported OK\n";
printf("$ok tests out of $total reported OK: %d%%\n",
$ok/$total*100);
if($ok != $total) {
print "These test cases failed: $failed\n";