curl/docs/TODO

82 lines
3.6 KiB
Plaintext
Raw Normal View History

2000-05-22 19:35:35 +02:00
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
TODO
Ok, this is what I wanna do with Curl. Please tell me what you think, and
please don't hesitate to contribute and send me patches that improve this
product! (Yes, you may add things not mentioned here, these are just a
few teasers...)
2001-03-06 13:50:42 +01:00
To be done for the 7.7 relase:
* Fix the random seeding. Add --egd-socket and --random-file options to the
curl client and libcurl curl_easy_setopt() interface.
* Support persistant connections (fully detailed elsewhere)
To be done after the 7.7 release:
* Make SSL session ids get used if multiple HTTPS documents from the same
host is requested.
2000-10-25 09:42:23 +02:00
* Add a command line option that allows the output file to get the same time
stamp as the remote file. libcurl already is capable of fetching the remote
file's date.
2000-10-25 09:42:23 +02:00
* Make the SSL layer option capable of using the Mozilla Security Services as
an alternative to OpenSSL:
http://www.mozilla.org/projects/security/pki/nss/
2000-08-21 23:56:41 +02:00
* Add asynchronous name resolving, as this enables full timeout support for
fork() systems.
* Move non-URL related functions that are used by both the lib and the curl
application to a separate "portability lib".
2000-10-25 09:42:23 +02:00
* Add support for other languages than C. C++ (rumours have been heard about
something being worked on in this area) and perl (we have seen the first
versions of this!) comes to mind. Python anyone?
2000-08-21 23:56:41 +02:00
2000-05-22 19:35:35 +02:00
* "Content-Encoding: compress/gzip/zlib"
HTTP 1.1 clearly defines how to get and decode compressed documents. There
is the zlib that is pretty good at decompressing stuff. This work was
started in October 1999 but halted again since it proved more work than we
thought. It is still a good idea to implement though.
* Authentication: NTLM. It would be to support that MS crap called NTLM
2000-09-21 10:53:59 +02:00
authentication. MS proxies and servers sometime require that. Since that
protocol is a proprietary one, it involves reverse engineering and network
sniffing. This should however be a library-based functionality. There are a
few different efforts "out there" to make open source HTTP clients support
this and it should be possible to take advantage of other people's hard
work. http://modntlm.sourceforge.net/ is one. There's a web page at
http://www.innovation.ch/java/ntlm.html that contains detailed reverse-
engineered info.
2000-05-22 19:35:35 +02:00
* RFC2617 compliance, "Digest Access Authentication"
A valid test page seem to exist at:
http://hopf.math.nwu.edu/testpage/digest/
And some friendly person's server source code is available at
http://hopf.math.nwu.edu/digestauth/index.html
Then there's the Apache mod_digest source code too of course. It seems as
if Netscape doesn't support this, and not many servers do. Although this is
a lot better authentication method than the more common "Basic". Basic
sends the password in cleartext over the network, this "Digest" method uses
a challange-response protocol which increases security quite a lot.
* Other proxies
Ftp-kind proxy, Socks5, whatever kind of proxies are there?
2001-03-06 13:50:42 +01:00
* IPv6 Awareness and support. (This is partly done.) RFC 2428 "FTP
Extensions for IPv6 and NATs" is interesting. PORT should be replaced with
EPRT for IPv6 (done), and EPSV instead of PASV. HTTP proxies are left to
add support for.
2000-05-22 19:35:35 +02:00
* SSL for more protocols, like SSL-FTP...
(http://search.ietf.org/internet-drafts/draft-murray-auth-ftp-ssl-05.txt)