Compare commits

...

41 Commits

Author SHA1 Message Date
Daniel Stenberg
887e728b7d cleanup-commit 2001-01-03 09:13:07 +00:00
Daniel Stenberg
c03e0074c6 ftp commands are now sent single-write() 2001-01-03 09:07:59 +00:00
Daniel Stenberg
0d12c56738 Added -i to allow ingore-patterns to get added 2001-01-03 08:35:16 +00:00
Daniel Stenberg
880208c5b2 only add good links as root links
don't break the loop on root link errors
2001-01-03 08:18:59 +00:00
Daniel Stenberg
f4acbed214 ftpsendf() is remade to send the entire command in one write(), as some
firewalls (like FW-1) seems to dislike split-up writes at times...
2000-12-30 13:12:30 +00:00
Daniel Stenberg
910fc8522a Added '5.4 Does libcurl do Winsock initing on win32 systems?' 2000-12-30 11:48:51 +00:00
Daniel Stenberg
6d90be0757 multi doc lib support
SSL session id support
2000-12-19 14:39:16 +00:00
Daniel Stenberg
3d8bb1c27a include unistd.h if present to prevent compiler warnings on close() 2000-12-19 13:35:23 +00:00
Daniel Stenberg
1c8121a89e removed debug output 2000-12-19 13:34:55 +00:00
Daniel Stenberg
0db48a8109 analyzes fopen() leaks as well 2000-12-19 13:32:30 +00:00
Daniel Stenberg
5594741acb Added fopen() and fclose() leak tracking 2000-12-19 13:23:54 +00:00
Daniel Stenberg
cbaeed7232 updated email and web site 2000-12-19 13:09:23 +00:00
Daniel Stenberg
6d7587d327 configure fix, two -O fixes 2000-12-19 13:08:04 +00:00
Daniel Stenberg
9ee94b3d84 fixed a leaked file descriptor when PORT failed 2000-12-19 09:06:36 +00:00
Daniel Stenberg
2c100371d2 NTLM details added 2000-12-19 07:30:51 +00:00
Daniel Stenberg
184ad46a27 fixed accept() for memory debugging 2000-12-18 16:13:37 +00:00
Daniel Stenberg
74d35416a2 changed the return code checker in the quote command send to only fail
on >= 400 errors
2000-12-16 10:36:08 +00:00
Daniel Stenberg
2fff6a4b0e Added Kermit and link 2000-12-16 10:25:10 +00:00
Daniel Stenberg
bf43b49a20 added socket() / sclose() checks to the memdebug system 2000-12-14 15:56:59 +00:00
Daniel Stenberg
6ad9bd8022 crawls through a whole site and verifies links 2000-12-14 12:19:57 +00:00
Daniel Stenberg
ec5ac82cfe How do a fetch multiple files with libcurl? 2000-12-14 08:37:09 +00:00
Daniel Stenberg
76ac228e44 added include stdio.h for the FILE 2000-12-14 08:34:47 +00:00
Daniel Stenberg
b0828267bc Added a few related RFCs 2000-12-12 13:10:45 +00:00
Daniel Stenberg
9c10cb4684 removed the config file entry as that has been much improved lately 2000-12-12 10:14:31 +00:00
Daniel Stenberg
3d8c4ce526 points to the curl local copy of the netscape cookie spec
points to the development site for wget
reworded some RFC references so that they turn up as links on the converted
web page better
2000-12-12 10:05:49 +00:00
Daniel Stenberg
ec420c62d9 fixed a strdup(NULL) error 2000-12-12 09:30:52 +00:00
Daniel Stenberg
5d44f00201 Francois Petitjean's solaris core dump fix 2000-12-12 08:48:39 +00:00
Daniel Stenberg
cddeb939ed updated the latest added features 2000-12-11 15:35:57 +00:00
Daniel Stenberg
45cf0cf3ec unix style newlines only 2000-12-11 08:16:25 +00:00
Daniel Stenberg
ff7729e2bc unix-style newlines 2000-12-11 08:15:22 +00:00
Daniel Stenberg
7dcda6a370 unix style newlines 2000-12-11 08:14:34 +00:00
Daniel Stenberg
dde277d86a Albert Chin-A-Young fixed the SSL option to use LDFLAGS properly 2000-12-11 07:38:47 +00:00
Daniel Stenberg
a5146c7297 fixed CURLOPT_COOKIE and added CURLOPT_CRLF 2000-12-08 17:25:24 +00:00
Daniel Stenberg
69abefc936 Added SA_RESTART since (some) HPUX doesn't have that define and it doesn't
need it
2000-12-07 09:09:26 +00:00
Daniel Stenberg
dad2317e6e post 7.5 fixes 2000-12-07 09:08:20 +00:00
Daniel Stenberg
22d8aa37e0 urlglob fix to prevent crashing when -o path is longer than url 2000-12-06 10:10:31 +00:00
Daniel Stenberg
160d2a30db Added the borland makefiles 2000-12-05 13:47:30 +00:00
Daniel Stenberg
cb1842cb52 uses the PERL variable configure digs up 2000-12-05 09:15:44 +00:00
Daniel Stenberg
6ced1ba615 changed third argument to size_t to match SCO prototype 2000-12-05 08:04:04 +00:00
Daniel Stenberg
54e46e199c Paul Marquis fixed a 7.4.2-dependency 2000-12-04 14:59:58 +00:00
Daniel Stenberg
ca8196a4dc Jrn fixed a multiple URL output bug 2000-12-04 12:21:18 +00:00
31 changed files with 1134 additions and 240 deletions

56
CHANGES
View File

@@ -6,6 +6,62 @@
History of Changes
Daniel (30 December 2000)
- Made all FTP commands get sent with the trailing CRLF in one single write()
as splitting them up seems to confuse at least some firewalls (FW-1 being
one major).
Daniel (19 December 2000)
- Added file desrciptor and FILE handle leak detection to the memdebug system
and thus I found and removed a file handler leakage in the ftp parts.
- Added an include <stdio.h> in <curl/curl.h> since it uses FILE *.
Daniel (12 December 2000)
- Multiple URL downloads with -O was still bugging. Not anymore I think or
hope, or at least I've tried... :-O
- Francois Petitjean fixed another -O problem
Version 7.5.1
Daniel (11 December 2000)
- Cleaned up a few of the makefiles to use unix-style newlines only. As Kevin
P Roth found out, at least one CVS client behaved wrongly when it found
different newline conventions within the same file.
- Albert Chin-A-Young corrected the LDFLAGS use in the configure script for
the SSL stuff.
Daniel (6 December 2000)
- Massimo Squillace correctly described how libcurl could use session ids when
doing SSL connections.
- James Griffiths found out that curl would crash if the file you specify with
-o is shorter than the URL! This took some hours to fully hunt down, but it
is fixed now.
Daniel (5 December 2000)
- Jaepil Kim sent us makefiles that build curl using the free windows borland
compiler. The root makefile now accepts 'make borland' to build curl with
that compiler.
- Stefan Radman pointed out that the test makefiles didn't use the PERL
variable that the configure scripts figure out. Actually, you still need
perl in the path for the test suite to run ok.
- Rich Gray found numerous portability problems:
* The SCO compiler got an error on the getpass_r() prototype in getpass.h
since the curl one differed from the SCO one
* The HPUX compiler got an error because of how curl did the sigaction
stuff and used a define HPUX doesn't have (or need).
* A few more problems remain to be researched.
- Paul Harrington experienced a core dump using https. Not much details yet.
Daniel (4 December 2000)
- J<>rn Hartroth fixed a problem with multiple URLs and -o/-O.
Version 7.5
Daniel (1 December 2000)

3
FILES
View File

@@ -53,6 +53,7 @@ src/*.in
src/*.am
src/mkhelp.pl
src/Makefile.vc6
src/Makefile.b32
src/*m32
lib/getdate.y
lib/*.[ch]
@@ -60,6 +61,8 @@ lib/*in
lib/*am
lib/Makefile.vc6
lib/*m32
lib/Makefile.b32
lib/Makefile.b32.resp
lib/libcurl.def
include/README
include/Makefile.in

View File

@@ -42,13 +42,17 @@
############################################################################
all:
./configure
./configure
make
ssl:
./configure --with-ssl
make
borland:
cd lib; make -f Makefile.b32
cd src; make -f Makefile.b32
mingw32:
cd lib; make -f Makefile.m32
cd src; make -f Makefile.m32
@@ -58,17 +62,17 @@ mingw32-ssl:
cd src; make -f Makefile.m32 SSL=1
vc:
cd lib
cd lib
nmake -f Makefile.vc6
cd ..\src
cd ..\src
nmake -f Makefile.vc6
vc-ssl:
cd lib
nmake -f Makefile.vc6 release-ssl
cd ..\src
nmake -f Makefile.vc6
vc-ssl:
cd lib
nmake -f Makefile.vc6 release-ssl
cd ..\src
nmake -f Makefile.vc6
cygwin:
./configure
make

View File

@@ -397,7 +397,8 @@ else
AC_MSG_RESULT([defaults (or given in environment)])
else
test X"$OPT_SSL" = Xyes && OPT_SSL=/usr/local/ssl
LIBS="$LIBS -L$OPT_SSL/lib"
dnl LIBS="$LIBS -L$OPT_SSL/lib"
LDFLAGS="$LDFLAGS -L$OPT_SSL/lib"
CPPFLAGS="$CPPFLAGS -I$OPT_SSL/include/openssl -I$OPT_SSL/include"
AC_MSG_RESULT([$OPT_SSL])
fi

View File

@@ -1,4 +1,4 @@
Updated: November 22, 2000 (http://curl.haxx.se/docs/faq.shtml)
Updated: January 2, 2001 (http://curl.haxx.se/docs/faq.shtml)
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
@@ -45,10 +45,13 @@ FAQ
4.6 Can you tell me what error code 142 means?
4.7 How do I keep usernames and passwords secret in Curl command lines?
4.8 I found a bug!
4.9 Curl can't authenticate to the server that requires NTLM?
5. libcurl Issues
5.1 Is libcurl thread safe?
5.2 How can I receive all data into a large memory chunk?
5.3 How do I fetch multiple files with libcurl?
5.4 Does libcurl do Winsock initing on win32 systems?
6. License Issues
6.1 I have a GPL program, can I use the libcurl library?
@@ -72,6 +75,8 @@ FAQ
Curl supports a range of common internet protocols, currently including
HTTP, HTTPS, FTP, GOPHER, LDAP, DICT and FILE.
Please spell it cURL or just curl.
1.2 What is libcurl?
libcurl is the engine inside curl that does all the work. curl is more or
@@ -219,7 +224,7 @@ FAQ
Curl supports resume both ways on FTP, download ways on HTTP.
Try the -c and -C options.
Try the -C option.
3.3. Why doesn't my posting using -F work?
@@ -356,9 +361,9 @@ FAQ
4.6. Can you tell me what error code 142 means?
All error codes that are larger than the highest documented error code means
that curl has existed due to a timeout. There is currently no nice way for
curl to abort from such a condition and that's why it gets this undocumented
error. This should be changed in releases after 7.4.1.
that curl has existed due to a timeout. There was no nice way for curl to
abort from such a condition and that's why it got this undocumented
error. This should not occur in releases after 7.4.1.
4.7. How do I keep usernames and passwords secret in Curl command lines?
@@ -374,6 +379,10 @@ FAQ
at least hide them from being read by human eyes, but that is not what
anyone would call security.
Also note that regular HTTP and FTP passwords are sent in clear across the
network. All it takes for anyone to fetch them is to listen on the network.
Evesdropping is very easy.
4.8 I found a bug!
It is not a bug if the behaviour is documented. Read the docs first.
@@ -390,6 +399,10 @@ FAQ
operating system name and version and complete instructions how to repeat
the bug.
4.9. Curl can't authenticate to the server that requires NTLM?
NTLM is a Microsoft proprietary protocol. Unfortunately, curl does not
currently support that.
5. libcurl Issues
@@ -436,6 +449,21 @@ FAQ
return realsize;
}
5.3 How do I fetch multiple files with libcurl?
The easy interface of libcurl does not support multiple requests using the
same connection. The only available way to do multiple requests is to
init/perform/cleanup for each request.
5.4 Does libcurl do Winsock initing on win32 systems?
No.
On win32 systems, you need to init the winsock stuff manually, libcurl will
not do that for you. WSAStartup() and WSACleanup() should be used
accordingly. The reason for this is of course that a single application may
use several different libraries and parts, and there's no reason for every
single library to do this.
6. License Issues

View File

@@ -15,9 +15,11 @@ Misc
- guesses protocol from host name unless specified
- uses .netrc
- progress bar/time specs while downloading
- PROXY environment variables support
- "standard" proxy environment variables support
- config file support
- compiles on win32
- redirectable stderr
- use selected network interface for outgoing traffic
HTTP
- GET
@@ -28,8 +30,9 @@ HTTP
- authentication
- resume
- follow redirects
- maximum amount of redirects to follow
- custom HTTP request
- cookie get/send
- cookie get/send fully parsed
- understands the netscape cookie file format
- custom headers (that can replace/remove internally generated headers)
- custom user-agent string
@@ -38,11 +41,12 @@ HTTP
- proxy authentication
- time conditions
- via http-proxy
- specify interface device/port
- retrieve file modification date
HTTPS (*1)
- (all the HTTP features)
- using certificates
- verify server certificate
- via http-proxy
FTP
@@ -63,6 +67,7 @@ FTP
- simple "range" support
- via http-proxy
- all operations can be tunneled through a http-proxy
- customizable to retrieve file modification date
TELNET
- connection negotiation

View File

@@ -15,6 +15,8 @@ Standards
RFC 959 - Defines how FTP works
RFC 1635 - How to Use Anonymous FTP
RFC 1738 - Uniform Resource Locators
RFC 1777 - defines the LDAP protocol
@@ -38,51 +40,59 @@ Standards
RFC 2109 - HTTP State Management Mechanism (cookie stuff)
- Also, read Netscape's specification at
http://www.netscape.com/newsref/std/cookie_spec.html
http://curl.haxx.se/rfc/cookie_spec.html
RFC 2183 - "The Content-Disposition Header Field"
RFC 2183 - The Content-Disposition Header Field
RFC 2229 - "A Dictionary Server Protocol"
RFC 2229 - A Dictionary Server Protocol
RFC 2255 - Newer LDAP URL syntax document.
RFC 2231 - "MIME Parameter Value and Encoded Word Extensions:
Character Sets, Languages, and Continuations"
RFC 2231 - MIME Parameter Value and Encoded Word Extensions:
Character Sets, Languages, and Continuations
RFC 2388 - "Returning Values from Forms: multipart/form-data"
Use this as an addition to the 1867
Use this as an addition to the RFC1867
RFC 2396 - "Uniform Resource Identifiers: Generic Syntax and Semantics" This
one obsoletes 1738, but since 1738 is often mentioned I've left
it in this list.
one obsoletes RFC 1738, but since RFC 1738 is often mentioned
I've left it in this list.
RFC 2428 - "FTP Extensions for IPv6 and NATs"
RFC 2428 - FTP Extensions for IPv6 and NATs
RFC 2577 - FTP Security Considerations
RFC 2616 - HTTP 1.1, the latest
RFC 2617 - HTTP Authentication
RFC 2718 - "Guidelines for new URL Schemes"
RFC 2732 - "Format for Literal IPv6 Addresses in URL's"
RFC 2718 - Guidelines for new URL Schemes
RFC 2732 - Format for Literal IPv6 Addresses in URL's
RFC 2818 - HTTP Over TLS (TLS is the successor to SSL)
RFC 2964 - Use of HTTP State Management
RFC 2965 - HTTP State Management Mechanism. Cookies. Obsoletes RFC2109
Compilers
---------
MingW32 - http://www.mingw.org
MingW32 - http://www.mingw.org/
gcc - http://www.gnu.org/software/gcc/gcc.html
Software
--------
OpenSSL - http://www.openssl.org
OpenSSL - http://www.openssl.org/
OpenLDAP - http://www.openldap.org
OpenLDAP - http://www.openldap.org/
zlib - http://www.cdrom.com/pub/infozip/zlib/
Similar Tools
-------------
wget - http://www.gnu.org/software/wget/wget.html
wget - http://sunsite.dk/wget/
snarf - http://www.xach.com/snarf/
@@ -90,6 +100,8 @@ Similar Tools
swebget - http://www.uni-hildesheim.de/~smol0075/swebget/
Kermit - http://www.columbia.edu/kermit/ftpclient/
Related Software
----------------
ftpparse - http://cr.yp.to/ftpparse.html parses FTP LIST responses

View File

@@ -13,6 +13,9 @@ For the future
product! (Yes, you may add things not mentioned here, these are just a
few teasers...)
* Make SSL session ids get used if multiple HTTPS documents from the same
host is requested.
* Improve the command line option parser to accept '-m300' as well as the '-m
300' convention. It should be able to work if '-m300' is considered to be
space separated to the next option.
@@ -42,6 +45,9 @@ For the future
* Make sure the low-level interface works. highlevel.c should basically be
possible to write using that interface. Document the low-level interface
* Make the easy-interface support multiple file transfers. If they're done
to the same host, they should use persistant connections or similar.
* Add asynchronous name resolving, as this enables full timeout support for
fork() systems.
@@ -52,39 +58,6 @@ For the future
something being worked on in this area) and perl (we have seen the first
versions of this!) comes to mind. Python anyone?
* Improve the -K config file parser (the parameter following the flag should
be possible to get specified *exactly* as it is done on a shell command
line).
Alternatively, and preferably, we rewrite the entire config file to become
a true config file that uses its own format instead of the currently
crippled and stupid format:
[option] = [value]
Where [option] would be the same as the --long-option and [value] would
either be 'on/off/true/false' for booleans or a plain value for [option]s
that accept variable input (such as -d, -o, -H, -d, -F etc).
[value] could be written as plain text, and then the initial and trailing
white spaces would be stripped off, or it can be specified within quotes
and then all white spaces within the quotes will count.
[value] could then be made to accept some format to specify an environment
variable. I could even think of supporting
[option] += [value]
for appending stuff to an option.
As has been suggested, ${name} could be used to read environment variables
and possibly other options. That could then be used instead of += operators
like:
bar = "foo ${bar}"
* rtsp:// support -- "Real Time Streaming Protocol" (RFC 2326)
* "Content-Encoding: compress/gzip/zlib"
HTTP 1.1 clearly defines how to get and decode compressed documents. There
@@ -98,7 +71,7 @@ For the future
sniffing. This should however be a library-based functionality. There are a
few different efforts "out there" to make open source HTTP clients support
this and it should be possible to take advantage of other people's hard
work.
work. http://modntlm.sourceforge.net/ is one.
* RFC2617 compliance, "Digest Access Authentication"
A valid test page seem to exist at:

View File

@@ -237,7 +237,7 @@ want the transfer to start from.
.B CURLOPT_COOKIE
Pass a pointer to a zero terminated string as parameter. It will be used to
set a cookie in the http request. The format of the string should be
'[NAME]=[CONTENTS];' Where NAME is the cookie name.
[NAME]=[CONTENTS]; Where NAME is the cookie name.
.TP
.B CURLOPT_HTTPHEADER
Pass a pointer to a linked list of HTTP headers to pass to the server in your
@@ -267,7 +267,7 @@ the password required to use the CURLOPT_SSLCERT certificate. If the password
is not supplied, you will be prompted for it.
.TP
.B CURLOPT_CRLF
TBD.
Convert unix newlines to CRLF newlines on FTP uploads.
.TP
.B CURLOPT_QUOTE
Pass a pointer to a linked list of FTP commands to pass to the server prior to

View File

@@ -26,9 +26,9 @@
*
* ------------------------------------------------------------
* Main author:
* - Daniel Stenberg <Daniel.Stenberg@haxx.nu>
* - Daniel Stenberg <daniel@haxx.se>
*
* http://curl.haxx.nu
* http://curl.haxx.se
*
* $Source$
* $Revision$
@@ -40,6 +40,7 @@
* ------------------------------------------------------------
****************************************************************************/
#include <stdio.h>
/* The include stuff here is mainly for time_t! */
#ifdef vms
# include <types.h>
@@ -470,8 +471,8 @@ char *curl_getenv(char *variable);
char *curl_version(void);
/* This is the version number */
#define LIBCURL_VERSION "7.5"
#define LIBCURL_VERSION_NUM 0x070500
#define LIBCURL_VERSION "7.5.2-pre1"
#define LIBCURL_VERSION_NUM 0x070502
/* linked-list structure for the CURLOPT_QUOTE option (and other) */
struct curl_slist {

View File

@@ -26,9 +26,9 @@
*
* ------------------------------------------------------------
* Main author:
* - Daniel Stenberg <Daniel.Stenberg@haxx.nu>
* - Daniel Stenberg <daniel@haxx.se>
*
* http://curl.haxx.nu
* http://curl.haxx.se
*
* $Source$
* $Revision$

75
lib/Makefile.b32 Normal file
View File

@@ -0,0 +1,75 @@
############################################################
# Makefile.b32 - Borland's C++ Compiler 5.X
#
# 'lib' directory
#
# Requires 'Makefile.b32.resp'
#
# Written by Jaepil Kim, pit@paradise.net.nz
############################################################
# Setup environment
CXX = bcc32
RM = del
LIB = tlib
TOPDIR = ..
CURNTDIR = .
CXXFLAGS = -5 -O2 -w-aus -w-ccc -w-csu -w-par -w-pia -w-rch -w-inl -w-ngu -w-pro
DEFINES = -DLIBCURL_BIGENDIAN=0 -DNDEBUG -DWIN32 -DCONSOLE -DMBCS
INCDIRS = -I$(CURNTDIR);$(TOPDIR)/include/
# 'BCCDIR' has to be set up in your c:\autoexec.bat
# i.e. SET BCCDIR = c:\Borland\BCC55
# where c:\Borland\BCC55 is the compiler is installed
LINKLIB = $(BCCDIR)/lib/psdk/wsock32.lib
LIBCURLLIB = libcurl.lib
.SUFFIXES: .c
SOURCES = \
base64.c \
cookie.c \
download.c \
escape.c \
formdata.c \
ftp.c \
http.c \
ldap.c \
dict.c \
telnet.c \
getdate.c \
getenv.c \
getpass.c \
hostip.c \
if2ip.c \
mprintf.c \
netrc.c \
progress.c \
sendf.c \
speedcheck.c \
ssluse.c \
timeval.c \
url.c \
file.c \
getinfo.c \
version.c \
easy.c \
highlevel.c \
strequal.c
OBJECTS = $(SOURCES:.c=.obj)
.c.obj:
$(CXX) -c $(INCDIRS) $(CXXFLAGS) $(DEFINES) $<
all: $(LIBCURLLIB)
clean:
$(RM) $(LIBCURLLIB)
$(RM) *.obj
$(LIBCURLLIB): $(LINKLIB) $(OBJECTS) Makefile.b32.resp
$(RM) $(LIBCURLLIB)
$(LIB) $(LIBCURLLIB) @Makefile.b32.resp

29
lib/Makefile.b32.resp Normal file
View File

@@ -0,0 +1,29 @@
+base64.obj &
+cookie.obj &
+download.obj &
+escape.obj &
+formdata.obj &
+ftp.obj &
+http.obj &
+ldap.obj &
+dict.obj &
+telnet.obj &
+getdate.obj &
+getenv.obj &
+getpass.obj &
+hostip.obj &
+if2ip.obj &
+mprintf.obj &
+netrc.obj &
+progress.obj &
+sendf.obj &
+speedcheck.obj &
+ssluse.obj &
+timeval.obj &
+url.obj &
+file.obj &
+getinfo.obj &
+version.obj &
+easy.obj &
+highlevel.obj &
+strequal.obj

View File

@@ -4,9 +4,9 @@
## (default is release)
##
## Comments to: Troy Engel <tengel@sonic.net>
## Updated by: Craig Davison <cd@securityfocus.com>
## Updated by: Craig Davison <cd@securityfocus.com>
PROGRAM_NAME = libcurl.lib
PROGRAM_NAME = libcurl.lib
PROGRAM_NAME_DEBUG = libcurld.lib
OPENSSL_PATH = ../../openssl-0.9.6
@@ -122,7 +122,7 @@ RELEASE_SSL_OBJS= \
easyrs.obj \
highlevelrs.obj \
strequalrs.obj
LINK_OBJS= \
base64.obj \
cookie.obj \
@@ -345,8 +345,8 @@ highlevelrs.obj: highlevel.c
$(CCRS) $(CFLAGS) highlevel.c
strequalrs.obj: strequal.c
$(CCRS) $(CFLAGS) strequal.c
clean:
-@erase *.obj
-@erase vc60.idb

View File

@@ -197,6 +197,8 @@ static CURLcode AllowServerConnect(struct UrlData *data,
getsockname(sock, (struct sockaddr *) &add, (int *)&size);
s=accept(sock, (struct sockaddr *) &add, (int *)&size);
sclose(sock); /* close the first socket */
if( -1 == s) {
/* DIE! */
failf(data, "Error accept()ing server connect");
@@ -535,7 +537,7 @@ CURLcode ftp_done(struct connectdata *conn)
if(nread < 0)
return CURLE_OPERATION_TIMEOUTED;
if (buf[0] != '2') {
if (ftpcode >= 400) {
failf(data, "QUOT string not accepted: %s",
qitem->data);
return CURLE_FTP_QUOTE_ERROR;
@@ -589,7 +591,7 @@ CURLcode _ftp(struct connectdata *conn)
if(nread < 0)
return CURLE_OPERATION_TIMEOUTED;
if (buf[0] != '2') {
if (ftpcode >= 400) {
failf(data, "QUOT string not accepted: %s",
qitem->data);
return CURLE_FTP_QUOTE_ERROR;
@@ -731,6 +733,12 @@ CURLcode _ftp(struct connectdata *conn)
if ( h ) {
if( (portsock = socket(AF_INET, SOCK_STREAM, 0)) >= 0 ) {
/* we set the secondary socket variable to this for now, it
is only so that the cleanup function will close it in case
we fail before the true secondary stuff is made */
data->secondarysocket = portsock;
memset((char *)&sa, 0, sizeof(sa));
memcpy((char *)&sa.sin_addr,
h->h_addr,

View File

@@ -71,7 +71,7 @@
# define perror(x) fprintf(stderr, "Error in: %s\n", x)
#endif
char *getpass_r(const char *prompt, char *buffer, int buflen)
char *getpass_r(const char *prompt, char *buffer, size_t buflen)
{
FILE *infp;
FILE *outfp;

View File

@@ -3,6 +3,6 @@
/*
* Returning NULL will abort the continued operation!
*/
char* getpass_r(char *prompt, char* buffer, int buflen );
char* getpass_r(char *prompt, char* buffer, size_t buflen );
#endif

View File

@@ -43,6 +43,14 @@
#include <curl/curl.h>
#if defined(WIN32) && !defined(__GNUC__) || defined(__MINGW32__)
#include <winsock.h>
#else /* some kind of unix */
#ifdef HAVE_SYS_SOCKET_H
#include <sys/socket.h>
#endif
#endif
#define _MPRINTF_REPLACE
#include <curl/mprintf.h>
#include "urldata.h"
@@ -50,6 +58,12 @@
#include <string.h>
#include <stdlib.h>
#ifdef HAVE_UNISTD_H
#include <unistd.h>
#endif
/* DONT include memdebug.h here! */
/*
* Note that these debug functions are very simple and they are meant to
* remain so. For advanced analysis, record a log file and write perl scripts
@@ -115,4 +129,46 @@ void curl_dofree(void *ptr, int line, char *source)
source, line, ptr);
}
int curl_socket(int domain, int type, int protocol, int line, char *source)
{
int sockfd=(socket)(domain, type, protocol);
fprintf(logfile?logfile:stderr, "FD %s:%d socket() = %d\n",
source, line, sockfd);
return sockfd;
}
int curl_accept(int s, struct sockaddr *addr, int *addrlen,
int line, char *source)
{
int sockfd=(accept)(s, addr, addrlen);
fprintf(logfile?logfile:stderr, "FD %s:%d accept() = %d\n",
source, line, sockfd);
return sockfd;
}
/* this is our own defined way to close sockets on *ALL* platforms */
int curl_sclose(int sockfd, int line, char *source)
{
int res=sclose(sockfd);
fprintf(logfile?logfile:stderr, "FD %s:%d sclose(%d)\n",
source, line, sockfd);
return sockfd;
}
FILE *curl_fopen(char *file, char *mode, int line, char *source)
{
FILE *res=(fopen)(file, mode);
fprintf(logfile?logfile:stderr, "FILE %s:%d fopen(\"%s\") = %p\n",
source, line, file, res);
return res;
}
int curl_fclose(FILE *file, int line, char *source)
{
int res=(fclose)(file);
fprintf(logfile?logfile:stderr, "FILE %s:%d fclose(%p)\n",
source, line, file);
return res;
}
#endif /* MALLOCDEBUG */

View File

@@ -1,13 +1,42 @@
#ifdef MALLOCDEBUG
#include <sys/socket.h>
#include <stdio.h>
/* memory functions */
void *curl_domalloc(size_t size, int line, char *source);
void *curl_dorealloc(void *ptr, size_t size, int line, char *source);
void curl_dofree(void *ptr, int line, char *source);
char *curl_dostrdup(char *str, int line, char *source);
void curl_memdebug(char *logname);
/* file descriptor manipulators */
int curl_socket(int domain, int type, int protocol, int, char *);
int curl_sclose(int sockfd, int, char *);
int curl_accept(int s, struct sockaddr *addr, int *addrlen,
int line, char *source);
/* FILE functions */
FILE *curl_fopen(char *file, char *mode, int line, char *source);
int curl_fclose(FILE *file, int line, char *source);
/* Set this symbol on the command-line, recompile all lib-sources */
#define strdup(ptr) curl_dostrdup(ptr, __LINE__, __FILE__)
#define malloc(size) curl_domalloc(size, __LINE__, __FILE__)
#define realloc(ptr,size) curl_dorealloc(ptr, size, __LINE__, __FILE__)
#define free(ptr) curl_dofree(ptr, __LINE__, __FILE__)
#define socket(domain,type,protocol)\
curl_socket(domain,type,protocol,__LINE__,__FILE__)
#define accept(sock,addr,len)\
curl_accept(sock,addr,len,__LINE__,__FILE__)
/* sclose is probably already defined, redefine it! */
#undef sclose
#define sclose(sockfd) curl_sclose(sockfd,__LINE__,__FILE__)
#undef fopen
#define fopen(file,mode) curl_fopen(file,mode,__LINE__,__FILE__)
#define fclose(file) curl_fclose(file,__LINE__,__FILE__)
#endif

View File

@@ -60,8 +60,8 @@
#ifdef KRB4
#include "security.h"
#include <string.h>
#endif
#include <string.h>
/* The last #include file should be: */
#ifdef MALLOCDEBUG
#include "memdebug.h"
@@ -123,37 +123,39 @@ size_t sendf(int fd, struct UrlData *data, char *fmt, ...)
/*
* ftpsendf() sends the formated string as a ftp command to a ftp server
*
* NOTE: we build the command in a fixed-length buffer, which sets length
* restrictions on the command!
*
*/
size_t ftpsendf(int fd, struct connectdata *conn, char *fmt, ...)
{
size_t bytes_written;
char *s;
char s[256];
va_list ap;
va_start(ap, fmt);
s = mvaprintf(fmt, ap);
vsnprintf(s, 250, fmt, ap);
va_end(ap);
if(!s)
return 0; /* failure */
if(conn->data->bits.verbose)
fprintf(conn->data->err, "> %s\n", s);
strcat(s, "\r\n"); /* append a trailing CRLF */
#ifdef KRB4
if(conn->sec_complete && conn->data->cmdchannel) {
bytes_written = sec_fprintf(conn, conn->data->cmdchannel, s);
bytes_written += fprintf(conn->data->cmdchannel, "\r\n");
fflush(conn->data->cmdchannel);
}
else
#endif /* KRB4 */
{
bytes_written = swrite(fd, s, strlen(s));
bytes_written += swrite(fd, "\r\n", 2);
}
free(s); /* free the output string */
return(bytes_written);
}
/* ssend() sends plain (binary) data to the server */
size_t ssend(int fd, struct connectdata *conn, void *mem, size_t len)
{

View File

@@ -723,7 +723,10 @@ static CURLcode _connect(CURL *curl, CURLconnect **in_connect)
#ifdef HAVE_SIGACTION
sigaction(SIGALRM, NULL, &sigact);
sigact.sa_handler = alarmfunc;
#ifdef SA_RESTART
/* HPUX doesn't have SA_RESTART but defaults to that behaviour! */
sigact.sa_flags &= ~SA_RESTART;
#endif
sigaction(SIGALRM, &sigact, NULL);
#else
/* no sigaction(), revert to the much lamer signal() */

View File

@@ -72,6 +72,61 @@ while(<STDIN>) {
print "Not recognized input line: $function\n";
}
}
# FD url.c:1282 socket() = 5
elsif($_ =~ /^FD ([^:]*):(\d*) (.*)/) {
# generic match for the filename+linenumber
$source = $1;
$linenum = $2;
$function = $3;
if($function =~ /socket\(\) = (\d*)/) {
$filedes{$1}=1;
$getfile{$1}="$source:$linenum";
$openfile++;
}
elsif($function =~ /accept\(\) = (\d*)/) {
$filedes{$1}=1;
$getfile{$1}="$source:$linenum";
$openfile++;
}
elsif($function =~ /sclose\((\d*)\)/) {
if($filedes{$1} != 1) {
print "Close without open: $line\n";
}
else {
$filedes{$1}=0; # closed now
$openfile--;
}
}
}
# FILE url.c:1282 fopen("blabla") = 0x5ddd
elsif($_ =~ /^FILE ([^:]*):(\d*) (.*)/) {
# generic match for the filename+linenumber
$source = $1;
$linenum = $2;
$function = $3;
if($function =~ /fopen\(\"([^\"]*)\"\) = (\(nil\)|0x([0-9a-f]*))/) {
if($2 eq "(nil)") {
;
}
else {
$fopen{$3}=1;
$fopenfile{$3}="$source:$linenum";
$fopens++;
}
}
# fclose(0x1026c8)
elsif($function =~ /fclose\(0x([0-9a-f]*)\)/) {
if(!$fopen{$1}) {
print "fclose() without fopen(): $line\n";
}
else {
$fopen{$1}=0;
$fopens--;
}
}
}
else {
print "Not recognized prefix line: $line\n";
}
@@ -93,3 +148,19 @@ if($totalmem) {
}
}
if($openfile) {
for(keys %filedes) {
if($filedes{$_} == 1) {
print "Open file descriptor created at ".$getfile{$_}."\n";
}
}
}
if($fopens) {
print "Open FILE handles left at:\n";
for(keys %fopen) {
if($fopen{$_} == 1) {
print "fopen() called at ".$fopenfile{$_}."\n";
}
}
}

View File

@@ -31,7 +31,7 @@ Authors:
%prep
%setup -n curl-7.4.2
%setup -n %{name}-%{version}
%build

443
perl/crawlink.pl Executable file
View File

@@ -0,0 +1,443 @@
#!/usr/bin/perl
#
# crawlink.pl
#
# This script crawls across all found links below the given "root" URL.
# It reports all good and bad links to stdout. This code was based on the
# checklink.pl script I wrote ages ago.
#
# Written to use 'curl' for URL checking.
#
# Author: Daniel Stenberg <daniel@haxx.se>
# Version: 0.3 Jan 3, 2001
#
# HISTORY
#
# 0.3 - The -i now adds regexes that if a full URL link matches one of those,
# it is not followed. This can then be used to prevent this script from
# following '.*\.cgi', specific pages or whatever.
#
# 0.2 - Made it only HEAD non html files (i.e skip the GET). Makes it a lot
# faster to skip large non HTML files such as pdfs or big RFCs! ;-)
# Added a -c option that allows me to pass options to curl.
#
# 0.1 - The given url works as the root. This script will only continue
# and check other URLs if the leftmost part of the new URL is identical
# to the root URL.
#
use strict;
my $in="";
my $verbose=0;
my $usestdin;
my $linenumber;
my $help;
my $external;
my $curlopts;
my @ignorelist;
argv:
if($ARGV[0] eq "-v" ) {
$verbose++;
shift @ARGV;
goto argv;
}
elsif($ARGV[0] eq "-c" ) {
$curlopts=$ARGV[1];
shift @ARGV;
shift @ARGV;
goto argv;
}
elsif($ARGV[0] eq "-i" ) {
push @ignorelist, $ARGV[1];
shift @ARGV;
shift @ARGV;
goto argv;
}
elsif($ARGV[0] eq "-l" ) {
$linenumber = 1;
shift @ARGV;
goto argv;
}
elsif($ARGV[0] eq "-h" ) {
$help = 1;
shift @ARGV;
goto argv;
}
elsif($ARGV[0] eq "-x" ) {
$external = 1;
shift @ARGV;
goto argv;
}
my $geturl = $ARGV[0];
my $firsturl= $geturl;
#
# Define a hash array to hold all root URLs to visit/we have visited
#
my %rooturls;
$rooturls{$ARGV[0]}=1;
if(($geturl eq "") || $help) {
print "Usage: $0 [-hilvx] <full URL>\n",
" Use a traling slash for directory URLs!\n",
" -c [data] Pass [data] as argument to every curl invoke\n",
" -h This help text\n",
" -i [regex] Ignore root links that match this pattern\n",
" -l Line number report for BAD links\n",
" -v Verbose mode\n",
" -x Check non-local (external?) links only\n";
exit;
}
my $proxy;
if($curlopts ne "") {
$proxy=" $curlopts";
#$proxy =" -x 194.237.142.41:80";
}
# linkchecker, URL will be appended to the right of this command line
# this is the one using HEAD:
my $linkcheck = "curl -s -m 20 -I$proxy";
# as a second attempt, this will be used. This is not using HEAD but will
# get the whole frigging document!
my $linkcheckfull = "curl -s -m 20 -i$proxy";
# htmlget, URL will be appended to the right of this command line
my $htmlget = "curl -s$proxy";
# Parse the input URL and split it into the relevant parts:
my $getprotocol;
my $getserver;
my $getpath;
my $getdocument;
my %done;
my %tagtype;
my $allcount=0;
my $badlinks=0;
sub SplitURL {
my $inurl = $_[0];
if($inurl=~ /^([^:]+):\/\/([^\/]*)\/(.*)\/(.*)/ ) {
$getprotocol = $1;
$getserver = $2;
$getpath = $3;
$getdocument = $4;
}
elsif ($inurl=~ /^([^:]+):\/\/([^\/]*)\/(.*)/ ) {
$getprotocol = $1;
$getserver = $2;
$getpath = $3;
$getdocument = "";
if($getpath !~ /\//) {
$getpath ="";
$getdocument = $3;
}
}
elsif ($inurl=~ /^([^:]+):\/\/(.*)/ ) {
$getprotocol = $1;
$getserver = $2;
$getpath = "";
$getdocument = "";
}
else {
print "Couldn't parse the specified URL, retry please!\n";
exit;
}
}
my @indoc;
sub GetRootPage {
my $geturl = $_[0];
my $in="";
my $code=200;
my $type="text/plain";
my $pagemoved=0;
open(HEADGET, "$linkcheck $geturl|") ||
die "Couldn't get web page for some reason";
while(<HEADGET>) {
#print STDERR $_;
if($_ =~ /HTTP\/1\.[01] (\d\d\d) /) {
$code=$1;
if($code =~ /^3/) {
$pagemoved=1;
}
}
elsif($_ =~ /^Content-Type: ([\/a-zA-Z]+)/) {
$type=$1;
}
elsif($pagemoved &&
($_ =~ /^Location: (.*)/)) {
$geturl = $1;
&SplitURL($geturl);
$pagemoved++;
last;
}
}
close(HEADGET);
if($pagemoved == 1) {
print "Page is moved but we don't know where. Did you forget the ",
"traling slash?\n";
exit;
}
if($type ne "text/html") {
# there no point in getting anything but HTML
$in="";
}
else {
open(WEBGET, "$htmlget $geturl|") ||
die "Couldn't get web page for some reason";
while(<WEBGET>) {
my $line = $_;
push @indoc, $line;
$line=~ s/\n/ /g;
$line=~ s/\r//g;
$in=$in.$line;
}
close(WEBGET);
}
return ($in, $code, $type);
}
sub LinkWorks {
my $check = $_[0];
# URL encode:
# $check =~s/([^a-zA-Z0-9_:\/.-])/uc sprintf("%%%02x",ord($1))/eg;
my @doc = `$linkcheck \"$check\"`;
my $head = 1;
# print "COMMAND: $linkcheck \"$check\"\n";
# print $doc[0]."\n";
boo:
if( $doc[0] =~ /^HTTP[^ ]+ (\d+)/ ) {
my $error = $1;
if($error < 400 ) {
return "GOOD";
}
else {
if($head && ($error >= 500)) {
# This server doesn't like HEAD!
@doc = `$linkcheckfull \"$check\"`;
$head = 0;
goto boo;
}
return "BAD";
}
}
return "BAD";
}
sub GetLinks {
my $in = $_[0];
my @result;
while($in =~ /[^<]*(<[^>]+>)/g ) {
# we have a tag in $1
my $tag = $1;
if($tag =~ /^<!--/) {
# this is a comment tag, ignore it
}
else {
if($tag =~ /(src|href|background|archive) *= *(\"[^\"]\"|[^ \)>]*)/i) {
my $url=$2;
if($url =~ /^\"(.*)\"$/) {
# this was a "string" now $1 has removed the quotes:
$url=$1;
}
$url =~ s/([^\#]*)\#.*/$1/g;
if($url eq "") {
# if the link was nothing than a #-link it may now have
# been emptied completely so then we skip the rest
next;
}
if($done{$url}) {
# if this url already is done, do next
$done{$url}++;
if($verbose) {
print " FOUND $url but that is already checked\n";
}
next;
}
$done{$url} = 1; # this is "done"
push @result, $url;
if($tag =~ /< *([^ ]+)/) {
$tagtype{$url}=$1;
}
}
}
}
return @result;
}
while(1) {
$geturl=-1;
for(keys %rooturls) {
if($rooturls{$_} == 1) {
if($_ !~ /^$firsturl/) {
$rooturls{$_} += 1000; # don't do this, outside our scope
if($verbose) {
print "SKIP: $_\n";
}
next;
}
$geturl=$_;
last;
}
}
if($geturl == -1) {
last;
}
#
# Splits the URL in its different parts
#
&SplitURL($geturl);
#
# Returns the full HTML of the root page
#
my ($in, $error, $ctype) = &GetRootPage($geturl);
$rooturls{$geturl}++; # increase to prove we have already got it
if($ctype ne "text/html") {
# this is not HTML, we skip this
if($verbose == 2) {
print "Non-HTML link, skipping\n";
next;
}
}
if($error >= 400) {
print "ROOT page $geturl returned $error\n";
next;
}
print " ==== $geturl ====\n";
if($verbose == 2) {
printf("Error code $error, Content-Type: $ctype, got %d bytes\n",
length($in));
}
#print "protocol = $getprotocol\n";
#print "server = $getserver\n";
#print "path = $getpath\n";
#print "document = $getdocument\n";
#exit;
#
# Extracts all links from the given HTML buffer
#
my @links = &GetLinks($in);
for(@links) {
my $url = $_;
my $link;
if($url =~ /^([^:]+):/) {
my $prot = $1;
if($prot !~ /http/i) {
# this is an unsupported protocol, we ignore this
next;
}
$link = $url;
}
else {
if($external) {
next;
}
# this is a link on the same server:
if($url =~ /^\//) {
# from root
$link = "$getprotocol://$getserver$url";
}
else {
# from the scanned page's dir
my $nyurl=$url;
if(length($getpath) &&
($getpath !~ /\/$/) &&
($nyurl !~ /^\//)) {
# lacks ending slash, add one to the document part:
$nyurl = "/".$nyurl;
}
$link = "$getprotocol://$getserver/$getpath$nyurl";
}
}
my $success = &LinkWorks($link);
my $count = $done{$url};
$allcount += $count;
print "$success $count <".$tagtype{$url}."> $link $url\n";
if("BAD" eq $success) {
$badlinks++;
if($linenumber) {
my $line =1;
for(@indoc) {
if($_ =~ /$url/) {
print " line $line\n";
}
$line++;
}
}
}
else {
# the link works, add it if it isn't in the ingore list
my $ignore=0;
for(@ignorelist) {
if($link =~ /$_/) {
$ignore=1;
}
}
if(!$ignore) {
# not ignored, add
$rooturls{$link}++; # check this if not checked already
}
}
}
}
if($verbose) {
print "$allcount links were checked";
if($badlinks > 0) {
print ", $badlinks were found bad";
}
print "\n";
}

43
src/Makefile.b32 Normal file
View File

@@ -0,0 +1,43 @@
############################################################
# Makefile.b32 - Borland's C++ Compiler 5.X
#
# 'src' directory
#
# Written by Jaepil Kim, pit@paradise.net.nz
############################################################
# Set program's name
PROGNAME = curl.exe
# Setup environment
CXX = bcc32
CXXFLAGS = -5 -O2 -WC -w-par -w-csu -w-aus
RM = del
TOPDIR = ..
DEFINES = -DNDEBUG -DLIBCURL_BIGENDIAN=0 -DWIN32 -D_CONSOLE -D_MBCS
LD = bcc32
LDFLAGS = -lap -e$(PROGNAME)
INCDIRS = -I$(TOPDIR)/include
LIBCURLLIB= $(TOPDIR)/lib/libcurl.lib
# 'BCCDIR' has to be set up in your c:\autoexec.bat
# i.e. SET BCCDIR = c:\Borland\BCC55
# where c:\Borland\BCC55 is the compiler is installed
LINKLIB = $(BCCDIR)/lib/psdk/wsock32.lib
PROGRAMS = \
curl.exe
.c.obj:
$(CXX) -c $(INCDIRS) $(CXXFLAGS) $(DEFINES) $*.c
all: $(PROGRAMS)
curl.exe: $(LIBCURLLIB) $(LINKLIB) hugehelp.obj writeout.obj urlglob.obj main.obj
$(LD) $(LDFLAGS) hugehelp.obj writeout.obj urlglob.obj main.obj $(LIBCURLLIB) $(LINKLIB)
clean:
$(RM) *.obj
$(RM) *.exe
$(RM) *.tds

View File

@@ -1,84 +1,84 @@
########################################################
## Makefile for building curl.exe with MSVC6
## Use: nmake -f makefile.vc6 [release | debug]
## (default is release)
##
## Comments to: Troy Engel <tengel@sonic.net>
## Updated by: Craig Davison <cd@securityfocus.com>
PROGRAM_NAME = curl.exe
########################################################
## Nothing more to do below this line!
## Release
CCR = cl.exe /MD /O2 /D "NDEBUG"
LINKR = link.exe /incremental:no /libpath:"../lib"
## Debug
CCD = cl.exe /MDd /Gm /ZI /Od /D "_DEBUG" /GZ
LINKD = link.exe /incremental:yes /debug
CFLAGS = /I "../include" /nologo /W3 /GX /D "WIN32" /D "_CONSOLE" /D "_MBCS" /YX /FD /c
LFLAGS = /nologo /out:$(PROGRAM_NAME) /subsystem:console /machine:I386
LINKLIBS = wsock32.lib libcurl.lib
LINKLIBS_DEBUG = wsock32.lib libcurld.lib
RELEASE_OBJS= \
hugehelpr.obj \
writeoutr.obj \
urlglobr.obj \
mainr.obj
DEBUG_OBJS= \
hugehelpd.obj \
writeoutd.obj \
urlglobd.obj \
maind.obj
LINK_OBJS= \
hugehelp.obj \
writeout.obj \
urlglob.obj \
main.obj
all : release
release: $(RELEASE_OBJS)
$(LINKR) $(LFLAGS) $(LINKLIBS) $(LINK_OBJS)
debug: $(DEBUG_OBJS)
$(LINKD) $(LFLAGS) $(LINKLIBS_DEBUG) $(LINK_OBJS)
## Release
hugehelpr.obj: hugehelp.c
$(CCR) $(CFLAGS) /Zm200 hugehelp.c
writeoutr.obj: writeout.c
$(CCR) $(CFLAGS) writeout.c
urlglobr.obj: urlglob.c
$(CCR) $(CFLAGS) urlglob.c
mainr.obj: main.c
$(CCR) $(CFLAGS) main.c
## Debug
hugehelpd.obj: hugehelp.c
$(CCD) $(CFLAGS) /Zm200 hugehelp.c
writeoutd.obj: writeout.c
$(CCD) $(CFLAGS) writeout.c
urlglobd.obj: urlglob.c
$(CCD) $(CFLAGS) urlglob.c
maind.obj: main.c
$(CCD) $(CFLAGS) main.c
clean:
-@erase hugehelp.obj
-@erase main.obj
-@erase vc60.idb
-@erase vc60.pdb
-@erase vc60.pch
-@erase curl.ilk
-@erase curl.pdb
distrib: clean
-@erase $(PROGRAM_NAME)
########################################################
## Makefile for building curl.exe with MSVC6
## Use: nmake -f makefile.vc6 [release | debug]
## (default is release)
##
## Comments to: Troy Engel <tengel@sonic.net>
## Updated by: Craig Davison <cd@securityfocus.com>
PROGRAM_NAME = curl.exe
########################################################
## Nothing more to do below this line!
## Release
CCR = cl.exe /MD /O2 /D "NDEBUG"
LINKR = link.exe /incremental:no /libpath:"../lib"
## Debug
CCD = cl.exe /MDd /Gm /ZI /Od /D "_DEBUG" /GZ
LINKD = link.exe /incremental:yes /debug
CFLAGS = /I "../include" /nologo /W3 /GX /D "WIN32" /D "_CONSOLE" /D "_MBCS" /YX /FD /c
LFLAGS = /nologo /out:$(PROGRAM_NAME) /subsystem:console /machine:I386
LINKLIBS = wsock32.lib libcurl.lib
LINKLIBS_DEBUG = wsock32.lib libcurld.lib
RELEASE_OBJS= \
hugehelpr.obj \
writeoutr.obj \
urlglobr.obj \
mainr.obj
DEBUG_OBJS= \
hugehelpd.obj \
writeoutd.obj \
urlglobd.obj \
maind.obj
LINK_OBJS= \
hugehelp.obj \
writeout.obj \
urlglob.obj \
main.obj
all : release
release: $(RELEASE_OBJS)
$(LINKR) $(LFLAGS) $(LINKLIBS) $(LINK_OBJS)
debug: $(DEBUG_OBJS)
$(LINKD) $(LFLAGS) $(LINKLIBS_DEBUG) $(LINK_OBJS)
## Release
hugehelpr.obj: hugehelp.c
$(CCR) $(CFLAGS) /Zm200 hugehelp.c
writeoutr.obj: writeout.c
$(CCR) $(CFLAGS) writeout.c
urlglobr.obj: urlglob.c
$(CCR) $(CFLAGS) urlglob.c
mainr.obj: main.c
$(CCR) $(CFLAGS) main.c
## Debug
hugehelpd.obj: hugehelp.c
$(CCD) $(CFLAGS) /Zm200 hugehelp.c
writeoutd.obj: writeout.c
$(CCD) $(CFLAGS) writeout.c
urlglobd.obj: urlglob.c
$(CCD) $(CFLAGS) urlglob.c
maind.obj: main.c
$(CCD) $(CFLAGS) main.c
clean:
-@erase hugehelp.obj
-@erase main.obj
-@erase vc60.idb
-@erase vc60.pdb
-@erase vc60.pch
-@erase curl.ilk
-@erase curl.pdb
distrib: clean
-@erase $(PROGRAM_NAME)

View File

@@ -1473,9 +1473,9 @@ operate(struct Configurable *config, int argc, char *argv[])
#endif
}
for (i = 0; (url = next_url(urls)); ++i) {
if (outfiles) {
if (config->outfile) {
free(config->outfile);
config->outfile = outfiles;
config->outfile = outfiles?strdup(outfiles):NULL;
}
if (config->outfile || config->remotefile) {
@@ -1486,13 +1486,14 @@ operate(struct Configurable *config, int argc, char *argv[])
if(!config->outfile && config->remotefile) {
/* Find and get the remote file name */
config->outfile=strstr(url, "://");
if(config->outfile)
config->outfile+=3;
char * pc =strstr(url, "://");
if(pc)
pc+=3;
else
config->outfile=url;
config->outfile = strdup(strrchr(config->outfile, '/'));
if(!config->outfile || !strlen(++config->outfile)) {
pc=url;
pc = strrchr(pc, '/');
config->outfile = (char *) NULL == pc ? NULL : strdup(pc+1) ;
if(!config->outfile || !strlen(config->outfile)) {
helpf("Remote file name has no length!\n");
return CURLE_WRITE_ERROR;
}
@@ -1500,7 +1501,7 @@ operate(struct Configurable *config, int argc, char *argv[])
else {
/* fill '#1' ... '#9' terms from URL pattern */
char *outfile = config->outfile;
config->outfile = match_url(config->outfile, *urls);
config->outfile = match_url(config->outfile, urls);
free(outfile);
}
@@ -1757,6 +1758,9 @@ operate(struct Configurable *config, int argc, char *argv[])
free(url);
}
if(outfiles)
free(outfiles);
#ifdef MIME_SEPARATORS
if (separator)
printf("--%s--\n", MIMEseparator);

View File

@@ -49,25 +49,23 @@
#include "../lib/memdebug.h"
#endif
char *glob_buffer;
URLGlob *glob_expand;
int glob_word(URLGlob *, char*, int);
int glob_word(char*, int);
int glob_set(char *pattern, int pos) {
int glob_set(URLGlob *glob, char *pattern, int pos)
{
/* processes a set expression with the point behind the opening '{'
','-separated elements are collected until the next closing '}'
*/
char* buf = glob_buffer;
char* buf = glob->glob_buffer;
URLPattern *pat;
pat = (URLPattern*)&glob_expand->pattern[glob_expand->size / 2];
pat = (URLPattern*)&glob->pattern[glob->size / 2];
/* patterns 0,1,2,... correspond to size=1,3,5,... */
pat->type = UPTSet;
pat->content.Set.size = 0;
pat->content.Set.ptr_s = 0;
pat->content.Set.elements = (char**)malloc(0);
++glob_expand->size;
++glob->size;
while (1) {
switch (*pattern) {
@@ -81,19 +79,22 @@ int glob_set(char *pattern, int pos) {
case ',':
case '}': /* set element completed */
*buf = '\0';
pat->content.Set.elements = realloc(pat->content.Set.elements, (pat->content.Set.size + 1) * sizeof(char*));
pat->content.Set.elements =
realloc(pat->content.Set.elements,
(pat->content.Set.size + 1) * sizeof(char*));
if (!pat->content.Set.elements) {
printf("out of memory in set pattern\n");
exit(CURLE_OUT_OF_MEMORY);
}
pat->content.Set.elements[pat->content.Set.size] = strdup(glob_buffer);
pat->content.Set.elements[pat->content.Set.size] =
strdup(glob->glob_buffer);
++pat->content.Set.size;
if (*pattern == '}') /* entire set pattern completed */
/* always check for a literal (may be "") between patterns */
return pat->content.Set.size * glob_word(++pattern, ++pos);
return pat->content.Set.size * glob_word(glob, ++pattern, ++pos);
buf = glob_buffer;
buf = glob->glob_buffer;
++pattern;
++pos;
break;
@@ -115,7 +116,8 @@ int glob_set(char *pattern, int pos) {
exit (CURLE_FAILED_INIT);
}
int glob_range(char *pattern, int pos) {
int glob_range(URLGlob *glob, char *pattern, int pos)
{
/* processes a range expression with the point behind the opening '['
- char range: e.g. "a-z]", "B-Q]"
- num range: e.g. "0-9]", "17-2000]"
@@ -125,9 +127,9 @@ int glob_range(char *pattern, int pos) {
URLPattern *pat;
char *c;
pat = (URLPattern*)&glob_expand->pattern[glob_expand->size / 2];
pat = (URLPattern*)&glob->pattern[glob->size / 2];
/* patterns 0,1,2,... correspond to size=1,3,5,... */
++glob_expand->size;
++glob->size;
if (isalpha((int)*pattern)) { /* character range detected */
pat->type = UPTCharRange;
@@ -141,7 +143,7 @@ int glob_range(char *pattern, int pos) {
pat->content.CharRange.ptr_c = pat->content.CharRange.min_c;
/* always check for a literal (may be "") between patterns */
return (pat->content.CharRange.max_c - pat->content.CharRange.min_c + 1) *
glob_word(pattern + 4, pos + 4);
glob_word(glob, pattern + 4, pos + 4);
}
if (isdigit((int)*pattern)) { /* numeric range detected */
pat->type = UPTNumRange;
@@ -162,17 +164,18 @@ int glob_range(char *pattern, int pos) {
c = (char*)(strchr(pattern, ']') + 1); /* continue after next ']' */
/* always check for a literal (may be "") between patterns */
return (pat->content.NumRange.max_n - pat->content.NumRange.min_n + 1) *
glob_word(c, pos + (c - pattern));
glob_word(glob, c, pos + (c - pattern));
}
printf("error: illegal character in range specification at pos %d\n", pos);
exit (CURLE_URL_MALFORMAT);
}
int glob_word(char *pattern, int pos) {
int glob_word(URLGlob *glob, char *pattern, int pos)
{
/* processes a literal string component of a URL
special characters '{' and '[' branch to set/range processing functions
*/
char* buf = glob_buffer;
char* buf = glob->glob_buffer;
int litindex;
while (*pattern != '\0' && *pattern != '{' && *pattern != '[') {
@@ -192,17 +195,17 @@ int glob_word(char *pattern, int pos) {
++pos;
}
*buf = '\0';
litindex = glob_expand->size / 2;
litindex = glob->size / 2;
/* literals 0,1,2,... correspond to size=0,2,4,... */
glob_expand->literal[litindex] = strdup(glob_buffer);
++glob_expand->size;
glob->literal[litindex] = strdup(glob->glob_buffer);
++glob->size;
if (*pattern == '\0')
return 1; /* singular URL processed */
if (*pattern == '{') {
return glob_set(++pattern, ++pos); /* process set pattern */
return glob_set(glob, ++pattern, ++pos); /* process set pattern */
}
if (*pattern == '[') {
return glob_range(++pattern, ++pos);/* process range pattern */
return glob_range(glob, ++pattern, ++pos);/* process range pattern */
}
printf("internal error\n");
exit (CURLE_FAILED_INIT);
@@ -214,18 +217,26 @@ int glob_url(URLGlob** glob, char* url, int *urlnum)
* We can deal with any-size, just make a buffer with the same length
* as the specified URL!
*/
glob_buffer=(char *)malloc(strlen(url)+1);
URLGlob *glob_expand;
char *glob_buffer=(char *)malloc(strlen(url)+1);
if(NULL == glob_buffer)
return CURLE_OUT_OF_MEMORY;
glob_expand = (URLGlob*)malloc(sizeof(URLGlob));
if(NULL == glob_expand) {
free(glob_buffer);
return CURLE_OUT_OF_MEMORY;
}
glob_expand->size = 0;
*urlnum = glob_word(url, 1);
glob_expand->urllen = strlen(url);
glob_expand->glob_buffer = glob_buffer;
*urlnum = glob_word(glob_expand, url, 1);
*glob = glob_expand;
return CURLE_OK;
}
void glob_cleanup(URLGlob* glob) {
void glob_cleanup(URLGlob* glob)
{
int i, elem;
for (i = glob->size - 1; i >= 0; --i) {
@@ -240,14 +251,14 @@ void glob_cleanup(URLGlob* glob) {
}
}
}
free(glob->glob_buffer);
free(glob);
free(glob_buffer);
}
char *next_url(URLGlob *glob)
{
static int beenhere = 0;
char *buf = glob_buffer;
char *buf = glob->glob_buffer;
URLPattern *pat;
char *lit;
signed int i;
@@ -318,48 +329,83 @@ char *next_url(URLGlob *glob)
}
}
*buf = '\0';
return strdup(glob_buffer);
return strdup(glob->glob_buffer);
}
char *match_url(char *filename, URLGlob glob) {
char *buf = glob_buffer;
char *match_url(char *filename, URLGlob *glob)
{
char *target;
URLPattern pat;
int i;
int allocsize;
int stringlen=0;
char numbuf[18];
char *appendthis;
size_t appendlen;
/* We cannot use the glob_buffer for storage here since the filename may
* be longer than the URL we use. We allocate a good start size, then
* we need to realloc in case of need.
*/
allocsize=strlen(filename);
target = malloc(allocsize);
if(NULL == target)
return NULL; /* major failure */
while (*filename != '\0') {
if (*filename == '#') {
if (!isdigit((int)*++filename) ||
*filename == '0') { /* only '#1' ... '#9' allowed */
printf("illegal matching expression\n");
exit(CURLE_URL_MALFORMAT);
/* printf("illegal matching expression\n");
exit(CURLE_URL_MALFORMAT);*/
continue;
}
i = *filename - '1';
if (i + 1 > glob.size / 2) {
printf("match against nonexisting pattern\n");
exit(CURLE_URL_MALFORMAT);
if (i + 1 > glob->size / 2) {
/*printf("match against nonexisting pattern\n");
exit(CURLE_URL_MALFORMAT);*/
continue;
}
pat = glob.pattern[i];
pat = glob->pattern[i];
switch (pat.type) {
case UPTSet:
strcpy(buf, pat.content.Set.elements[pat.content.Set.ptr_s]);
buf += strlen(pat.content.Set.elements[pat.content.Set.ptr_s]);
appendthis = pat.content.Set.elements[pat.content.Set.ptr_s];
appendlen = strlen(pat.content.Set.elements[pat.content.Set.ptr_s]);
break;
case UPTCharRange:
*buf++ = pat.content.CharRange.ptr_c;
numbuf[0]=pat.content.CharRange.ptr_c;
numbuf[1]=0;
appendthis=numbuf;
appendlen=1;
break;
case UPTNumRange:
sprintf(buf, "%0*d", pat.content.NumRange.padlength, pat.content.NumRange.ptr_n);
buf += strlen(buf);
sprintf(numbuf, "%0*d", pat.content.NumRange.padlength, pat.content.NumRange.ptr_n);
appendthis = numbuf;
appendlen = strlen(numbuf);
break;
default:
printf("internal error: invalid pattern type (%d)\n", pat.type);
exit (CURLE_FAILED_INIT);
return NULL;
}
++filename;
}
else
*buf++ = *filename++;
else {
appendthis=filename++;
appendlen=1;
}
if(appendlen + stringlen >= allocsize) {
char *newstr;
allocsize = (appendlen + stringlen)*2;
newstr=realloc(target, allocsize);
if(NULL ==newstr) {
free(target);
return NULL;
}
target=newstr;
}
memcpy(&target[stringlen], appendthis, appendlen);
stringlen += appendlen;
}
*buf = '\0';
return strdup(glob_buffer);
target[stringlen]= '\0';
return target;
}

View File

@@ -65,11 +65,13 @@ typedef struct {
char* literal[10];
URLPattern pattern[9];
int size;
int urllen;
char *glob_buffer;
} URLGlob;
int glob_url(URLGlob**, char*, int *);
char* next_url(URLGlob*);
char* match_url(char*, URLGlob);
char* match_url(char*, URLGlob *);
void glob_cleanup(URLGlob* glob);
#endif

View File

@@ -1,3 +1,3 @@
#define CURL_NAME "curl"
#define CURL_VERSION "7.5"
#define CURL_VERSION "7.5.2-pre1"
#define CURL_ID CURL_NAME " " CURL_VERSION " (" OS ") "

View File

@@ -6,10 +6,10 @@ curl:
@(cd ..; make)
test:
perl runtests.pl
$(PERL) runtests.pl
quiet-test:
perl runtests.pl -s -a
$(PERL) runtests.pl -s -a
clean:
rm -rf log