mirror of
https://github.com/open-source-parsers/jsoncpp.git
synced 2025-04-19 15:47:13 +02:00
Compare commits
15 Commits
master
...
svn-releas
Author | SHA1 | Date | |
---|---|---|---|
![]() |
710c73a950 | ||
![]() |
66464f2c62 | ||
![]() |
d0e9ec5d54 | ||
![]() |
d90fc720ef | ||
![]() |
97af24d2f8 | ||
![]() |
3a75efb375 | ||
![]() |
da6ccbbcfb | ||
![]() |
50c383ba01 | ||
![]() |
df75e4aa84 | ||
![]() |
2279ce36c4 | ||
![]() |
64cc86bf5f | ||
![]() |
6283d0c7c1 | ||
![]() |
c45da10999 | ||
![]() |
9597adfcd0 | ||
![]() |
d07ebe671a |
1
LICENSE
Normal file
1
LICENSE
Normal file
@ -0,0 +1 @@
|
|||||||
|
The json-cpp library and this documentation are in Public Domain.
|
74
README.txt
74
README.txt
@ -1,10 +1,11 @@
|
|||||||
* Introduction:
|
* Introduction:
|
||||||
|
=============
|
||||||
|
|
||||||
JSON (JavaScript Object Notation) is a lightweight data-interchange format.
|
JSON (JavaScript Object Notation) is a lightweight data-interchange format.
|
||||||
It can represent integer, real number, string, an ordered sequence of
|
It can represent integer, real number, string, an ordered sequence of
|
||||||
value, and a collection of name/value pairs.
|
value, and a collection of name/value pairs.
|
||||||
|
|
||||||
JsonCpp is a simple API to manipulate JSON value, and handle serialization
|
JsonCpp is a simple API to manipulate JSON value, handle serialization
|
||||||
and unserialization to string.
|
and unserialization to string.
|
||||||
|
|
||||||
It can also preserve existing comment in unserialization/serialization steps,
|
It can also preserve existing comment in unserialization/serialization steps,
|
||||||
@ -12,7 +13,9 @@ making it a convenient format to store user input files.
|
|||||||
|
|
||||||
Unserialization parsing is user friendly and provides precise error reports.
|
Unserialization parsing is user friendly and provides precise error reports.
|
||||||
|
|
||||||
|
|
||||||
* Building/Testing:
|
* Building/Testing:
|
||||||
|
=================
|
||||||
|
|
||||||
JsonCpp uses Scons (http://www.scons.org) as a build system. Scons requires
|
JsonCpp uses Scons (http://www.scons.org) as a build system. Scons requires
|
||||||
python to be installed (http://www.python.org).
|
python to be installed (http://www.python.org).
|
||||||
@ -39,15 +42,76 @@ to do so.
|
|||||||
|
|
||||||
and TARGET may be:
|
and TARGET may be:
|
||||||
check: build library and run unit tests.
|
check: build library and run unit tests.
|
||||||
doc: build documentation
|
|
||||||
doc-dist: build documentation tarball
|
|
||||||
|
|
||||||
To run the test manually:
|
|
||||||
|
* Running the test manually:
|
||||||
|
==========================
|
||||||
|
|
||||||
cd test
|
cd test
|
||||||
# This will run the Reader/Writer tests
|
# This will run the Reader/Writer tests
|
||||||
python runjsontests.py "path to jsontest.exe"
|
python runjsontests.py "path to jsontest.exe"
|
||||||
|
|
||||||
|
# This will run the Reader/Writer tests, using JSONChecker test suite
|
||||||
|
# (http://www.json.org/JSON_checker/).
|
||||||
|
# Notes: not all tests pass: JsonCpp is too lenient (for example,
|
||||||
|
# it allows an integer to start with '0'). The goal is to improve
|
||||||
|
# strict mode parsing to get all tests to pass.
|
||||||
|
python runjsontests.py --with-json-checker "path to jsontest.exe"
|
||||||
|
|
||||||
# This will run the unit tests (mostly Value)
|
# This will run the unit tests (mostly Value)
|
||||||
python rununittests.py "path to test_lib_json.exe"
|
python rununittests.py "path to test_lib_json.exe"
|
||||||
|
|
||||||
You can run the tests using valgrind using:
|
You can run the tests using valgrind:
|
||||||
python rununittests.py --valgrind "path to test_lib_json.exe"
|
python rununittests.py --valgrind "path to test_lib_json.exe"
|
||||||
|
|
||||||
|
|
||||||
|
* Building the documentation:
|
||||||
|
===========================
|
||||||
|
|
||||||
|
Run the python script doxybuild.py from the top directory:
|
||||||
|
|
||||||
|
python doxybuild.py --open --with-dot
|
||||||
|
|
||||||
|
See doxybuild.py --help for options.
|
||||||
|
|
||||||
|
|
||||||
|
* Adding a reader/writer test:
|
||||||
|
============================
|
||||||
|
|
||||||
|
To add a test, you need to create two files in test/data:
|
||||||
|
- a TESTNAME.json file, that contains the input document in JSON format.
|
||||||
|
- a TESTNAME.expected file, that contains a flatened representation of
|
||||||
|
the input document.
|
||||||
|
|
||||||
|
TESTNAME.expected file format:
|
||||||
|
- each line represents a JSON element of the element tree represented
|
||||||
|
by the input document.
|
||||||
|
- each line has two parts: the path to access the element separated from
|
||||||
|
the element value by '='. Array and object values are always empty
|
||||||
|
(e.g. represented by either [] or {}).
|
||||||
|
- element path: '.' represented the root element, and is used to separate
|
||||||
|
object members. [N] is used to specify the value of an array element
|
||||||
|
at index N.
|
||||||
|
See test_complex_01.json and test_complex_01.expected to better understand
|
||||||
|
element path.
|
||||||
|
|
||||||
|
|
||||||
|
* Understanding reader/writer test output:
|
||||||
|
========================================
|
||||||
|
|
||||||
|
When a test is run, output files are generated aside the input test files.
|
||||||
|
Below is a short description of the content of each file:
|
||||||
|
|
||||||
|
- test_complex_01.json: input JSON document
|
||||||
|
- test_complex_01.expected: flattened JSON element tree used to check if
|
||||||
|
parsing was corrected.
|
||||||
|
|
||||||
|
- test_complex_01.actual: flattened JSON element tree produced by
|
||||||
|
jsontest.exe from reading test_complex_01.json
|
||||||
|
- test_complex_01.rewrite: JSON document written by jsontest.exe using the
|
||||||
|
Json::Value parsed from test_complex_01.json and serialized using
|
||||||
|
Json::StyledWritter.
|
||||||
|
- test_complex_01.actual-rewrite: flattened JSON element tree produced by
|
||||||
|
jsontest.exe from reading test_complex_01.rewrite.
|
||||||
|
test_complex_01.process-output: jsontest.exe output, typically useful to
|
||||||
|
understand parsing error.
|
||||||
|
77
SConstruct
77
SConstruct
@ -1,79 +1,17 @@
|
|||||||
"""
|
"""
|
||||||
Build system can be clean-up by sticking to a few core production factory, with automatic dependencies resolution.
|
Notes:
|
||||||
4 basic project productions:
|
- shared library support is buggy: it assumes that a static and dynamic library can be build from the same object files. This is not true on many platforms. For this reason it is only enabled on linux-gcc at the current time.
|
||||||
- library
|
|
||||||
- binary
|
|
||||||
- documentation
|
|
||||||
- tests
|
|
||||||
|
|
||||||
* Library:
|
To add a platform:
|
||||||
Input:
|
- add its name in options allowed_values below
|
||||||
- dependencies (other libraries)
|
- add tool initialization for this platform. Search for "if platform == 'suncc'" as an example.
|
||||||
- headers: include path & files
|
|
||||||
- sources
|
|
||||||
- generated sources
|
|
||||||
- resources
|
|
||||||
- generated resources
|
|
||||||
Production:
|
|
||||||
- Static library
|
|
||||||
- Dynamic library
|
|
||||||
- Naming rule
|
|
||||||
Life-cycle:
|
|
||||||
- Library compilation
|
|
||||||
- Compilation as a dependencies
|
|
||||||
- Run-time
|
|
||||||
- Packaging
|
|
||||||
Identity:
|
|
||||||
- Name
|
|
||||||
- Version
|
|
||||||
* Binary:
|
|
||||||
Input:
|
|
||||||
- dependencies (other libraries)
|
|
||||||
- headers: include path & files (usually empty)
|
|
||||||
- sources
|
|
||||||
- generated sources
|
|
||||||
- resources
|
|
||||||
- generated resources
|
|
||||||
- supported variant (optimized/debug, dll/static...)
|
|
||||||
Production:
|
|
||||||
- Binary executable
|
|
||||||
- Manifest [on some platforms]
|
|
||||||
- Debug symbol [on some platforms]
|
|
||||||
Life-cycle:
|
|
||||||
- Compilation
|
|
||||||
- Run-time
|
|
||||||
- Packaging
|
|
||||||
Identity:
|
|
||||||
- Name
|
|
||||||
- Version
|
|
||||||
* Documentation:
|
|
||||||
Input:
|
|
||||||
- dependencies (libraries, binaries)
|
|
||||||
- additional sources
|
|
||||||
- generated sources
|
|
||||||
- resources
|
|
||||||
- generated resources
|
|
||||||
- supported variant (public/internal)
|
|
||||||
Production:
|
|
||||||
- HTML documentation
|
|
||||||
- PDF documentation
|
|
||||||
- CHM documentation
|
|
||||||
Life-cycle:
|
|
||||||
- Documentation
|
|
||||||
- Packaging
|
|
||||||
- Test
|
|
||||||
Identity:
|
|
||||||
- Name
|
|
||||||
- Version
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
import os
|
import os
|
||||||
import os.path
|
import os.path
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
JSONCPP_VERSION = '0.2'
|
JSONCPP_VERSION = open(File('#version').abspath,'rt').read().strip()
|
||||||
DIST_DIR = '#dist'
|
DIST_DIR = '#dist'
|
||||||
|
|
||||||
options = Variables()
|
options = Variables()
|
||||||
@ -174,8 +112,6 @@ else:
|
|||||||
print "UNSUPPORTED PLATFORM."
|
print "UNSUPPORTED PLATFORM."
|
||||||
env.Exit(1)
|
env.Exit(1)
|
||||||
|
|
||||||
env.Tool('doxygen')
|
|
||||||
env.Tool('substinfile')
|
|
||||||
env.Tool('targz')
|
env.Tool('targz')
|
||||||
env.Tool('srcdist')
|
env.Tool('srcdist')
|
||||||
env.Tool('globtool')
|
env.Tool('globtool')
|
||||||
@ -295,6 +231,5 @@ env.Alias( 'src-dist', srcdist_cmd )
|
|||||||
buildProjectInDirectory( 'src/jsontestrunner' )
|
buildProjectInDirectory( 'src/jsontestrunner' )
|
||||||
buildProjectInDirectory( 'src/lib_json' )
|
buildProjectInDirectory( 'src/lib_json' )
|
||||||
buildProjectInDirectory( 'src/test_lib_json' )
|
buildProjectInDirectory( 'src/test_lib_json' )
|
||||||
buildProjectInDirectory( 'doc' )
|
|
||||||
#print env.Dump()
|
#print env.Dump()
|
||||||
|
|
||||||
|
1
devtools/__init__.py
Normal file
1
devtools/__init__.py
Normal file
@ -0,0 +1 @@
|
|||||||
|
# module
|
201
devtools/antglob.py
Normal file
201
devtools/antglob.py
Normal file
@ -0,0 +1,201 @@
|
|||||||
|
#!/usr/bin/env python
|
||||||
|
# encoding: utf-8
|
||||||
|
# Baptiste Lepilleur, 2009
|
||||||
|
|
||||||
|
from dircache import listdir
|
||||||
|
import re
|
||||||
|
import fnmatch
|
||||||
|
import os.path
|
||||||
|
|
||||||
|
|
||||||
|
# These fnmatch expressions are used by default to prune the directory tree
|
||||||
|
# while doing the recursive traversal in the glob_impl method of glob function.
|
||||||
|
prune_dirs = '.git .bzr .hg .svn _MTN _darcs CVS SCCS '
|
||||||
|
|
||||||
|
# These fnmatch expressions are used by default to exclude files and dirs
|
||||||
|
# while doing the recursive traversal in the glob_impl method of glob function.
|
||||||
|
##exclude_pats = prune_pats + '*~ #*# .#* %*% ._* .gitignore .cvsignore vssver.scc .DS_Store'.split()
|
||||||
|
|
||||||
|
# These ant_glob expressions are used by default to exclude files and dirs and also prune the directory tree
|
||||||
|
# while doing the recursive traversal in the glob_impl method of glob function.
|
||||||
|
default_excludes = '''
|
||||||
|
**/*~
|
||||||
|
**/#*#
|
||||||
|
**/.#*
|
||||||
|
**/%*%
|
||||||
|
**/._*
|
||||||
|
**/CVS
|
||||||
|
**/CVS/**
|
||||||
|
**/.cvsignore
|
||||||
|
**/SCCS
|
||||||
|
**/SCCS/**
|
||||||
|
**/vssver.scc
|
||||||
|
**/.svn
|
||||||
|
**/.svn/**
|
||||||
|
**/.git
|
||||||
|
**/.git/**
|
||||||
|
**/.gitignore
|
||||||
|
**/.bzr
|
||||||
|
**/.bzr/**
|
||||||
|
**/.hg
|
||||||
|
**/.hg/**
|
||||||
|
**/_MTN
|
||||||
|
**/_MTN/**
|
||||||
|
**/_darcs
|
||||||
|
**/_darcs/**
|
||||||
|
**/.DS_Store '''
|
||||||
|
|
||||||
|
DIR = 1
|
||||||
|
FILE = 2
|
||||||
|
DIR_LINK = 4
|
||||||
|
FILE_LINK = 8
|
||||||
|
LINKS = DIR_LINK | FILE_LINK
|
||||||
|
ALL_NO_LINK = DIR | FILE
|
||||||
|
ALL = DIR | FILE | LINKS
|
||||||
|
|
||||||
|
_ANT_RE = re.compile( r'(/\*\*/)|(\*\*/)|(/\*\*)|(\*)|(/)|([^\*/]*)' )
|
||||||
|
|
||||||
|
def ant_pattern_to_re( ant_pattern ):
|
||||||
|
"""Generates a regular expression from the ant pattern.
|
||||||
|
Matching convention:
|
||||||
|
**/a: match 'a', 'dir/a', 'dir1/dir2/a'
|
||||||
|
a/**/b: match 'a/b', 'a/c/b', 'a/d/c/b'
|
||||||
|
*.py: match 'script.py' but not 'a/script.py'
|
||||||
|
"""
|
||||||
|
rex = ['^']
|
||||||
|
next_pos = 0
|
||||||
|
sep_rex = r'(?:/|%s)' % re.escape( os.path.sep )
|
||||||
|
## print 'Converting', ant_pattern
|
||||||
|
for match in _ANT_RE.finditer( ant_pattern ):
|
||||||
|
## print 'Matched', match.group()
|
||||||
|
## print match.start(0), next_pos
|
||||||
|
if match.start(0) != next_pos:
|
||||||
|
raise ValueError( "Invalid ant pattern" )
|
||||||
|
if match.group(1): # /**/
|
||||||
|
rex.append( sep_rex + '(?:.*%s)?' % sep_rex )
|
||||||
|
elif match.group(2): # **/
|
||||||
|
rex.append( '(?:.*%s)?' % sep_rex )
|
||||||
|
elif match.group(3): # /**
|
||||||
|
rex.append( sep_rex + '.*' )
|
||||||
|
elif match.group(4): # *
|
||||||
|
rex.append( '[^/%s]*' % re.escape(os.path.sep) )
|
||||||
|
elif match.group(5): # /
|
||||||
|
rex.append( sep_rex )
|
||||||
|
else: # somepath
|
||||||
|
rex.append( re.escape(match.group(6)) )
|
||||||
|
next_pos = match.end()
|
||||||
|
rex.append('$')
|
||||||
|
return re.compile( ''.join( rex ) )
|
||||||
|
|
||||||
|
def _as_list( l ):
|
||||||
|
if isinstance(l, basestring):
|
||||||
|
return l.split()
|
||||||
|
return l
|
||||||
|
|
||||||
|
def glob(dir_path,
|
||||||
|
includes = '**/*',
|
||||||
|
excludes = default_excludes,
|
||||||
|
entry_type = FILE,
|
||||||
|
prune_dirs = prune_dirs,
|
||||||
|
max_depth = 25):
|
||||||
|
include_filter = [ant_pattern_to_re(p) for p in _as_list(includes)]
|
||||||
|
exclude_filter = [ant_pattern_to_re(p) for p in _as_list(excludes)]
|
||||||
|
prune_dirs = [p.replace('/',os.path.sep) for p in _as_list(prune_dirs)]
|
||||||
|
dir_path = dir_path.replace('/',os.path.sep)
|
||||||
|
entry_type_filter = entry_type
|
||||||
|
|
||||||
|
def is_pruned_dir( dir_name ):
|
||||||
|
for pattern in prune_dirs:
|
||||||
|
if fnmatch.fnmatch( dir_name, pattern ):
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
def apply_filter( full_path, filter_rexs ):
|
||||||
|
"""Return True if at least one of the filter regular expression match full_path."""
|
||||||
|
for rex in filter_rexs:
|
||||||
|
if rex.match( full_path ):
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
def glob_impl( root_dir_path ):
|
||||||
|
child_dirs = [root_dir_path]
|
||||||
|
while child_dirs:
|
||||||
|
dir_path = child_dirs.pop()
|
||||||
|
for entry in listdir( dir_path ):
|
||||||
|
full_path = os.path.join( dir_path, entry )
|
||||||
|
## print 'Testing:', full_path,
|
||||||
|
is_dir = os.path.isdir( full_path )
|
||||||
|
if is_dir and not is_pruned_dir( entry ): # explore child directory ?
|
||||||
|
## print '===> marked for recursion',
|
||||||
|
child_dirs.append( full_path )
|
||||||
|
included = apply_filter( full_path, include_filter )
|
||||||
|
rejected = apply_filter( full_path, exclude_filter )
|
||||||
|
if not included or rejected: # do not include entry ?
|
||||||
|
## print '=> not included or rejected'
|
||||||
|
continue
|
||||||
|
link = os.path.islink( full_path )
|
||||||
|
is_file = os.path.isfile( full_path )
|
||||||
|
if not is_file and not is_dir:
|
||||||
|
## print '=> unknown entry type'
|
||||||
|
continue
|
||||||
|
if link:
|
||||||
|
entry_type = is_file and FILE_LINK or DIR_LINK
|
||||||
|
else:
|
||||||
|
entry_type = is_file and FILE or DIR
|
||||||
|
## print '=> type: %d' % entry_type,
|
||||||
|
if (entry_type & entry_type_filter) != 0:
|
||||||
|
## print ' => KEEP'
|
||||||
|
yield os.path.join( dir_path, entry )
|
||||||
|
## else:
|
||||||
|
## print ' => TYPE REJECTED'
|
||||||
|
return list( glob_impl( dir_path ) )
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
import unittest
|
||||||
|
|
||||||
|
class AntPatternToRETest(unittest.TestCase):
|
||||||
|
## def test_conversion( self ):
|
||||||
|
## self.assertEqual( '^somepath$', ant_pattern_to_re( 'somepath' ).pattern )
|
||||||
|
|
||||||
|
def test_matching( self ):
|
||||||
|
test_cases = [ ( 'path',
|
||||||
|
['path'],
|
||||||
|
['somepath', 'pathsuffix', '/path', '/path'] ),
|
||||||
|
( '*.py',
|
||||||
|
['source.py', 'source.ext.py', '.py'],
|
||||||
|
['path/source.py', '/.py', 'dir.py/z', 'z.pyc', 'z.c'] ),
|
||||||
|
( '**/path',
|
||||||
|
['path', '/path', '/a/path', 'c:/a/path', '/a/b/path', '//a/path', '/a/path/b/path'],
|
||||||
|
['path/', 'a/path/b', 'dir.py/z', 'somepath', 'pathsuffix', 'a/somepath'] ),
|
||||||
|
( 'path/**',
|
||||||
|
['path/a', 'path/path/a', 'path//'],
|
||||||
|
['path', 'somepath/a', 'a/path', 'a/path/a', 'pathsuffix/a'] ),
|
||||||
|
( '/**/path',
|
||||||
|
['/path', '/a/path', '/a/b/path/path', '/path/path'],
|
||||||
|
['path', 'path/', 'a/path', '/pathsuffix', '/somepath'] ),
|
||||||
|
( 'a/b',
|
||||||
|
['a/b'],
|
||||||
|
['somea/b', 'a/bsuffix', 'a/b/c'] ),
|
||||||
|
( '**/*.py',
|
||||||
|
['script.py', 'src/script.py', 'a/b/script.py', '/a/b/script.py'],
|
||||||
|
['script.pyc', 'script.pyo', 'a.py/b'] ),
|
||||||
|
( 'src/**/*.py',
|
||||||
|
['src/a.py', 'src/dir/a.py'],
|
||||||
|
['a/src/a.py', '/src/a.py'] ),
|
||||||
|
]
|
||||||
|
for ant_pattern, accepted_matches, rejected_matches in list(test_cases):
|
||||||
|
def local_path( paths ):
|
||||||
|
return [ p.replace('/',os.path.sep) for p in paths ]
|
||||||
|
test_cases.append( (ant_pattern, local_path(accepted_matches), local_path( rejected_matches )) )
|
||||||
|
for ant_pattern, accepted_matches, rejected_matches in test_cases:
|
||||||
|
rex = ant_pattern_to_re( ant_pattern )
|
||||||
|
print 'ant_pattern:', ant_pattern, ' => ', rex.pattern
|
||||||
|
for accepted_match in accepted_matches:
|
||||||
|
print 'Accepted?:', accepted_match
|
||||||
|
self.assert_( rex.match( accepted_match ) is not None )
|
||||||
|
for rejected_match in rejected_matches:
|
||||||
|
print 'Rejected?:', rejected_match
|
||||||
|
self.assert_( rex.match( rejected_match ) is None )
|
||||||
|
|
||||||
|
unittest.main()
|
63
devtools/fixeol.py
Normal file
63
devtools/fixeol.py
Normal file
@ -0,0 +1,63 @@
|
|||||||
|
import os.path
|
||||||
|
|
||||||
|
def fix_source_eol( path, is_dry_run = True, verbose = True, eol = '\n' ):
|
||||||
|
"""Makes sure that all sources have the specified eol sequence (default: unix)."""
|
||||||
|
if not os.path.isfile( path ):
|
||||||
|
raise ValueError( 'Path "%s" is not a file' % path )
|
||||||
|
try:
|
||||||
|
f = open(path, 'rb')
|
||||||
|
except IOError, msg:
|
||||||
|
print >> sys.stderr, "%s: I/O Error: %s" % (file, str(msg))
|
||||||
|
return False
|
||||||
|
try:
|
||||||
|
raw_lines = f.readlines()
|
||||||
|
finally:
|
||||||
|
f.close()
|
||||||
|
fixed_lines = [line.rstrip('\r\n') + eol for line in raw_lines]
|
||||||
|
if raw_lines != fixed_lines:
|
||||||
|
print '%s =>' % path,
|
||||||
|
if not is_dry_run:
|
||||||
|
f = open(path, "wb")
|
||||||
|
try:
|
||||||
|
f.writelines(fixed_lines)
|
||||||
|
finally:
|
||||||
|
f.close()
|
||||||
|
if verbose:
|
||||||
|
print is_dry_run and ' NEED FIX' or ' FIXED'
|
||||||
|
return True
|
||||||
|
##
|
||||||
|
##
|
||||||
|
##
|
||||||
|
##def _do_fix( is_dry_run = True ):
|
||||||
|
## from waftools import antglob
|
||||||
|
## python_sources = antglob.glob( '.',
|
||||||
|
## includes = '**/*.py **/wscript **/wscript_build',
|
||||||
|
## excludes = antglob.default_excludes + './waf.py',
|
||||||
|
## prune_dirs = antglob.prune_dirs + 'waf-* ./build' )
|
||||||
|
## for path in python_sources:
|
||||||
|
## _fix_python_source( path, is_dry_run )
|
||||||
|
##
|
||||||
|
## cpp_sources = antglob.glob( '.',
|
||||||
|
## includes = '**/*.cpp **/*.h **/*.inl',
|
||||||
|
## prune_dirs = antglob.prune_dirs + 'waf-* ./build' )
|
||||||
|
## for path in cpp_sources:
|
||||||
|
## _fix_source_eol( path, is_dry_run )
|
||||||
|
##
|
||||||
|
##
|
||||||
|
##def dry_fix(context):
|
||||||
|
## _do_fix( is_dry_run = True )
|
||||||
|
##
|
||||||
|
##def fix(context):
|
||||||
|
## _do_fix( is_dry_run = False )
|
||||||
|
##
|
||||||
|
##def shutdown():
|
||||||
|
## pass
|
||||||
|
##
|
||||||
|
##def check(context):
|
||||||
|
## # Unit tests are run when "check" target is used
|
||||||
|
## ut = UnitTest.unit_test()
|
||||||
|
## ut.change_to_testfile_dir = True
|
||||||
|
## ut.want_to_see_test_output = True
|
||||||
|
## ut.want_to_see_test_error = True
|
||||||
|
## ut.run()
|
||||||
|
## ut.print_results()
|
53
devtools/tarball.py
Normal file
53
devtools/tarball.py
Normal file
@ -0,0 +1,53 @@
|
|||||||
|
import os.path
|
||||||
|
import gzip
|
||||||
|
import tarfile
|
||||||
|
|
||||||
|
TARGZ_DEFAULT_COMPRESSION_LEVEL = 9
|
||||||
|
|
||||||
|
def make_tarball(tarball_path, sources, base_dir, prefix_dir=''):
|
||||||
|
"""Parameters:
|
||||||
|
tarball_path: output path of the .tar.gz file
|
||||||
|
sources: list of sources to include in the tarball, relative to the current directory
|
||||||
|
base_dir: if a source file is in a sub-directory of base_dir, then base_dir is stripped
|
||||||
|
from path in the tarball.
|
||||||
|
prefix_dir: all files stored in the tarball be sub-directory of prefix_dir. Set to ''
|
||||||
|
to make them child of root.
|
||||||
|
"""
|
||||||
|
base_dir = os.path.normpath( os.path.abspath( base_dir ) )
|
||||||
|
def archive_name( path ):
|
||||||
|
"""Makes path relative to base_dir."""
|
||||||
|
path = os.path.normpath( os.path.abspath( path ) )
|
||||||
|
common_path = os.path.commonprefix( (base_dir, path) )
|
||||||
|
archive_name = path[len(common_path):]
|
||||||
|
if os.path.isabs( archive_name ):
|
||||||
|
archive_name = archive_name[1:]
|
||||||
|
return os.path.join( prefix_dir, archive_name )
|
||||||
|
def visit(tar, dirname, names):
|
||||||
|
for name in names:
|
||||||
|
path = os.path.join(dirname, name)
|
||||||
|
if os.path.isfile(path):
|
||||||
|
path_in_tar = archive_name(path)
|
||||||
|
tar.add(path, path_in_tar )
|
||||||
|
compression = TARGZ_DEFAULT_COMPRESSION_LEVEL
|
||||||
|
tar = tarfile.TarFile.gzopen( tarball_path, 'w', compresslevel=compression )
|
||||||
|
try:
|
||||||
|
for source in sources:
|
||||||
|
source_path = source
|
||||||
|
if os.path.isdir( source ):
|
||||||
|
os.path.walk(source_path, visit, tar)
|
||||||
|
else:
|
||||||
|
path_in_tar = archive_name(source_path)
|
||||||
|
tar.add(source_path, path_in_tar ) # filename, arcname
|
||||||
|
finally:
|
||||||
|
tar.close()
|
||||||
|
|
||||||
|
def decompress( tarball_path, base_dir ):
|
||||||
|
"""Decompress the gzipped tarball into directory base_dir.
|
||||||
|
"""
|
||||||
|
# !!! This class method is not documented in the online doc
|
||||||
|
# nor is bz2open!
|
||||||
|
tar = tarfile.TarFile.gzopen(tarball_path, mode='r')
|
||||||
|
try:
|
||||||
|
tar.extractall( base_dir )
|
||||||
|
finally:
|
||||||
|
tar.close()
|
1360
doc/doxyfile.in
1360
doc/doxyfile.in
File diff suppressed because it is too large
Load Diff
@ -4,7 +4,7 @@
|
|||||||
|
|
||||||
<a HREF="http://www.json.org/">JSON (JavaScript Object Notation)</a>
|
<a HREF="http://www.json.org/">JSON (JavaScript Object Notation)</a>
|
||||||
is a lightweight data-interchange format.
|
is a lightweight data-interchange format.
|
||||||
It can represents integer, real number, string, an ordered sequence of value, and
|
It can represent integer, real number, string, an ordered sequence of value, and
|
||||||
a collection of name/value pairs.
|
a collection of name/value pairs.
|
||||||
|
|
||||||
Here is an example of JSON data:
|
Here is an example of JSON data:
|
||||||
@ -28,8 +28,16 @@ Here is an example of JSON data:
|
|||||||
|
|
||||||
\section _features Features
|
\section _features Features
|
||||||
- read and write JSON document
|
- read and write JSON document
|
||||||
|
- attach C and C++ style comments to element during parsing
|
||||||
- rewrite JSON document preserving original comments
|
- rewrite JSON document preserving original comments
|
||||||
|
|
||||||
|
Notes: Comments used to be supported in JSON but where removed for
|
||||||
|
portability (C like comments are not supported in Python). Since
|
||||||
|
comments are useful in configuration/input file, this feature was
|
||||||
|
preserved.
|
||||||
|
|
||||||
|
\section _example Code example
|
||||||
|
|
||||||
\code
|
\code
|
||||||
Json::Value root; // will contains the root value after parsing.
|
Json::Value root; // will contains the root value after parsing.
|
||||||
Json::Reader reader;
|
Json::Reader reader;
|
||||||
@ -57,7 +65,7 @@ setIndentUseSpace( root["indent"].get("use_space", true).asBool() );
|
|||||||
// ...
|
// ...
|
||||||
// At application shutdown to make the new configuration document:
|
// At application shutdown to make the new configuration document:
|
||||||
// Since Json::Value has implicit constructor for all value types, it is not
|
// Since Json::Value has implicit constructor for all value types, it is not
|
||||||
// necessary to explicitely construct the Json::Value object:
|
// necessary to explicitly construct the Json::Value object:
|
||||||
root["encoding"] = getCurrentEncoding();
|
root["encoding"] = getCurrentEncoding();
|
||||||
root["indent"]["length"] = getCurrentIndentLength();
|
root["indent"]["length"] = getCurrentIndentLength();
|
||||||
root["indent"]["use_space"] = getCurrentIndentUseSpace();
|
root["indent"]["use_space"] = getCurrentIndentUseSpace();
|
||||||
@ -75,11 +83,22 @@ std::cout << root;
|
|||||||
\endcode
|
\endcode
|
||||||
|
|
||||||
\section _plinks Build instructions
|
\section _plinks Build instructions
|
||||||
The build instruction are located in the file
|
The build instructions are located in the file
|
||||||
<a HREF="README.txt">README.txt</a> in the top-directory of the project.
|
<a HREF="README.txt">README.txt</a> in the top-directory of the project.
|
||||||
|
|
||||||
Permanent link to the lastest revision of the file in subversion:
|
Permanent link to the latest revision of the file in subversion:
|
||||||
<a HREF="http://svn.sourceforge.net/viewcvs.cgi/jsoncpp/README.txt?view=markup">lastest README.txt</a>
|
<a HREF="http://svn.sourceforge.net/viewcvs.cgi/jsoncpp/README.txt?view=markup">latest README.txt</a>
|
||||||
|
|
||||||
|
\section _pdownload Download
|
||||||
|
The sources can be downloaded from
|
||||||
|
<a HREF="http://sourceforge.net/projects/jsoncpp/files/">SourceForge download page</a>.
|
||||||
|
|
||||||
|
The latest version of the source is available in the project's subversion repository:
|
||||||
|
<a HREF="http://jsoncpp.svn.sourceforge.net/svnroot/jsoncpp/trunk/">
|
||||||
|
http://jsoncpp.svn.sourceforge.net/svnroot/jsoncpp/trunk/</a>
|
||||||
|
|
||||||
|
To checkout the source, see the following
|
||||||
|
<a HREF="http://sourceforge.net/scm/?type=svn&group_id=144446">instructions</a>.
|
||||||
|
|
||||||
\section _plinks Project links
|
\section _plinks Project links
|
||||||
- <a HREF="http://jsoncpp.sourceforge.net">json-cpp home</a>
|
- <a HREF="http://jsoncpp.sourceforge.net">json-cpp home</a>
|
||||||
|
@ -1,61 +0,0 @@
|
|||||||
Import( 'env' )
|
|
||||||
import os.path
|
|
||||||
|
|
||||||
if 'doxygen' in env['TOOLS']:
|
|
||||||
doc_topdir = str(env['ROOTBUILD_DIR'])
|
|
||||||
html_dir = 'jsoncpp-api-doc'
|
|
||||||
|
|
||||||
doxygen_inputs = env.Glob( includes = '*.dox', dir = '#doc' ) \
|
|
||||||
+ env.Glob( includes = '*.h', dir = '#include/json/' ) \
|
|
||||||
+ env.Glob( includes = ('*.dox','*.h','*.inl','*.cpp'),
|
|
||||||
dir = '#src/lib_json' )
|
|
||||||
## for p in doxygen_inputs:
|
|
||||||
## print p.abspath
|
|
||||||
|
|
||||||
top_dir = env.Dir('#').abspath
|
|
||||||
include_top_dir = env.Dir('#include').abspath
|
|
||||||
env['DOXYFILE_DICT'] = { 'PROJECT_NAME': 'JsonCpp',
|
|
||||||
'PROJECT_NUMBER': env['JSONCPP_VERSION'],
|
|
||||||
'STRIP_FROM_PATH': top_dir,
|
|
||||||
'STRIP_FROM_INC_PATH': include_top_dir,
|
|
||||||
'HTML_OUTPUT': html_dir,
|
|
||||||
'HTML_HEADER': env.File('#doc/header.html').abspath,
|
|
||||||
'HTML_FOOTER': env.File('#doc/footer.html').abspath,
|
|
||||||
'INCLUDE_PATH': include_top_dir,
|
|
||||||
'PREDEFINED': 'JSONCPP_DOC_EXCLUDE_IMPLEMENTATION JSON_VALUE_USE_INTERNAL_MAP'
|
|
||||||
}
|
|
||||||
env['DOXYFILE_FILE'] = 'doxyfile.in'
|
|
||||||
doxfile_nodes = env.Doxyfile( os.path.join( doc_topdir, 'doxyfile' ), doxygen_inputs )
|
|
||||||
html_doc_path = os.path.join( doc_topdir, html_dir )
|
|
||||||
doc_nodes = env.Doxygen( source = doxfile_nodes,
|
|
||||||
target = os.path.join( html_doc_path, 'index.html' ) )
|
|
||||||
alias_doc_cmd = env.Alias('doc', doc_nodes )
|
|
||||||
env.Alias('doc', env.Install( html_doc_path, '#README.txt' ) )
|
|
||||||
if 'TarGz' in env['BUILDERS']:
|
|
||||||
targz_path = os.path.join( env['DIST_DIR'], '%s.tar.gz' % html_dir )
|
|
||||||
zip_doc_cmd = env.TarGz( targz_path, [env.Dir(html_doc_path)],
|
|
||||||
TARGZ_BASEDIR = env['ROOTBUILD_DIR'] )
|
|
||||||
env.Depends( zip_doc_cmd, alias_doc_cmd )
|
|
||||||
env.Alias( 'doc-dist', zip_doc_cmd )
|
|
||||||
##
|
|
||||||
## doxyfile = env.SubstInFile( '#doc/doxyfile', 'doxyfile.in',
|
|
||||||
## SUBST_DICT = {
|
|
||||||
## '%JSONCPP_VERSION%' : env['JSONCPP_VERSION'],
|
|
||||||
## '%TOPDIR%' : env.Dir('#').abspath,
|
|
||||||
## '%DOC_TOPDIR%' : str(doc_topdir) } )
|
|
||||||
## doc_cmd = env.Doxygen( doxyfile )
|
|
||||||
## alias_doc_cmd = env.Alias('doc', doc_cmd )
|
|
||||||
## env.AlwaysBuild(alias_doc_cmd)
|
|
||||||
##
|
|
||||||
## for dir in doc_cmd:
|
|
||||||
## env.Alias('doc', env.Install( '#' + dir.path, '#README.txt' ) )
|
|
||||||
## filename = os.path.split(dir.path)[1]
|
|
||||||
## targz_path = os.path.join( env['DIST_DIR'], '%s.tar.gz' % filename )
|
|
||||||
## zip_doc_cmd = env.TarGz( targz_path, [env.Dir(dir)],
|
|
||||||
## TARGZ_BASEDIR = doc_topdir )
|
|
||||||
## env.Depends( zip_doc_cmd, alias_doc_cmd )
|
|
||||||
## env.Alias( 'doc-dist', zip_doc_cmd )
|
|
||||||
##
|
|
||||||
## # When doxyfile gets updated, I get errors on the first pass.
|
|
||||||
## # I have to run scons twice. Something is wrong with the dependencies
|
|
||||||
## # here, but I avoid it by running "scons doc/doxyfile" first.
|
|
167
doxybuild.py
Normal file
167
doxybuild.py
Normal file
@ -0,0 +1,167 @@
|
|||||||
|
"""Script to generate doxygen documentation.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import re
|
||||||
|
import os
|
||||||
|
import os.path
|
||||||
|
import sys
|
||||||
|
import shutil
|
||||||
|
from devtools import tarball
|
||||||
|
|
||||||
|
def find_program(*filenames):
|
||||||
|
"""find a program in folders path_lst, and sets env[var]
|
||||||
|
@param filenames: a list of possible names of the program to search for
|
||||||
|
@return: the full path of the filename if found, or '' if filename could not be found
|
||||||
|
"""
|
||||||
|
paths = os.environ.get('PATH', '').split(os.pathsep)
|
||||||
|
suffixes = ('win32' in sys.platform ) and '.exe .com .bat .cmd' or ''
|
||||||
|
for filename in filenames:
|
||||||
|
for name in [filename+ext for ext in suffixes.split()]:
|
||||||
|
for directory in paths:
|
||||||
|
full_path = os.path.join(directory, name)
|
||||||
|
if os.path.isfile(full_path):
|
||||||
|
return full_path
|
||||||
|
return ''
|
||||||
|
|
||||||
|
def do_subst_in_file(targetfile, sourcefile, dict):
|
||||||
|
"""Replace all instances of the keys of dict with their values.
|
||||||
|
For example, if dict is {'%VERSION%': '1.2345', '%BASE%': 'MyProg'},
|
||||||
|
then all instances of %VERSION% in the file will be replaced with 1.2345 etc.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
f = open(sourcefile, 'rb')
|
||||||
|
contents = f.read()
|
||||||
|
f.close()
|
||||||
|
except:
|
||||||
|
print "Can't read source file %s"%sourcefile
|
||||||
|
raise
|
||||||
|
for (k,v) in dict.items():
|
||||||
|
v = v.replace('\\','\\\\')
|
||||||
|
contents = re.sub(k, v, contents)
|
||||||
|
try:
|
||||||
|
f = open(targetfile, 'wb')
|
||||||
|
f.write(contents)
|
||||||
|
f.close()
|
||||||
|
except:
|
||||||
|
print "Can't write target file %s"%targetfile
|
||||||
|
raise
|
||||||
|
|
||||||
|
def run_doxygen(doxygen_path, config_file, working_dir, is_silent):
|
||||||
|
config_file = os.path.abspath( config_file )
|
||||||
|
doxygen_path = doxygen_path
|
||||||
|
old_cwd = os.getcwd()
|
||||||
|
try:
|
||||||
|
os.chdir( working_dir )
|
||||||
|
cmd = [doxygen_path, config_file]
|
||||||
|
print 'Running:', ' '.join( cmd )
|
||||||
|
try:
|
||||||
|
import subprocess
|
||||||
|
except:
|
||||||
|
if os.system( ' '.join( cmd ) ) != 0:
|
||||||
|
print 'Documentation generation failed'
|
||||||
|
return False
|
||||||
|
else:
|
||||||
|
if is_silent:
|
||||||
|
process = subprocess.Popen( cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT )
|
||||||
|
else:
|
||||||
|
process = subprocess.Popen( cmd )
|
||||||
|
stdout, _ = process.communicate()
|
||||||
|
if process.returncode:
|
||||||
|
print 'Documentation generation failed:'
|
||||||
|
print stdout
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
finally:
|
||||||
|
os.chdir( old_cwd )
|
||||||
|
|
||||||
|
def build_doc( options, make_release=False ):
|
||||||
|
if make_release:
|
||||||
|
options.make_tarball = True
|
||||||
|
options.with_dot = True
|
||||||
|
options.with_html_help = True
|
||||||
|
options.with_uml_look = True
|
||||||
|
options.open = False
|
||||||
|
options.silent = True
|
||||||
|
|
||||||
|
version = open('version','rt').read().strip()
|
||||||
|
output_dir = 'dist/doxygen' # relative to doc/doxyfile location.
|
||||||
|
if not os.path.isdir( output_dir ):
|
||||||
|
os.makedirs( output_dir )
|
||||||
|
top_dir = os.path.abspath( '.' )
|
||||||
|
html_output_dirname = 'jsoncpp-api-html-' + version
|
||||||
|
tarball_path = os.path.join( 'dist', html_output_dirname + '.tar.gz' )
|
||||||
|
warning_log_path = os.path.join( output_dir, '../jsoncpp-doxygen-warning.log' )
|
||||||
|
html_output_path = os.path.join( output_dir, html_output_dirname )
|
||||||
|
def yesno( bool ):
|
||||||
|
return bool and 'YES' or 'NO'
|
||||||
|
subst_keys = {
|
||||||
|
'%JSONCPP_VERSION%': version,
|
||||||
|
'%DOC_TOPDIR%': '',
|
||||||
|
'%TOPDIR%': top_dir,
|
||||||
|
'%HTML_OUTPUT%': os.path.join( '..', output_dir, html_output_dirname ),
|
||||||
|
'%HAVE_DOT%': yesno(options.with_dot),
|
||||||
|
'%DOT_PATH%': os.path.split(options.dot_path)[0],
|
||||||
|
'%HTML_HELP%': yesno(options.with_html_help),
|
||||||
|
'%UML_LOOK%': yesno(options.with_uml_look),
|
||||||
|
'%WARNING_LOG_PATH%': os.path.join( '..', warning_log_path )
|
||||||
|
}
|
||||||
|
|
||||||
|
if os.path.isdir( output_dir ):
|
||||||
|
print 'Deleting directory:', output_dir
|
||||||
|
shutil.rmtree( output_dir )
|
||||||
|
if not os.path.isdir( output_dir ):
|
||||||
|
os.makedirs( output_dir )
|
||||||
|
|
||||||
|
do_subst_in_file( 'doc/doxyfile', 'doc/doxyfile.in', subst_keys )
|
||||||
|
ok = run_doxygen( options.doxygen_path, 'doc/doxyfile', 'doc', is_silent=options.silent )
|
||||||
|
if not options.silent:
|
||||||
|
print open(warning_log_path, 'rb').read()
|
||||||
|
index_path = os.path.abspath(os.path.join(subst_keys['%HTML_OUTPUT%'], 'index.html'))
|
||||||
|
print 'Generated documentation can be found in:'
|
||||||
|
print index_path
|
||||||
|
if options.open:
|
||||||
|
import webbrowser
|
||||||
|
webbrowser.open( 'file://' + index_path )
|
||||||
|
if options.make_tarball:
|
||||||
|
print 'Generating doc tarball to', tarball_path
|
||||||
|
tarball_sources = [
|
||||||
|
output_dir,
|
||||||
|
'README.txt',
|
||||||
|
'version'
|
||||||
|
]
|
||||||
|
tarball_basedir = os.path.join( output_dir, html_output_dirname )
|
||||||
|
tarball.make_tarball( tarball_path, tarball_sources, tarball_basedir, html_output_dirname )
|
||||||
|
return tarball_path, html_output_dirname
|
||||||
|
|
||||||
|
def main():
|
||||||
|
usage = """%prog
|
||||||
|
Generates doxygen documentation in build/doxygen.
|
||||||
|
Optionaly makes a tarball of the documentation to dist/.
|
||||||
|
|
||||||
|
Must be started in the project top directory.
|
||||||
|
"""
|
||||||
|
from optparse import OptionParser
|
||||||
|
parser = OptionParser(usage=usage)
|
||||||
|
parser.allow_interspersed_args = False
|
||||||
|
parser.add_option('--with-dot', dest="with_dot", action='store_true', default=False,
|
||||||
|
help="""Enable usage of DOT to generate collaboration diagram""")
|
||||||
|
parser.add_option('--dot', dest="dot_path", action='store', default=find_program('dot'),
|
||||||
|
help="""Path to GraphViz dot tool. Must be full qualified path. [Default: %default]""")
|
||||||
|
parser.add_option('--doxygen', dest="doxygen_path", action='store', default=find_program('doxygen'),
|
||||||
|
help="""Path to Doxygen tool. [Default: %default]""")
|
||||||
|
parser.add_option('--with-html-help', dest="with_html_help", action='store_true', default=False,
|
||||||
|
help="""Enable generation of Microsoft HTML HELP""")
|
||||||
|
parser.add_option('--no-uml-look', dest="with_uml_look", action='store_false', default=True,
|
||||||
|
help="""Generates DOT graph without UML look [Default: False]""")
|
||||||
|
parser.add_option('--open', dest="open", action='store_true', default=False,
|
||||||
|
help="""Open the HTML index in the web browser after generation""")
|
||||||
|
parser.add_option('--tarball', dest="make_tarball", action='store_true', default=False,
|
||||||
|
help="""Generates a tarball of the documentation in dist/ directory""")
|
||||||
|
parser.add_option('-s', '--silent', dest="silent", action='store_true', default=False,
|
||||||
|
help="""Hides doxygen output""")
|
||||||
|
parser.enable_interspersed_args()
|
||||||
|
options, args = parser.parse_args()
|
||||||
|
build_doc( options )
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
@ -513,7 +513,7 @@ namespace Json {
|
|||||||
Args args_;
|
Args args_;
|
||||||
};
|
};
|
||||||
|
|
||||||
/** \brief Allocator to customize member name and string value memory management done by Value.
|
/** \brief Experimental do not use: Allocator to customize member name and string value memory management done by Value.
|
||||||
*
|
*
|
||||||
* - makeMemberName() and releaseMemberName() are called to respectively duplicate and
|
* - makeMemberName() and releaseMemberName() are called to respectively duplicate and
|
||||||
* free an Json::objectValue member name.
|
* free an Json::objectValue member name.
|
||||||
@ -785,7 +785,7 @@ namespace Json {
|
|||||||
PageIndex pageCount_;
|
PageIndex pageCount_;
|
||||||
};
|
};
|
||||||
|
|
||||||
/** \brief Allocator to customize Value internal array.
|
/** \brief Experimental: do not use. Allocator to customize Value internal array.
|
||||||
* Below is an example of a simple implementation (actual implementation use
|
* Below is an example of a simple implementation (actual implementation use
|
||||||
* memory pool).
|
* memory pool).
|
||||||
\code
|
\code
|
||||||
@ -873,7 +873,7 @@ public: // overridden from ValueArrayAllocator
|
|||||||
#endif // #ifdef JSON_VALUE_USE_INTERNAL_MAP
|
#endif // #ifdef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
|
||||||
|
|
||||||
/** \brief Experimental and untested: base class for Value iterators.
|
/** \brief base class for Value iterators.
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
class ValueIteratorBase
|
class ValueIteratorBase
|
||||||
@ -943,7 +943,7 @@ public: // overridden from ValueArrayAllocator
|
|||||||
#endif
|
#endif
|
||||||
};
|
};
|
||||||
|
|
||||||
/** \brief Experimental and untested: const iterator for object and array value.
|
/** \brief const iterator for object and array value.
|
||||||
*
|
*
|
||||||
*/
|
*/
|
||||||
class ValueConstIterator : public ValueIteratorBase
|
class ValueConstIterator : public ValueIteratorBase
|
||||||
@ -1002,7 +1002,7 @@ public: // overridden from ValueArrayAllocator
|
|||||||
};
|
};
|
||||||
|
|
||||||
|
|
||||||
/** \brief Experimental and untested: iterator for object and array value.
|
/** \brief Iterator for object and array value.
|
||||||
*/
|
*/
|
||||||
class ValueIterator : public ValueIteratorBase
|
class ValueIterator : public ValueIteratorBase
|
||||||
{
|
{
|
||||||
|
@ -1,214 +1,214 @@
|
|||||||
<?xml version="1.0" encoding="Windows-1252"?>
|
<?xml version="1.0" encoding="Windows-1252"?>
|
||||||
<VisualStudioProject
|
<VisualStudioProject
|
||||||
ProjectType="Visual C++"
|
ProjectType="Visual C++"
|
||||||
Version="7.10"
|
Version="7.10"
|
||||||
Name="lib_json"
|
Name="lib_json"
|
||||||
ProjectGUID="{B84F7231-16CE-41D8-8C08-7B523FF4225B}"
|
ProjectGUID="{B84F7231-16CE-41D8-8C08-7B523FF4225B}"
|
||||||
Keyword="Win32Proj">
|
Keyword="Win32Proj">
|
||||||
<Platforms>
|
<Platforms>
|
||||||
<Platform
|
<Platform
|
||||||
Name="Win32"/>
|
Name="Win32"/>
|
||||||
</Platforms>
|
</Platforms>
|
||||||
<Configurations>
|
<Configurations>
|
||||||
<Configuration
|
<Configuration
|
||||||
Name="Debug|Win32"
|
Name="Debug|Win32"
|
||||||
OutputDirectory="../../build/vs71/debug/lib_json"
|
OutputDirectory="../../build/vs71/debug/lib_json"
|
||||||
IntermediateDirectory="../../build/vs71/debug/lib_json"
|
IntermediateDirectory="../../build/vs71/debug/lib_json"
|
||||||
ConfigurationType="4"
|
ConfigurationType="4"
|
||||||
CharacterSet="2">
|
CharacterSet="2">
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCCLCompilerTool"
|
Name="VCCLCompilerTool"
|
||||||
Optimization="0"
|
Optimization="0"
|
||||||
AdditionalIncludeDirectories="../../include"
|
AdditionalIncludeDirectories="../../include"
|
||||||
PreprocessorDefinitions="WIN32;_DEBUG;_LIB"
|
PreprocessorDefinitions="WIN32;_DEBUG;_LIB"
|
||||||
StringPooling="TRUE"
|
StringPooling="TRUE"
|
||||||
MinimalRebuild="TRUE"
|
MinimalRebuild="TRUE"
|
||||||
BasicRuntimeChecks="3"
|
BasicRuntimeChecks="3"
|
||||||
RuntimeLibrary="1"
|
RuntimeLibrary="1"
|
||||||
EnableFunctionLevelLinking="TRUE"
|
EnableFunctionLevelLinking="TRUE"
|
||||||
DisableLanguageExtensions="TRUE"
|
DisableLanguageExtensions="TRUE"
|
||||||
ForceConformanceInForLoopScope="FALSE"
|
ForceConformanceInForLoopScope="FALSE"
|
||||||
RuntimeTypeInfo="TRUE"
|
RuntimeTypeInfo="TRUE"
|
||||||
UsePrecompiledHeader="0"
|
UsePrecompiledHeader="0"
|
||||||
WarningLevel="3"
|
WarningLevel="3"
|
||||||
Detect64BitPortabilityProblems="TRUE"
|
Detect64BitPortabilityProblems="TRUE"
|
||||||
DebugInformationFormat="4"/>
|
DebugInformationFormat="4"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCCustomBuildTool"/>
|
Name="VCCustomBuildTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCLibrarianTool"
|
Name="VCLibrarianTool"
|
||||||
OutputFile="$(OutDir)/json_vc71_libmtd.lib"/>
|
OutputFile="$(OutDir)/json_vc71_libmtd.lib"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCMIDLTool"/>
|
Name="VCMIDLTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCPostBuildEventTool"/>
|
Name="VCPostBuildEventTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCPreBuildEventTool"/>
|
Name="VCPreBuildEventTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCPreLinkEventTool"/>
|
Name="VCPreLinkEventTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCResourceCompilerTool"/>
|
Name="VCResourceCompilerTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCWebServiceProxyGeneratorTool"/>
|
Name="VCWebServiceProxyGeneratorTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCXMLDataGeneratorTool"/>
|
Name="VCXMLDataGeneratorTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCManagedWrapperGeneratorTool"/>
|
Name="VCManagedWrapperGeneratorTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCAuxiliaryManagedWrapperGeneratorTool"/>
|
Name="VCAuxiliaryManagedWrapperGeneratorTool"/>
|
||||||
</Configuration>
|
</Configuration>
|
||||||
<Configuration
|
<Configuration
|
||||||
Name="Release|Win32"
|
Name="Release|Win32"
|
||||||
OutputDirectory="../../build/vs71/release/lib_json"
|
OutputDirectory="../../build/vs71/release/lib_json"
|
||||||
IntermediateDirectory="../../build/vs71/release/lib_json"
|
IntermediateDirectory="../../build/vs71/release/lib_json"
|
||||||
ConfigurationType="4"
|
ConfigurationType="4"
|
||||||
CharacterSet="2"
|
CharacterSet="2"
|
||||||
WholeProgramOptimization="TRUE">
|
WholeProgramOptimization="TRUE">
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCCLCompilerTool"
|
Name="VCCLCompilerTool"
|
||||||
GlobalOptimizations="TRUE"
|
GlobalOptimizations="TRUE"
|
||||||
EnableIntrinsicFunctions="TRUE"
|
EnableIntrinsicFunctions="TRUE"
|
||||||
AdditionalIncludeDirectories="../../include"
|
AdditionalIncludeDirectories="../../include"
|
||||||
PreprocessorDefinitions="WIN32;NDEBUG;_LIB"
|
PreprocessorDefinitions="WIN32;NDEBUG;_LIB"
|
||||||
StringPooling="TRUE"
|
StringPooling="TRUE"
|
||||||
RuntimeLibrary="0"
|
RuntimeLibrary="0"
|
||||||
EnableFunctionLevelLinking="TRUE"
|
EnableFunctionLevelLinking="TRUE"
|
||||||
DisableLanguageExtensions="TRUE"
|
DisableLanguageExtensions="TRUE"
|
||||||
ForceConformanceInForLoopScope="FALSE"
|
ForceConformanceInForLoopScope="FALSE"
|
||||||
RuntimeTypeInfo="TRUE"
|
RuntimeTypeInfo="TRUE"
|
||||||
UsePrecompiledHeader="0"
|
UsePrecompiledHeader="0"
|
||||||
AssemblerOutput="4"
|
AssemblerOutput="4"
|
||||||
WarningLevel="3"
|
WarningLevel="3"
|
||||||
Detect64BitPortabilityProblems="TRUE"
|
Detect64BitPortabilityProblems="TRUE"
|
||||||
DebugInformationFormat="3"/>
|
DebugInformationFormat="3"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCCustomBuildTool"/>
|
Name="VCCustomBuildTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCLibrarianTool"
|
Name="VCLibrarianTool"
|
||||||
OutputFile="$(OutDir)/json_vc71_libmt.lib"/>
|
OutputFile="$(OutDir)/json_vc71_libmt.lib"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCMIDLTool"/>
|
Name="VCMIDLTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCPostBuildEventTool"/>
|
Name="VCPostBuildEventTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCPreBuildEventTool"/>
|
Name="VCPreBuildEventTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCPreLinkEventTool"/>
|
Name="VCPreLinkEventTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCResourceCompilerTool"/>
|
Name="VCResourceCompilerTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCWebServiceProxyGeneratorTool"/>
|
Name="VCWebServiceProxyGeneratorTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCXMLDataGeneratorTool"/>
|
Name="VCXMLDataGeneratorTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCManagedWrapperGeneratorTool"/>
|
Name="VCManagedWrapperGeneratorTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCAuxiliaryManagedWrapperGeneratorTool"/>
|
Name="VCAuxiliaryManagedWrapperGeneratorTool"/>
|
||||||
</Configuration>
|
</Configuration>
|
||||||
<Configuration
|
<Configuration
|
||||||
Name="dummy|Win32"
|
Name="dummy|Win32"
|
||||||
OutputDirectory="$(ConfigurationName)"
|
OutputDirectory="$(ConfigurationName)"
|
||||||
IntermediateDirectory="$(ConfigurationName)"
|
IntermediateDirectory="$(ConfigurationName)"
|
||||||
ConfigurationType="2"
|
ConfigurationType="2"
|
||||||
CharacterSet="2"
|
CharacterSet="2"
|
||||||
WholeProgramOptimization="TRUE">
|
WholeProgramOptimization="TRUE">
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCCLCompilerTool"
|
Name="VCCLCompilerTool"
|
||||||
GlobalOptimizations="TRUE"
|
GlobalOptimizations="TRUE"
|
||||||
EnableIntrinsicFunctions="TRUE"
|
EnableIntrinsicFunctions="TRUE"
|
||||||
AdditionalIncludeDirectories="../../include"
|
AdditionalIncludeDirectories="../../include"
|
||||||
PreprocessorDefinitions="WIN32;NDEBUG;_LIB"
|
PreprocessorDefinitions="WIN32;NDEBUG;_LIB"
|
||||||
StringPooling="TRUE"
|
StringPooling="TRUE"
|
||||||
RuntimeLibrary="4"
|
RuntimeLibrary="4"
|
||||||
EnableFunctionLevelLinking="TRUE"
|
EnableFunctionLevelLinking="TRUE"
|
||||||
DisableLanguageExtensions="TRUE"
|
DisableLanguageExtensions="TRUE"
|
||||||
ForceConformanceInForLoopScope="FALSE"
|
ForceConformanceInForLoopScope="FALSE"
|
||||||
RuntimeTypeInfo="TRUE"
|
RuntimeTypeInfo="TRUE"
|
||||||
UsePrecompiledHeader="0"
|
UsePrecompiledHeader="0"
|
||||||
AssemblerOutput="4"
|
AssemblerOutput="4"
|
||||||
WarningLevel="3"
|
WarningLevel="3"
|
||||||
Detect64BitPortabilityProblems="TRUE"
|
Detect64BitPortabilityProblems="TRUE"
|
||||||
DebugInformationFormat="3"/>
|
DebugInformationFormat="3"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCCustomBuildTool"/>
|
Name="VCCustomBuildTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCLinkerTool"
|
Name="VCLinkerTool"
|
||||||
GenerateDebugInformation="TRUE"
|
GenerateDebugInformation="TRUE"
|
||||||
SubSystem="2"
|
SubSystem="2"
|
||||||
OptimizeReferences="2"
|
OptimizeReferences="2"
|
||||||
EnableCOMDATFolding="2"
|
EnableCOMDATFolding="2"
|
||||||
TargetMachine="1"/>
|
TargetMachine="1"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCMIDLTool"/>
|
Name="VCMIDLTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCPostBuildEventTool"/>
|
Name="VCPostBuildEventTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCPreBuildEventTool"/>
|
Name="VCPreBuildEventTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCPreLinkEventTool"/>
|
Name="VCPreLinkEventTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCResourceCompilerTool"/>
|
Name="VCResourceCompilerTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCWebServiceProxyGeneratorTool"/>
|
Name="VCWebServiceProxyGeneratorTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCXMLDataGeneratorTool"/>
|
Name="VCXMLDataGeneratorTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCWebDeploymentTool"/>
|
Name="VCWebDeploymentTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCManagedWrapperGeneratorTool"/>
|
Name="VCManagedWrapperGeneratorTool"/>
|
||||||
<Tool
|
<Tool
|
||||||
Name="VCAuxiliaryManagedWrapperGeneratorTool"/>
|
Name="VCAuxiliaryManagedWrapperGeneratorTool"/>
|
||||||
</Configuration>
|
</Configuration>
|
||||||
</Configurations>
|
</Configurations>
|
||||||
<References>
|
<References>
|
||||||
</References>
|
</References>
|
||||||
<Files>
|
<Files>
|
||||||
<File
|
<File
|
||||||
RelativePath="..\..\include\json\autolink.h">
|
RelativePath="..\..\include\json\autolink.h">
|
||||||
</File>
|
</File>
|
||||||
<File
|
<File
|
||||||
RelativePath="..\..\include\json\config.h">
|
RelativePath="..\..\include\json\config.h">
|
||||||
</File>
|
</File>
|
||||||
<File
|
<File
|
||||||
RelativePath="..\..\include\json\features.h">
|
RelativePath="..\..\include\json\features.h">
|
||||||
</File>
|
</File>
|
||||||
<File
|
<File
|
||||||
RelativePath="..\..\include\json\forwards.h">
|
RelativePath="..\..\include\json\forwards.h">
|
||||||
</File>
|
</File>
|
||||||
<File
|
<File
|
||||||
RelativePath="..\..\include\json\json.h">
|
RelativePath="..\..\include\json\json.h">
|
||||||
</File>
|
</File>
|
||||||
<File
|
<File
|
||||||
RelativePath="..\..\src\lib_json\json_batchallocator.h">
|
RelativePath="..\..\src\lib_json\json_batchallocator.h">
|
||||||
</File>
|
</File>
|
||||||
<File
|
<File
|
||||||
RelativePath="..\..\src\lib_json\json_internalarray.inl">
|
RelativePath="..\..\src\lib_json\json_internalarray.inl">
|
||||||
</File>
|
</File>
|
||||||
<File
|
<File
|
||||||
RelativePath="..\..\src\lib_json\json_internalmap.inl">
|
RelativePath="..\..\src\lib_json\json_internalmap.inl">
|
||||||
</File>
|
</File>
|
||||||
<File
|
<File
|
||||||
RelativePath="..\..\src\lib_json\json_reader.cpp">
|
RelativePath="..\..\src\lib_json\json_reader.cpp">
|
||||||
</File>
|
</File>
|
||||||
<File
|
<File
|
||||||
RelativePath="..\..\src\lib_json\json_value.cpp">
|
RelativePath="..\..\src\lib_json\json_value.cpp">
|
||||||
</File>
|
</File>
|
||||||
<File
|
<File
|
||||||
RelativePath="..\..\src\lib_json\json_valueiterator.inl">
|
RelativePath="..\..\src\lib_json\json_valueiterator.inl">
|
||||||
</File>
|
</File>
|
||||||
<File
|
<File
|
||||||
RelativePath="..\..\src\lib_json\json_writer.cpp">
|
RelativePath="..\..\src\lib_json\json_writer.cpp">
|
||||||
</File>
|
</File>
|
||||||
<File
|
<File
|
||||||
RelativePath="..\..\include\json\reader.h">
|
RelativePath="..\..\include\json\reader.h">
|
||||||
</File>
|
</File>
|
||||||
<File
|
<File
|
||||||
RelativePath="..\..\include\json\value.h">
|
RelativePath="..\..\include\json\value.h">
|
||||||
</File>
|
</File>
|
||||||
<File
|
<File
|
||||||
RelativePath="..\..\include\json\writer.h">
|
RelativePath="..\..\include\json\writer.h">
|
||||||
</File>
|
</File>
|
||||||
</Files>
|
</Files>
|
||||||
<Globals>
|
<Globals>
|
||||||
</Globals>
|
</Globals>
|
||||||
</VisualStudioProject>
|
</VisualStudioProject>
|
||||||
|
368
makerelease.py
Normal file
368
makerelease.py
Normal file
@ -0,0 +1,368 @@
|
|||||||
|
"""Tag the sandbox for release, make source and doc tarballs.
|
||||||
|
|
||||||
|
Requires Python 2.6
|
||||||
|
|
||||||
|
Example of invocation (use to test the script):
|
||||||
|
python makerelease.py --force --retag --platform=msvc6,msvc71,msvc80,mingw -ublep 0.5.0 0.6.0-dev
|
||||||
|
|
||||||
|
Example of invocation when doing a release:
|
||||||
|
python makerelease.py 0.5.0 0.6.0-dev
|
||||||
|
"""
|
||||||
|
import os.path
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
import doxybuild
|
||||||
|
import subprocess
|
||||||
|
import xml.etree.ElementTree as ElementTree
|
||||||
|
import shutil
|
||||||
|
import urllib2
|
||||||
|
import tempfile
|
||||||
|
import os
|
||||||
|
import time
|
||||||
|
from devtools import antglob, fixeol, tarball
|
||||||
|
|
||||||
|
SVN_ROOT = 'https://jsoncpp.svn.sourceforge.net/svnroot/jsoncpp/'
|
||||||
|
SVN_TAG_ROOT = SVN_ROOT + 'tags/jsoncpp'
|
||||||
|
SCONS_LOCAL_URL = 'http://sourceforge.net/projects/scons/files/scons-local/1.2.0/scons-local-1.2.0.tar.gz/download'
|
||||||
|
SOURCEFORGE_PROJECT = 'jsoncpp'
|
||||||
|
|
||||||
|
def set_version( version ):
|
||||||
|
with open('version','wb') as f:
|
||||||
|
f.write( version.strip() )
|
||||||
|
|
||||||
|
def rmdir_if_exist( dir_path ):
|
||||||
|
if os.path.isdir( dir_path ):
|
||||||
|
shutil.rmtree( dir_path )
|
||||||
|
|
||||||
|
class SVNError(Exception):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def svn_command( command, *args ):
|
||||||
|
cmd = ['svn', '--non-interactive', command] + list(args)
|
||||||
|
print 'Running:', ' '.join( cmd )
|
||||||
|
process = subprocess.Popen( cmd,
|
||||||
|
stdout=subprocess.PIPE,
|
||||||
|
stderr=subprocess.STDOUT )
|
||||||
|
stdout = process.communicate()[0]
|
||||||
|
if process.returncode:
|
||||||
|
error = SVNError( 'SVN command failed:\n' + stdout )
|
||||||
|
error.returncode = process.returncode
|
||||||
|
raise error
|
||||||
|
return stdout
|
||||||
|
|
||||||
|
def check_no_pending_commit():
|
||||||
|
"""Checks that there is no pending commit in the sandbox."""
|
||||||
|
stdout = svn_command( 'status', '--xml' )
|
||||||
|
etree = ElementTree.fromstring( stdout )
|
||||||
|
msg = []
|
||||||
|
for entry in etree.getiterator( 'entry' ):
|
||||||
|
path = entry.get('path')
|
||||||
|
status = entry.find('wc-status').get('item')
|
||||||
|
if status != 'unversioned' and path != 'version':
|
||||||
|
msg.append( 'File "%s" has pending change (status="%s")' % (path, status) )
|
||||||
|
if msg:
|
||||||
|
msg.insert(0, 'Pending change to commit found in sandbox. Commit them first!' )
|
||||||
|
return '\n'.join( msg )
|
||||||
|
|
||||||
|
def svn_join_url( base_url, suffix ):
|
||||||
|
if not base_url.endswith('/'):
|
||||||
|
base_url += '/'
|
||||||
|
if suffix.startswith('/'):
|
||||||
|
suffix = suffix[1:]
|
||||||
|
return base_url + suffix
|
||||||
|
|
||||||
|
def svn_check_if_tag_exist( tag_url ):
|
||||||
|
"""Checks if a tag exist.
|
||||||
|
Returns: True if the tag exist, False otherwise.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
list_stdout = svn_command( 'list', tag_url )
|
||||||
|
except SVNError, e:
|
||||||
|
if e.returncode != 1 or not str(e).find('tag_url'):
|
||||||
|
raise e
|
||||||
|
# otherwise ignore error, meaning tag does not exist
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
|
||||||
|
def svn_commit( message ):
|
||||||
|
"""Commit the sandbox, providing the specified comment.
|
||||||
|
"""
|
||||||
|
svn_command( 'ci', '-m', message )
|
||||||
|
|
||||||
|
def svn_tag_sandbox( tag_url, message ):
|
||||||
|
"""Makes a tag based on the sandbox revisions.
|
||||||
|
"""
|
||||||
|
svn_command( 'copy', '-m', message, '.', tag_url )
|
||||||
|
|
||||||
|
def svn_remove_tag( tag_url, message ):
|
||||||
|
"""Removes an existing tag.
|
||||||
|
"""
|
||||||
|
svn_command( 'delete', '-m', message, tag_url )
|
||||||
|
|
||||||
|
def svn_export( tag_url, export_dir ):
|
||||||
|
"""Exports the tag_url revision to export_dir.
|
||||||
|
Target directory, including its parent is created if it does not exist.
|
||||||
|
If the directory export_dir exist, it is deleted before export proceed.
|
||||||
|
"""
|
||||||
|
rmdir_if_exist( export_dir )
|
||||||
|
svn_command( 'export', tag_url, export_dir )
|
||||||
|
|
||||||
|
def fix_sources_eol( dist_dir ):
|
||||||
|
"""Set file EOL for tarball distribution.
|
||||||
|
"""
|
||||||
|
print 'Preparing exported source file EOL for distribution...'
|
||||||
|
prune_dirs = antglob.prune_dirs + 'scons-local* ./build* ./libs ./dist'
|
||||||
|
win_sources = antglob.glob( dist_dir,
|
||||||
|
includes = '**/*.sln **/*.vcproj',
|
||||||
|
prune_dirs = prune_dirs )
|
||||||
|
unix_sources = antglob.glob( dist_dir,
|
||||||
|
includes = '''**/*.h **/*.cpp **/*.inl **/*.txt **/*.dox **/*.py **/*.html **/*.in
|
||||||
|
sconscript *.json *.expected AUTHORS LICENSE''',
|
||||||
|
excludes = antglob.default_excludes + 'scons.py sconsign.py scons-*',
|
||||||
|
prune_dirs = prune_dirs )
|
||||||
|
for path in win_sources:
|
||||||
|
fixeol.fix_source_eol( path, is_dry_run = False, verbose = True, eol = '\r\n' )
|
||||||
|
for path in unix_sources:
|
||||||
|
fixeol.fix_source_eol( path, is_dry_run = False, verbose = True, eol = '\n' )
|
||||||
|
|
||||||
|
def download( url, target_path ):
|
||||||
|
"""Download file represented by url to target_path.
|
||||||
|
"""
|
||||||
|
f = urllib2.urlopen( url )
|
||||||
|
try:
|
||||||
|
data = f.read()
|
||||||
|
finally:
|
||||||
|
f.close()
|
||||||
|
fout = open( target_path, 'wb' )
|
||||||
|
try:
|
||||||
|
fout.write( data )
|
||||||
|
finally:
|
||||||
|
fout.close()
|
||||||
|
|
||||||
|
def check_compile( distcheck_top_dir, platform ):
|
||||||
|
cmd = [sys.executable, 'scons.py', 'platform=%s' % platform, 'check']
|
||||||
|
print 'Running:', ' '.join( cmd )
|
||||||
|
log_path = os.path.join( distcheck_top_dir, 'build-%s.log' % platform )
|
||||||
|
flog = open( log_path, 'wb' )
|
||||||
|
try:
|
||||||
|
process = subprocess.Popen( cmd,
|
||||||
|
stdout=flog,
|
||||||
|
stderr=subprocess.STDOUT,
|
||||||
|
cwd=distcheck_top_dir )
|
||||||
|
stdout = process.communicate()[0]
|
||||||
|
status = (process.returncode == 0)
|
||||||
|
finally:
|
||||||
|
flog.close()
|
||||||
|
return (status, log_path)
|
||||||
|
|
||||||
|
def write_tempfile( content, **kwargs ):
|
||||||
|
fd, path = tempfile.mkstemp( **kwargs )
|
||||||
|
f = os.fdopen( fd, 'wt' )
|
||||||
|
try:
|
||||||
|
f.write( content )
|
||||||
|
finally:
|
||||||
|
f.close()
|
||||||
|
return path
|
||||||
|
|
||||||
|
class SFTPError(Exception):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def run_sftp_batch( userhost, sftp, batch, retry=0 ):
|
||||||
|
path = write_tempfile( batch, suffix='.sftp', text=True )
|
||||||
|
# psftp -agent -C blep,jsoncpp@web.sourceforge.net -batch -b batch.sftp -bc
|
||||||
|
cmd = [sftp, '-agent', '-C', '-batch', '-b', path, '-bc', userhost]
|
||||||
|
error = None
|
||||||
|
for retry_index in xrange(0, max(1,retry)):
|
||||||
|
heading = retry_index == 0 and 'Running:' or 'Retrying:'
|
||||||
|
print heading, ' '.join( cmd )
|
||||||
|
process = subprocess.Popen( cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT )
|
||||||
|
stdout = process.communicate()[0]
|
||||||
|
if process.returncode != 0:
|
||||||
|
error = SFTPError( 'SFTP batch failed:\n' + stdout )
|
||||||
|
else:
|
||||||
|
break
|
||||||
|
if error:
|
||||||
|
raise error
|
||||||
|
return stdout
|
||||||
|
|
||||||
|
def sourceforge_web_synchro( sourceforge_project, doc_dir,
|
||||||
|
user=None, sftp='sftp' ):
|
||||||
|
"""Notes: does not synchronize sub-directory of doc-dir.
|
||||||
|
"""
|
||||||
|
userhost = '%s,%s@web.sourceforge.net' % (user, sourceforge_project)
|
||||||
|
stdout = run_sftp_batch( userhost, sftp, """
|
||||||
|
cd htdocs
|
||||||
|
dir
|
||||||
|
exit
|
||||||
|
""" )
|
||||||
|
existing_paths = set()
|
||||||
|
collect = 0
|
||||||
|
for line in stdout.split('\n'):
|
||||||
|
line = line.strip()
|
||||||
|
if not collect and line.endswith('> dir'):
|
||||||
|
collect = True
|
||||||
|
elif collect and line.endswith('> exit'):
|
||||||
|
break
|
||||||
|
elif collect == 1:
|
||||||
|
collect = 2
|
||||||
|
elif collect == 2:
|
||||||
|
path = line.strip().split()[-1:]
|
||||||
|
if path and path[0] not in ('.', '..'):
|
||||||
|
existing_paths.add( path[0] )
|
||||||
|
upload_paths = set( [os.path.basename(p) for p in antglob.glob( doc_dir )] )
|
||||||
|
paths_to_remove = existing_paths - upload_paths
|
||||||
|
if paths_to_remove:
|
||||||
|
print 'Removing the following file from web:'
|
||||||
|
print '\n'.join( paths_to_remove )
|
||||||
|
stdout = run_sftp_batch( userhost, sftp, """cd htdocs
|
||||||
|
rm %s
|
||||||
|
exit""" % ' '.join(paths_to_remove) )
|
||||||
|
print 'Uploading %d files:' % len(upload_paths)
|
||||||
|
batch_size = 10
|
||||||
|
upload_paths = list(upload_paths)
|
||||||
|
start_time = time.time()
|
||||||
|
for index in xrange(0,len(upload_paths),batch_size):
|
||||||
|
paths = upload_paths[index:index+batch_size]
|
||||||
|
file_per_sec = (time.time() - start_time) / (index+1)
|
||||||
|
remaining_files = len(upload_paths) - index
|
||||||
|
remaining_sec = file_per_sec * remaining_files
|
||||||
|
print '%d/%d, ETA=%.1fs' % (index+1, len(upload_paths), remaining_sec)
|
||||||
|
run_sftp_batch( userhost, sftp, """cd htdocs
|
||||||
|
lcd %s
|
||||||
|
mput %s
|
||||||
|
exit""" % (doc_dir, ' '.join(paths) ), retry=3 )
|
||||||
|
|
||||||
|
def sourceforge_release_tarball( sourceforge_project, paths, user=None, sftp='sftp' ):
|
||||||
|
userhost = '%s,%s@frs.sourceforge.net' % (user, sourceforge_project)
|
||||||
|
run_sftp_batch( userhost, sftp, """
|
||||||
|
mput %s
|
||||||
|
exit
|
||||||
|
""" % (' '.join(paths),) )
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
usage = """%prog release_version next_dev_version
|
||||||
|
Update 'version' file to release_version and commit.
|
||||||
|
Generates the document tarball.
|
||||||
|
Tags the sandbox revision with release_version.
|
||||||
|
Update 'version' file to next_dev_version and commit.
|
||||||
|
|
||||||
|
Performs an svn export of tag release version, and build a source tarball.
|
||||||
|
|
||||||
|
Must be started in the project top directory.
|
||||||
|
|
||||||
|
Warning: --force should only be used when developping/testing the release script.
|
||||||
|
"""
|
||||||
|
from optparse import OptionParser
|
||||||
|
parser = OptionParser(usage=usage)
|
||||||
|
parser.allow_interspersed_args = False
|
||||||
|
parser.add_option('--dot', dest="dot_path", action='store', default=doxybuild.find_program('dot'),
|
||||||
|
help="""Path to GraphViz dot tool. Must be full qualified path. [Default: %default]""")
|
||||||
|
parser.add_option('--doxygen', dest="doxygen_path", action='store', default=doxybuild.find_program('doxygen'),
|
||||||
|
help="""Path to Doxygen tool. [Default: %default]""")
|
||||||
|
parser.add_option('--force', dest="ignore_pending_commit", action='store_true', default=False,
|
||||||
|
help="""Ignore pending commit. [Default: %default]""")
|
||||||
|
parser.add_option('--retag', dest="retag_release", action='store_true', default=False,
|
||||||
|
help="""Overwrite release existing tag if it exist. [Default: %default]""")
|
||||||
|
parser.add_option('-p', '--platforms', dest="platforms", action='store', default='',
|
||||||
|
help="""Comma separated list of platform passed to scons for build check.""")
|
||||||
|
parser.add_option('--no-test', dest="no_test", action='store_true', default=False,
|
||||||
|
help="""Skips build check.""")
|
||||||
|
parser.add_option('--no-web', dest="no_web", action='store_true', default=False,
|
||||||
|
help="""Do not update web site.""")
|
||||||
|
parser.add_option('-u', '--upload-user', dest="user", action='store',
|
||||||
|
help="""Sourceforge user for SFTP documentation upload.""")
|
||||||
|
parser.add_option('--sftp', dest='sftp', action='store', default=doxybuild.find_program('psftp', 'sftp'),
|
||||||
|
help="""Path of the SFTP compatible binary used to upload the documentation.""")
|
||||||
|
parser.enable_interspersed_args()
|
||||||
|
options, args = parser.parse_args()
|
||||||
|
|
||||||
|
if len(args) != 2:
|
||||||
|
parser.error( 'release_version missing on command-line.' )
|
||||||
|
release_version = args[0]
|
||||||
|
next_version = args[1]
|
||||||
|
|
||||||
|
if not options.platforms and not options.no_test:
|
||||||
|
parser.error( 'You must specify either --platform or --no-test option.' )
|
||||||
|
|
||||||
|
if options.ignore_pending_commit:
|
||||||
|
msg = ''
|
||||||
|
else:
|
||||||
|
msg = check_no_pending_commit()
|
||||||
|
if not msg:
|
||||||
|
print 'Setting version to', release_version
|
||||||
|
set_version( release_version )
|
||||||
|
svn_commit( 'Release ' + release_version )
|
||||||
|
tag_url = svn_join_url( SVN_TAG_ROOT, release_version )
|
||||||
|
if svn_check_if_tag_exist( tag_url ):
|
||||||
|
if options.retag_release:
|
||||||
|
svn_remove_tag( tag_url, 'Overwriting previous tag' )
|
||||||
|
else:
|
||||||
|
print 'Aborting, tag %s already exist. Use --retag to overwrite it!' % tag_url
|
||||||
|
sys.exit( 1 )
|
||||||
|
svn_tag_sandbox( tag_url, 'Release ' + release_version )
|
||||||
|
|
||||||
|
print 'Generated doxygen document...'
|
||||||
|
## doc_dirname = r'jsoncpp-api-html-0.5.0'
|
||||||
|
## doc_tarball_path = r'e:\prg\vc\Lib\jsoncpp-trunk\dist\jsoncpp-api-html-0.5.0.tar.gz'
|
||||||
|
doc_tarball_path, doc_dirname = doxybuild.build_doc( options, make_release=True )
|
||||||
|
doc_distcheck_dir = 'dist/doccheck'
|
||||||
|
tarball.decompress( doc_tarball_path, doc_distcheck_dir )
|
||||||
|
doc_distcheck_top_dir = os.path.join( doc_distcheck_dir, doc_dirname )
|
||||||
|
|
||||||
|
export_dir = 'dist/export'
|
||||||
|
svn_export( tag_url, export_dir )
|
||||||
|
fix_sources_eol( export_dir )
|
||||||
|
|
||||||
|
source_dir = 'jsoncpp-src-' + release_version
|
||||||
|
source_tarball_path = 'dist/%s.tar.gz' % source_dir
|
||||||
|
print 'Generating source tarball to', source_tarball_path
|
||||||
|
tarball.make_tarball( source_tarball_path, [export_dir], export_dir, prefix_dir=source_dir )
|
||||||
|
|
||||||
|
# Decompress source tarball, download and install scons-local
|
||||||
|
distcheck_dir = 'dist/distcheck'
|
||||||
|
distcheck_top_dir = distcheck_dir + '/' + source_dir
|
||||||
|
print 'Decompressing source tarball to', distcheck_dir
|
||||||
|
rmdir_if_exist( distcheck_dir )
|
||||||
|
tarball.decompress( source_tarball_path, distcheck_dir )
|
||||||
|
scons_local_path = 'dist/scons-local.tar.gz'
|
||||||
|
print 'Downloading scons-local to', scons_local_path
|
||||||
|
download( SCONS_LOCAL_URL, scons_local_path )
|
||||||
|
print 'Decompressing scons-local to', distcheck_top_dir
|
||||||
|
tarball.decompress( scons_local_path, distcheck_top_dir )
|
||||||
|
|
||||||
|
# Run compilation
|
||||||
|
print 'Compiling decompressed tarball'
|
||||||
|
all_build_status = True
|
||||||
|
for platform in options.platforms.split(','):
|
||||||
|
print 'Testing platform:', platform
|
||||||
|
build_status, log_path = check_compile( distcheck_top_dir, platform )
|
||||||
|
print 'see build log:', log_path
|
||||||
|
print build_status and '=> ok' or '=> FAILED'
|
||||||
|
all_build_status = all_build_status and build_status
|
||||||
|
if not build_status:
|
||||||
|
print 'Testing failed on at least one platform, aborting...'
|
||||||
|
svn_remove_tag( tag_url, 'Removing tag due to failed testing' )
|
||||||
|
sys.exit(1)
|
||||||
|
if options.user:
|
||||||
|
if not options.no_web:
|
||||||
|
print 'Uploading documentation using user', options.user
|
||||||
|
sourceforge_web_synchro( SOURCEFORGE_PROJECT, doc_distcheck_top_dir, user=options.user, sftp=options.sftp )
|
||||||
|
print 'Completed documentation upload'
|
||||||
|
print 'Uploading source and documentation tarballs for release using user', options.user
|
||||||
|
sourceforge_release_tarball( SOURCEFORGE_PROJECT,
|
||||||
|
[source_tarball_path, doc_tarball_path],
|
||||||
|
user=options.user, sftp=options.sftp )
|
||||||
|
print 'Source and doc release tarballs uploaded'
|
||||||
|
else:
|
||||||
|
print 'No upload user specified. Web site and download tarbal were not uploaded.'
|
||||||
|
print 'Tarball can be found at:', doc_tarball_path
|
||||||
|
|
||||||
|
# Set next version number and commit
|
||||||
|
set_version( next_version )
|
||||||
|
svn_commit( 'Released ' + release_version )
|
||||||
|
else:
|
||||||
|
sys.stderr.write( msg + '\n' )
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
@ -1,116 +0,0 @@
|
|||||||
# Big issue:
|
|
||||||
# emitter depends on doxyfile which is generated from doxyfile.in.
|
|
||||||
# build fails after cleaning and relaunching the build.
|
|
||||||
|
|
||||||
# Todo:
|
|
||||||
# Add helper function to environment like for glob
|
|
||||||
# Easier passage of header/footer
|
|
||||||
# Automatic deduction of index.html path based on custom parameters passed to doxyfile
|
|
||||||
|
|
||||||
import os
|
|
||||||
import os.path
|
|
||||||
from fnmatch import fnmatch
|
|
||||||
import SCons
|
|
||||||
|
|
||||||
def Doxyfile_emitter(target, source, env):
|
|
||||||
"""
|
|
||||||
Modify the target and source lists to use the defaults if nothing
|
|
||||||
else has been specified.
|
|
||||||
|
|
||||||
Dependencies on external HTML documentation references are also
|
|
||||||
appended to the source list.
|
|
||||||
"""
|
|
||||||
doxyfile_template = env.File(env['DOXYFILE_FILE'])
|
|
||||||
source.insert(0, doxyfile_template)
|
|
||||||
|
|
||||||
return target, source
|
|
||||||
|
|
||||||
def Doxyfile_Builder(target, source, env):
|
|
||||||
"""Input:
|
|
||||||
DOXYFILE_FILE
|
|
||||||
Path of the template file for the output doxyfile
|
|
||||||
|
|
||||||
DOXYFILE_DICT
|
|
||||||
A dictionnary of parameter to append to the generated doxyfile
|
|
||||||
"""
|
|
||||||
subdir = os.path.split(source[0].abspath)[0]
|
|
||||||
doc_top_dir = os.path.split(target[0].abspath)[0]
|
|
||||||
doxyfile_path = source[0].abspath
|
|
||||||
doxy_file = file( target[0].abspath, 'wt' )
|
|
||||||
try:
|
|
||||||
# First, output the template file
|
|
||||||
try:
|
|
||||||
f = file(doxyfile_path, 'rt')
|
|
||||||
doxy_file.write( f.read() )
|
|
||||||
f.close()
|
|
||||||
doxy_file.write( '\n' )
|
|
||||||
doxy_file.write( '# Generated content:\n' )
|
|
||||||
except:
|
|
||||||
raise SCons.Errors.UserError, "Can't read doxygen template file '%s'" % doxyfile_path
|
|
||||||
# Then, the input files
|
|
||||||
doxy_file.write( 'INPUT = \\\n' )
|
|
||||||
for source in source:
|
|
||||||
if source.abspath != doxyfile_path: # skip doxyfile path, which is the first source
|
|
||||||
doxy_file.write( '"%s" \\\n' % source.abspath )
|
|
||||||
doxy_file.write( '\n' )
|
|
||||||
# Dot...
|
|
||||||
values_dict = { 'HAVE_DOT': env.get('DOT') and 'YES' or 'NO',
|
|
||||||
'DOT_PATH': env.get('DOT') and os.path.split(env['DOT'])[0] or '',
|
|
||||||
'OUTPUT_DIRECTORY': doc_top_dir,
|
|
||||||
'WARN_LOGFILE': target[0].abspath + '-warning.log'}
|
|
||||||
values_dict.update( env['DOXYFILE_DICT'] )
|
|
||||||
# Finally, output user dictionary values which override any of the previously set parameters.
|
|
||||||
for key, value in values_dict.iteritems():
|
|
||||||
doxy_file.write ('%s = "%s"\n' % (key, str(value)))
|
|
||||||
finally:
|
|
||||||
doxy_file.close()
|
|
||||||
|
|
||||||
def generate(env):
|
|
||||||
"""
|
|
||||||
Add builders and construction variables for the
|
|
||||||
Doxygen tool.
|
|
||||||
"""
|
|
||||||
## Doxyfile builder
|
|
||||||
def doxyfile_message (target, source, env):
|
|
||||||
return "creating Doxygen config file '%s'" % target[0]
|
|
||||||
|
|
||||||
doxyfile_variables = [
|
|
||||||
'DOXYFILE_DICT',
|
|
||||||
'DOXYFILE_FILE'
|
|
||||||
]
|
|
||||||
|
|
||||||
#doxyfile_action = SCons.Action.Action( Doxyfile_Builder, doxyfile_message,
|
|
||||||
# doxyfile_variables )
|
|
||||||
doxyfile_action = SCons.Action.Action( Doxyfile_Builder, doxyfile_message)
|
|
||||||
|
|
||||||
doxyfile_builder = SCons.Builder.Builder( action = doxyfile_action,
|
|
||||||
emitter = Doxyfile_emitter )
|
|
||||||
|
|
||||||
env['BUILDERS']['Doxyfile'] = doxyfile_builder
|
|
||||||
env['DOXYFILE_DICT'] = {}
|
|
||||||
env['DOXYFILE_FILE'] = 'doxyfile.in'
|
|
||||||
|
|
||||||
## Doxygen builder
|
|
||||||
def Doxygen_emitter(target, source, env):
|
|
||||||
output_dir = str( source[0].dir )
|
|
||||||
if str(target[0]) == str(source[0]):
|
|
||||||
target = env.File( os.path.join( output_dir, 'html', 'index.html' ) )
|
|
||||||
return target, source
|
|
||||||
|
|
||||||
doxygen_action = SCons.Action.Action( [ '$DOXYGEN_COM'] )
|
|
||||||
doxygen_builder = SCons.Builder.Builder( action = doxygen_action,
|
|
||||||
emitter = Doxygen_emitter )
|
|
||||||
env['BUILDERS']['Doxygen'] = doxygen_builder
|
|
||||||
env['DOXYGEN_COM'] = '$DOXYGEN $DOXYGEN_FLAGS $SOURCE'
|
|
||||||
env['DOXYGEN_FLAGS'] = ''
|
|
||||||
env['DOXYGEN'] = 'doxygen'
|
|
||||||
|
|
||||||
dot_path = env.WhereIs("dot")
|
|
||||||
if dot_path:
|
|
||||||
env['DOT'] = dot_path
|
|
||||||
|
|
||||||
def exists(env):
|
|
||||||
"""
|
|
||||||
Make sure doxygen exists.
|
|
||||||
"""
|
|
||||||
return env.Detect("doxygen")
|
|
@ -304,6 +304,11 @@ TestCase::TestCase()
|
|||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
TestCase::~TestCase()
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
void
|
void
|
||||||
TestCase::run( TestResult &result )
|
TestCase::run( TestResult &result )
|
||||||
{
|
{
|
||||||
|
@ -1,252 +1,254 @@
|
|||||||
#ifndef JSONTEST_H_INCLUDED
|
#ifndef JSONTEST_H_INCLUDED
|
||||||
# define JSONTEST_H_INCLUDED
|
# define JSONTEST_H_INCLUDED
|
||||||
|
|
||||||
# include <json/config.h>
|
# include <json/config.h>
|
||||||
# include <stdio.h>
|
# include <stdio.h>
|
||||||
# include <deque>
|
# include <deque>
|
||||||
# include <string>
|
# include <string>
|
||||||
|
|
||||||
// //////////////////////////////////////////////////////////////////
|
// //////////////////////////////////////////////////////////////////
|
||||||
// //////////////////////////////////////////////////////////////////
|
// //////////////////////////////////////////////////////////////////
|
||||||
// Mini Unit Testing framework
|
// Mini Unit Testing framework
|
||||||
// //////////////////////////////////////////////////////////////////
|
// //////////////////////////////////////////////////////////////////
|
||||||
// //////////////////////////////////////////////////////////////////
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
/** \brief Unit testing framework.
|
/** \brief Unit testing framework.
|
||||||
* \warning: all assertions are non-aborting, test case execution will continue
|
* \warning: all assertions are non-aborting, test case execution will continue
|
||||||
* even if an assertion namespace.
|
* even if an assertion namespace.
|
||||||
* This constraint is for portability: the framework needs to compile
|
* This constraint is for portability: the framework needs to compile
|
||||||
* on Visual Studio 6 and must not require exception usage.
|
* on Visual Studio 6 and must not require exception usage.
|
||||||
*/
|
*/
|
||||||
namespace JsonTest {
|
namespace JsonTest {
|
||||||
|
|
||||||
|
|
||||||
class Failure
|
class Failure
|
||||||
{
|
{
|
||||||
public:
|
public:
|
||||||
const char *file_;
|
const char *file_;
|
||||||
unsigned int line_;
|
unsigned int line_;
|
||||||
std::string expr_;
|
std::string expr_;
|
||||||
std::string message_;
|
std::string message_;
|
||||||
unsigned int nestingLevel_;
|
unsigned int nestingLevel_;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
||||||
/// Context used to create the assertion callstack on failure.
|
/// Context used to create the assertion callstack on failure.
|
||||||
/// Must be a POD to allow inline initialisation without stepping
|
/// Must be a POD to allow inline initialisation without stepping
|
||||||
/// into the debugger.
|
/// into the debugger.
|
||||||
struct PredicateContext
|
struct PredicateContext
|
||||||
{
|
{
|
||||||
typedef unsigned int Id;
|
typedef unsigned int Id;
|
||||||
Id id_;
|
Id id_;
|
||||||
const char *file_;
|
const char *file_;
|
||||||
unsigned int line_;
|
unsigned int line_;
|
||||||
const char *expr_;
|
const char *expr_;
|
||||||
PredicateContext *next_;
|
PredicateContext *next_;
|
||||||
/// Related Failure, set when the PredicateContext is converted
|
/// Related Failure, set when the PredicateContext is converted
|
||||||
/// into a Failure.
|
/// into a Failure.
|
||||||
Failure *failure_;
|
Failure *failure_;
|
||||||
};
|
};
|
||||||
|
|
||||||
class TestResult
|
class TestResult
|
||||||
{
|
{
|
||||||
public:
|
public:
|
||||||
TestResult();
|
TestResult();
|
||||||
|
|
||||||
/// \internal Implementation detail for assertion macros
|
/// \internal Implementation detail for assertion macros
|
||||||
/// Not encapsulated to prevent step into when debugging failed assertions
|
/// Not encapsulated to prevent step into when debugging failed assertions
|
||||||
/// Incremented by one on assertion predicate entry, decreased by one
|
/// Incremented by one on assertion predicate entry, decreased by one
|
||||||
/// by addPredicateContext().
|
/// by addPredicateContext().
|
||||||
PredicateContext::Id predicateId_;
|
PredicateContext::Id predicateId_;
|
||||||
|
|
||||||
/// \internal Implementation detail for predicate macros
|
/// \internal Implementation detail for predicate macros
|
||||||
PredicateContext *predicateStackTail_;
|
PredicateContext *predicateStackTail_;
|
||||||
|
|
||||||
void setTestName( const std::string &name );
|
void setTestName( const std::string &name );
|
||||||
|
|
||||||
/// Adds an assertion failure.
|
/// Adds an assertion failure.
|
||||||
TestResult &addFailure( const char *file, unsigned int line,
|
TestResult &addFailure( const char *file, unsigned int line,
|
||||||
const char *expr = 0 );
|
const char *expr = 0 );
|
||||||
|
|
||||||
/// Removes the last PredicateContext added to the predicate stack
|
/// Removes the last PredicateContext added to the predicate stack
|
||||||
/// chained list.
|
/// chained list.
|
||||||
/// Next messages will be targed at the PredicateContext that was removed.
|
/// Next messages will be targed at the PredicateContext that was removed.
|
||||||
TestResult &popPredicateContext();
|
TestResult &popPredicateContext();
|
||||||
|
|
||||||
bool failed() const;
|
bool failed() const;
|
||||||
|
|
||||||
void printFailure( bool printTestName ) const;
|
void printFailure( bool printTestName ) const;
|
||||||
|
|
||||||
TestResult &operator << ( bool value );
|
TestResult &operator << ( bool value );
|
||||||
TestResult &operator << ( int value );
|
TestResult &operator << ( int value );
|
||||||
TestResult &operator << ( unsigned int value );
|
TestResult &operator << ( unsigned int value );
|
||||||
TestResult &operator << ( double value );
|
TestResult &operator << ( double value );
|
||||||
TestResult &operator << ( const char *value );
|
TestResult &operator << ( const char *value );
|
||||||
TestResult &operator << ( const std::string &value );
|
TestResult &operator << ( const std::string &value );
|
||||||
|
|
||||||
private:
|
private:
|
||||||
TestResult &addToLastFailure( const std::string &message );
|
TestResult &addToLastFailure( const std::string &message );
|
||||||
unsigned int getAssertionNestingLevel() const;
|
unsigned int getAssertionNestingLevel() const;
|
||||||
/// Adds a failure or a predicate context
|
/// Adds a failure or a predicate context
|
||||||
void addFailureInfo( const char *file, unsigned int line,
|
void addFailureInfo( const char *file, unsigned int line,
|
||||||
const char *expr, unsigned int nestingLevel );
|
const char *expr, unsigned int nestingLevel );
|
||||||
static std::string indentText( const std::string &text,
|
static std::string indentText( const std::string &text,
|
||||||
const std::string &indent );
|
const std::string &indent );
|
||||||
|
|
||||||
typedef std::deque<Failure> Failures;
|
typedef std::deque<Failure> Failures;
|
||||||
Failures failures_;
|
Failures failures_;
|
||||||
std::string name_;
|
std::string name_;
|
||||||
PredicateContext rootPredicateNode_;
|
PredicateContext rootPredicateNode_;
|
||||||
PredicateContext::Id lastUsedPredicateId_;
|
PredicateContext::Id lastUsedPredicateId_;
|
||||||
/// Failure which is the target of the messages added using operator <<
|
/// Failure which is the target of the messages added using operator <<
|
||||||
Failure *messageTarget_;
|
Failure *messageTarget_;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
||||||
class TestCase
|
class TestCase
|
||||||
{
|
{
|
||||||
public:
|
public:
|
||||||
TestCase();
|
TestCase();
|
||||||
|
|
||||||
void run( TestResult &result );
|
virtual ~TestCase();
|
||||||
|
|
||||||
virtual const char *testName() const = 0;
|
void run( TestResult &result );
|
||||||
|
|
||||||
protected:
|
virtual const char *testName() const = 0;
|
||||||
TestResult *result_;
|
|
||||||
|
protected:
|
||||||
private:
|
TestResult *result_;
|
||||||
virtual void runTestCase() = 0;
|
|
||||||
};
|
private:
|
||||||
|
virtual void runTestCase() = 0;
|
||||||
/// Function pointer type for TestCase factory
|
};
|
||||||
typedef TestCase *(*TestCaseFactory)();
|
|
||||||
|
/// Function pointer type for TestCase factory
|
||||||
class Runner
|
typedef TestCase *(*TestCaseFactory)();
|
||||||
{
|
|
||||||
public:
|
class Runner
|
||||||
Runner();
|
{
|
||||||
|
public:
|
||||||
/// Adds a test to the suite
|
Runner();
|
||||||
Runner &add( TestCaseFactory factory );
|
|
||||||
|
/// Adds a test to the suite
|
||||||
/// Runs test as specified on the command-line
|
Runner &add( TestCaseFactory factory );
|
||||||
/// If no command-line arguments are provided, run all tests.
|
|
||||||
/// If --list-tests is provided, then print the list of all test cases
|
/// Runs test as specified on the command-line
|
||||||
/// If --test <testname> is provided, then run test testname.
|
/// If no command-line arguments are provided, run all tests.
|
||||||
int runCommandLine( int argc, const char *argv[] ) const;
|
/// If --list-tests is provided, then print the list of all test cases
|
||||||
|
/// If --test <testname> is provided, then run test testname.
|
||||||
/// Runs all the test cases
|
int runCommandLine( int argc, const char *argv[] ) const;
|
||||||
bool runAllTest( bool printSummary ) const;
|
|
||||||
|
/// Runs all the test cases
|
||||||
/// Returns the number of test case in the suite
|
bool runAllTest( bool printSummary ) const;
|
||||||
unsigned int testCount() const;
|
|
||||||
|
/// Returns the number of test case in the suite
|
||||||
/// Returns the name of the test case at the specified index
|
unsigned int testCount() const;
|
||||||
std::string testNameAt( unsigned int index ) const;
|
|
||||||
|
/// Returns the name of the test case at the specified index
|
||||||
/// Runs the test case at the specified index using the specified TestResult
|
std::string testNameAt( unsigned int index ) const;
|
||||||
void runTestAt( unsigned int index, TestResult &result ) const;
|
|
||||||
|
/// Runs the test case at the specified index using the specified TestResult
|
||||||
static void printUsage( const char *appName );
|
void runTestAt( unsigned int index, TestResult &result ) const;
|
||||||
|
|
||||||
private: // prevents copy construction and assignment
|
static void printUsage( const char *appName );
|
||||||
Runner( const Runner &other );
|
|
||||||
Runner &operator =( const Runner &other );
|
private: // prevents copy construction and assignment
|
||||||
|
Runner( const Runner &other );
|
||||||
private:
|
Runner &operator =( const Runner &other );
|
||||||
void listTests() const;
|
|
||||||
bool testIndex( const std::string &testName, unsigned int &index ) const;
|
private:
|
||||||
static void preventDialogOnCrash();
|
void listTests() const;
|
||||||
|
bool testIndex( const std::string &testName, unsigned int &index ) const;
|
||||||
private:
|
static void preventDialogOnCrash();
|
||||||
typedef std::deque<TestCaseFactory> Factories;
|
|
||||||
Factories tests_;
|
private:
|
||||||
};
|
typedef std::deque<TestCaseFactory> Factories;
|
||||||
|
Factories tests_;
|
||||||
template<typename T>
|
};
|
||||||
TestResult &
|
|
||||||
checkEqual( TestResult &result, const T &expected, const T &actual,
|
template<typename T>
|
||||||
const char *file, unsigned int line, const char *expr )
|
TestResult &
|
||||||
{
|
checkEqual( TestResult &result, const T &expected, const T &actual,
|
||||||
if ( expected != actual )
|
const char *file, unsigned int line, const char *expr )
|
||||||
{
|
{
|
||||||
result.addFailure( file, line, expr );
|
if ( expected != actual )
|
||||||
result << "Expected: " << expected << "\n";
|
{
|
||||||
result << "Actual : " << actual;
|
result.addFailure( file, line, expr );
|
||||||
}
|
result << "Expected: " << expected << "\n";
|
||||||
return result;
|
result << "Actual : " << actual;
|
||||||
}
|
}
|
||||||
|
return result;
|
||||||
TestResult &
|
}
|
||||||
checkStringEqual( TestResult &result,
|
|
||||||
const std::string &expected, const std::string &actual,
|
TestResult &
|
||||||
const char *file, unsigned int line, const char *expr );
|
checkStringEqual( TestResult &result,
|
||||||
|
const std::string &expected, const std::string &actual,
|
||||||
} // namespace JsonTest
|
const char *file, unsigned int line, const char *expr );
|
||||||
|
|
||||||
|
} // namespace JsonTest
|
||||||
/// \brief Asserts that the given expression is true.
|
|
||||||
/// JSONTEST_ASSERT( x == y ) << "x=" << x << ", y=" << y;
|
|
||||||
/// JSONTEST_ASSERT( x == y );
|
/// \brief Asserts that the given expression is true.
|
||||||
#define JSONTEST_ASSERT( expr ) \
|
/// JSONTEST_ASSERT( x == y ) << "x=" << x << ", y=" << y;
|
||||||
if ( condition ) \
|
/// JSONTEST_ASSERT( x == y );
|
||||||
{ \
|
#define JSONTEST_ASSERT( expr ) \
|
||||||
} \
|
if ( condition ) \
|
||||||
else \
|
{ \
|
||||||
result_->addFailure( __FILE__, __LINE__, #expr )
|
} \
|
||||||
|
else \
|
||||||
/// \brief Asserts that the given predicate is true.
|
result_->addFailure( __FILE__, __LINE__, #expr )
|
||||||
/// The predicate may do other assertions and be a member function of the fixture.
|
|
||||||
#define JSONTEST_ASSERT_PRED( expr ) \
|
/// \brief Asserts that the given predicate is true.
|
||||||
{ \
|
/// The predicate may do other assertions and be a member function of the fixture.
|
||||||
JsonTest::PredicateContext _minitest_Context = { \
|
#define JSONTEST_ASSERT_PRED( expr ) \
|
||||||
result_->predicateId_, __FILE__, __LINE__, #expr }; \
|
{ \
|
||||||
result_->predicateStackTail_->next_ = &_minitest_Context; \
|
JsonTest::PredicateContext _minitest_Context = { \
|
||||||
result_->predicateId_ += 1; \
|
result_->predicateId_, __FILE__, __LINE__, #expr }; \
|
||||||
result_->predicateStackTail_ = &_minitest_Context; \
|
result_->predicateStackTail_->next_ = &_minitest_Context; \
|
||||||
(expr); \
|
result_->predicateId_ += 1; \
|
||||||
result_->popPredicateContext(); \
|
result_->predicateStackTail_ = &_minitest_Context; \
|
||||||
} \
|
(expr); \
|
||||||
*result_
|
result_->popPredicateContext(); \
|
||||||
|
} \
|
||||||
/// \brief Asserts that two values are equals.
|
*result_
|
||||||
#define JSONTEST_ASSERT_EQUAL( expected, actual ) \
|
|
||||||
JsonTest::checkEqual( *result_, expected, actual, \
|
/// \brief Asserts that two values are equals.
|
||||||
__FILE__, __LINE__, \
|
#define JSONTEST_ASSERT_EQUAL( expected, actual ) \
|
||||||
#expected " == " #actual )
|
JsonTest::checkEqual( *result_, expected, actual, \
|
||||||
|
__FILE__, __LINE__, \
|
||||||
/// \brief Asserts that two values are equals.
|
#expected " == " #actual )
|
||||||
#define JSONTEST_ASSERT_STRING_EQUAL( expected, actual ) \
|
|
||||||
JsonTest::checkStringEqual( *result_, \
|
/// \brief Asserts that two values are equals.
|
||||||
std::string(expected), std::string(actual), \
|
#define JSONTEST_ASSERT_STRING_EQUAL( expected, actual ) \
|
||||||
#expected " == " #actual )
|
JsonTest::checkStringEqual( *result_, \
|
||||||
|
std::string(expected), std::string(actual), \
|
||||||
/// \brief Begin a fixture test case.
|
#expected " == " #actual )
|
||||||
#define JSONTEST_FIXTURE( FixtureType, name ) \
|
|
||||||
class Test##FixtureType##name : public FixtureType \
|
/// \brief Begin a fixture test case.
|
||||||
{ \
|
#define JSONTEST_FIXTURE( FixtureType, name ) \
|
||||||
public: \
|
class Test##FixtureType##name : public FixtureType \
|
||||||
static JsonTest::TestCase *factory() \
|
{ \
|
||||||
{ \
|
public: \
|
||||||
return new Test##FixtureType##name(); \
|
static JsonTest::TestCase *factory() \
|
||||||
} \
|
{ \
|
||||||
public: /* overidden from TestCase */ \
|
return new Test##FixtureType##name(); \
|
||||||
virtual const char *testName() const \
|
} \
|
||||||
{ \
|
public: /* overidden from TestCase */ \
|
||||||
return #FixtureType "/" #name; \
|
virtual const char *testName() const \
|
||||||
} \
|
{ \
|
||||||
virtual void runTestCase(); \
|
return #FixtureType "/" #name; \
|
||||||
}; \
|
} \
|
||||||
\
|
virtual void runTestCase(); \
|
||||||
void Test##FixtureType##name::runTestCase()
|
}; \
|
||||||
|
\
|
||||||
#define JSONTEST_FIXTURE_FACTORY( FixtureType, name ) \
|
void Test##FixtureType##name::runTestCase()
|
||||||
&Test##FixtureType##name::factory
|
|
||||||
|
#define JSONTEST_FIXTURE_FACTORY( FixtureType, name ) \
|
||||||
#define JSONTEST_REGISTER_FIXTURE( runner, FixtureType, name ) \
|
&Test##FixtureType##name::factory
|
||||||
(runner).add( JSONTEST_FIXTURE_FACTORY( FixtureType, name ) )
|
|
||||||
|
#define JSONTEST_REGISTER_FIXTURE( runner, FixtureType, name ) \
|
||||||
#endif // ifndef JSONTEST_H_INCLUDED
|
(runner).add( JSONTEST_FIXTURE_FACTORY( FixtureType, name ) )
|
||||||
|
|
||||||
|
#endif // ifndef JSONTEST_H_INCLUDED
|
||||||
|
@ -1,244 +1,244 @@
|
|||||||
#include <json/json.h>
|
#include <json/json.h>
|
||||||
#include "jsontest.h"
|
#include "jsontest.h"
|
||||||
|
|
||||||
|
|
||||||
// TODO:
|
// TODO:
|
||||||
// - boolean value returns that they are integral. Should not be.
|
// - boolean value returns that they are integral. Should not be.
|
||||||
// - unsigned integer in integer range are not considered to be valid integer. Should check range.
|
// - unsigned integer in integer range are not considered to be valid integer. Should check range.
|
||||||
|
|
||||||
|
|
||||||
// //////////////////////////////////////////////////////////////////
|
// //////////////////////////////////////////////////////////////////
|
||||||
// //////////////////////////////////////////////////////////////////
|
// //////////////////////////////////////////////////////////////////
|
||||||
// Json Library test cases
|
// Json Library test cases
|
||||||
// //////////////////////////////////////////////////////////////////
|
// //////////////////////////////////////////////////////////////////
|
||||||
// //////////////////////////////////////////////////////////////////
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
|
||||||
struct ValueTest : JsonTest::TestCase
|
struct ValueTest : JsonTest::TestCase
|
||||||
{
|
{
|
||||||
Json::Value null_;
|
Json::Value null_;
|
||||||
Json::Value emptyArray_;
|
Json::Value emptyArray_;
|
||||||
Json::Value emptyObject_;
|
Json::Value emptyObject_;
|
||||||
Json::Value integer_;
|
Json::Value integer_;
|
||||||
Json::Value unsignedInteger_;
|
Json::Value unsignedInteger_;
|
||||||
Json::Value smallUnsignedInteger_;
|
Json::Value smallUnsignedInteger_;
|
||||||
Json::Value real_;
|
Json::Value real_;
|
||||||
Json::Value array1_;
|
Json::Value array1_;
|
||||||
Json::Value object1_;
|
Json::Value object1_;
|
||||||
Json::Value emptyString_;
|
Json::Value emptyString_;
|
||||||
Json::Value string1_;
|
Json::Value string1_;
|
||||||
Json::Value string_;
|
Json::Value string_;
|
||||||
Json::Value true_;
|
Json::Value true_;
|
||||||
Json::Value false_;
|
Json::Value false_;
|
||||||
|
|
||||||
ValueTest()
|
ValueTest()
|
||||||
: emptyArray_( Json::arrayValue )
|
: emptyArray_( Json::arrayValue )
|
||||||
, emptyObject_( Json::objectValue )
|
, emptyObject_( Json::objectValue )
|
||||||
, integer_( 123456789 )
|
, integer_( 123456789 )
|
||||||
, smallUnsignedInteger_( Json::Value::UInt( Json::Value::maxInt ) )
|
, smallUnsignedInteger_( Json::Value::UInt( Json::Value::maxInt ) )
|
||||||
, unsignedInteger_( 34567890u )
|
, unsignedInteger_( 34567890u )
|
||||||
, real_( 1234.56789 )
|
, real_( 1234.56789 )
|
||||||
, emptyString_( "" )
|
, emptyString_( "" )
|
||||||
, string1_( "a" )
|
, string1_( "a" )
|
||||||
, string_( "sometext with space" )
|
, string_( "sometext with space" )
|
||||||
, true_( true )
|
, true_( true )
|
||||||
, false_( false )
|
, false_( false )
|
||||||
{
|
{
|
||||||
array1_.append( 1234 );
|
array1_.append( 1234 );
|
||||||
object1_["id"] = 1234;
|
object1_["id"] = 1234;
|
||||||
}
|
}
|
||||||
|
|
||||||
struct IsCheck
|
struct IsCheck
|
||||||
{
|
{
|
||||||
/// Initialize all checks to \c false by default.
|
/// Initialize all checks to \c false by default.
|
||||||
IsCheck();
|
IsCheck();
|
||||||
|
|
||||||
bool isObject_;
|
bool isObject_;
|
||||||
bool isArray_;
|
bool isArray_;
|
||||||
bool isBool_;
|
bool isBool_;
|
||||||
bool isDouble_;
|
bool isDouble_;
|
||||||
bool isInt_;
|
bool isInt_;
|
||||||
bool isUInt_;
|
bool isUInt_;
|
||||||
bool isIntegral_;
|
bool isIntegral_;
|
||||||
bool isNumeric_;
|
bool isNumeric_;
|
||||||
bool isString_;
|
bool isString_;
|
||||||
bool isNull_;
|
bool isNull_;
|
||||||
};
|
};
|
||||||
|
|
||||||
void checkConstMemberCount( const Json::Value &value, unsigned int expectedCount );
|
void checkConstMemberCount( const Json::Value &value, unsigned int expectedCount );
|
||||||
|
|
||||||
void checkMemberCount( Json::Value &value, unsigned int expectedCount );
|
void checkMemberCount( Json::Value &value, unsigned int expectedCount );
|
||||||
|
|
||||||
void checkIs( const Json::Value &value, const IsCheck &check );
|
void checkIs( const Json::Value &value, const IsCheck &check );
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
||||||
JSONTEST_FIXTURE( ValueTest, size )
|
JSONTEST_FIXTURE( ValueTest, size )
|
||||||
{
|
{
|
||||||
JSONTEST_ASSERT_PRED( checkMemberCount(emptyArray_, 0) );
|
JSONTEST_ASSERT_PRED( checkMemberCount(emptyArray_, 0) );
|
||||||
JSONTEST_ASSERT_PRED( checkMemberCount(emptyObject_, 0) );
|
JSONTEST_ASSERT_PRED( checkMemberCount(emptyObject_, 0) );
|
||||||
JSONTEST_ASSERT_PRED( checkMemberCount(array1_, 1) );
|
JSONTEST_ASSERT_PRED( checkMemberCount(array1_, 1) );
|
||||||
JSONTEST_ASSERT_PRED( checkMemberCount(object1_, 1) );
|
JSONTEST_ASSERT_PRED( checkMemberCount(object1_, 1) );
|
||||||
JSONTEST_ASSERT_PRED( checkMemberCount(null_, 0) );
|
JSONTEST_ASSERT_PRED( checkMemberCount(null_, 0) );
|
||||||
JSONTEST_ASSERT_PRED( checkMemberCount(integer_, 0) );
|
JSONTEST_ASSERT_PRED( checkMemberCount(integer_, 0) );
|
||||||
JSONTEST_ASSERT_PRED( checkMemberCount(real_, 0) );
|
JSONTEST_ASSERT_PRED( checkMemberCount(real_, 0) );
|
||||||
JSONTEST_ASSERT_PRED( checkMemberCount(emptyString_, 0) );
|
JSONTEST_ASSERT_PRED( checkMemberCount(emptyString_, 0) );
|
||||||
JSONTEST_ASSERT_PRED( checkMemberCount(string_, 0) );
|
JSONTEST_ASSERT_PRED( checkMemberCount(string_, 0) );
|
||||||
JSONTEST_ASSERT_PRED( checkMemberCount(true_, 0) );
|
JSONTEST_ASSERT_PRED( checkMemberCount(true_, 0) );
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
JSONTEST_FIXTURE( ValueTest, isObject )
|
JSONTEST_FIXTURE( ValueTest, isObject )
|
||||||
{
|
{
|
||||||
IsCheck checks;
|
IsCheck checks;
|
||||||
checks.isObject_ = true;
|
checks.isObject_ = true;
|
||||||
JSONTEST_ASSERT_PRED( checkIs( emptyObject_, checks ) );
|
JSONTEST_ASSERT_PRED( checkIs( emptyObject_, checks ) );
|
||||||
JSONTEST_ASSERT_PRED( checkIs( object1_, checks ) );
|
JSONTEST_ASSERT_PRED( checkIs( object1_, checks ) );
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
JSONTEST_FIXTURE( ValueTest, isArray )
|
JSONTEST_FIXTURE( ValueTest, isArray )
|
||||||
{
|
{
|
||||||
IsCheck checks;
|
IsCheck checks;
|
||||||
checks.isArray_ = true;
|
checks.isArray_ = true;
|
||||||
JSONTEST_ASSERT_PRED( checkIs( emptyArray_, checks ) );
|
JSONTEST_ASSERT_PRED( checkIs( emptyArray_, checks ) );
|
||||||
JSONTEST_ASSERT_PRED( checkIs( array1_, checks ) );
|
JSONTEST_ASSERT_PRED( checkIs( array1_, checks ) );
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
JSONTEST_FIXTURE( ValueTest, isNull )
|
JSONTEST_FIXTURE( ValueTest, isNull )
|
||||||
{
|
{
|
||||||
IsCheck checks;
|
IsCheck checks;
|
||||||
checks.isNull_ = true;
|
checks.isNull_ = true;
|
||||||
checks.isObject_ = true;
|
checks.isObject_ = true;
|
||||||
checks.isArray_ = true;
|
checks.isArray_ = true;
|
||||||
JSONTEST_ASSERT_PRED( checkIs( null_, checks ) );
|
JSONTEST_ASSERT_PRED( checkIs( null_, checks ) );
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
JSONTEST_FIXTURE( ValueTest, isString )
|
JSONTEST_FIXTURE( ValueTest, isString )
|
||||||
{
|
{
|
||||||
IsCheck checks;
|
IsCheck checks;
|
||||||
checks.isString_ = true;
|
checks.isString_ = true;
|
||||||
JSONTEST_ASSERT_PRED( checkIs( emptyString_, checks ) );
|
JSONTEST_ASSERT_PRED( checkIs( emptyString_, checks ) );
|
||||||
JSONTEST_ASSERT_PRED( checkIs( string_, checks ) );
|
JSONTEST_ASSERT_PRED( checkIs( string_, checks ) );
|
||||||
JSONTEST_ASSERT_PRED( checkIs( string1_, checks ) );
|
JSONTEST_ASSERT_PRED( checkIs( string1_, checks ) );
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
JSONTEST_FIXTURE( ValueTest, isBool )
|
JSONTEST_FIXTURE( ValueTest, isBool )
|
||||||
{
|
{
|
||||||
IsCheck checks;
|
IsCheck checks;
|
||||||
checks.isBool_ = true;
|
checks.isBool_ = true;
|
||||||
checks.isIntegral_ = true;
|
checks.isIntegral_ = true;
|
||||||
checks.isNumeric_ = true;
|
checks.isNumeric_ = true;
|
||||||
JSONTEST_ASSERT_PRED( checkIs( false_, checks ) );
|
JSONTEST_ASSERT_PRED( checkIs( false_, checks ) );
|
||||||
JSONTEST_ASSERT_PRED( checkIs( true_, checks ) );
|
JSONTEST_ASSERT_PRED( checkIs( true_, checks ) );
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
JSONTEST_FIXTURE( ValueTest, isDouble )
|
JSONTEST_FIXTURE( ValueTest, isDouble )
|
||||||
{
|
{
|
||||||
IsCheck checks;
|
IsCheck checks;
|
||||||
checks.isDouble_ = true;
|
checks.isDouble_ = true;
|
||||||
checks.isNumeric_ = true;
|
checks.isNumeric_ = true;
|
||||||
JSONTEST_ASSERT_PRED( checkIs( real_, checks ) );
|
JSONTEST_ASSERT_PRED( checkIs( real_, checks ) );
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
JSONTEST_FIXTURE( ValueTest, isInt )
|
JSONTEST_FIXTURE( ValueTest, isInt )
|
||||||
{
|
{
|
||||||
IsCheck checks;
|
IsCheck checks;
|
||||||
checks.isInt_ = true;
|
checks.isInt_ = true;
|
||||||
checks.isNumeric_ = true;
|
checks.isNumeric_ = true;
|
||||||
checks.isIntegral_ = true;
|
checks.isIntegral_ = true;
|
||||||
JSONTEST_ASSERT_PRED( checkIs( integer_, checks ) );
|
JSONTEST_ASSERT_PRED( checkIs( integer_, checks ) );
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
JSONTEST_FIXTURE( ValueTest, isUInt )
|
JSONTEST_FIXTURE( ValueTest, isUInt )
|
||||||
{
|
{
|
||||||
IsCheck checks;
|
IsCheck checks;
|
||||||
checks.isUInt_ = true;
|
checks.isUInt_ = true;
|
||||||
checks.isNumeric_ = true;
|
checks.isNumeric_ = true;
|
||||||
checks.isIntegral_ = true;
|
checks.isIntegral_ = true;
|
||||||
JSONTEST_ASSERT_PRED( checkIs( unsignedInteger_, checks ) );
|
JSONTEST_ASSERT_PRED( checkIs( unsignedInteger_, checks ) );
|
||||||
JSONTEST_ASSERT_PRED( checkIs( smallUnsignedInteger_, checks ) );
|
JSONTEST_ASSERT_PRED( checkIs( smallUnsignedInteger_, checks ) );
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
void
|
void
|
||||||
ValueTest::checkConstMemberCount( const Json::Value &value, unsigned int expectedCount )
|
ValueTest::checkConstMemberCount( const Json::Value &value, unsigned int expectedCount )
|
||||||
{
|
{
|
||||||
unsigned int count = 0;
|
unsigned int count = 0;
|
||||||
Json::Value::const_iterator itEnd = value.end();
|
Json::Value::const_iterator itEnd = value.end();
|
||||||
for ( Json::Value::const_iterator it = value.begin(); it != itEnd; ++it )
|
for ( Json::Value::const_iterator it = value.begin(); it != itEnd; ++it )
|
||||||
{
|
{
|
||||||
++count;
|
++count;
|
||||||
}
|
}
|
||||||
JSONTEST_ASSERT_EQUAL( expectedCount, count ) << "Json::Value::const_iterator";
|
JSONTEST_ASSERT_EQUAL( expectedCount, count ) << "Json::Value::const_iterator";
|
||||||
}
|
}
|
||||||
|
|
||||||
void
|
void
|
||||||
ValueTest::checkMemberCount( Json::Value &value, unsigned int expectedCount )
|
ValueTest::checkMemberCount( Json::Value &value, unsigned int expectedCount )
|
||||||
{
|
{
|
||||||
JSONTEST_ASSERT_EQUAL( expectedCount, value.size() );
|
JSONTEST_ASSERT_EQUAL( expectedCount, value.size() );
|
||||||
|
|
||||||
unsigned int count = 0;
|
unsigned int count = 0;
|
||||||
Json::Value::iterator itEnd = value.end();
|
Json::Value::iterator itEnd = value.end();
|
||||||
for ( Json::Value::iterator it = value.begin(); it != itEnd; ++it )
|
for ( Json::Value::iterator it = value.begin(); it != itEnd; ++it )
|
||||||
{
|
{
|
||||||
++count;
|
++count;
|
||||||
}
|
}
|
||||||
JSONTEST_ASSERT_EQUAL( expectedCount, count ) << "Json::Value::iterator";
|
JSONTEST_ASSERT_EQUAL( expectedCount, count ) << "Json::Value::iterator";
|
||||||
|
|
||||||
JSONTEST_ASSERT_PRED( checkConstMemberCount(value, expectedCount) );
|
JSONTEST_ASSERT_PRED( checkConstMemberCount(value, expectedCount) );
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
ValueTest::IsCheck::IsCheck()
|
ValueTest::IsCheck::IsCheck()
|
||||||
: isObject_( false )
|
: isObject_( false )
|
||||||
, isArray_( false )
|
, isArray_( false )
|
||||||
, isBool_( false )
|
, isBool_( false )
|
||||||
, isDouble_( false )
|
, isDouble_( false )
|
||||||
, isInt_( false )
|
, isInt_( false )
|
||||||
, isUInt_( false )
|
, isUInt_( false )
|
||||||
, isIntegral_( false )
|
, isIntegral_( false )
|
||||||
, isNumeric_( false )
|
, isNumeric_( false )
|
||||||
, isString_( false )
|
, isString_( false )
|
||||||
, isNull_( false )
|
, isNull_( false )
|
||||||
{
|
{
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
void
|
void
|
||||||
ValueTest::checkIs( const Json::Value &value, const IsCheck &check )
|
ValueTest::checkIs( const Json::Value &value, const IsCheck &check )
|
||||||
{
|
{
|
||||||
JSONTEST_ASSERT_EQUAL( check.isObject_, value.isObject() );
|
JSONTEST_ASSERT_EQUAL( check.isObject_, value.isObject() );
|
||||||
JSONTEST_ASSERT_EQUAL( check.isArray_, value.isArray() );
|
JSONTEST_ASSERT_EQUAL( check.isArray_, value.isArray() );
|
||||||
JSONTEST_ASSERT_EQUAL( check.isBool_, value.isBool() );
|
JSONTEST_ASSERT_EQUAL( check.isBool_, value.isBool() );
|
||||||
JSONTEST_ASSERT_EQUAL( check.isDouble_, value.isDouble() );
|
JSONTEST_ASSERT_EQUAL( check.isDouble_, value.isDouble() );
|
||||||
JSONTEST_ASSERT_EQUAL( check.isInt_, value.isInt() );
|
JSONTEST_ASSERT_EQUAL( check.isInt_, value.isInt() );
|
||||||
JSONTEST_ASSERT_EQUAL( check.isUInt_, value.isUInt() );
|
JSONTEST_ASSERT_EQUAL( check.isUInt_, value.isUInt() );
|
||||||
JSONTEST_ASSERT_EQUAL( check.isIntegral_, value.isIntegral() );
|
JSONTEST_ASSERT_EQUAL( check.isIntegral_, value.isIntegral() );
|
||||||
JSONTEST_ASSERT_EQUAL( check.isNumeric_, value.isNumeric() );
|
JSONTEST_ASSERT_EQUAL( check.isNumeric_, value.isNumeric() );
|
||||||
JSONTEST_ASSERT_EQUAL( check.isString_, value.isString() );
|
JSONTEST_ASSERT_EQUAL( check.isString_, value.isString() );
|
||||||
JSONTEST_ASSERT_EQUAL( check.isNull_, value.isNull() );
|
JSONTEST_ASSERT_EQUAL( check.isNull_, value.isNull() );
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
int main( int argc, const char *argv[] )
|
int main( int argc, const char *argv[] )
|
||||||
{
|
{
|
||||||
JsonTest::Runner runner;
|
JsonTest::Runner runner;
|
||||||
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, size );
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, size );
|
||||||
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isObject );
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isObject );
|
||||||
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isArray );
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isArray );
|
||||||
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isBool );
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isBool );
|
||||||
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isInt );
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isInt );
|
||||||
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isUInt );
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isUInt );
|
||||||
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isDouble );
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isDouble );
|
||||||
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isString );
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isString );
|
||||||
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isNull );
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isNull );
|
||||||
return runner.runCommandLine( argc, argv );
|
return runner.runCommandLine( argc, argv );
|
||||||
}
|
}
|
||||||
|
@ -1,10 +1,10 @@
|
|||||||
Import( 'env_testing buildUnitTests' )
|
Import( 'env_testing buildUnitTests' )
|
||||||
|
|
||||||
buildUnitTests( env_testing, Split( """
|
buildUnitTests( env_testing, Split( """
|
||||||
main.cpp
|
main.cpp
|
||||||
jsontest.cpp
|
jsontest.cpp
|
||||||
""" ),
|
""" ),
|
||||||
'test_lib_json' )
|
'test_lib_json' )
|
||||||
|
|
||||||
# For 'check' to work, 'libs' must be built first.
|
# For 'check' to work, 'libs' must be built first.
|
||||||
env_testing.Depends('test_lib_json', '#libs')
|
env_testing.Depends('test_lib_json', '#libs')
|
||||||
|
Loading…
x
Reference in New Issue
Block a user