Compare commits

..

581 Commits

Author SHA1 Message Date
Christopher Dunn
1021ff66d3 partially revert 'Added features that allow the reader to accept common non-standard JSON.'
revert '642befc836ac5093b528e7d8b4fd66b66735a98c',
but keep the *added* methods for `decodedNumber()` and `decodedDouble()`.
2015-03-15 14:25:04 -05:00
Christopher Dunn
f3aa9c1201 partially revert 'fix bug for static init'
re: 28836b8acc

A global instance of a Value (viz. 'null') was a mistake,
but dropping it breaks binary-compatibility. So we will keep it
everywhere except the one platform where it was crashing, ARM.
2015-03-15 14:25:04 -05:00
Christopher Dunn
b0879b2ab8 revert 'Made it possible to drop null placeholders from array output.'
revert ae3c7a7aab
2015-03-15 14:25:04 -05:00
Christopher Dunn
3feaf29075 Revert "added option to FastWriter which omits the trailing new line character"
This reverts commit 5bf16105b5.
2015-03-15 14:25:04 -05:00
Christopher Dunn
360153687d revert 'Added structured error reporting to Reader.'
revert 68db655347
issue #147
2015-03-15 14:25:04 -05:00
Christopher Dunn
a67d378236 revert 'Add public semantic error reporting'
for binary-compatibility with 0.6.0
issue #147
was #57
2015-03-15 14:25:04 -05:00
Christopher Dunn
e7f6e91381 partially revert "Switch to copy-and-swap idiom for operator=."
This partially reverts commit 45cd9490cd.

Ignored ValueInternal* changes, since those did not produce symbols for
Debian build. (They must not have used the INTERNAL stuff.)

Ignored CZString changes since those are private (and sizeof struct did
not change).

  https://github.com/open-source-parsers/jsoncpp/issues/78

Conflicts:
	include/json/value.h
	src/lib_json/json_internalarray.inl
	src/lib_json/json_internalmap.inl
	src/lib_json/json_value.cpp
2015-03-15 14:25:04 -05:00
Christopher Dunn
f025d8d225 NOT C++11 2015-03-15 14:25:04 -05:00
Christopher Dunn
a3007ee301 0.10.z (based on 1.6.z, but binary-compat w/ 0.6.0-rc2) 2015-03-15 14:24:50 -05:00
Christopher Dunn
c2040ec371 prefer std::string for setComment()
in case of embedded nulls
2015-03-15 13:50:00 -05:00
Christopher Dunn
cbe7e7c9cb Merge pull request #221 from btolfa/forgotten-virtual-dtor
Added forgotten virtual dtor for `Json::CharReader::Factory`.

(Without this, the destructor of the derived `CharReaderBuilder` would not be called, which is a small memory leak.)
2015-03-15 13:49:24 -05:00
Tengiz Sharafiev
be183def8f Update reader.h 2015-03-14 21:30:00 +03:00
Christopher Dunn
951bd3d05d Merge pull request #219 from cdunn2001/c-std-headers
Close #218. Fix #214.
2015-03-11 21:36:51 -05:00
Connor Manning
1c58876185 Use std namespace for snprintf. 2015-03-11 21:33:08 -05:00
Connor Manning
2f2034629e Constrain MSVC _isfinite to before 2013, remove duplicate includes. 2015-03-11 21:33:08 -05:00
Dani-Hub
7020451b44 Fix isfinite for MSVC. 2015-03-11 21:32:59 -05:00
Connor Manning
80497f102e Use C++ standard headers. 2015-03-10 18:48:45 -05:00
Dani-Hub
f9feb66be2 Change exception data member
from "reference to string" to "string" (Resolves the most serious part of issue #216)
2015-03-09 18:42:16 -05:00
Christopher Dunn
ed495edcc1 prefer ValueIterator::name() to ::memberName()
in case of embedded nulls
2015-03-08 14:35:00 -05:00
Christopher Dunn
3c0a383877 Merge pull request #212 from cdunn2001/macro-deprec
close #210
2015-03-08 13:10:37 -05:00
Dani-Hub
5003983029 Make preprocessor query robust against older gcc versions 2015-03-08 13:07:27 -05:00
Dani-Hub
871b311e7e Provide JSONCPP_DEPRECATED definitions for clang and gcc 2015-03-08 13:07:27 -05:00
Christopher Dunn
cdbc35f6ac 1.6.0 2015-03-08 12:57:13 -05:00
Christopher Dunn
4e30c4fcdb comments 2015-03-08 12:56:32 -05:00
Christopher Dunn
0d33cb3639 Merge pull request #211 from cdunn2001/except
* Add Json::Exception and derivatives.
* Clarify when exceptions are thrown, to avoid crashes caused by malicious input.
* Use our own type (derived fro std::exception) so they are trappable.
2015-03-08 12:50:34 -05:00
Christopher Dunn
2250b3c29d use Json::RuntimeError 2015-03-08 12:44:55 -05:00
Christopher Dunn
9376368d86 use Json::LogicError in macros 2015-03-08 12:42:53 -05:00
Christopher Dunn
5383794cc9 Runtime/LogicError and throwers 2015-03-08 12:31:57 -05:00
Christopher Dunn
75279ccec2 base Json::Exception 2015-03-08 12:20:06 -05:00
Christopher Dunn
717b08695e clarify errors
* use macros for logic errors, not input errors
* throw on parsing failure in `operator>>()`, not assert
* throw on malloc, not assert
2015-03-08 12:06:22 -05:00
Christopher Dunn
ee4ea0ec3f delete debug code from test 2015-03-07 15:47:39 -06:00
Christopher Dunn
ce19001238 require length
Ugh! I meant to do this long ago. It would have caught my blunder.
2015-03-07 15:12:52 -06:00
Christopher Dunn
078f991c57 1.5.4 <- 1.5.3
important bug-fix (thx to datadiode@)
2015-03-07 14:52:01 -06:00
Christopher Dunn
72b5293695 Merge pull request #207 from cdunn2001/fix_CZString_copy_constructor
Fix czstring copy constructor
2015-03-07 14:49:54 -06:00
Christopher Dunn
a63d82d78a drop unused CString ctor case
`Value::CZString::CZString(char const* str, unsigned length, DuplicationPolicy allocate)` with `allocate == duplicate` does not happen.
2015-03-07 14:43:37 -06:00
datadiode
ee83f8891c Trivial fixes in CZString constructors. 2015-03-07 14:43:07 -06:00
Christopher Dunn
5c448687e1 fix ValueTest/zeroes* 2015-03-07 14:41:15 -06:00
Christopher Dunn
401e98269e old-style enum namespacing 2015-03-06 16:11:55 -06:00
Christopher Dunn
b2a7438d08 Merge pull request #205 from open-source-parsers/reject-dup-keys
[Shekhar (shakers007) wrote](https://sourceforge.net/p/jsoncpp/bugs/22/):

> As per RFC4627 (section 2.2), names within an object should be unique. When using JSONCPP's strict mode, parsing such an object should fail.
2015-03-06 12:58:55 -06:00
Christopher Dunn
62ad140d18 rejectDupKeys 2015-03-06 12:39:05 -06:00
Christopher Dunn
527332d5d5 add rejectDupKeys feature - not yet impld 2015-03-06 12:38:58 -06:00
Christopher Dunn
cada3b951f test for repeated key in strictMode
https://sourceforge.net/p/jsoncpp/bugs/22/
2015-03-06 12:38:00 -06:00
Christopher Dunn
ff61752444 change str_ for cross-compilation
https://sourceforge.net/p/jsoncpp/bugs/59/
2015-03-06 10:31:46 -06:00
Christopher Dunn
7f439f4276 clarify operator= 2015-03-06 09:22:57 -06:00
Christopher Dunn
3976f17ffd test assignment over-writes comments, but swapPayload() does not 2015-03-06 09:16:19 -06:00
Christopher Dunn
80ca11bb41 test commentBefore
for issue #203
2015-03-06 05:55:19 -06:00
Christopher Dunn
2fc08b4ebd clarify which versions work with old compilers 2015-03-05 21:45:42 -06:00
Christopher Dunn
239c733ab5 1.5.3 <- 1.5.2 2015-03-05 18:27:52 -06:00
Christopher Dunn
295e73ff3c generate both version.h and version from CMakelists.txt
This forces consistency, since they will be re-generated whenever
a git operation alters CMakelists.txt. They are still in the repo
because users might not actually run cmake.
2015-03-05 18:27:39 -06:00
Christopher Dunn
2a840c105c had trouble finding Python on Windows
With this change, `make jsoncpp_check` will still fail if Python
is missing, so our CI tests are unaffected.
2015-03-05 17:42:12 -06:00
Christopher Dunn
7ec98dc9fe Merge pull request #202 from open-source-parsers/get-with-zero
`Value::get(key, default)` with zero
2015-03-05 16:56:56 -06:00
Christopher Dunn
0fd2875a44 fix get() for embedded zeroes in key
This method had been overlooked.
2015-03-05 16:47:29 -06:00
Christopher Dunn
d31151d150 test get(key, default) 2015-03-05 16:44:50 -06:00
Christopher Dunn
f50145fbda Merge pull request #201 from open-source-parsers/vs
* Copy .dll for running unit-tests in VisualStudio.
* Stop using `do{}while(0)` idiom b/c VisualStudio warns.
* Fix a warning in a test.
2015-03-05 16:37:42 -06:00
Christopher Dunn
b3e6f3d70f drop do{}while(0) idiom
Rationale:
  * http://stackoverflow.com/questions/154136/do-while-and-if-else-statements-in-c-c-macros/154138#154138

But Visual Studio issues a warning: `warning C4127: conditional expression is constant`
  * http://stackoverflow.com/questions/1946445/c-c-how-to-use-the-do-while0-construct-without-compiler-warnings-like-c412
2015-03-05 15:26:29 -06:00
Christopher Dunn
13e063c336 copy .dll for unit-test
Fix 2nd problem in issue #200.
* http://stackoverflow.com/questions/10671916/how-to-copy-dll-files-into-the-same-folder-as-the-executable-using-cmake

Q: What about the Python tests?
A: They are not normally run in Visual Studio. If desired, one can set PATH.
2015-03-05 15:23:20 -06:00
Christopher Dunn
f57da48f48 maybe address warning
cmake -DCMAKE_BUILD_TYPE=debug -DJSONCPP_LIB_BUILD_STATIC=OFF
-DJSONCPP_LIB_BUILD_SHARED=ON -G "Visual Studio 10" ../..

`potentially uninitialized local variable 'dist' (line 2212 of
test_lib_json/main.cpp)`
2015-03-05 14:41:42 -06:00
Christopher Dunn
eaa3fd8eca maybe fix DLL symbols for VS 2015-03-05 10:11:43 -06:00
Christopher Dunn
ff63d034e5 1.5.2 <- 1.5.1
* Fixed compile error for VS2013.
* Added `operator[]` to Builders. Recalling previous minor versions for
  bug-fixes.
2015-03-05 09:18:44 -06:00
Christopher Dunn
2aa4557e79 Merge pull request #199 from open-source-parsers/set-builders
Setting builders
2015-03-05 09:18:06 -06:00
Christopher Dunn
37dde9d29d fix example in docs 2015-03-05 09:18:06 -06:00
Christopher Dunn
c312dd5ef7 Builder::operator[] plus tests 2015-03-05 09:18:01 -06:00
Christopher Dunn
42d7e59fe0 fix compiler-error and warnings for VS2013
fix issue #200
2015-03-05 09:15:10 -06:00
Christopher Dunn
a8104a8035 1.5.1 <- 1.5.0
Fixed Builders ::validate(). Added tests.
2015-03-04 21:18:05 -06:00
Christopher Dunn
7b22768c33 test Builders::validate() 2015-03-04 21:17:03 -06:00
Christopher Dunn
19c49a459d fix Builders::validate()
(cherry picked from commit 626cfcdbb8)
2015-03-04 21:14:53 -06:00
Christopher Dunn
99822b27cd clarify python version 2015-03-03 16:17:11 -06:00
Christopher Dunn
8a70297869 fix inline doxygen comments 2015-03-03 16:17:08 -06:00
Christopher Dunn
24f544996f no struct init
The C struct initializer is not standard C++.
GCC and Clang handle this (at least in some versions) but some
compilers might not.
2015-03-03 10:15:09 -06:00
Christopher Dunn
0c91927da2 assertions should be logic_error 2015-03-03 09:45:33 -06:00
Christopher Dunn
493f6dcebe keep StaticString (!allocated_) for copy ctor 2015-03-03 09:36:22 -06:00
Christopher Dunn
eaa06355e1 test Json::Value::null 2015-03-03 08:45:52 -06:00
Christopher Dunn
effd732aa1 null -> nullRef 2015-03-03 01:25:33 -06:00
Christopher Dunn
70cd04d49a 1.5.0 <- 1.4.4 2015-03-03 00:45:57 -06:00
Christopher Dunn
9e49e3d84a Merge pull request #197 from cdunn2001/allow-zero
allow zeroes in strings
2015-03-03 00:44:19 -06:00
Christopher Dunn
2d653bd15d fix security hole for string-key-lengths > 2^30 2015-03-03 00:14:54 -06:00
Christopher Dunn
585b267595 tests for zeroes
* ValueTest/zeroes
* ValueTest/zeroesInKeys
2015-03-03 00:14:54 -06:00
Christopher Dunn
c28610fb5d fix StaticString test
* support zeroes in string_
* support zeroes in writer; provide getString(char**, unsigned*)
* valueToQuotedStringN(), isCC0(), etc
* allow zeroes for cpptl ConstString
* allocated => non-static
2015-03-03 00:14:54 -06:00
Christopher Dunn
a53283568f cp duplicateStringValue() 2015-03-03 00:14:53 -06:00
Christopher Dunn
ef21fbc785 doc new behavior 2015-03-03 00:14:53 -06:00
Christopher Dunn
25342bac13 support UTF-8 for const methods 2015-03-03 00:14:53 -06:00
Christopher Dunn
2b9abc3ebf Merge pull request #195 from cdunn2001/len-in-czstring
Store length in CZString
2015-03-03 00:07:03 -06:00
Christopher Dunn
e6b46e4503 stop computing strlen() in CZString 2015-03-02 23:50:59 -06:00
Christopher Dunn
8a77037320 actually store length in CZString 2015-03-02 23:50:59 -06:00
Christopher Dunn
57ad051f67 allow length in CZString 2015-03-02 23:50:59 -06:00
Christopher Dunn
b383fdc61e use memcmp in CZString
This is a loss of efficiency, but it prepares for an increase when we
have stored lengths.
2015-03-02 23:50:59 -06:00
Mattes D
5d79275a5b Added MSVC+CMake generated files to .gitignore.
pull #193
2015-03-02 23:50:14 -06:00
Christopher Dunn
c1e834a110 Merge pull request #194 from cdunn2001/master
test StaticString
2015-03-02 17:14:36 -06:00
Christopher Dunn
0570f9eefb test StaticString 2015-03-02 17:06:09 -06:00
Christopher Dunn
b947b0b3df Merge pull request #192 from marklakata/size_t
changed length argument of duplicateStringValue from unsigned int to siz...
2015-03-02 17:03:24 -06:00
Mark Lakata
8051cf6ba7 changed length argument of duplicateStringValue from unsigned int to size_t, to avoid warnings in Visual Studio. 2015-03-02 11:57:17 -08:00
Christopher Dunn
c8bf184ea9 Merge pull request #189 from open-source-parsers/skip-py-tests
skip python jsontestrunner by default
2015-02-26 09:55:21 -06:00
Christopher Dunn
9998094eee skip python jsontestrunner by default
To run these tests, in cmake build-dir:
    make jsoncpp_check

TravisCI is now set to run these always.

For now, the test_json_lib unit-tests will still run.

issue #187
2015-02-26 09:41:45 -06:00
Christopher Dunn
4ac4cac2be Merge pull request #185 from open-source-parsers/drop-internal-map
`JSON_VALUE_USE_INTERNAL_MAP` and `JSON_USE_SIMPLE_INTERNAL_ALLOCATOR` did not actually compile with current compilers. In fact, as far as I can tell, they haven't worked since just after the 0.6.0-rc2 release. And according to blep, there are bugs, e.g. [this one](https://sourceforge.net/p/jsoncpp/bugs/27). It's probably better to hide the implementation of Value (maybe in the next major release) and to put experiments like this into completely separate classes.
2015-02-25 10:12:20 -06:00
Christopher Dunn
4788764844 drop JSON_VALUE_USE_INTERNAL_MAP, JSON_USE_SIMPLE_INTERNAL_ALLOCATOR
And remove some old headers.

These were not actually compiling anymore, and there were outstanding,
known bugs, e.g. https://sourceforge.net/p/jsoncpp/bugs/27
2015-02-25 10:04:13 -06:00
Christopher Dunn
6c898441e3 test-amalgamate
Not tested via Travis (since this is outside cmake), but at least
we can more easily validate changes to `amalgamate.py`.
2015-02-25 10:04:12 -06:00
Christopher Dunn
7d1f656859 Merge pull request #183 from open-source-parsers/allow-single-quote
Allow single quote
2015-02-24 18:24:42 -06:00
Christopher Dunn
0c66e698fb allowSingleQuotes
issue #182
2015-02-24 15:49:45 -06:00
Christopher Dunn
b9229b7400 failing test for allowSingleQuotes 2015-02-23 16:40:06 -06:00
Christopher Dunn
f9db82af17 1.4.4 < 1.4.3
* fixed Readers when `allowDropNullPlaceholders`
2015-02-19 15:37:57 -06:00
Christopher Dunn
ae71879549 Merge pull request #179 from cdunn2001/drop-null
Fix failures for `allowDroppedNullPlaceholders`.
2015-02-19 12:38:16 -06:00
Christopher Dunn
7b3683ccd1 apply fix to old Reader 2015-02-19 11:37:17 -06:00
Christopher Dunn
58499031a4 fix all cases from issue -- all pass! 2015-02-19 11:35:28 -06:00
Christopher Dunn
2c8c1ac0ec all 8 failing cases from issue 2015-02-19 11:35:20 -06:00
Christopher Dunn
c58e93b014 fix failing object case 2015-02-19 11:34:35 -06:00
Christopher Dunn
eed193e151 object cases; 1st passes, 2nd fails 2015-02-19 11:34:35 -06:00
Christopher Dunn
4382a7b585 cases 0,1,5,9,12 from issue -- passing 2015-02-19 11:32:42 -06:00
Christopher Dunn
30d923f155 1.4.3 <- 1.4.2 2015-02-18 09:20:28 -06:00
Christopher Dunn
2f4e40bc95 fix typo: issue #177 2015-02-18 09:17:06 -06:00
Christopher Dunn
505e086ebc Merge pull request #173 from cdunn2001/fix-include-amal
help use of amalgamated header

Reviewed by "Lightness Races in Orbit" at StackOverflow.
2015-02-15 13:11:38 -06:00
Christopher Dunn
c6582415d8 fix typos 2015-02-15 13:06:48 -06:00
Christopher Dunn
0ee7e2426f help use of amalgamated header
* Include from same-directory first, so `-I DIR` is rarely needed.
* Double-check that proper header has been included.

http://stackoverflow.com/questions/28510109/linking-problems-with-jsoncpp
2015-02-15 13:06:48 -06:00
Christopher Dunn
1522e4dfb1 Merge pull request #174 from kobigurk/kobigurk-patch-asserts
only throws exceptions JSON_USE_EXCEPTION
2015-02-15 12:05:28 -06:00
Kobi Gurkan
09b8670536 only throws exceptions JSON_USE_EXCEPTION
JSON_ASSERT now throws a runtime_error
2015-02-15 18:00:31 +02:00
Christopher Dunn
f164288646 help rebasing 2015-02-15 03:01:26 -06:00
Christopher Dunn
3bfd215938 1.4.2 < 1.4.1
* Bug-fix for ValueIterator::operator-() (issue #169)
2015-02-15 02:49:34 -06:00
Christopher Dunn
400b744195 Merge pull request #172 from cdunn2001/master
Fix bug in ValueIteratorBase::operator- 

Fixes #169.
2015-02-15 02:44:17 -06:00
Christopher Dunn
bd55164089 reverse sense for CPPTL too 2015-02-15 02:38:31 -06:00
Kevin Grant
4c5832a0be Fix bug in ValueIteratorBase::operator- 2015-02-15 02:38:31 -06:00
Christopher Dunn
8ba9875962 IteratorTest 2015-02-15 02:38:31 -06:00
Christopher Dunn
9c91b995dd rules for signing and doc-generation 2015-02-14 10:20:45 -06:00
Christopher Dunn
e7233bf056 1.4.1 <- 1.4.0 2015-02-13 10:00:38 -06:00
Christopher Dunn
9c90456890 Merge pull request #167 from cdunn2001/fail-if-extra
Add `failIfExtra` feature to `CharReaderBuilder`.
2015-02-13 09:55:11 -06:00
Christopher Dunn
f4be815c86 failIfExtra
1. failing regression tests, from #164 and #107
2. implemented; tests pass
3. allow trailing comments
2015-02-13 09:39:08 -06:00
Christopher Dunn
aa13a8ba40 comments/minor typos 2015-02-13 09:38:49 -06:00
Christopher Dunn
da0fcfbaa2 link web docs 2015-02-12 11:45:21 -06:00
Christopher Dunn
3ebba5cea8 stop calling validate() in newReader/Writer()
By not calling validate(), we can add
non-invasive features which will be simply ignored when user-code
is compiled against an old version. That way, we can often
avoid a minor version-bump.

The user can call validate() himself if he prefers that behavior.
2015-02-11 11:15:32 -06:00
Christopher Dunn
acbf4eb2ef Merge pull request #166 from cdunn2001/stackLimit
Fixes #88 and #56.
2015-02-11 10:35:16 -06:00
Christopher Dunn
56df206847 limit stackDepth for old (deprecated) Json::Reader too
This is an improper solution. If multiple Readers exist,
then the effect stackLimit is reduced because of side-effects.
But our options are limited. We need to address the security
hole without breaking binary-compatibility.

However, this is not likely to cause any practical problems because:

* Anyone using `operator>>(istream, Json::Value)` will be using the
new code already
* Multiple Readers are uncommon.
* The stackLimit is quite high.
* Deeply nested JSON probably would have hit the system limits anyway.
2015-02-11 10:20:53 -06:00
Christopher Dunn
4dca80da49 limit stackDepth 2015-02-11 10:20:47 -06:00
Christopher Dunn
249ad9f47f stackLimit 2015-02-11 10:01:58 -06:00
Christopher Dunn
99b8e856f6 stackLimit_ 2015-02-11 10:01:58 -06:00
Christopher Dunn
89b72e1653 test stackLimit 2015-02-11 10:01:58 -06:00
Christopher Dunn
2474989f24 Old -> Our 2015-02-11 09:48:24 -06:00
Christopher Dunn
315b8c9f2c 1st StreamWriterTest 2015-02-10 23:29:14 -06:00
Christopher Dunn
29501c4d9f clarify comments
And throw instead of return null for invalid settings.
2015-02-10 23:03:27 -06:00
Christopher Dunn
7796f20eab Merge pull request #165 from cdunn2001/master
Remove some experimental classes that are not needed for 1.4.0. This also helps 0.8.0 binary compatibility with 0.6.0-rc2.
2015-02-10 22:45:32 -06:00
Christopher Dunn
20d09676c2 drop experimental OldCompressingStreamWriterBuilder 2015-02-10 21:29:35 -06:00
Christopher Dunn
5a744708fc enableYAMLCompatibility and dropNullPlaceholders for StreamWriterBuilder 2015-02-10 21:28:13 -06:00
Christopher Dunn
07f0e9308d nullRef, since we had to add that kludge to 0.8.0 2015-02-10 21:28:13 -06:00
Christopher Dunn
052050df07 copy Features to OldFeatures 2015-02-10 17:01:08 -06:00
Christopher Dunn
435d2a2f8d passes 2015-02-10 17:01:08 -06:00
Christopher Dunn
6123bd1505 copy Reader impl to OldReader 2015-02-10 17:01:08 -06:00
Christopher Dunn
7477bcfa3a renames for OldReader 2015-02-10 17:01:08 -06:00
Christopher Dunn
5e3e68af2e OldReader copied from Reader 2015-02-10 17:01:08 -06:00
Christopher Dunn
04a607d95b Merge pull request #163 from cdunn2001/master
Reimplement the new Builders.

Issue #131.
2015-02-09 18:55:55 -06:00
Christopher Dunn
db75cdf21e mv CommentStyle to .cpp 2015-02-09 18:54:58 -06:00
Christopher Dunn
c41609b9f9 set output stream in write(), not in builder 2015-02-09 18:44:53 -06:00
Christopher Dunn
b56381a636 <stdexcept> 2015-02-09 18:29:11 -06:00
Christopher Dunn
f757c18ca0 add all features 2015-02-09 18:24:56 -06:00
Christopher Dunn
3cf9175bde remark defaults via doxygen snippet 2015-02-09 18:16:24 -06:00
Christopher Dunn
a9e1ab302d Builder::settings_
We use Json::Value to configure the builders so we can maintain
binary-compatibility easily.
2015-02-09 17:30:11 -06:00
Christopher Dunn
694dbcb328 update docs, writeString() 2015-02-09 15:25:57 -06:00
Christopher Dunn
732abb80ef Merge pull request #162 from cdunn2001/master
Deprecate the new Builders.
2015-02-09 11:55:54 -06:00
Christopher Dunn
f3b3358a0e deprecate current Builders 2015-02-09 11:51:06 -06:00
Christopher Dunn
1357cddf1e deprecate Builders
see issue #131
2015-02-09 11:46:27 -06:00
Christopher Dunn
8df98f6112 deprecate old Reader; separate Advanced Usage section 2015-02-09 11:15:39 -06:00
Christopher Dunn
16bdfd8af3 --in=doc/web_doxyfile.in 2015-02-09 11:15:11 -06:00
Christopher Dunn
ce799b3aa3 copy doxyfile.in 2015-02-09 10:36:55 -06:00
Christopher Dunn
3a65581b20 drop an old impl 2015-02-09 09:54:26 -06:00
Christopher Dunn
6451412c99 simplify basic docs 2015-02-09 09:44:26 -06:00
Christopher Dunn
66a8ba255f clarify Builders 2015-02-09 01:29:43 -06:00
Christopher Dunn
249fd18114 put version into docs 2015-02-09 00:50:27 -06:00
Christopher Dunn
a587d04f77 Merge pull request #161 from cdunn2001/master
CharReader/Builder

I guess we should but the patch-level version. We will set the version properly soon...
2015-02-08 13:25:08 -06:00
Christopher Dunn
2c1197c2c8 CharReader/Builder
* CharReaderBuilder is similar to StreamWriterBuilder.
* use rdbuf(), since getline(string) is not required to handle EOF as delimiter
2015-02-08 13:22:09 -06:00
Christopher Dunn
2a94618589 Merge pull request #160 from cdunn2001/master
rm unique_ptr<>/shared_ptr<>, for pre-C++11
2015-02-08 13:10:18 -06:00
Christopher Dunn
dee4602b8f rm unique_ptr<>/shared_ptr<>, for pre-C++11 2015-02-08 11:54:49 -06:00
Christopher Dunn
ea2d167a38 Merge pull request #158 from cdunn2001/travis-with-cmake-package
JSONCPP_WITH_CMAKE_PACKAGE in Travis

I guess we don't really need to shared and static separately either. Saves a little time, maybe?
2015-02-07 12:24:58 -06:00
Christopher Dunn
41edda5ebe JSONCPP_WITH_CMAKE_PACKAGE in Travis 2015-02-07 12:18:20 -06:00
Christopher Dunn
2941cb3fe2 Merge pull request #156 from cdunn2001/with-cmake-package
fix JSONCPP_WITH_CMAKE_PACKAGE #155
2015-02-07 11:44:24 -06:00
Christopher Dunn
636121485c fix JSONCPP_WITH_CMAKE_PACKAGE #155
mv JSONCPP_WITH_CMAKE_PACKAGE ahead of INSTALL def.
2015-02-07 11:39:16 -06:00
Christopher Dunn
fe855fb4dd drop nullptr
See issue #153.
2015-02-02 15:33:47 -06:00
Christopher Dunn
198cc350c5 drop scoped enum, for pre-C++11 compatibility 2015-01-29 13:49:21 -06:00
Peter Spiess-Knafl
5e8595c0e2 added cmake option to build static and shared libraries at once
See #147 and #149.
2015-01-27 18:22:43 -06:00
Christopher Dunn
38042b3892 docs 2015-01-26 11:38:38 -06:00
Christopher Dunn
3b5f2b85ca Merge pull request #145 from cdunn2001/simplify-builder
Simplify builder
2015-01-26 11:33:16 -06:00
Christopher Dunn
7eca3b4e88 gcc-4.6 (Travis CI) does not support 2015-01-26 11:17:42 -06:00
Christopher Dunn
999f5912f0 docs 2015-01-26 11:12:53 -06:00
Christopher Dunn
472d29f57b fix doc 2015-01-26 11:04:03 -06:00
Christopher Dunn
6065a1c142 make StreamWriterBuilder concrete 2015-01-26 11:01:15 -06:00
Christopher Dunn
28a20917b0 Move old FastWriter stuff out of new Builder 2015-01-26 10:47:42 -06:00
Christopher Dunn
177b7b8f22 OldCompressingStreamWriterBuilder 2015-01-26 10:44:20 -06:00
Christopher Dunn
9da9f84903 improve docs
including `writeString()`
2015-01-26 10:43:53 -06:00
Christopher Dunn
54b8e6939a Merge pull request #132 from cdunn2001/builder
StreamWriter::Builder

Deprecate old Writers, but include them in tests.

This should still be binary-compatible with 1.3.0.
2015-01-25 18:52:09 -06:00
Christopher Dunn
c7b39c2e25 deprecate old Writers
also, use withers instead of setters, and update docs
2015-01-25 18:45:59 -06:00
Christopher Dunn
d78caa3851 implement strange setting from FastWriter 2015-01-25 18:15:54 -06:00
Christopher Dunn
c6e0688e5a implement CommentStyle::None/indentation_=="" 2015-01-25 17:32:36 -06:00
Christopher Dunn
1e21e63853 default \t indentation, All comments 2015-01-25 16:01:59 -06:00
Christopher Dunn
dea6f8d9a6 incorporate 'proper newlines for comments' into new StreamWriter 2015-01-25 15:55:18 -06:00
Christopher Dunn
648843d148 clarify CommentStyle 2015-01-25 15:54:40 -06:00
Christopher Dunn
fe3979cd8a drop StreamWriterBuilderFactory, for now 2015-01-25 15:54:40 -06:00
Christopher Dunn
94665eab72 copy fixes from StyledStreamWriter 2015-01-25 15:54:40 -06:00
Christopher Dunn
9e4bcf354f test BuiltStyledStreamWriter too 2015-01-25 15:54:40 -06:00
Christopher Dunn
9243d602fe const stuff 2015-01-25 15:54:40 -06:00
Christopher Dunn
beb6f35c63 non-const write 2015-01-25 15:54:40 -06:00
Christopher Dunn
ceef7f5219 copied impl of StyledStreamWriter 2015-01-25 15:54:40 -06:00
Christopher Dunn
77ce057f14 fix comment 2015-01-25 15:54:40 -06:00
Christopher Dunn
d49ab5aee1 use new BuiltStyledStreamWriter in operator<<() 2015-01-25 15:54:40 -06:00
Christopher Dunn
4d649402b0 setIndentation() 2015-01-25 15:54:40 -06:00
Christopher Dunn
489707ff60 StreamWriter::Builder 2015-01-25 15:54:39 -06:00
Christopher Dunn
5fbfe3cdb9 StreamWriter 2015-01-25 15:54:39 -06:00
Christopher Dunn
948f29032e update docs 2015-01-25 15:54:07 -06:00
Christopher Dunn
964affd333 add back space before trailing comment 2015-01-25 15:49:02 -06:00
Christopher Dunn
c038e08efc Merge pull request #144 from cdunn2001/proper-comment-lfs
proper newlines for comments

This alters `StyledStreamWriter`, but not `StyledWriter`.
2015-01-25 15:10:38 -06:00
Christopher Dunn
74c2d82e19 proper newlines for comments
The logic is still messy, but it seems to work.
2015-01-25 15:05:09 -06:00
Christopher Dunn
30726082f3 Merge pull request #143 from cdunn2001/rm-trailing-newlines
rm trailing newlines for *all* comments
2015-01-25 14:35:24 -06:00
Christopher Dunn
1e3149ab75 rm trailing newlines for *all* comments
This will make it easier to fix newlines consistently.
2015-01-25 14:32:13 -06:00
Christopher Dunn
7312b1022d Merge pull request #141 from cdunn2001/set-comment
Fix a border case which causes Value::CommentInfo::setComment() to crash
2015-01-25 11:37:02 -06:00
datadiode
2f046b584d Fix a border case which causes Value::CommentInfo::setComment() to crash
re: pull #140
2015-01-25 11:19:51 -06:00
Christopher Dunn
dd91914b1b TravisCI gcc-4.6 does not yet support -Wpedantic 2015-01-25 10:34:49 -06:00
Christopher Dunn
2a46e295ec Merge pull request #139 from cdunn2001/some-python-changes
Some python changes.

* Better messaging.
* Make `doxybuild.py` work with python3.4
2015-01-24 16:24:12 -06:00
Christopher Dunn
f4bc0bf4ec README.md 2015-01-24 16:21:12 -06:00
Christopher Dunn
f357688893 make doxybuild.py work with python3.4 2015-01-24 16:21:12 -06:00
Florian Meier
bb0c80b3e5 Doxybuild: Error message if doxygen not found
This patch introduces a better error message.

See discussion at pull #129.
2015-01-24 16:21:12 -06:00
Christopher Dunn
ff5abe76a5 update doxbuild.py 2015-01-24 16:21:12 -06:00
Christopher Dunn
9cc0bb80b2 update TarFile usage 2015-01-24 16:21:12 -06:00
Christopher Dunn
494950a63d rm extra whitespace in python, per PEP8 2015-01-24 16:21:12 -06:00
Christopher Dunn
7d82b14726 fix issue #90
We are static-casting to U, so we really have no reason to use
references.

However, if this comes up again, try applying -ffloat-store to
the target executable, per
    https://github.com/open-source-parsers/jsoncpp/issues/90
2015-01-24 14:34:54 -06:00
Christopher Dunn
2bc6137ada fix gcc warnings 2015-01-24 13:42:37 -06:00
Christopher Dunn
201904bfbb Merge pull request #138 from cdunn2001/fix-103
Fix #103.
2015-01-23 14:51:31 -06:00
Christopher Dunn
216ecd3085 fix test_comment_00 for #103 2015-01-23 14:28:44 -06:00
Christopher Dunn
8d15e51228 add test_comment_00
one-element array with comment, for issue #103
2015-01-23 14:28:21 -06:00
Christopher Dunn
9fbd12b27c Merge pull request #137 from cdunn2001/avoid-extra-newline
Avoid extra newline
2015-01-23 14:24:52 -06:00
Christopher Dunn
f8ca6cbb25 1.4.0 <- 1.3.0
Minor version bump, but we will wait for a few more commits this time
before tagging the release.
2015-01-23 14:23:31 -06:00
Christopher Dunn
d383056fbb avoid extra newlines in StyledStreamWriter
Add indented_ as a bitfield. (Verified that sizeof(StyledStreamWriter)
remains 96 for binary compatibility. But the new symbol requires a minor
version-bump.)
2015-01-23 14:23:31 -06:00
Christopher Dunn
ddb4ff7dec Merge pull request #136 from cdunn2001/test-both-styled-writers
Test both styled writers

Not only does this now test StyledStreamWriter the same way as StyledWriter, but it also makes the former work more like the latter, indenting separate lines of a comment before a value. Might break some user tests (as `operator<<()` uses `StyledStreamWriter`) but basically a harmless improvement.

All tests pass.
2015-01-23 13:55:45 -06:00
Christopher Dunn
3efc587fba make StyledStreamWriter work more like StyledWriter
tests pass
2015-01-23 13:36:10 -06:00
Christopher Dunn
70704b9a70 test both StyledWriter and StyledStreamWriter 2015-01-23 13:36:10 -06:00
Christopher Dunn
ac6bbbc739 show cmd in runjsontests.py 2015-01-23 13:36:10 -06:00
Christopher Dunn
26c52861b9 pass --json-writer StyledWriter 2015-01-23 13:36:10 -06:00
Christopher Dunn
3682f60927 --json-writer arg 2015-01-23 13:36:10 -06:00
Christopher Dunn
58c31ac550 mv try-block 2015-01-23 12:35:12 -06:00
Christopher Dunn
08cfd02d8c fix minor bugs in test-runner 2015-01-23 12:35:12 -06:00
Christopher Dunn
79211e1aeb Options class for test 2015-01-23 12:35:12 -06:00
Christopher Dunn
632c9b5032 cleaner 2015-01-23 12:35:12 -06:00
Christopher Dunn
05810a7607 cleaner 2015-01-23 12:35:12 -06:00
Christopher Dunn
942e2c999a unindent test-code 2015-01-23 12:35:12 -06:00
Christopher Dunn
2160c9a042 switch from StyledWriter to StyledStream writer in tests 2015-01-23 09:02:44 -06:00
Christopher Dunn
ee8b58f82f Merge pull request #135 from cdunn2001/removeMember
Deprecate old `removeMember()`. Add new.

[Deprecated methods will be removed at the next major version bump](http://apr.apache.org/versioning.html#binary).
2015-01-22 19:26:46 -06:00
Christopher Dunn
9132aa94b1 1.3.0
http://apr.apache.org/versioning.html#binary
2015-01-22 19:25:44 -06:00
Christopher Dunn
76746b09fc deprecate old removeMember() 2015-01-22 19:25:44 -06:00
Christopher Dunn
70b795bd45 Merge pull request #133 from cdunn2001/travis-11
upgrade -std=c++ version
2015-01-22 19:21:24 -06:00
Christopher Dunn
26842530f2 upgrade -std=c++ version
Travis CI does not yet support gcc-4.8, needed for c++11, so we
will try c++0x for now.
2015-01-22 19:12:23 -06:00
Christopher Dunn
e3f24286c1 Merge pull request #130 from connormanning/master
Build without warnings with -pedantic enabled.
2015-01-22 11:48:58 -06:00
Connor Manning
00b8ce81db Build without warnings with -pedantic enabled. 2015-01-22 10:48:45 -06:00
Christopher Dunn
40810fe326 Merge pull request #127 from cdunn2001/removeIndex
`Value::removeIndex()`

See issue #28.
2015-01-21 16:08:25 -06:00
Christopher Dunn
59167d8627 more changes per cr 2015-01-21 16:05:08 -06:00
Christopher Dunn
05c1b8344d drop this-> (team preference) 2015-01-21 15:43:48 -06:00
Christopher Dunn
e893625e88 test removeIndex/Member() 2015-01-20 17:04:03 -06:00
Christopher Dunn
e87e41cdb0 from Itzik S; see issue #28
with minor corrections
2015-01-20 17:03:58 -06:00
Christopher Dunn
9de2c2d84d partial 2015-01-20 17:02:48 -06:00
Christopher Dunn
7956ccd61e allow stream ops for JSON_FAIL_MESSAGE
http://www.iar.com/Global/Resources/Developers_Toolbox/C_Cplusplus_Programming/Tips%20and%20tricks%20using%20the%20preprocessor%20%28part%20two%29.pdf
2015-01-20 16:25:26 -06:00
datadiode
9454e687a3 Specialize std::swap() for Json::Value in a C++ standard compliant way
originally from pull #119
2015-01-20 15:25:41 -06:00
Christopher Dunn
46a925ba4a fix compiler warning for a test 2015-01-20 15:19:22 -06:00
Christopher Dunn
c407f1407f test-data for #103
passes
2015-01-20 15:16:46 -06:00
Christopher Dunn
ec251df6b7 Merge pull request #125 from cdunn2001/issue-116
1.2.1

Fix issue #116, DOS line-endings. Never output \r.
2015-01-20 15:14:50 -06:00
Christopher Dunn
51c0afab22 1.2.1 <- 1.2.0
This can affect existing round-trip tests, but we never made any
guarantees about whitespace constancy.
2015-01-20 15:12:49 -06:00
Mark Zeren
e39fb0083c Normalize comment EOLs while reading instead of while writing
Tests are currently failing when git cloning on Windows with autocrlf = true. In
that setup multiline comments contain \r\n EOLs. The test code assumes that
comments contain \n EOLs and opens the .actual files (etc.) with "wt" which
converts \n to \r\n. Thus we end up with \r\r\n EOLs in the output, which
triggers a test failure.

Instead we should cannonicalize comments while reading so that they contain only
\n EOLs. This approach simplifies other parts of the reader and writer logic,
and requires no changes to the test. It is a breaking change, but probably the
Right Thing going forward.

This change also fixes dereferencing past the end of the comment string in
StyledWriter::writeCommentBeforeValue.

Tests should be added with appropriate .gitattributes for the input files to
ensure that we run tests for DOS, Mac, and Unix EOL files on all platforms. For
now this change is enough to unblock Windows builds.

issue #116
2015-01-20 13:45:44 -06:00
Christopher Dunn
ec727e2f6b -Wall for Clang/GCC 2015-01-20 13:45:28 -06:00
Christopher Dunn
4ce4bb8404 Merge pull request #124 from cdunn2001/assign-with-comments
1.2.0

 `operator=()` (which already performed a deep-copy) now includes comments. This change is probably harmless in all practical cases. But just in case, we bump the minor version.

Address #47.
2015-01-20 12:49:51 -06:00
Christopher Dunn
2cd0f4ec21 1.2.0 <- 1.1.1
`operator=()` (which already performed a deep-copy) now includes
comments. The change is probably harmless in all practical cases.
2015-01-20 12:44:51 -06:00
Christopher Dunn
836f0fb863 fix comments before several types
tests pass
2015-01-20 12:23:44 -06:00
Christopher Dunn
37644abd77 test comment before several types
* array
* double
* string
* true
* false
* null
2015-01-20 12:23:18 -06:00
Christopher Dunn
66eb72f121 use SwapPayload() to retain comments
All tests pass, but we might be missing coverage.

issue #47
2015-01-20 12:07:01 -06:00
Christopher Dunn
94b0297dc5 Revert "consider these as binary, so git will not alter line-endings"
This reverts commit 8f3aa220db.

We will find a better fix for #116. In the meantime, we want to see
diffs for changes to test-data.
2015-01-20 12:06:12 -06:00
Christopher Dunn
55db3c3cb2 Merge pull request #118 from datadiode/47_fix_value_swap
swap comments too

* Changed `operator=` to exclude start/limit, which should never have been added.
* Changed `swap` to include comments. Hmm. That affects efficiency (but *not* for `operator=`) and probably nothing else in practice.

- issue #47
2015-01-18 11:31:47 -06:00
datadiode
c07ef37904 https://github.com/open-source-parsers/jsoncpp/issues/47 2015-01-18 10:05:25 +01:00
Christopher Dunn
62ab94ddd3 Merge pull request #117 from datadiode/integration
Simplify Reader::decodeNumber() / Remove unused functions
2015-01-17 08:10:59 -06:00
datadiode
09d352ac13 Remove unused functions 2015-01-17 13:26:23 +01:00
datadiode
50753bb808 Simplify Reader::decodeNumber() 2015-01-17 13:21:42 +01:00
Christopher Dunn
8f3aa220db consider these as binary, so git will not alter line-endings
issue #116
2015-01-16 16:29:07 -06:00
Christopher Dunn
73e127892e Merge branch 'fix-fail31' 2015-01-16 15:10:56 -06:00
Christopher Dunn
4997dfb8af 1.1.1 <- 1.1.0
slight change to fail on a bad float
2015-01-16 15:09:54 -06:00
datadiode
c1441ef5e0 stricter float parsing
fixes `test/jsonchecker/fail31.json`
(issue #113)
2015-01-16 15:05:12 -06:00
Christopher Dunn
e0bfb45000 Merge branch 'py3/2' 2015-01-16 14:53:22 -06:00
Christopher Dunn
4bc311503c just in case 2015-01-16 14:53:04 -06:00
Christopher Dunn
cd140b5141 py2 and py3 2015-01-16 14:52:56 -06:00
datadiode
01aee4a0dc Fix Python test scripts for Python 3 and Windows 2015-01-16 09:57:42 -06:00
Christopher Dunn
59a01652ab Merge pull request #114 from Gachapen/fix_cmake_output_dir
CMake: Remove set(CMAKE_*_OUTPUT_DIRECTORY)
2015-01-15 20:17:34 -06:00
Magnus Bjerke Vik
8371a4337c CMake: Remove set(CMAKE_*_OUTPUT_DIRECTORY)
With set(CMAKE_*_OUTPUT_DIRECTORY) when using jsoncpp as a sub project,
the parent project's executables and libraries will also be outputed to
jsoncpp's directory. By removing this, it is up to the parent projects
to decide where to put their and jsoncpp's executables and libraries.
2015-01-15 20:16:54 -06:00
Christopher Dunn
dc2e1c98b9 Merge pull request #111 from open-source-parsers/quotes-spaces-fixed
Quotes spaces fixed
2015-01-09 22:36:40 -06:00
Christopher Dunn
d98b5f4230 quote spaces in commands for Windows
See comments at:
    1a4dc3a888
2015-01-09 22:32:10 -06:00
Christopher Dunn
4ca9d25ccc Revert "Merge pull request #108 from open-source-parsers/quote-spaces"
This reverts commit dfc5f879c1, reversing
changes made to 0f6884f771.
2015-01-09 22:28:20 -06:00
Christopher Dunn
6eaf150dc7 Merge pull request #109 from open-source-parsers/double-string-double
Double string double
2015-01-06 12:54:39 -06:00
Christopher Dunn
8b489f891a 1.1.0 <- 1.0.0 2015-01-06 12:46:17 -06:00
Christopher Dunn
65cee6ea16 fix double->string->double round-trip (bump minor ver.)
See #98.
  http://stackoverflow.com/questions/747470/what-is-the-meaning-of-numeric-limitsdoubledigits10/16941784#16941784
2015-01-06 12:40:36 -06:00
Christopher Dunn
dfc5f879c1 Merge pull request #108 from open-source-parsers/quote-spaces
quote cmdline arg
2015-01-06 12:14:48 -06:00
Christopher Dunn
1a4dc3a888 quote cmdline arg
See #99.
2015-01-06 12:11:12 -06:00
Christopher Dunn
0f6884f771 Merge pull request #106 from Gachapen/fix_cmake_install
Fix cmake_package install being broken because of wrong include path.
2015-01-06 11:50:13 -06:00
Magnus Bjerke Vik
748328a0d1 Fix cmake_package install being broken because of wrong include path.
The TARGET_INCLUDE_DIRECTORIES from inside the
IF(JSONCPP_WITH_CMAKE_PACKAGE) block was removed, since it only needs to
be set once.

In addition the CMAKE_VERSION check was simplified.
2015-01-06 09:51:44 +01:00
Christopher Dunn
f44278cd4e Merge pull request #101 from dominicpezzuto/master
Fix build issues related to Solaris and older GCC
2015-01-03 14:45:41 -06:00
dominicpezzuto
d2b6992f3e Fix build issues related to Solaris and older GCC
Fixed two build issues:
 - JsonCPP currently doesn’t compile for Solaris due to platform
differences with ‘isfinite’ function.  Fixed by adding proper include
and define for Solaris.
 - JsonCPP currently doesn’t compile for GCC version 4.1.2 and earlier
due to use of ‘-Werror=*’ compile flag, which was introduced in a later
version.  Fixed by adding version check to only add this flag on
supported versions of GCC.
2014-12-27 16:45:40 -05:00
dominicpezzuto
54764dd85b Fix build issues related to Solaris and older GCC
Fixed two build issues:
 - JsonCPP currently doesn’t compile for Solaris due to platform
differences with ‘isfinite’ function.  Fixed by adding proper include
and define for Solaris.
 - JsonCPP currently doesn’t compile for GCC version 4.1.2 and earlier
due to use of ‘-Werror=*’ compile flag, which was introduced in a later
version.  Fixed by adding version check to only add this flag on
supported versions of GCC.
2014-12-27 16:44:26 -05:00
Christopher Dunn
8dd32e1e2e Merge pull request #94 from Gachapen/cmake_target_include
CMake: Add include directory to jsoncpp_lib target

Well-researched. Passes Travis CI.
2014-12-24 01:30:13 -06:00
Magnus Bjerke Vik
3fd7f8b470 CMake: Only add include directory to jsoncpp_lib target if CMake version supports the command. 2014-12-16 08:58:52 +01:00
Magnus Bjerke Vik
e99e6d9cc6 CMake: Add include directory to jsoncpp_lib target so that it can be easier used with other projects. 2014-12-03 15:42:41 +01:00
Christopher Dunn
9ca1aaab14 Merge pull request #93 from akien-mga/master
Small packaging improvements
2014-12-02 00:40:08 -06:00
Rémi Verschelde
27639ce578 Add support for BUILD_SHARED_LIBS argument
BUILD_SHARED_LIBS is a standard CMake argument that serves the purpose
of the custom JSONCPP_LIB_BUILD_SHARED. For now we force JSONCPP_LIB_BUILD_SHARED
to true if BUILD_SHARED_LIBS was defined.
Workaround for #51.
2014-12-01 23:47:21 +01:00
Rémi Verschelde
f8a3a599ac Adapt libdir for 64bit RPM-based distros
RPM-based distros such as Fedora or Mageia put 64bit libraries in /usr/lib64
while 32bit libraries go to /usr/lib. This is usually taken into account
in CMake projects using a LIB_SUFFIX parameter that can be set to "" or "64".
2014-12-01 23:44:08 +01:00
Christopher Dunn
7165f6ac4c 1.0.0 2014-11-20 08:45:58 -06:00
Christopher Dunn
37a9fa9f9d 1.0.0 2014-11-20 00:20:51 -06:00
xiaoyur347
83683da13f fix gcc warning when CXXFLAGS contains '-Wextra'
json_value.cpp:179:26: warning: enumeral and non-enumeral type in conditional expression [enabled by default]

https://github.com/open-source-parsers/jsoncpp/pull/84
2014-11-19 23:59:34 -06:00
Christopher Dunn
e5de78db82 Merge pull request #87 from cdunn2001/master
2to3 (but only the changes which should work with python2 also)
2014-11-19 23:54:56 -06:00
Christopher Dunn
ffd7295ab8 simple 2014-11-19 23:35:56 -06:00
Christopher Dunn
433876866d ws 2014-11-19 23:34:15 -06:00
Christopher Dunn
bd1e895287 simple py3 changes 2014-11-19 23:30:47 -06:00
Christopher Dunn
9aa6144b2a python except as 2014-11-19 23:10:02 -06:00
Christopher Dunn
5fda247dab Merge pull request #79 from ya1gaurav/patch-2
Remove gcc compilation warnings in json_reader.cpp
2014-11-18 00:14:06 -06:00
Gaurav
767713be2b Remove gcc compilation warning in json_reader.cpp
Submitting Patch for Issue : https://github.com/open-source-parsers/jsoncpp/issues/77
It will fix warnings in json_reader.cpp
2014-11-17 14:04:03 +05:30
Aaron Jacobs
3e3a8d5bd2 Merge pull request #74 from ya1gaurav/master
Prefer appending character constants over string literals.
2014-11-14 10:39:03 +11:00
Gaurav
abc1e07543 Prefer appending character constants over string literals - correct patch.
Submitting correct patch for https://github.com/open-source-parsers/jsoncpp/issues/61
2014-11-13 12:47:19 +05:30
Christopher Dunn
00b0a1b992 Merge pull request #70 from jmesmon/pkg-config-include-var
pkg-config: support INCLUDE_INSTALL_DIR
2014-11-12 00:03:52 -06:00
Cody P Schafer
1fe6c59827 pkg-config: support INCLUDE_INSTALL_DIR 2014-11-11 16:09:05 -05:00
Aaron Jacobs
20672ed02c Merge pull request #68 from BillyDonahue/refactor_ctor_boilerplate
Json::Value: Refactor common code in all constructors to an initBasic() function.
2014-11-10 20:23:52 +11:00
Billy Donahue
8eb5d89db6 Remove initInt and initUInt until they are needed. 2014-11-10 01:35:42 -05:00
Christopher Dunn
9c80798038 Merge pull request #63/#67 from dreifachstein/master
Allow customization of component install dirs
Passed Travis CI.
2014-11-08 15:56:36 -06:00
Yu Xiaolei
72e5223658 Fix default runtime install dir 2014-11-05 13:18:16 +08:00
Yu Xiaolei
dc84d96a49 Add CMake package file generation support 2014-11-05 12:31:44 +08:00
Yu Xiaolei
1c3a20de50 Allow customization of component install dirs 2014-11-05 11:25:53 +08:00
Christopher Dunn
533dbe0898 Update README.md
Note on C++11
2014-11-03 12:39:01 -06:00
Christopher Dunn
6cb2f7bd65 Merge pull request #58 from autochthe/master
Add public semantic error reporting
2014-10-29 22:41:50 -05:00
Mara Kim
b84a39cae5 Add public semantic error reporting
Closes open-source-parsers/jsoncpp#57
2014-10-23 02:18:14 -05:00
Christopher Dunn
7bd75b0fb4 Merge pull request #55 from glehmann/build-env
use the CXXFLAGS and LINKFLAGS environment variable in scons
2014-10-21 12:38:05 -05:00
Gaëtan Lehmann
f74a4ff17a use the CXXFLAGS and LINKFLAGS environment variable in scons
this allows homebrew to use its own flags during the build
2014-10-21 11:43:53 +02:00
Aaron Jacobs
b7eccbb110 Merge pull request #54 from I3ck/master
Updated documentation links to point to GitHub
2014-10-20 22:06:26 +11:00
Martin Buck
ebe2d6e6ee Updated documentation links to point to GitHub 2014-10-20 07:59:44 +02:00
Christopher Dunn
4cd31f01bb Merge pull request #53 from I3ck/master
Removed typo in README
2014-10-13 07:24:07 -05:00
I3ck
ab19fa1d6f Removed typo in README 2014-10-13 12:17:53 +02:00
Christopher Dunn
bc8b5d871f Merge pull request #52 from cquammen/master
Removed unneeded newlines from parsed comments
2014-10-11 16:40:51 -05:00
Cory Quammen
fd06bfca79 Removed unneeded newlines from parsed comments
Newlines from comments separated by lines are retained when comments
are appended, so adding a newline between separate comments for a
node is not needed.
2014-10-09 16:33:29 -04:00
Cory Quammen
4d23492d11 Added printing of comments to *.actual test files
This enables testing of comment-handling code. Updated *.expected test
result files to account for printing of comments.
2014-10-09 16:33:29 -04:00
Christopher Dunn
aa650c5b9d Merge pull request #50 from sergzub/master
CMake 2.8.5 or higher is required
2014-10-05 12:04:46 -05:00
sergzub
ae5a56f9ff CMake 2.8.5 or higher is required
make with error for the lower version:

Linking CXX executable ../../bin/jsoncpp_test
/bin/sh: $<TARGET_FILE:jsoncpp_test>: command not found
make[2]: *** [bin/jsoncpp_test] Error 127
make[1]: *** [src/test_lib_json/CMakeFiles/jsoncpp_test.dir/all] Error 2
make: *** [all] Error 2

due to 
http://stackoverflow.com/questions/5410164/how-do-i-use-a-targets-path-in-add-custom-command-in-cmake#comment6139682_5410794
2014-10-03 16:40:58 +04:00
Christopher Dunn
8aec8d88f2 Merge pull request #46 from chuckatkins/fix-for-old-msvc
Workaround for missing C99 functions in older versions of Visual Studio
2014-09-20 14:05:18 -07:00
Chuck Atkins
9dc9026e0b Workaround for missing C99 functions in older versions of Visual Studio 2014-09-19 13:16:09 -04:00
Christopher Dunn
4002f8a4be Revert "Revert "Removed vim mode lines.""
This reverts commit af77b5b594.

See discussion at
  32009b17e4 (commitcomment-7827708)
2014-09-18 16:46:40 -07:00
Christopher Dunn
0375af2eb5 drop version qualifier
This should help keep version.h stable.

    x.y.z-dev
    => major, minor, patch, qual
    == x, y, z, -dev

But we do not need -dev anymore.
2014-09-18 16:43:07 -07:00
Aaron Jacobs
ba330893d7 Ran clang-format again, this time hitting .inl files too.
clang-format -i $(find . -name '*.h' -or -name '*.cpp' -or -name '*.inl')
2014-09-18 16:33:49 -07:00
Christopher Dunn
57dde78308 Merge pull request #45 from jonessen96/master
Added Version definition to the pkg-config file
2014-09-18 16:33:29 -07:00
Jonas Platte
69c324ead5 Added Version definition to the pkg-config file 2014-09-17 20:37:59 +02:00
Christopher Dunn
263a4706fa Merge pull request #44 from cdunn2001/version
0.7.0
2014-09-16 19:11:59 -07:00
Christopher Dunn
4bceabf2f9 ws autogen 2014-09-16 19:11:20 -07:00
Christopher Dunn
877dd17206 bump version; proper SOVERSION 2014-09-16 12:42:33 -07:00
Christopher Dunn
16709c6ee8 JSONCPP_VERSION, not JSON_CPP_VERSION 2014-09-16 12:42:33 -07:00
Christopher Dunn
b2a1ca5b54 in dev.makefile, build shared too 2014-09-16 12:42:33 -07:00
Christopher Dunn
9aa4681052 Revert "Merge branch 'no-version'"
This reverts commit d9ced92d40, reversing
changes made to d2fa664a12.

Conflicts:
	include/json/version.h (keep)
2014-09-16 12:42:32 -07:00
Christopher Dunn
af77b5b594 Revert "Removed vim mode lines."
This reverts commit 32009b17e4.
2014-09-16 12:42:32 -07:00
Aaron Jacobs
11086dd6a7 Enabled PointerBindsToType in clang-format options. 2014-09-15 10:15:29 +10:00
Aaron Jacobs
30b07c0275 Ran clang-format over all .h and .cpp files.
clang-format -i $(find . -name '*.h' -or -name '*.cpp')
2014-09-15 10:14:48 +10:00
Aaron Jacobs
32009b17e4 Removed vim mode lines.
Users can set their own preferences in their personal vimrc.
2014-09-15 08:23:41 +10:00
Christopher Dunn
b4357fa224 Merge pull request #41 from bmcdorman/feature-arrow_operator
Added arrow operator to ValueIterator and ValueConstIterator
2014-09-14 08:17:03 -07:00
Braden McDorman
540db3b052 Added arrow operator to ValueIterator and ValueConstIterator 2014-09-14 08:15:47 -07:00
Christopher Dunn
f4b06cd607 rm trailing ws 2014-09-14 08:15:32 -07:00
Christopher Dunn
dd5b57a3d9 Merge pull request #42 from jonessen96/master
Add pkg-config support
2014-09-14 07:54:13 -07:00
Jonas Platte
6270858c43 Added pkg-config file 2014-09-14 15:45:07 +02:00
Christopher Dunn
d9ced92d40 Merge branch 'no-version'
We can modify version.h directly, as desired. It is retained for
backward-compatibility, in case anyone is using those macros.

Note: I have not modified SConstruct since that is deprecated, so
I have retained the `version` file, which should be ignored.

Addresses issue #38
2014-09-11 10:10:40 -07:00
Christopher Dunn
8f730b8a60 stop using version.h.in for cmake 2014-09-11 10:09:48 -07:00
Christopher Dunn
b061ff4a1e generated for the last time, maybe 2014-09-11 10:04:49 -07:00
Christopher Dunn
8ececfa538 stop ignoring version.h 2014-09-11 10:04:23 -07:00
Christopher Dunn
d2fa664a12 makefile for simple testing
This is hard to use within Travis-ci.com because that uses
build variants.
2014-09-10 18:03:34 -07:00
Christopher Dunn
b7894977e7 deprecate makerelease.py (someday drop version.h too?) 2014-09-10 18:01:10 -07:00
Christopher Dunn
53262c66d9 Merge branch 'SuperManitu:python3' from issue #36 2014-09-10 17:29:07 -07:00
Christopher Dunn
09228968ea fix for python2 2014-09-10 17:26:46 -07:00
SuperManitu
83b43caf8e allow python3 2014-09-10 11:09:35 -07:00
Aaron Jacobs
0dc03d0848 Merge pull request #37 from BillyDonahue/value-efficiency
Switch to copy-and-swap idiom for operator=.
2014-09-10 10:52:26 -07:00
Billy Donahue
45cd9490cd Switch to copy-and-swap idiom for operator=.
This allows the compiler to elide a copy when rhs is a temporary.
2014-09-10 10:37:34 -07:00
Christopher Dunn
236db83742 ws 2014-09-10 10:35:01 -07:00
findblar
a70b00750d pull request #35 from finblarr:patch-1
fix build directory, within repo tree
2014-09-10 10:32:51 -07:00
Christopher Dunn
033677cc1a Merge pull request #30 from mloy/redundant-strlen 2014-09-03 14:07:40 -07:00
Christopher Dunn
9d694516a0 clarify return value 2014-09-03 13:54:49 -07:00
Christopher Dunn
d94caac1ea ws 2014-09-03 13:46:37 -07:00
mloy
8eb6f88a87 snprintf does return a signed integer
assert if returned value is neagtive
2014-09-03 13:37:17 -07:00
Matthias Loy
64d591b720 snprintf already calculated the length 2014-09-03 13:37:17 -07:00
Matthias Loy
fe2cd01e80 free does nothing if parameter equals NULL 2014-09-03 13:37:17 -07:00
Christopher Dunn
b02ff20bd3 Merge pull request #33 from donmilham/master
added option to FastWriter which omits the trailing new line character
2014-09-03 13:32:53 -07:00
Don Milham
5bf16105b5 added option to FastWriter which omits the trailing new line character 2014-09-02 17:09:07 -06:00
Christopher Dunn
3515db184a Merge pull request #29 from mloy/type-punned-pointer
Type punned pointer

I'll revert this if anyone reports a problem. *strict-aliasing* is not my favorite compiler warning.
2014-08-13 23:41:05 -07:00
Matthias Loy
48d9a92a1b do intermediate step in order to omit "dereferencing type-punned pointer" error 2014-08-13 13:20:29 +02:00
Matthias Loy
f97723dbb7 provoke compile error:
"dereferencing type-punned pointer will break strict-aliasing rules [-Werror=strict-aliasing]"
2014-08-13 13:19:02 +02:00
Christopher Dunn
1a6426ac19 Merge pull request #21 from hiharin/master
Hmmm. Not ideal. A round-trip should reproduce the original, but null -> NaN -> ? But I guess it's no worse than it was.

The different behavior for Win CE is troubling, but it only affects people who are using these extreme values.

I've worked with Inf/NaN before, so I understand your pain.
2014-08-13 02:03:55 -07:00
David West
bc5dbc6d41 Patch for bug #53 on version 0.5.0
This is a patch that we have utilized at IDEXX Labs for the the bug described above.
We have tested and verified this on x86 32 and 64 bit linux and 32 bit arm.
2014-08-13 02:03:33 -07:00
Christopher Dunn
1ac2295c21 Merge pull request #27 from egor-tensin/master
Fixed deprecated target file path location

+1 for fixing indentation!
2014-08-13 02:03:18 -07:00
Egor Tensin
81d16dfda1 Fixed deprecated target file path location 2014-08-13 02:02:53 -07:00
Christopher Dunn
c138933784 Merge pull request #26 from alex-ac/master
Fix CMake subproject behaviour.

Sweet. But doesn't this assume that people call the subproject `jsoncpp`? It used to be `json-cpp`.
2014-08-13 02:02:33 -07:00
Aleksandr Derbenev
b3deb61f87 Fix CMake subproject behaviour. 2014-08-13 02:01:38 -07:00
Christopher Dunn
740e0207b1 Merge pull request #25 from cgilling/master
add tests to check that exceptions are thrown for wrong types

Nice!

For the record, I would have put the add-failure into the `try` block, for simplicity.
2014-08-13 02:01:11 -07:00
Chris Gilling
97c77b4a86 add tests to check that exceptions are thrown for wrong types
* Add JSONTEST_ASSERT_THROWS macro to test if an expression
  throws an exceptions.
* add JSONTEST_FIXTURE(ValueTest, typeChecksThrowExceptions)
2014-08-13 02:00:41 -07:00
Christopher Dunn
7ebdabc059 Merge pull request #23 from mloy/msvc2010
Solution and project files for MSVC 2010

We'll just trust you on this. Thanks for the contribution.
2014-08-13 01:59:52 -07:00
mloy
c6d9424f71 project files for msvc2010 2014-08-13 01:57:45 -07:00
mloy
19d0ece5f9 add solution for msvc 2010 2014-08-13 01:57:45 -07:00
Christopher Dunn
35e4f2abd6 Merge pull request #22 from AlexeyKruchinin/patch-1
Update README.md
2014-08-13 01:01:37 -07:00
Alexey Kruchinin
b548cdf49c Update README.md 2014-08-06 23:24:34 -04:00
Christopher Dunn
3b9b7402fd Merge pull request #14 from eightnoteight/master
header.add_file (version.h) temporarily commented | header include path modified
2014-07-13 23:52:14 -07:00
eightnoteight
3585477f33 add file version.h temporarily commented | header include path modified 2014-07-14 11:11:14 +05:30
pffang
27e3263894 WinCE Compatibility Fix
Note: str.imbue and std::locale::classic() are not supported on WINCE
2014-07-10 20:27:52 -07:00
Christopher Dunn
8582876c5c vim modelines 2014-07-10 20:24:23 -07:00
Christopher Dunn
496c655523 fix numeric locale
In some locales (e.g. de_DE) floats have commas instead of
dots, but JSON requires dots.
See:
  https://github.com/open-source-parsers/jsoncpp/pull/9
  https://github.com/open-source-parsers/jsoncpp/pull/3
2014-07-10 20:24:23 -07:00
Christopher Dunn
49c732607b Revert "Merge pull request #7 from steffen-kiess/fix-locale"
This reverts commit 0db9d6ea01, reversing
changes made to 06dcb1fc89.

For discussion, see
  https://github.com/open-source-parsers/jsoncpp/pull/9
  https://github.com/open-source-parsers/jsoncpp/pull/3
2014-07-10 19:59:26 -07:00
Christopher Dunn
655a9db0cc Merge pull request #11 from cdunn2001/inc
improve some includes
2014-07-09 21:53:40 -07:00
Christopher Dunn
f3989977c0 rm generated version.h 2014-07-09 21:48:49 -07:00
Christopher Dunn
60f778b9fc relative include 2014-07-09 21:40:23 -07:00
Christopher Dunn
50f6779578 Merge pull request #10 from cdunn2001/doxy
Doxy
2014-07-09 21:29:18 -07:00
Christopher Dunn
5a65132e72 README/NEWS links 2014-07-09 11:48:27 -07:00
Christopher Dunn
bef834edca update docs for open-source-parsers org 2014-07-09 11:48:27 -07:00
Christopher Dunn
0973f2e6bc after doxygen -u 2014-07-09 11:48:27 -07:00
Christopher Dunn
5031a59518 doxygen changed 2014-07-09 11:48:27 -07:00
Christopher Dunn
5850b83a5b moved roadmap to wiki 2014-07-09 11:48:26 -07:00
Christopher Dunn
9dd7eea945 fix doxybuild.py paths 2014-07-09 11:48:26 -07:00
Christopher Dunn
35bea41bc9 update docs for github 2014-07-09 11:48:26 -07:00
Christopher Dunn
542cd1d3f5 remove some sourceforge links 2014-07-09 11:48:26 -07:00
Christopher Dunn
ba50403414 ignore doxygen stuff 2014-07-09 11:48:26 -07:00
Christopher Dunn
0db9d6ea01 Merge pull request #7 from cdunn2001/fix-locale
Use std::stringstream instead of snprintf() for double->string conversion
2014-07-09 11:47:48 -07:00
Steffen Kieß
b8aaa03367 Use std::stringstream instead of snprintf() for double->string conversion
`snprintf()` will use the current `LC_NUMERIC` locale
for converting a double to a string,
which will use a `,` instead of a `.` in some locales (e.g. de_DE).
`std::stringstream` allows setting the locale to `"C"` to always get a `.`.
This occurs only for that `stringstream` instance; no global is
altered.
2014-07-09 11:46:00 -07:00
Christopher Dunn
06dcb1fc89 cmake updates this 2014-07-08 21:57:12 -07:00
Christopher Dunn
973de3988b Merge pull request #1 from cdunn2001/patch-renu555
dead-code patch by renu555
2014-07-05 19:10:34 -07:00
renu555
41b79398a3 Always true condition.
for (int index = 0; index < size && !isMultiLine; ++index) 
In addition to dead code, in the above if condition checking to !isMultiLine is of no use as it will be always true and hence "for" depends only on condition [index < size.]
The mentioned test case works fine in this case also.
2014-07-05 19:05:41 -07:00
renu555
66b77384d8 Fix dead code scenario.
Changes explained
2014-07-05 19:05:41 -07:00
renu555
17c244e644 Fixing unreachable condition.
if (!isMultiLine) at line 563 suggests that isMultiline is 0 when if takes true branch. So the condition && at line 571 will always be false.
Also at line 568 !isMultiline in loop conditional check suggests that it depends only on one condition i.e. index <size because !isMultiline is always true. 
Hence , it seems logical mistake at line 571 of using && instead of ||
2014-07-05 19:05:41 -07:00
Christopher Dunn
8050d8b677 Merge pull request #6 from cdunn2001/fix-static-init
I will try to pull the other changes from Chromium as well.

This passed Travis.
2014-07-05 17:39:19 -07:00
Christopher Dunn
28836b8acc fix bug for static init
Ugh! Static initialization of instance variables is a very bad idea.

This fix is taken from the Chromium code-base. It includes their
double-fix for ARM.

* https://codereview.chromium.org/24984004
* https://src.chromium.org/viewvc/chrome?revision=226099&view=revision
* https://code.google.com/p/webrtc/issues/detail?id=1777
2014-07-05 17:36:20 -07:00
Aaron Jacobs
47f1577fd3 Gave the license section a makeover. 2014-07-01 13:13:46 +10:00
Aaron Jacobs
9cc184146f Gave the test output section a makeover. 2014-07-01 13:13:03 +10:00
Aaron Jacobs
c151549937 Gave the reader/writer test section a makeover. 2014-07-01 13:11:05 +10:00
Aaron Jacobs
d61fa29da8 Gave the amalgamated source section a makeover. 2014-07-01 13:09:15 +10:00
Aaron Jacobs
b2a086adeb Gave the documentation section a makeover. 2014-07-01 13:07:21 +10:00
Aaron Jacobs
8d0f8d0dcd Gave the testing section a makeover. 2014-07-01 13:06:40 +10:00
Aaron Jacobs
49ed6c9774 Gave the scons section a makeover. 2014-07-01 12:06:16 +10:00
Aaron Jacobs
38c9826423 Gave the cmake section a makeover. 2014-07-01 12:02:57 +10:00
Aaron Jacobs
969921748b Gave the using section a makeover. 2014-07-01 11:59:25 +10:00
Aaron Jacobs
ff22ca7973 Gave the introduction section a makeover. 2014-07-01 11:56:56 +10:00
Aaron Jacobs
4b687640cb Began converting the README to Markdown. 2014-07-01 11:54:14 +10:00
Aaron Jacobs
3a0c4fcc82 Ran clang-format again. 2014-07-01 09:20:48 +10:00
Aaron Jacobs
445328ace6 Fixed some clang-format weirdness. 2014-07-01 09:15:11 +10:00
Aaron Jacobs
9fa4e849a1 Ran clang-format over all .h and .cpp files.
clang-format -i $(find . -name '*.h' -or -name '*.cpp')
2014-07-01 08:48:54 +10:00
Aaron Jacobs
1b137a3802 Set BinPackParameters to false.
This option personally drives me crazy. I think it's much more readable
to be able to see one parameter per line when there are many.
2014-07-01 08:47:46 +10:00
Aaron Jacobs
fd6ada015e Added a clang-format config file, in preparation for formatting jsoncpp.
clang-format -style=llvm -dump-config > .clang-format
2014-07-01 08:45:32 +10:00
Aaron Jacobs
8540868347 Fixed some cruft in the Travis CI config file. 2014-06-30 19:57:59 +10:00
Aaron Jacobs
2de98d9bd1 Updated notification settings for Travis CI. 2014-06-30 19:51:29 +10:00
Christopher Dunn
6764059395 fix stdexcept
https://sourceforge.net/p/jsoncpp/bugs/68/
2014-05-13 09:49:25 +00:00
Aaron Jacobs
5d32295a6e Fixed a test that causes a crash when exceptions are disabled.
While I was at it, corrected whitespace too.
2014-04-23 23:57:59 +00:00
Aaron Jacobs
68db655347 Added structured error reporting to Reader.
This allows applications for interactively viewing or editing JSON to do
a better job of highlighting errors. Also added offset accessors to
Value, offering the same sort of functionality even for non-errors.

Thanks to Zach Clifford (zacharyc@google.com) for the patch.
2014-04-23 23:41:12 +00:00
Aaron Jacobs
642befc836 Added features that allow the reader to accept common non-standard JSON.
This is a version of patch #17, from Clay Wood:

    http://sourceforge.net/p/jsoncpp/patches/17/
2014-04-23 23:28:23 +00:00
Christopher Dunn
77cd83890d vim modeline
http://vim.wikia.com/wiki/Modeline_magic
2014-04-19 21:41:03 +00:00
Christopher Dunn
09439b7bc7 Comment reading/write improvements
This patch fixes some aspects of reading and writing comments:
- Multiple C++-style comments before a Json value had extra newlines appended to them. This patch removes the addition of those newlines.
- Comments written before Json values in the StyledWriter were not indented to match the indentation level of the value. This patch adds indentation to comments.
- Fixed inconsistency in newlines following C- and C++-style comments being saved as part of the comment. All newlines at the end of a comment are now removed.
- Added an additional test of comments.

https://sourceforge.net/p/jsoncpp/patches/25/
2014-04-19 21:19:24 +00:00
Christopher Dunn
ea0797351f JSON_ASSERT -> JSON_ASSERT_MESSAGE
This way, assertions can produce exceptions.
https://sourceforge.net/p/jsoncpp/bugs/67/
2014-04-19 06:37:23 +00:00
Aaron Jacobs
94d17e9fdf Added missing includes for std::istream.
Thanks to Quentin Fiard for the report.
2014-01-29 00:13:38 +00:00
Baptiste Lepilleur
a3f19c23a0 Fixed broken build on VS 2012 2013-09-23 14:10:39 +00:00
Aaron Jacobs
d2618806ba Fixed some snprintf-related build breakages in Visual Studio. 2013-08-08 23:08:28 +00:00
Aaron Jacobs
36400ac0c1 Updated two calls to sprintf that I missed in r269. 2013-08-08 00:39:32 +00:00
Aaron Jacobs
32ffb931e7 Replaced the complex implementation of valueToString(double).
The previous one was confusing and prone to buffer overflows, and didn't
work correctly with 16-decimal-digit numbers. The new one simply uses
snprintf with a standard format string.

The major change is that we don't always print a decimal point now.
Fortunately, JSON doesn't distinguish between integers and reals.
2013-08-08 00:39:12 +00:00
Aaron Jacobs
bb53cd0899 Added more floating point tests.
The first demonstrates a bug that I will soon fix.
2013-08-08 00:37:39 +00:00
Aaron Jacobs
4c531bb584 Added further floating point tests. 2013-08-08 00:13:10 +00:00
Aaron Jacobs
42d918b7aa Switched away from sprintf, which is prone to buffer overflows.
Most reasonable platforms have this function. If you're here because
this broke the build for you, consider adding an ifdef for your platform
and using sprintf there (but not on other platforms).
2013-08-06 23:12:56 +00:00
Baptiste Lepilleur
700b38020e - CMake: added option to turn fail compilation if warning occurs, and warning level 4 with MSVC.
- Fixed some warnings
2013-05-09 18:42:33 +00:00
Baptiste Lepilleur
7b62ceacee - disabled warning 4786 for VS6 caused by STL (identifier was truncated to '255' characters in the debug information)
- added batchbuild config for XP VM
2013-05-09 16:24:13 +00:00
Baptiste Lepilleur
cb5ae30f6e Added simple batch build script for CMake. 2013-05-09 15:22:14 +00:00
Baptiste Lepilleur
58b6541478 Added missing source file to CMakeLists.txt. 2013-05-09 15:21:06 +00:00
Baptiste Lepilleur
1ccfdfcb9b 2013-05-09 15:20:32 +00:00
Baptiste Lepilleur
71860de813 Fixed continuous integration matrix for debug/release build. Made static debug build verbose. 2013-05-08 22:23:07 +00:00
Baptiste Lepilleur
c515b8ec30 Added continuous integration matrix for debug/release build. Made static debug build verbose. 2013-05-08 22:15:15 +00:00
Baptiste Lepilleur
5fff185aa4 Added continuous integration matrix for shared/static library (specified through environment variables). 2013-05-08 22:04:57 +00:00
Baptiste Lepilleur
10712e85d6 Added continuous integration failure e-mail notification. 2013-05-08 21:23:52 +00:00
Baptiste Lepilleur
53c08ad916 Added clang compiler for continuous integration. 2013-05-08 21:04:42 +00:00
Baptiste Lepilleur
79e90fba0b Added basic Travis CI integration contributed by Igor Okulist. 2013-05-08 20:46:56 +00:00
Baptiste Lepilleur
ce277aa6e4 Fixed CMake / Unix build instructions. 2013-05-08 20:37:54 +00:00
Baptiste Lepilleur
eafd702a17 - New CMake based build system. Based in part on contribution from
Igor Okulist and Damien Buhl (Patch #14). Added support for running
tests and building with DLL on Windows.
- added missing JSON_API
- Visual Studio DLL: suppressed warning "C4251: <data member>: <type> 
needs to have dll-interface to be used by..." via pragma push/pop
in json-cpp headers.
- New header json/version.h now contains version number macros
(JSONCPP_VERSION_MAJOR, JSONCPP_VERSION_MINOR, JSONCPP_VERSION_PATCH
and JSONCPP_VERSION_HEXA). While this header is generated by CMake,
it is committed to ease build with alternate build system 
(CMake only update the file when it changes avoid issues with VCS).
2013-05-08 20:21:11 +00:00
Baptiste Lepilleur
a8afdd40af - Patch #3393345: BOOST_FOREACH compatibility. Made Json::iterator more standard compliant, added missing iterator_category and value_type typedefs (contribued by Robert A. Iannucci).
- Patch #3474563: added missing JSON_API on some classes causing link issues when building as a dynamic library on Windows (contributed by Francis Bolduc).
2013-04-12 14:10:13 +00:00
Baptiste Lepilleur
f92ace5e82 Patch #3600941: Missing field copy in Json::Value::iterator causing infinite loop when using experimental internal map (#define JSON_VALUE_USE_INTERNAL_MAP) (contributed by Ming-Lin Kao). 2013-04-12 13:26:23 +00:00
Baptiste Lepilleur
3f124172ce Patch #3539678: Copy constructor does not initialize allocated_ for stringValue (contributed by rmongia). 2013-04-12 13:11:14 +00:00
Baptiste Lepilleur
f8715856f3 Fix gcc -Wall warnings (patch from Matt McCormick) 2013-02-18 15:53:47 +00:00
Baptiste Lepilleur
42321f24a6 Fixed warning(error?) on #if testing value of _MSC_VER without checking that it was defined. 2012-12-20 10:08:50 +00:00
Baptiste Lepilleur
aff1171153 Added missing "include/json/assertions.h" header in amalgamate.py. 2012-07-27 09:06:40 +00:00
Aaron Jacobs
ae3c7a7aab Made it possible to drop null placeholders from array output.
This can be used when it's clear that the consumer is able to deal with
this, as web browsers are. Thanks to Yatin Chawathe for the patch.
2012-03-12 04:53:57 +00:00
Aaron Jacobs
f572e8e42e Added an exit() to JSON_FAIL_MESSAGE to fix "no return" errors. 2012-01-08 23:49:55 +00:00
Aaron Jacobs
2b853c4067 Got rid of several unnecessary includes of <iostream>.
Including <iostream> causes the file to be polluted with a static
initializer for the __ioinit symbol. This can harm binary startup time.
For more info, see here:

    http://neugierig.org/software/chromium/notes/2011/08/static-initializers.html
2011-12-22 03:18:24 +00:00
Aaron Jacobs
7c507d7eba Made JSON_USE_EXCEPTION's value in config.h a default that can be overridden.
This allows users to override it with their compiler invocation. For example:

    g++ -D JSON_USE_EXCEPTION=0 ...
2011-09-14 08:41:37 +00:00
Christopher Dunn
c3763f55da Updated bug-fix list. 2011-06-24 21:15:30 +00:00
Christopher Dunn
3b3540d9ef bug#2407932: strpbrk() could fail for NULL pointer. 2011-06-22 21:04:41 +00:00
Christopher Dunn
9d317c3794 bug#3306345: minor typo in Path::resolve() -- missing bang. 2011-06-22 08:30:21 +00:00
Christopher Dunn
03288e8eb6 (bug#3314841) Fixed JSON_IS_AMALGAMATION. Using os.path for OSX filename compatibility. 2011-06-22 00:43:31 +00:00
Christopher Dunn
c9f91dd929 More missing constructor initializers found by Coverity. 2011-06-21 23:02:06 +00:00
Christopher Dunn
2ba3bc3252 Another simple addition for constructor initialization, PathArgument. 2011-06-21 22:08:49 +00:00
Christopher Dunn
ac5df77bbc Simple changes to Reader initialization, from Chromium folks. (I do not think this was submitted as a bug.) 2011-06-21 21:56:54 +00:00
Christopher Dunn
468564b3fe More eol changes. 2011-06-21 21:53:02 +00:00
Christopher Dunn
dc0f736f59 Switched CRLF to LF in repo, and added svn:eol-style native. I might have missed a few files though. Just committing what I have so far. 2011-06-21 21:18:49 +00:00
Christopher Dunn
139da63aef Just testing whether I can still commit changes. I cannot tell my access-level from the sf project page. 2011-06-21 20:34:40 +00:00
Baptiste Lepilleur
d496e044b1 Fixed unit tests execution on MSVC 6 by removing usage of std::numeric_limits. It was returning 0 value in some max cases. Fixed Value::asFloat() to use integerToDouble(). 2011-05-27 08:12:41 +00:00
Baptiste Lepilleur
f587e6a420 Fixed compilation issues with MSVC 6: replace usage of ostringstream with valueToString to support 64 bits integer and high precision floating point conversion to string. Replace usage of ULL and LL literal with UInt64(expr) and Int64(expr). Introduced helper function uint64ToDouble() to work-around missing conversion. Unit tests do not pass yet. 2011-05-26 22:55:24 +00:00
Baptiste Lepilleur
f0b24e705f Fixed MSVS 2003, 2005 and 2008 tests execution by normalizing floating-point string representation using helper normalizeFloatingPointStr(). 2011-05-26 20:14:32 +00:00
Baptiste Lepilleur
e807a7640e Fixed unit test failure on IBM AIX xlC by hard-coding the maxUInt64AsDouble as double constant instead of relying on double(Value::maxUInt64) which produces an incorrect value. 2011-05-26 17:14:26 +00:00
Baptiste Lepilleur
d3cd9a7fc5 - Fixed unit test compilation on MSVS 2003, 2005 and 2008.
- Worked-around unit test failure with MSVS* by "forcing" all floating-point numbers to be loaded from memory instead of FPU registers.
2011-05-26 07:32:36 +00:00
Aaron Jacobs
a2fb7fb918 Fixed some test bugs that show up when 64-bit mode is disabled. 2011-05-26 06:58:52 +00:00
Aaron Jacobs
4b819c2309 Added a few test cases that Google is using internally for patches made
in the past.
2011-05-26 06:58:14 +00:00
Aaron Jacobs
c649badb95 Another round of attempting to fix VC++ errors... 2011-05-26 03:44:02 +00:00
Aaron Jacobs
a9eb1eccc0 Fixed more default cases. 2011-05-26 03:32:11 +00:00
Aaron Jacobs
6ffff91c54 Got rid of some unreachable code. 2011-05-26 03:27:44 +00:00
Aaron Jacobs
acdefb0869 Fixed a double -> float compilation warning/error. 2011-05-26 03:04:01 +00:00
Aaron Jacobs
c025697ea5 Reworked the type conversion system again, so that:A
*  isFoo methods determine exact representability.
 *  asFoo methods cause casting when safe.
 *  isConvertibleTo indicates whether casting is safe.

See NEWS.txt for details.
2011-05-26 02:46:28 +00:00
Aaron Jacobs
b0ec41c3e3 Made the unit test's output more readable, adding to jsontest's
capabilities (and simplifying its implementation) in the process.
2011-05-26 00:30:39 +00:00
Aaron Jacobs
2a2b5cf3ad Made jsontest work with 64-bit integers, and fixed an error. 2011-05-26 00:12:48 +00:00
Aaron Jacobs
b6620e2801 Removed some out of date TODOs. 2011-05-25 23:26:58 +00:00
Aaron Jacobs
ccde848fd1 Fixed test failures with 64-bit support disabled. 2011-05-25 05:53:59 +00:00
Aaron Jacobs
e082248001 Fixed a 'comparison between signed and unsigned' error. 2011-05-25 05:50:13 +00:00
Aaron Jacobs
7b5edd9859 Added line breaks to make error messages easier to read. 2011-05-25 04:59:57 +00:00
Aaron Jacobs
e91a68cb9e Fixed a compilation warning/error. 2011-05-25 04:34:57 +00:00
Aaron Jacobs
1b138e8544 Gave a more consistent behavior to the Value::isFoo methods. See
NEWS.txt for more details.
2011-05-25 04:19:17 +00:00
Aaron Jacobs
4f081b50e6 Fixed bugs in asInt64 and asUInt64. 2011-05-25 03:16:49 +00:00
Aaron Jacobs
3c9fdeb859 Added tests for default numeric values. 2011-05-25 02:54:11 +00:00
Aaron Jacobs
4b79fd1a00 Fixed a test bug. 2011-05-25 01:51:30 +00:00
Aaron Jacobs
e12d84ebaa Made tests more comprehensive. 2011-05-25 01:46:50 +00:00
Aaron Jacobs
078e0d7c37 Gave tests more general names in preparation for making them much more
comprehensive.
2011-05-25 01:24:23 +00:00
Aaron Jacobs
fee49b1a37 Fixed some whitespace. 2011-05-25 01:23:47 +00:00
Aaron Jacobs
22eede44c1 Added tests for 64-bit integers. 2011-05-25 01:23:08 +00:00
Aaron Jacobs
d9ec234fc2 Greatly fleshed out numeric type tests. 2011-05-25 01:04:07 +00:00
Aaron Jacobs
3e5b347f75 Added some missing checks. 2011-05-25 01:03:29 +00:00
Aaron Jacobs
96408a30e1 Renamed test cases to make more sense with the upcoming new behavior of
isFoo methods.
2011-05-25 00:39:55 +00:00
Aaron Jacobs
1d648f089a Fixed a whitespace problem. 2011-05-25 00:39:17 +00:00
Aaron Jacobs
f40c880585 Fixed a "comparison between signed and unsigned" warning/error. 2011-05-24 23:08:59 +00:00
Aaron Jacobs
39ba2dbea9 Added a .gitignore file, for ease of use with git-svn. 2011-05-24 23:05:56 +00:00
Aaron Jacobs
a761530f14 Fixed a missing include error. 2011-05-24 06:27:36 +00:00
Aaron Jacobs
ae9ffb5443 Fixed a parsing bug in decodeNumber, updating the failing test cases to be
correct in the process. (The test cases incorrectly used exact integers instead
of scientific notation.)
2011-05-24 03:59:24 +00:00
Aaron Jacobs
e656c5fa2d Added some test cases that catch a parsing bug. 2011-05-24 03:19:50 +00:00
Aaron Jacobs
f1053e7acb Fixed a bunch of compilation errors when JSON_HAS_INT64 is set. 2011-05-24 03:18:02 +00:00
Aaron Jacobs
e3d0eca9f4 Centralized assertion macros and made them obey JSON_USE_EXCEPTION. 2011-05-24 01:03:22 +00:00
Aaron Jacobs
a77a803c85 Made two security fixes. 2011-05-24 00:43:59 +00:00
Aaron Jacobs
785ba2675d Updated a cast to use a more appropriate type. 2011-05-24 00:43:30 +00:00
Aaron Jacobs
3b556ec633 Fixed constructor initializer list order warnings/errors. 2011-05-24 00:42:58 +00:00
Aaron Jacobs
5fb0f09cbb Removed an unused typedef. 2011-05-24 00:42:15 +00:00
Aaron Jacobs
73911f2e33 Fixed a hard to debug crash on OS X related to sscanf format strings.
See here for more info:
    http://developer.apple.com/library/mac/#DOCUMENTATION/DeveloperTools/gcc-4.0.1/gcc/Incompatibilities.html
2011-05-24 00:41:12 +00:00
Baptiste Lepilleur
d21c256fae Released 0.6.0-rc2 2011-05-02 22:07:18 +00:00
Baptiste Lepilleur
72c406b550 Release 0.6.0-rc2 2011-05-02 21:30:42 +00:00
Baptiste Lepilleur
eadc478e50 Fixed typo: amalga*ma*te. Replaced macro JSON_IS_AMALGATED with JSON_IS_AMALGAMATION 2011-05-02 21:09:30 +00:00
Baptiste Lepilleur
1837a1c508 Value::compare() is now const and has an actual implementation with unit tests. 2011-05-02 20:11:48 +00:00
Baptiste Lepilleur
e3cc0f004b Untabified some sources 2011-05-02 18:41:01 +00:00
Baptiste Lepilleur
fb17080142 - Added unit tests for comparison operators (except compare())
- Fixed Value::operator <= implementation (had the semantic of operator >=). Found when addigin unit tests for comparison operators.
2011-05-02 16:53:10 +00:00
Baptiste Lepilleur
e0e1fd37cd - Bug #3200841: removed "warning C4127: conditional expression is constant" concerning infinite loop by replacing while (true) with for (;;). Added new JSON_FAIL macro. Commented unused parameters. 2011-05-02 16:51:48 +00:00
Baptiste Lepilleur
d0a9f3d98d Bug #3200841: removed "warning C4127: conditional expression is constant" concerning infinite loop by replacing while (true) with for (;;). 2011-05-02 09:54:49 +00:00
Baptiste Lepilleur
7953a801c1 Released 0.6.0-rc1 2011-05-02 07:30:45 +00:00
Baptiste Lepilleur
df4de558c3 Need more tests on unicode 2011-05-02 07:06:33 +00:00
Baptiste Lepilleur
62d7bc75db Added support for amalgated source and header generation (a la sqlite). Refer to README.txt section "Generating amalgated source and header" for detail.
The amalgated sources are generated by concatenating JsonCpp source in the correct order and defining macro JSON_IS_AMALGATED to prevent inclusion of other headers. Sources and header has been modified to prevent any inclusion when this macro is defined.

The script amalgate.py handle the generation.
2011-05-02 07:06:07 +00:00
Baptiste Lepilleur
224a1aee72 Release 0.6.0-rc1 2011-05-01 22:11:05 +00:00
Baptiste Lepilleur
40388494bd Release test-0.6.0 2011-05-01 20:50:44 +00:00
Baptiste Lepilleur
bafb43c203 Release test-0.6.0 2011-05-01 20:36:55 +00:00
Baptiste Lepilleur
64e40aafe5 Added support for amalgated source and header generation (a la sqlite). Refer to README.txt section "Generating amalgated source and header" for detail.
The amalgated sources are generated by concatenating JsonCpp source in the correct order and defining macro JSON_IS_AMALGATED to prevent inclusion of other headers. Sources and header has been modified to prevent any inclusion when this macro is defined.

The script amalgate.py handle the generation.
2011-05-01 20:13:40 +00:00
Baptiste Lepilleur
91923f2cbc Added project URL. 2011-05-01 18:33:46 +00:00
Baptiste Lepilleur
13698b5835 Added recommended include path. 2011-05-01 17:24:16 +00:00
Baptiste Lepilleur
5349225f43 Added known bug reference for experimental internal map. 2011-05-01 16:42:18 +00:00
Baptiste Lepilleur
b2e8cccbc6 Renamed Reader::getFormatedErrorMessages() to getFormattedErrorMessages. Bug #3023708 (Formatted has 2 't'). The old member function is deprecated but still present for backward compatibility. 2011-05-01 16:27:55 +00:00
Baptiste Lepilleur
99043b32b5 Fixed bug #3139678: stack buffer overflow when parsing a double with a length of 32 characters. 2011-05-01 15:47:38 +00:00
Baptiste Lepilleur
9c98f2277b Fixed bug #3139677: JSON [1 2 3] was incorrectly parsed as [1, 3]. Error is now correctly detected.
Modified runjsontests.py to allow test that expect failure in jsoncpp test suite.
2011-05-01 15:40:47 +00:00
Baptiste Lepilleur
565a1f3d39 Fixed latest readme.txt url. 2011-05-01 15:09:16 +00:00
Baptiste Lepilleur
61324b5f77 Fixed url for scons 1.2 download. Clarify manual test run executable path. 2011-05-01 15:06:40 +00:00
Baptiste Lepilleur
842e9ac54b Major rework of 64 integer support: 64 bits integer are only returned when explicitly request via Json::Value::asInt64(), unlike previous implementation where Json::Value::asInt() returned a 64 bits integer.
This eases porting portable code and does not break compatibility with the previous release.

Json::Value::asLargestInt() has also be added to ease writing portable code independent of 64 bits integer support. It is typically used to implement writers.
2010-12-27 17:45:23 +00:00
Baptiste Lepilleur
5c5628aec2 Fixed some documentation issues pointed out by Daniel. 2010-12-24 19:58:23 +00:00
Baptiste Lepilleur
b96aed0f3e Added float Json::Value::asFloat() to obtain a floating point value as a float (avoid lost of precision warning caused by used of asDouble() to initialize a float). 2010-12-24 19:30:06 +00:00
Baptiste Lepilleur
fa130ef871 - Array index can be passed as int to operator[], allowing use of literal:
Json::Value array;
  array.append( 1234 );
  int value = array[0].asInt();  // did not compile previously
2010-12-24 12:47:14 +00:00
Baptiste Lepilleur
e6046e589e updated license with clearer information 2010-04-27 16:38:30 +00:00
Baptiste Lepilleur
402c13eb3d - added unit test and roadmap for handling of escape sequence "\/" 2010-04-27 16:37:50 +00:00
Baptiste Lepilleur
7469f1d014 JsonCpp is now licensed under MIT license, or public domain if desired and recognized in your jurisdiction. 2010-04-20 21:35:19 +00:00
Baptiste Lepilleur
201fb2cf0d - Moved definition of Json::Int and Json::UInt to config.h which compiler detection logic to define them to 64 bits integer if JSON_NO_INT64 is not defined.
- Added Json::ArrayIndex as an unsigned int to forwards.h
- Modified Json::Value to consistently use Json::ArrayIndex.
- Added int/unsigned int constructor overload to Json::Value to avoid ambiguous constructor call.
- Modified jsontestrunner/main.cpp to use Json::valueToString for Value::asInt() conversion to string.
- Modified Json::Reader to only overflow to double when the number is too large (previous code relied on the fact that an int fitted in a double without precision loss).
- Generalized uintToString() helpers and buffer size to automatically adapt to the precision of Json::UInt.
- Added specific conversion logic for UInt to double conversion on Microsoft Visual Studio 6 which only support __int64 to double conversion (unsigned __int64 conversion is not supported)
- Added test for 64 bits parsing/writing. Notes: those will fail when compiled with JSON_NO_INT64 (more dev required to adapt).
2010-04-19 07:37:41 +00:00
Baptiste Lepilleur
377d21e145 - added need for 64 bits integer to roadmap 2010-04-14 13:17:26 +00:00
Baptiste Lepilleur
afd9cef928 Removed experimental ValueAllocator, it caused static initialization/destruction order issues (bug #2934500). The DefaultValueAllocator has been inlined in code. 2010-03-13 13:10:27 +00:00
Baptiste Lepilleur
d38ba2a2cb - extracted some utility functions out-of reader and parser. 2010-03-13 12:24:38 +00:00
Baptiste Lepilleur
130730ffd7 Added NEWS.txt that provides a synopsis of the change since the last version. Integrated NEWS.txt in documentation. 2010-03-13 11:14:49 +00:00
Baptiste Lepilleur
e1b26455e7 - added support for compilation using Microsoft Visual Studio 2008 2010-03-13 10:59:50 +00:00
Baptiste Lepilleur
cd6cb5d0e6 - better execution examples 2010-03-13 07:59:07 +00:00
Baptiste Lepilleur
0a899589c2 - add LD_LIBRARY_PATH to propagated environment variables as it is required for some compiler installations. 2010-03-13 07:55:46 +00:00
Baptiste Lepilleur
a11e47d9ad - fixed project links section name 2010-03-12 10:17:46 +00:00
Baptiste Lepilleur
59ff11281a Released 0.5.0 2010-03-12 07:46:20 +00:00
Baptiste Lepilleur
e6a77410f4 - fixed typos and added "download" section to documentation
- commit version numbers after release
2010-03-11 21:02:26 +00:00
Baptiste Lepilleur
0c5fff142d Removed experimental notification on iterators, and added experimental status for allocator (to be removed) 2010-03-11 20:23:07 +00:00
Baptiste Lepilleur
d89d7961d6 - added --no-web to skip upload to web site
- added automatic upload of source and documentation tarball on frs.sourceforge.net
2010-02-25 08:30:09 +00:00
Baptiste Lepilleur
64ba062076 - doc is now generated in dist/doxygen
- makerelease now decompress the tarball, download and install scons, run scons check on provided platforms, decompress doc tarball and upload the doc on the project web
2010-02-24 23:08:47 +00:00
Baptiste Lepilleur
35bdc07ebd - added source tarball decompression 2010-02-24 08:05:41 +00:00
Baptiste Lepilleur
e94d2f483b - added the following step to make_release: fix EOL in distribution source, generate source tarball.
- devtools/ was made into a python module and common utilities are being moved in this module
2010-02-23 21:00:30 +00:00
Baptiste Lepilleur
7c171ee726 - added svn export
- prepared tool for eol conversion
2010-02-23 08:44:52 +00:00
Baptiste Lepilleur
fcf145ecd4 - changed SVN EOL properties so that HTML file are in Unix format, Visual Studio solution are always in Windows format, and sources are in native format. 2010-02-23 08:23:41 +00:00
Baptiste Lepilleur
1f4847cbd9 - added (incomplete) script makerelease.py to handle svn tagging and tar balls generation 2010-02-23 07:57:38 +00:00
Baptiste Lepilleur
35503e5917 - fixed project name and version 2010-02-22 04:37:31 +00:00
Baptiste Lepilleur
57ee0e3b37 - Documentation generation is no longer handled by SCons. The script doxybuild.py is used to generate the documentation on demand.
- Added file 'version' that contains jsoncpp version number. It is used by both SConstruct and doxybuild.py.
- Updated README.txt with documentation build instruction, and instructions to add a test case.
2010-02-22 04:16:10 +00:00
Baptiste Lepilleur
8d3790d217 - added missing virtual destructor to TestCase. 2010-02-21 14:28:54 +00:00
127 changed files with 18073 additions and 8757 deletions

47
.clang-format Normal file
View File

@@ -0,0 +1,47 @@
---
# BasedOnStyle: LLVM
AccessModifierOffset: -2
ConstructorInitializerIndentWidth: 4
AlignEscapedNewlinesLeft: false
AlignTrailingComments: true
AllowAllParametersOfDeclarationOnNextLine: true
AllowShortIfStatementsOnASingleLine: false
AllowShortLoopsOnASingleLine: false
AlwaysBreakTemplateDeclarations: false
AlwaysBreakBeforeMultilineStrings: false
BreakBeforeBinaryOperators: false
BreakBeforeTernaryOperators: true
BreakConstructorInitializersBeforeComma: false
BinPackParameters: false
ColumnLimit: 80
ConstructorInitializerAllOnOneLineOrOnePerLine: false
DerivePointerBinding: false
ExperimentalAutoDetectBinPacking: false
IndentCaseLabels: false
MaxEmptyLinesToKeep: 1
NamespaceIndentation: None
ObjCSpaceBeforeProtocolList: true
PenaltyBreakBeforeFirstCallParameter: 19
PenaltyBreakComment: 60
PenaltyBreakString: 1000
PenaltyBreakFirstLessLess: 120
PenaltyExcessCharacter: 1000000
PenaltyReturnTypeOnItsOwnLine: 60
PointerBindsToType: true
SpacesBeforeTrailingComments: 1
Cpp11BracedListStyle: false
Standard: Cpp03
IndentWidth: 2
TabWidth: 8
UseTab: Never
BreakBeforeBraces: Attach
IndentFunctionDeclarationAfterType: false
SpacesInParentheses: false
SpacesInAngles: false
SpaceInEmptyParentheses: false
SpacesInCStyleCastParentheses: false
SpaceAfterControlStatementKeyword: true
SpaceBeforeAssignmentOperators: true
ContinuationIndentWidth: 4
...

36
.gitignore vendored Normal file
View File

@@ -0,0 +1,36 @@
/build/
*.pyc
*.swp
*.actual
*.actual-rewrite
*.process-output
*.rewrite
/bin/
/buildscons/
/libs/
/doc/doxyfile
/dist/
#/version
#/include/json/version.h
# MSVC project files:
*.sln
*.vcxproj
*.filters
*.user
*.sdf
*.opensdf
*.suo
# MSVC build files:
*.lib
*.obj
*.tlog/
*.pdb
# CMake-generated files:
CMakeFiles/
CTestTestFile.cmake
cmake_install.cmake
pkg-config/jsoncpp.pc
jsoncpp_lib_static.dir/

17
.travis.yml Normal file
View File

@@ -0,0 +1,17 @@
# Build matrix / environment variable are explained on:
# http://about.travis-ci.org/docs/user/build-configuration/
# This file can be validated on:
# http://lint.travis-ci.org/
before_install: sudo apt-get install cmake
language: cpp
compiler:
- gcc
- clang
script: cmake -DJSONCPP_WITH_CMAKE_PACKAGE=$CMAKE_PKG -DJSONCPP_LIB_BUILD_SHARED=$SHARED_LIB -DCMAKE_BUILD_TYPE=$BUILD_TYPE -DCMAKE_VERBOSE_MAKEFILE=$VERBOSE_MAKE . && make && make jsoncpp_check
env:
matrix:
- SHARED_LIB=ON STATIC_LIB=ON CMAKE_PKG=ON BUILD_TYPE=release VERBOSE_MAKE=false
- SHARED_LIB=OFF STATIC_LIB=ON CMAKE_PKG=OFF BUILD_TYPE=debug VERBOSE_MAKE=true VERBOSE
notifications:
email:
- aaronjjacobs@gmail.com

126
CMakeLists.txt Normal file
View File

@@ -0,0 +1,126 @@
# vim: et ts=4 sts=4 sw=4 tw=0
CMAKE_MINIMUM_REQUIRED(VERSION 2.8.5)
PROJECT(jsoncpp)
ENABLE_TESTING()
OPTION(JSONCPP_WITH_TESTS "Compile and (for jsoncpp_check) run JsonCpp test executables" ON)
OPTION(JSONCPP_WITH_POST_BUILD_UNITTEST "Automatically run unit-tests as a post build step" ON)
OPTION(JSONCPP_WITH_WARNING_AS_ERROR "Force compilation to fail if a warning occurs" OFF)
OPTION(JSONCPP_WITH_PKGCONFIG_SUPPORT "Generate and install .pc files" ON)
OPTION(JSONCPP_WITH_CMAKE_PACKAGE "Generate and install cmake package files" OFF)
# Ensures that CMAKE_BUILD_TYPE is visible in cmake-gui on Unix
IF(NOT WIN32)
IF(NOT CMAKE_BUILD_TYPE)
SET(CMAKE_BUILD_TYPE Release CACHE STRING
"Choose the type of build, options are: None Debug Release RelWithDebInfo MinSizeRel Coverage."
FORCE)
ENDIF(NOT CMAKE_BUILD_TYPE)
ENDIF(NOT WIN32)
SET(LIB_SUFFIX "" CACHE STRING "Optional arch-dependent suffix for the library installation directory")
SET(RUNTIME_INSTALL_DIR bin
CACHE PATH "Install dir for executables and dlls")
SET(ARCHIVE_INSTALL_DIR lib${LIB_SUFFIX}
CACHE PATH "Install dir for static libraries")
SET(LIBRARY_INSTALL_DIR lib${LIB_SUFFIX}
CACHE PATH "Install dir for shared libraries")
SET(INCLUDE_INSTALL_DIR include
CACHE PATH "Install dir for headers")
SET(PACKAGE_INSTALL_DIR lib${LIB_SUFFIX}/cmake
CACHE PATH "Install dir for cmake package config files")
MARK_AS_ADVANCED( RUNTIME_INSTALL_DIR ARCHIVE_INSTALL_DIR INCLUDE_INSTALL_DIR PACKAGE_INSTALL_DIR )
# Set variable named ${VAR_NAME} to value ${VALUE}
FUNCTION(set_using_dynamic_name VAR_NAME VALUE)
SET( "${VAR_NAME}" "${VALUE}" PARENT_SCOPE)
ENDFUNCTION(set_using_dynamic_name)
# Extract major, minor, patch from version text
# Parse a version string "X.Y.Z" and outputs
# version parts in ${OUPUT_PREFIX}_MAJOR, _MINOR, _PATCH.
# If parse succeeds then ${OUPUT_PREFIX}_FOUND is TRUE.
MACRO(jsoncpp_parse_version VERSION_TEXT OUPUT_PREFIX)
SET(VERSION_REGEX "[0-9]+\\.[0-9]+\\.[0-9]+(-[a-zA-Z0-9_]+)?")
IF( ${VERSION_TEXT} MATCHES ${VERSION_REGEX} )
STRING(REGEX MATCHALL "[0-9]+|-([A-Za-z0-9_]+)" VERSION_PARTS ${VERSION_TEXT})
LIST(GET VERSION_PARTS 0 ${OUPUT_PREFIX}_MAJOR)
LIST(GET VERSION_PARTS 1 ${OUPUT_PREFIX}_MINOR)
LIST(GET VERSION_PARTS 2 ${OUPUT_PREFIX}_PATCH)
set_using_dynamic_name( "${OUPUT_PREFIX}_FOUND" TRUE )
ELSE( ${VERSION_TEXT} MATCHES ${VERSION_REGEX} )
set_using_dynamic_name( "${OUPUT_PREFIX}_FOUND" FALSE )
ENDIF( ${VERSION_TEXT} MATCHES ${VERSION_REGEX} )
ENDMACRO(jsoncpp_parse_version)
# Read out version from "version" file
#FILE(STRINGS "version" JSONCPP_VERSION)
#SET( JSONCPP_VERSION_MAJOR X )
#SET( JSONCPP_VERSION_MINOR Y )
#SET( JSONCPP_VERSION_PATCH Z )
SET( JSONCPP_VERSION 1.10.0 )
jsoncpp_parse_version( ${JSONCPP_VERSION} JSONCPP_VERSION )
#IF(NOT JSONCPP_VERSION_FOUND)
# MESSAGE(FATAL_ERROR "Failed to parse version string properly. Expect X.Y.Z")
#ENDIF(NOT JSONCPP_VERSION_FOUND)
MESSAGE(STATUS "JsonCpp Version: ${JSONCPP_VERSION_MAJOR}.${JSONCPP_VERSION_MINOR}.${JSONCPP_VERSION_PATCH}")
# File version.h is only regenerated on CMake configure step
CONFIGURE_FILE( "${PROJECT_SOURCE_DIR}/src/lib_json/version.h.in"
"${PROJECT_SOURCE_DIR}/include/json/version.h"
NEWLINE_STYLE UNIX )
CONFIGURE_FILE( "${PROJECT_SOURCE_DIR}/version.in"
"${PROJECT_SOURCE_DIR}/version"
NEWLINE_STYLE UNIX )
macro(UseCompilationWarningAsError)
if ( MSVC )
# Only enabled in debug because some old versions of VS STL generate
# warnings when compiled in release configuration.
set(CMAKE_CXX_FLAGS_DEBUG "${CMAKE_CXX_FLAGS_DEBUG} /WX ")
endif( MSVC )
endmacro()
# Include our configuration header
INCLUDE_DIRECTORIES( ${jsoncpp_SOURCE_DIR}/include )
if ( MSVC )
# Only enabled in debug because some old versions of VS STL generate
# unreachable code warning when compiled in release configuration.
set(CMAKE_CXX_FLAGS_DEBUG "${CMAKE_CXX_FLAGS_DEBUG} /W4 ")
endif( MSVC )
if (CMAKE_CXX_COMPILER_ID MATCHES "Clang")
# using regular Clang or AppleClang
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall")
elseif ("${CMAKE_CXX_COMPILER_ID}" STREQUAL "GNU")
# using GCC
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall -Wextra -pedantic")
endif()
IF(JSONCPP_WITH_WARNING_AS_ERROR)
UseCompilationWarningAsError()
ENDIF(JSONCPP_WITH_WARNING_AS_ERROR)
IF(JSONCPP_WITH_PKGCONFIG_SUPPORT)
CONFIGURE_FILE(
"pkg-config/jsoncpp.pc.in"
"pkg-config/jsoncpp.pc"
@ONLY)
INSTALL(FILES "${CMAKE_BINARY_DIR}/pkg-config/jsoncpp.pc"
DESTINATION "${CMAKE_INSTALL_PREFIX}/lib${LIB_SUFFIX}/pkgconfig")
ENDIF(JSONCPP_WITH_PKGCONFIG_SUPPORT)
IF(JSONCPP_WITH_CMAKE_PACKAGE)
INSTALL(EXPORT jsoncpp
DESTINATION ${PACKAGE_INSTALL_DIR}/jsoncpp
FILE jsoncppConfig.cmake)
ENDIF(JSONCPP_WITH_CMAKE_PACKAGE)
# Build the different applications
ADD_SUBDIRECTORY( src )
#install the includes
ADD_SUBDIRECTORY( include )

56
LICENSE
View File

@@ -1 +1,55 @@
The json-cpp library and this documentation are in Public Domain.
The JsonCpp library's source code, including accompanying documentation,
tests and demonstration applications, are licensed under the following
conditions...
The author (Baptiste Lepilleur) explicitly disclaims copyright in all
jurisdictions which recognize such a disclaimer. In such jurisdictions,
this software is released into the Public Domain.
In jurisdictions which do not recognize Public Domain property (e.g. Germany as of
2010), this software is Copyright (c) 2007-2010 by Baptiste Lepilleur, and is
released under the terms of the MIT License (see below).
In jurisdictions which recognize Public Domain property, the user of this
software may choose to accept it either as 1) Public Domain, 2) under the
conditions of the MIT License (see below), or 3) under the terms of dual
Public Domain/MIT License conditions described here, as they choose.
The MIT License is about as close to Public Domain as a license can get, and is
described in clear, concise terms at:
http://en.wikipedia.org/wiki/MIT_License
The full text of the MIT License follows:
========================================================================
Copyright (c) 2007-2010 Baptiste Lepilleur
Permission is hereby granted, free of charge, to any person
obtaining a copy of this software and associated documentation
files (the "Software"), to deal in the Software without
restriction, including without limitation the rights to use, copy,
modify, merge, publish, distribute, sublicense, and/or sell copies
of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS
BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN
ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
========================================================================
(END LICENSE TEXT)
The MIT license is compatible with both the GPL and commercial
software, affording one all of the rights of Public Domain with the
minor nuisance of being required to keep the above copyright notice
and license text in the source code. Note also that by accepting the
Public Domain "license" you can re-license your copy using whatever
license you like.

175
NEWS.txt Normal file
View File

@@ -0,0 +1,175 @@
New in SVN
----------
* Updated the type system's behavior, in order to better support backwards
compatibility with code that was written before 64-bit integer support was
introduced. Here's how it works now:
* isInt, isInt64, isUInt, and isUInt64 return true if and only if the
value can be exactly represented as that type. In particular, a value
constructed with a double like 17.0 will now return true for all of
these methods.
* isDouble and isFloat now return true for all numeric values, since all
numeric values can be converted to a double or float without
truncation. Note however that the conversion may not be exact -- for
example, doubles cannot exactly represent all integers above 2^53 + 1.
* isBool, isNull, isString, isArray, and isObject now return true if and
only if the value is of that type.
* isConvertibleTo(fooValue) indicates that it is safe to call asFoo.
(For each type foo, isFoo always implies isConvertibleTo(fooValue).)
asFoo returns an approximate or exact representation as appropriate.
For example, a double value may be truncated when asInt is called.
* For backwards compatibility with old code, isConvertibleTo(intValue)
may return false even if type() == intValue. This is because the value
may have been constructed with a 64-bit integer larger than maxInt,
and calling asInt() would cause an exception. If you're writing new
code, use isInt64 to find out whether the value is exactly
representable using an Int64, or asDouble() combined with minInt64 and
maxInt64 to figure out whether it is approximately representable.
* Value
- Patch #10: BOOST_FOREACH compatibility. Made Json::iterator more
standard compliant, added missing iterator_category and value_type
typedefs (contribued by Robert A. Iannucci).
* Compilation
- New CMake based build system. Based in part on contribution from
Igor Okulist and Damien Buhl (Patch #14).
- New header json/version.h now contains version number macros
(JSONCPP_VERSION_MAJOR, JSONCPP_VERSION_MINOR, JSONCPP_VERSION_PATCH
and JSONCPP_VERSION_HEXA).
- Patch #11: added missing JSON_API on some classes causing link issues
when building as a dynamic library on Windows
(contributed by Francis Bolduc).
- Visual Studio DLL: suppressed warning "C4251: <data member>: <type>
needs to have dll-interface to be used by..." via pragma push/pop
in json-cpp headers.
- Added Travis CI intregration: https://travis-ci.org/blep/jsoncpp-mirror
* Bug fixes
- Patch #15: Copy constructor does not initialize allocated_ for stringValue
(contributed by rmongia).
- Patch #16: Missing field copy in Json::Value::iterator causing infinite
loop when using experimental internal map (#define JSON_VALUE_USE_INTERNAL_MAP)
(contributed by Ming-Lin Kao).
New in JsonCpp 0.6.0:
---------------------
* Compilation
- LD_LIBRARY_PATH and LIBRARY_PATH environment variables are now
propagated to the build environment as this is required for some
compiler installation.
- Added support for Microsoft Visual Studio 2008 (bug #2930462):
The platform "msvc90" has been added.
Notes: you need to setup the environment by running vcvars32.bat
(e.g. MSVC 2008 command prompt in start menu) before running scons.
- Added support for amalgamated source and header generation (a la sqlite).
Refer to README.md section "Generating amalgamated source and header"
for detail.
* Value
- Removed experimental ValueAllocator, it caused static
initialization/destruction order issues (bug #2934500).
The DefaultValueAllocator has been inlined in code.
- Added support for 64 bits integer:
Types Json::Int64 and Json::UInt64 have been added. They are aliased
to 64 bits integers on system that support them (based on __int64 on
Microsoft Visual Studio platform, and long long on other platforms).
Types Json::LargestInt and Json::LargestUInt have been added. They are
aliased to the largest integer type supported:
either Json::Int/Json::UInt or Json::Int64/Json::UInt64 respectively.
Json::Value::asInt() and Json::Value::asUInt() still returns plain
"int" based types, but asserts if an attempt is made to retrieve
a 64 bits value that can not represented as the return type.
Json::Value::asInt64() and Json::Value::asUInt64() have been added
to obtain the 64 bits integer value.
Json::Value::asLargestInt() and Json::Value::asLargestUInt() returns
the integer as a LargestInt/LargestUInt respectively. Those functions
functions are typically used when implementing writer.
The reader attempts to read number as 64 bits integer, and fall back
to reading a double if the number is not in the range of 64 bits
integer.
Warning: Json::Value::asInt() and Json::Value::asUInt() now returns
long long. This changes break code that was passing the return value
to *printf() function.
Support for 64 bits integer can be disabled by defining the macro
JSON_NO_INT64 (uncomment it in json/config.h for example), though
it should have no impact on existing usage.
- The type Json::ArrayIndex is used for indexes of a JSON value array. It
is an unsigned int (typically 32 bits).
- Array index can be passed as int to operator[], allowing use of literal:
Json::Value array;
array.append( 1234 );
int value = array[0].asInt(); // did not compile previously
- Added float Json::Value::asFloat() to obtain a floating point value as a
float (avoid lost of precision warning caused by used of asDouble()
to initialize a float).
* Reader
- Renamed Reader::getFormatedErrorMessages() to getFormattedErrorMessages.
Bug #3023708 (Formatted has 2 't'). The old member function is deprecated
but still present for backward compatibility.
* Tests
- Added test to ensure that the escape sequence "\/" is corrected handled
by the parser.
* Bug fixes
- Bug #3139677: JSON [1 2 3] was incorrectly parsed as [1, 3]. Error is now
correctly detected.
- Bug #3139678: stack buffer overflow when parsing a double with a
length of 32 characters.
- Fixed Value::operator <= implementation (had the semantic of operator >=).
Found when adding unit tests for comparison operators.
- Value::compare() is now const and has an actual implementation with
unit tests.
- Bug #2407932: strpbrk() can fail for NULL pointer.
- Bug #3306345: Fixed minor typo in Path::resolve().
- Bug #3314841/#3306896: errors in amalgamate.py
- Fixed some Coverity warnings and line-endings.
* License
- See file LICENSE for details. Basically JsonCpp is now licensed under
MIT license, or public domain if desired and recognized in your jurisdiction.
Thanks to Stephan G. Beal [http://wanderinghorse.net/home/stephan/]) who
helped figuring out the solution to the public domain issue.

210
README.md Normal file
View File

@@ -0,0 +1,210 @@
Introduction
------------
[JSON][json-org] is a lightweight data-interchange format. It can represent
numbers, strings, ordered sequences of values, and collections of name/value
pairs.
[json-org]: http://json.org/
[JsonCpp][] is a C++ library that allows manipulating JSON values, including
serialization and deserialization to and from strings. It can also preserve
existing comment in unserialization/serialization steps, making it a convenient
format to store user input files.
[JsonCpp]: http://open-source-parsers.github.io/jsoncpp-docs/doxygen/index.html
## A note on backward-compatibility
* `1.y.z` is built with C++11.
* `0.y.z` can be used with older compilers.
* Major versions maintain binary-compatibility.
Using JsonCpp in your project
-----------------------------
The recommended approach to integrating JsonCpp in your project is to build
the amalgamated source (a single `.cpp` file) with your own build system. This
ensures consistency of compilation flags and ABI compatibility. See the section
"Generating amalgamated source and header" for instructions.
The `include/` should be added to your compiler include path. Jsoncpp headers
should be included as follow:
#include <json/json.h>
If JsonCpp was built as a dynamic library on Windows, then your project needs to
define the macro `JSON_DLL`.
Building and testing with CMake
-------------------------------
[CMake][] is a C++ Makefiles/Solution generator. It is usually available on most
Linux system as package. On Ubuntu:
sudo apt-get install cmake
[CMake]: http://www.cmake.org
Note that Python is also required to run the JSON reader/writer tests. If
missing, the build will skip running those tests.
When running CMake, a few parameters are required:
* a build directory where the makefiles/solution are generated. It is also used
to store objects, libraries and executables files.
* the generator to use: makefiles or Visual Studio solution? What version or
Visual Studio, 32 or 64 bits solution?
Steps for generating solution/makefiles using `cmake-gui`:
* Make "source code" point to the source directory.
* Make "where to build the binary" point to the directory to use for the build.
* Click on the "Grouped" check box.
* Review JsonCpp build options (tick `JSONCPP_LIB_BUILD_SHARED` to build as a
dynamic library).
* Click the configure button at the bottom, then the generate button.
* The generated solution/makefiles can be found in the binary directory.
Alternatively, from the command-line on Unix in the source directory:
mkdir -p build/debug
cd build/debug
cmake -DCMAKE_BUILD_TYPE=debug -DJSONCPP_LIB_BUILD_STATIC=ON -DJSONCPP_LIB_BUILD_SHARED=OFF -G "Unix Makefiles" ../..
make
Running `cmake -`" will display the list of available generators (passed using
the `-G` option).
By default CMake hides compilation commands. This can be modified by specifying
`-DCMAKE_VERBOSE_MAKEFILE=true` when generating makefiles.
Building and testing with SCons
-------------------------------
**Note:** The SCons-based build system is deprecated. Please use CMake; see the
section above.
JsonCpp can use [Scons][] as a build system. Note that SCons requires Python to
be installed.
[SCons]: http://www.scons.org/
Invoke SCons as follows:
scons platform=$PLATFORM [TARGET]
where `$PLATFORM` may be one of:
* `suncc`: Sun C++ (Solaris)
* `vacpp`: Visual Age C++ (AIX)
* `mingw`
* `msvc6`: Microsoft Visual Studio 6 service pack 5-6
* `msvc70`: Microsoft Visual Studio 2002
* `msvc71`: Microsoft Visual Studio 2003
* `msvc80`: Microsoft Visual Studio 2005
* `msvc90`: Microsoft Visual Studio 2008
* `linux-gcc`: Gnu C++ (linux, also reported to work for Mac OS X)
If you are building with Microsoft Visual Studio 2008, you need to set up the
environment by running `vcvars32.bat` (e.g. MSVC 2008 command prompt) before
running SCons.
# Running the tests manually
You need to run tests manually only if you are troubleshooting an issue.
In the instructions below, replace `path/to/jsontest` with the path of the
`jsontest` executable that was compiled on your platform.
cd test
# This will run the Reader/Writer tests
python runjsontests.py path/to/jsontest
# This will run the Reader/Writer tests, using JSONChecker test suite
# (http://www.json.org/JSON_checker/).
# Notes: not all tests pass: JsonCpp is too lenient (for example,
# it allows an integer to start with '0'). The goal is to improve
# strict mode parsing to get all tests to pass.
python runjsontests.py --with-json-checker path/to/jsontest
# This will run the unit tests (mostly Value)
python rununittests.py path/to/test_lib_json
# You can run the tests using valgrind:
python rununittests.py --valgrind path/to/test_lib_json
## Running the tests using scons
Note that tests can be run using SCons using the `check` target:
scons platform=$PLATFORM check
Building the documentation
--------------------------
Run the Python script `doxybuild.py` from the top directory:
python doxybuild.py --doxygen=$(which doxygen) --open --with-dot
See `doxybuild.py --help` for options.
Generating amalgamated source and header
----------------------------------------
JsonCpp is provided with a script to generate a single header and a single
source file to ease inclusion into an existing project. The amalgamated source
can be generated at any time by running the following command from the
top-directory (this requires Python 2.6):
python amalgamate.py
It is possible to specify header name. See the `-h` option for detail.
By default, the following files are generated:
* `dist/jsoncpp.cpp`: source file that needs to be added to your project.
* `dist/json/json.h`: corresponding header file for use in your project. It is
equivalent to including `json/json.h` in non-amalgamated source. This header
only depends on standard headers.
* `dist/json/json-forwards.h`: header that provides forward declaration of all
JsonCpp types.
The amalgamated sources are generated by concatenating JsonCpp source in the
correct order and defining the macro `JSON_IS_AMALGAMATION` to prevent inclusion
of other headers.
Adding a reader/writer test
---------------------------
To add a test, you need to create two files in test/data:
* a `TESTNAME.json` file, that contains the input document in JSON format.
* a `TESTNAME.expected` file, that contains a flatened representation of the
input document.
The `TESTNAME.expected` file format is as follows:
* each line represents a JSON element of the element tree represented by the
input document.
* each line has two parts: the path to access the element separated from the
element value by `=`. Array and object values are always empty (i.e.
represented by either `[]` or `{}`).
* element path: `.` represents the root element, and is used to separate object
members. `[N]` is used to specify the value of an array element at index `N`.
See the examples `test_complex_01.json` and `test_complex_01.expected` to better
understand element paths.
Understanding reader/writer test output
---------------------------------------
When a test is run, output files are generated beside the input test files.
Below is a short description of the content of each file:
* `test_complex_01.json`: input JSON document.
* `test_complex_01.expected`: flattened JSON element tree used to check if
parsing was corrected.
* `test_complex_01.actual`: flattened JSON element tree produced by `jsontest`
from reading `test_complex_01.json`.
* `test_complex_01.rewrite`: JSON document written by `jsontest` using the
`Json::Value` parsed from `test_complex_01.json` and serialized using
`Json::StyledWritter`.
* `test_complex_01.actual-rewrite`: flattened JSON element tree produced by
`jsontest` from reading `test_complex_01.rewrite`.
* `test_complex_01.process-output`: `jsontest` output, typically useful for
understanding parsing errors.
License
-------
See the `LICENSE` file for details. In summary, JsonCpp is licensed under the
MIT license, or public domain if desired and recognized in your jurisdiction.

View File

@@ -1,117 +0,0 @@
* Introduction:
=============
JSON (JavaScript Object Notation) is a lightweight data-interchange format.
It can represent integer, real number, string, an ordered sequence of
value, and a collection of name/value pairs.
JsonCpp is a simple API to manipulate JSON value, handle serialization
and unserialization to string.
It can also preserve existing comment in unserialization/serialization steps,
making it a convenient format to store user input files.
Unserialization parsing is user friendly and provides precise error reports.
* Building/Testing:
=================
JsonCpp uses Scons (http://www.scons.org) as a build system. Scons requires
python to be installed (http://www.python.org).
You download scons-local distribution from the following url:
http://sourceforge.net/project/showfiles.php?group_id=30337&package_id=67375
Unzip it in the directory where you found this README file. scons.py Should be
at the same level as README.
python scons.py platform=PLTFRM [TARGET]
where PLTFRM may be one of:
suncc Sun C++ (Solaris)
vacpp Visual Age C++ (AIX)
mingw
msvc6 Microsoft Visual Studio 6 service pack 5-6
msvc70 Microsoft Visual Studio 2002
msvc71 Microsoft Visual Studio 2003
msvc80 Microsoft Visual Studio 2005
linux-gcc Gnu C++ (linux, also reported to work for Mac OS X)
adding platform is fairly simple. You need to change the Sconstruct file
to do so.
and TARGET may be:
check: build library and run unit tests.
* Running the test manually:
==========================
cd test
# This will run the Reader/Writer tests
python runjsontests.py "path to jsontest.exe"
# This will run the Reader/Writer tests, using JSONChecker test suite
# (http://www.json.org/JSON_checker/).
# Notes: not all tests pass: JsonCpp is too lenient (for example,
# it allows an integer to start with '0'). The goal is to improve
# strict mode parsing to get all tests to pass.
python runjsontests.py --with-json-checker "path to jsontest.exe"
# This will run the unit tests (mostly Value)
python rununittests.py "path to test_lib_json.exe"
You can run the tests using valgrind:
python rununittests.py --valgrind "path to test_lib_json.exe"
* Building the documentation:
===========================
Run the python script doxybuild.py from the top directory:
python doxybuild.py --open --with-dot
See doxybuild.py --help for options.
* Adding a reader/writer test:
============================
To add a test, you need to create two files in test/data:
- a TESTNAME.json file, that contains the input document in JSON format.
- a TESTNAME.expected file, that contains a flatened representation of
the input document.
TESTNAME.expected file format:
- each line represents a JSON element of the element tree represented
by the input document.
- each line has two parts: the path to access the element separated from
the element value by '='. Array and object values are always empty
(e.g. represented by either [] or {}).
- element path: '.' represented the root element, and is used to separate
object members. [N] is used to specify the value of an array element
at index N.
See test_complex_01.json and test_complex_01.expected to better understand
element path.
* Understanding reader/writer test output:
========================================
When a test is run, output files are generated aside the input test files.
Below is a short description of the content of each file:
- test_complex_01.json: input JSON document
- test_complex_01.expected: flattened JSON element tree used to check if
parsing was corrected.
- test_complex_01.actual: flattened JSON element tree produced by
jsontest.exe from reading test_complex_01.json
- test_complex_01.rewrite: JSON document written by jsontest.exe using the
Json::Value parsed from test_complex_01.json and serialized using
Json::StyledWritter.
- test_complex_01.actual-rewrite: flattened JSON element tree produced by
jsontest.exe from reading test_complex_01.rewrite.
test_complex_01.process-output: jsontest.exe output, typically useful to
understand parsing error.

View File

@@ -18,7 +18,7 @@ options = Variables()
options.Add( EnumVariable('platform',
'Platform (compiler/stl) used to build the project',
'msvc71',
allowed_values='suncc vacpp mingw msvc6 msvc7 msvc71 msvc80 linux-gcc'.split(),
allowed_values='suncc vacpp mingw msvc6 msvc7 msvc71 msvc80 msvc90 linux-gcc'.split(),
ignorecase=2) )
try:
@@ -57,8 +57,9 @@ def make_environ_vars():
"""Returns a dictionnary with environment variable to use when compiling."""
# PATH is required to find the compiler
# TEMP is required for at least mingw
# LD_LIBRARY_PATH & co is required on some system for the compiler
vars = {}
for name in ('PATH', 'TEMP', 'TMP'):
for name in ('PATH', 'TEMP', 'TMP', 'LD_LIBRARY_PATH', 'LIBRARY_PATH'):
if name in os.environ:
vars[name] = os.environ[name]
return vars
@@ -101,12 +102,24 @@ elif platform == 'msvc80':
for tool in ['msvc', 'msvs', 'mslink', 'masm', 'mslib']:
env.Tool( tool )
env['CXXFLAGS']='-GR -EHsc /nologo /MT'
elif platform == 'msvc90':
env['MSVS_VERSION']='9.0'
# Scons 1.2 fails to detect the correct location of the platform SDK.
# So we propagate those from the environment. This requires that the
# user run vcvars32.bat before compiling.
if 'INCLUDE' in os.environ:
env['ENV']['INCLUDE'] = os.environ['INCLUDE']
if 'LIB' in os.environ:
env['ENV']['LIB'] = os.environ['LIB']
for tool in ['msvc', 'msvs', 'mslink', 'masm', 'mslib']:
env.Tool( tool )
env['CXXFLAGS']='-GR -EHsc /nologo /MT'
elif platform == 'mingw':
env.Tool( 'mingw' )
env.Append( CPPDEFINES=[ "WIN32", "NDEBUG", "_MT" ] )
elif platform.startswith('linux-gcc'):
env.Tool( 'default' )
env.Append( LIBS = ['pthread'], CCFLAGS = "-Wall" )
env.Append( LIBS = ['pthread'], CCFLAGS = os.environ.get("CXXFLAGS", "-Wall"), LINKFLAGS=os.environ.get("LDFLAGS", "") )
env['SHARED_LIB_ENABLED'] = True
else:
print "UNSUPPORTED PLATFORM."
@@ -224,7 +237,7 @@ RunUnitTests = ActionFactory(runUnitTests_action, runUnitTests_string )
env.Alias( 'check' )
srcdist_cmd = env['SRCDIST_ADD']( source = """
AUTHORS README.txt SConstruct
AUTHORS README.md SConstruct
""".split() )
env.Alias( 'src-dist', srcdist_cmd )

154
amalgamate.py Normal file
View File

@@ -0,0 +1,154 @@
"""Amalgate json-cpp library sources into a single source and header file.
Works with python2.6+ and python3.4+.
Example of invocation (must be invoked from json-cpp top directory):
python amalgate.py
"""
import os
import os.path
import sys
class AmalgamationFile:
def __init__(self, top_dir):
self.top_dir = top_dir
self.blocks = []
def add_text(self, text):
if not text.endswith("\n"):
text += "\n"
self.blocks.append(text)
def add_file(self, relative_input_path, wrap_in_comment=False):
def add_marker(prefix):
self.add_text("")
self.add_text("// " + "/"*70)
self.add_text("// %s of content of file: %s" % (prefix, relative_input_path.replace("\\","/")))
self.add_text("// " + "/"*70)
self.add_text("")
add_marker("Beginning")
f = open(os.path.join(self.top_dir, relative_input_path), "rt")
content = f.read()
if wrap_in_comment:
content = "/*\n" + content + "\n*/"
self.add_text(content)
f.close()
add_marker("End")
self.add_text("\n\n\n\n")
def get_value(self):
return "".join(self.blocks).replace("\r\n","\n")
def write_to(self, output_path):
output_dir = os.path.dirname(output_path)
if output_dir and not os.path.isdir(output_dir):
os.makedirs(output_dir)
f = open(output_path, "wb")
f.write(str.encode(self.get_value(), 'UTF-8'))
f.close()
def amalgamate_source(source_top_dir=None,
target_source_path=None,
header_include_path=None):
"""Produces amalgated source.
Parameters:
source_top_dir: top-directory
target_source_path: output .cpp path
header_include_path: generated header path relative to target_source_path.
"""
print("Amalgating header...")
header = AmalgamationFile(source_top_dir)
header.add_text("/// Json-cpp amalgated header (http://jsoncpp.sourceforge.net/).")
header.add_text('/// It is intended to be used with #include "%s"' % header_include_path)
header.add_file("LICENSE", wrap_in_comment=True)
header.add_text("#ifndef JSON_AMALGATED_H_INCLUDED")
header.add_text("# define JSON_AMALGATED_H_INCLUDED")
header.add_text("/// If defined, indicates that the source file is amalgated")
header.add_text("/// to prevent private header inclusion.")
header.add_text("#define JSON_IS_AMALGAMATION")
header.add_file("include/json/version.h")
header.add_file("include/json/config.h")
header.add_file("include/json/forwards.h")
header.add_file("include/json/features.h")
header.add_file("include/json/value.h")
header.add_file("include/json/reader.h")
header.add_file("include/json/writer.h")
header.add_file("include/json/assertions.h")
header.add_text("#endif //ifndef JSON_AMALGATED_H_INCLUDED")
target_header_path = os.path.join(os.path.dirname(target_source_path), header_include_path)
print("Writing amalgated header to %r" % target_header_path)
header.write_to(target_header_path)
base, ext = os.path.splitext(header_include_path)
forward_header_include_path = base + "-forwards" + ext
print("Amalgating forward header...")
header = AmalgamationFile(source_top_dir)
header.add_text("/// Json-cpp amalgated forward header (http://jsoncpp.sourceforge.net/).")
header.add_text('/// It is intended to be used with #include "%s"' % forward_header_include_path)
header.add_text("/// This header provides forward declaration for all JsonCpp types.")
header.add_file("LICENSE", wrap_in_comment=True)
header.add_text("#ifndef JSON_FORWARD_AMALGATED_H_INCLUDED")
header.add_text("# define JSON_FORWARD_AMALGATED_H_INCLUDED")
header.add_text("/// If defined, indicates that the source file is amalgated")
header.add_text("/// to prevent private header inclusion.")
header.add_text("#define JSON_IS_AMALGAMATION")
header.add_file("include/json/config.h")
header.add_file("include/json/forwards.h")
header.add_text("#endif //ifndef JSON_FORWARD_AMALGATED_H_INCLUDED")
target_forward_header_path = os.path.join(os.path.dirname(target_source_path),
forward_header_include_path)
print("Writing amalgated forward header to %r" % target_forward_header_path)
header.write_to(target_forward_header_path)
print("Amalgating source...")
source = AmalgamationFile(source_top_dir)
source.add_text("/// Json-cpp amalgated source (http://jsoncpp.sourceforge.net/).")
source.add_text('/// It is intended to be used with #include "%s"' % header_include_path)
source.add_file("LICENSE", wrap_in_comment=True)
source.add_text("")
source.add_text('#include "%s"' % header_include_path)
source.add_text("""
#ifndef JSON_IS_AMALGAMATION
#error "Compile with -I PATH_TO_JSON_DIRECTORY"
#endif
""")
source.add_text("")
lib_json = "src/lib_json"
source.add_file(os.path.join(lib_json, "json_tool.h"))
source.add_file(os.path.join(lib_json, "json_reader.cpp"))
source.add_file(os.path.join(lib_json, "json_valueiterator.inl"))
source.add_file(os.path.join(lib_json, "json_value.cpp"))
source.add_file(os.path.join(lib_json, "json_writer.cpp"))
print("Writing amalgated source to %r" % target_source_path)
source.write_to(target_source_path)
def main():
usage = """%prog [options]
Generate a single amalgated source and header file from the sources.
"""
from optparse import OptionParser
parser = OptionParser(usage=usage)
parser.allow_interspersed_args = False
parser.add_option("-s", "--source", dest="target_source_path", action="store", default="dist/jsoncpp.cpp",
help="""Output .cpp source path. [Default: %default]""")
parser.add_option("-i", "--include", dest="header_include_path", action="store", default="json/json.h",
help="""Header include path. Used to include the header from the amalgated source file. [Default: %default]""")
parser.add_option("-t", "--top-dir", dest="top_dir", action="store", default=os.getcwd(),
help="""Source top-directory. [Default: %default]""")
parser.enable_interspersed_args()
options, args = parser.parse_args()
msg = amalgamate_source(source_top_dir=options.top_dir,
target_source_path=options.target_source_path,
header_include_path=options.header_include_path)
if msg:
sys.stderr.write(msg + "\n")
sys.exit(1)
else:
print("Source succesfully amalagated")
if __name__ == "__main__":
main()

32
dev.makefile Normal file
View File

@@ -0,0 +1,32 @@
# This is only for jsoncpp developers/contributors.
# We use this to sign releases, generate documentation, etc.
VER?=$(shell cat version)
default:
@echo "VER=${VER}"
sign: jsoncpp-${VER}.tar.gz
gpg --armor --detach-sign $<
gpg --verify $<.asc
# Then upload .asc to the release.
jsoncpp-%.tar.gz:
curl https://github.com/open-source-parsers/jsoncpp/archive/$*.tar.gz -o $@
dox:
python doxybuild.py --doxygen=$$(which doxygen) --in doc/web_doxyfile.in
rsync -va --delete dist/doxygen/jsoncpp-api-html-${VER}/ ../jsoncpp-docs/doxygen/
# Then 'git add -A' and 'git push' in jsoncpp-docs.
build:
mkdir -p build/debug
cd build/debug; cmake -DCMAKE_BUILD_TYPE=debug -DJSONCPP_LIB_BUILD_SHARED=ON -G "Unix Makefiles" ../..
make -C build/debug
# Currently, this depends on include/json/version.h generated
# by cmake.
test-amalgamate:
python2.7 amalgamate.py
python3.4 amalgamate.py
cd dist; gcc -I. -c jsoncpp.cpp
clean:
\rm -rf *.gz *.asc dist/
.PHONY: build

33
devtools/agent_vmw7.json Normal file
View File

@@ -0,0 +1,33 @@
{
"cmake_variants" : [
{"name": "generator",
"generators": [
{"generator": [
"Visual Studio 7 .NET 2003",
"Visual Studio 9 2008",
"Visual Studio 9 2008 Win64",
"Visual Studio 10",
"Visual Studio 10 Win64",
"Visual Studio 11",
"Visual Studio 11 Win64"
]
},
{"generator": ["MinGW Makefiles"],
"env_prepend": [{"path": "c:/wut/prg/MinGW/bin"}]
}
]
},
{"name": "shared_dll",
"variables": [
["JSONCPP_LIB_BUILD_SHARED=true"],
["JSONCPP_LIB_BUILD_SHARED=false"]
]
},
{"name": "build_type",
"build_types": [
"debug",
"release"
]
}
]
}

26
devtools/agent_vmxp.json Normal file
View File

@@ -0,0 +1,26 @@
{
"cmake_variants" : [
{"name": "generator",
"generators": [
{"generator": [
"Visual Studio 6",
"Visual Studio 7",
"Visual Studio 8 2005"
]
}
]
},
{"name": "shared_dll",
"variables": [
["JSONCPP_LIB_BUILD_SHARED=true"],
["JSONCPP_LIB_BUILD_SHARED=false"]
]
},
{"name": "build_type",
"build_types": [
"debug",
"release"
]
}
]
}

View File

@@ -2,6 +2,7 @@
# encoding: utf-8
# Baptiste Lepilleur, 2009
from __future__ import print_function
from dircache import listdir
import re
import fnmatch
@@ -53,44 +54,44 @@ LINKS = DIR_LINK | FILE_LINK
ALL_NO_LINK = DIR | FILE
ALL = DIR | FILE | LINKS
_ANT_RE = re.compile( r'(/\*\*/)|(\*\*/)|(/\*\*)|(\*)|(/)|([^\*/]*)' )
_ANT_RE = re.compile(r'(/\*\*/)|(\*\*/)|(/\*\*)|(\*)|(/)|([^\*/]*)')
def ant_pattern_to_re( ant_pattern ):
"""Generates a regular expression from the ant pattern.
Matching convention:
**/a: match 'a', 'dir/a', 'dir1/dir2/a'
a/**/b: match 'a/b', 'a/c/b', 'a/d/c/b'
*.py: match 'script.py' but not 'a/script.py'
def ant_pattern_to_re(ant_pattern):
"""Generates a regular expression from the ant pattern.
Matching convention:
**/a: match 'a', 'dir/a', 'dir1/dir2/a'
a/**/b: match 'a/b', 'a/c/b', 'a/d/c/b'
*.py: match 'script.py' but not 'a/script.py'
"""
rex = ['^']
next_pos = 0
sep_rex = r'(?:/|%s)' % re.escape( os.path.sep )
## print 'Converting', ant_pattern
for match in _ANT_RE.finditer( ant_pattern ):
## print 'Matched', match.group()
## print match.start(0), next_pos
sep_rex = r'(?:/|%s)' % re.escape(os.path.sep)
## print 'Converting', ant_pattern
for match in _ANT_RE.finditer(ant_pattern):
## print 'Matched', match.group()
## print match.start(0), next_pos
if match.start(0) != next_pos:
raise ValueError( "Invalid ant pattern" )
raise ValueError("Invalid ant pattern")
if match.group(1): # /**/
rex.append( sep_rex + '(?:.*%s)?' % sep_rex )
rex.append(sep_rex + '(?:.*%s)?' % sep_rex)
elif match.group(2): # **/
rex.append( '(?:.*%s)?' % sep_rex )
rex.append('(?:.*%s)?' % sep_rex)
elif match.group(3): # /**
rex.append( sep_rex + '.*' )
rex.append(sep_rex + '.*')
elif match.group(4): # *
rex.append( '[^/%s]*' % re.escape(os.path.sep) )
rex.append('[^/%s]*' % re.escape(os.path.sep))
elif match.group(5): # /
rex.append( sep_rex )
rex.append(sep_rex)
else: # somepath
rex.append( re.escape(match.group(6)) )
next_pos = match.end()
rex.append(re.escape(match.group(6)))
next_pos = match.end()
rex.append('$')
return re.compile( ''.join( rex ) )
def _as_list( l ):
if isinstance(l, basestring):
return l.split()
return l
return re.compile(''.join(rex))
def _as_list(l):
if isinstance(l, basestring):
return l.split()
return l
def glob(dir_path,
includes = '**/*',
@@ -99,103 +100,103 @@ def glob(dir_path,
prune_dirs = prune_dirs,
max_depth = 25):
include_filter = [ant_pattern_to_re(p) for p in _as_list(includes)]
exclude_filter = [ant_pattern_to_re(p) for p in _as_list(excludes)]
prune_dirs = [p.replace('/',os.path.sep) for p in _as_list(prune_dirs)]
exclude_filter = [ant_pattern_to_re(p) for p in _as_list(excludes)]
prune_dirs = [p.replace('/',os.path.sep) for p in _as_list(prune_dirs)]
dir_path = dir_path.replace('/',os.path.sep)
entry_type_filter = entry_type
def is_pruned_dir( dir_name ):
def is_pruned_dir(dir_name):
for pattern in prune_dirs:
if fnmatch.fnmatch( dir_name, pattern ):
if fnmatch.fnmatch(dir_name, pattern):
return True
return False
def apply_filter( full_path, filter_rexs ):
def apply_filter(full_path, filter_rexs):
"""Return True if at least one of the filter regular expression match full_path."""
for rex in filter_rexs:
if rex.match( full_path ):
if rex.match(full_path):
return True
return False
def glob_impl( root_dir_path ):
child_dirs = [root_dir_path]
while child_dirs:
def glob_impl(root_dir_path):
child_dirs = [root_dir_path]
while child_dirs:
dir_path = child_dirs.pop()
for entry in listdir( dir_path ):
full_path = os.path.join( dir_path, entry )
## print 'Testing:', full_path,
is_dir = os.path.isdir( full_path )
if is_dir and not is_pruned_dir( entry ): # explore child directory ?
## print '===> marked for recursion',
child_dirs.append( full_path )
included = apply_filter( full_path, include_filter )
rejected = apply_filter( full_path, exclude_filter )
if not included or rejected: # do not include entry ?
## print '=> not included or rejected'
continue
link = os.path.islink( full_path )
is_file = os.path.isfile( full_path )
if not is_file and not is_dir:
## print '=> unknown entry type'
continue
if link:
entry_type = is_file and FILE_LINK or DIR_LINK
else:
entry_type = is_file and FILE or DIR
## print '=> type: %d' % entry_type,
if (entry_type & entry_type_filter) != 0:
## print ' => KEEP'
yield os.path.join( dir_path, entry )
## else:
## print ' => TYPE REJECTED'
return list( glob_impl( dir_path ) )
for entry in listdir(dir_path):
full_path = os.path.join(dir_path, entry)
## print 'Testing:', full_path,
is_dir = os.path.isdir(full_path)
if is_dir and not is_pruned_dir(entry): # explore child directory ?
## print '===> marked for recursion',
child_dirs.append(full_path)
included = apply_filter(full_path, include_filter)
rejected = apply_filter(full_path, exclude_filter)
if not included or rejected: # do not include entry ?
## print '=> not included or rejected'
continue
link = os.path.islink(full_path)
is_file = os.path.isfile(full_path)
if not is_file and not is_dir:
## print '=> unknown entry type'
continue
if link:
entry_type = is_file and FILE_LINK or DIR_LINK
else:
entry_type = is_file and FILE or DIR
## print '=> type: %d' % entry_type,
if (entry_type & entry_type_filter) != 0:
## print ' => KEEP'
yield os.path.join(dir_path, entry)
## else:
## print ' => TYPE REJECTED'
return list(glob_impl(dir_path))
if __name__ == "__main__":
import unittest
class AntPatternToRETest(unittest.TestCase):
## def test_conversion( self ):
## self.assertEqual( '^somepath$', ant_pattern_to_re( 'somepath' ).pattern )
def test_matching( self ):
test_cases = [ ( 'path',
['path'],
['somepath', 'pathsuffix', '/path', '/path'] ),
( '*.py',
['source.py', 'source.ext.py', '.py'],
['path/source.py', '/.py', 'dir.py/z', 'z.pyc', 'z.c'] ),
( '**/path',
['path', '/path', '/a/path', 'c:/a/path', '/a/b/path', '//a/path', '/a/path/b/path'],
['path/', 'a/path/b', 'dir.py/z', 'somepath', 'pathsuffix', 'a/somepath'] ),
( 'path/**',
['path/a', 'path/path/a', 'path//'],
['path', 'somepath/a', 'a/path', 'a/path/a', 'pathsuffix/a'] ),
( '/**/path',
['/path', '/a/path', '/a/b/path/path', '/path/path'],
['path', 'path/', 'a/path', '/pathsuffix', '/somepath'] ),
( 'a/b',
['a/b'],
['somea/b', 'a/bsuffix', 'a/b/c'] ),
( '**/*.py',
['script.py', 'src/script.py', 'a/b/script.py', '/a/b/script.py'],
['script.pyc', 'script.pyo', 'a.py/b'] ),
( 'src/**/*.py',
['src/a.py', 'src/dir/a.py'],
['a/src/a.py', '/src/a.py'] ),
]
for ant_pattern, accepted_matches, rejected_matches in list(test_cases):
def local_path( paths ):
return [ p.replace('/',os.path.sep) for p in paths ]
test_cases.append( (ant_pattern, local_path(accepted_matches), local_path( rejected_matches )) )
for ant_pattern, accepted_matches, rejected_matches in test_cases:
rex = ant_pattern_to_re( ant_pattern )
print 'ant_pattern:', ant_pattern, ' => ', rex.pattern
for accepted_match in accepted_matches:
print 'Accepted?:', accepted_match
self.assert_( rex.match( accepted_match ) is not None )
for rejected_match in rejected_matches:
print 'Rejected?:', rejected_match
self.assert_( rex.match( rejected_match ) is None )
## def test_conversion(self):
## self.assertEqual('^somepath$', ant_pattern_to_re('somepath').pattern)
def test_matching(self):
test_cases = [ ('path',
['path'],
['somepath', 'pathsuffix', '/path', '/path']),
('*.py',
['source.py', 'source.ext.py', '.py'],
['path/source.py', '/.py', 'dir.py/z', 'z.pyc', 'z.c']),
('**/path',
['path', '/path', '/a/path', 'c:/a/path', '/a/b/path', '//a/path', '/a/path/b/path'],
['path/', 'a/path/b', 'dir.py/z', 'somepath', 'pathsuffix', 'a/somepath']),
('path/**',
['path/a', 'path/path/a', 'path//'],
['path', 'somepath/a', 'a/path', 'a/path/a', 'pathsuffix/a']),
('/**/path',
['/path', '/a/path', '/a/b/path/path', '/path/path'],
['path', 'path/', 'a/path', '/pathsuffix', '/somepath']),
('a/b',
['a/b'],
['somea/b', 'a/bsuffix', 'a/b/c']),
('**/*.py',
['script.py', 'src/script.py', 'a/b/script.py', '/a/b/script.py'],
['script.pyc', 'script.pyo', 'a.py/b']),
('src/**/*.py',
['src/a.py', 'src/dir/a.py'],
['a/src/a.py', '/src/a.py']),
]
for ant_pattern, accepted_matches, rejected_matches in list(test_cases):
def local_path(paths):
return [ p.replace('/',os.path.sep) for p in paths ]
test_cases.append((ant_pattern, local_path(accepted_matches), local_path(rejected_matches)))
for ant_pattern, accepted_matches, rejected_matches in test_cases:
rex = ant_pattern_to_re(ant_pattern)
print('ant_pattern:', ant_pattern, ' => ', rex.pattern)
for accepted_match in accepted_matches:
print('Accepted?:', accepted_match)
self.assertTrue(rex.match(accepted_match) is not None)
for rejected_match in rejected_matches:
print('Rejected?:', rejected_match)
self.assertTrue(rex.match(rejected_match) is None)
unittest.main()

278
devtools/batchbuild.py Normal file
View File

@@ -0,0 +1,278 @@
from __future__ import print_function
import collections
import itertools
import json
import os
import os.path
import re
import shutil
import string
import subprocess
import sys
import cgi
class BuildDesc:
def __init__(self, prepend_envs=None, variables=None, build_type=None, generator=None):
self.prepend_envs = prepend_envs or [] # [ { "var": "value" } ]
self.variables = variables or []
self.build_type = build_type
self.generator = generator
def merged_with(self, build_desc):
"""Returns a new BuildDesc by merging field content.
Prefer build_desc fields to self fields for single valued field.
"""
return BuildDesc(self.prepend_envs + build_desc.prepend_envs,
self.variables + build_desc.variables,
build_desc.build_type or self.build_type,
build_desc.generator or self.generator)
def env(self):
environ = os.environ.copy()
for values_by_name in self.prepend_envs:
for var, value in list(values_by_name.items()):
var = var.upper()
if type(value) is unicode:
value = value.encode(sys.getdefaultencoding())
if var in environ:
environ[var] = value + os.pathsep + environ[var]
else:
environ[var] = value
return environ
def cmake_args(self):
args = ["-D%s" % var for var in self.variables]
# skip build type for Visual Studio solution as it cause warning
if self.build_type and 'Visual' not in self.generator:
args.append("-DCMAKE_BUILD_TYPE=%s" % self.build_type)
if self.generator:
args.extend(['-G', self.generator])
return args
def __repr__(self):
return "BuildDesc(%s, build_type=%s)" % (" ".join(self.cmake_args()), self.build_type)
class BuildData:
def __init__(self, desc, work_dir, source_dir):
self.desc = desc
self.work_dir = work_dir
self.source_dir = source_dir
self.cmake_log_path = os.path.join(work_dir, 'batchbuild_cmake.log')
self.build_log_path = os.path.join(work_dir, 'batchbuild_build.log')
self.cmake_succeeded = False
self.build_succeeded = False
def execute_build(self):
print('Build %s' % self.desc)
self._make_new_work_dir()
self.cmake_succeeded = self._generate_makefiles()
if self.cmake_succeeded:
self.build_succeeded = self._build_using_makefiles()
return self.build_succeeded
def _generate_makefiles(self):
print(' Generating makefiles: ', end=' ')
cmd = ['cmake'] + self.desc.cmake_args() + [os.path.abspath(self.source_dir)]
succeeded = self._execute_build_subprocess(cmd, self.desc.env(), self.cmake_log_path)
print('done' if succeeded else 'FAILED')
return succeeded
def _build_using_makefiles(self):
print(' Building:', end=' ')
cmd = ['cmake', '--build', self.work_dir]
if self.desc.build_type:
cmd += ['--config', self.desc.build_type]
succeeded = self._execute_build_subprocess(cmd, self.desc.env(), self.build_log_path)
print('done' if succeeded else 'FAILED')
return succeeded
def _execute_build_subprocess(self, cmd, env, log_path):
process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, cwd=self.work_dir,
env=env)
stdout, _ = process.communicate()
succeeded = (process.returncode == 0)
with open(log_path, 'wb') as flog:
log = ' '.join(cmd) + '\n' + stdout + '\nExit code: %r\n' % process.returncode
flog.write(fix_eol(log))
return succeeded
def _make_new_work_dir(self):
if os.path.isdir(self.work_dir):
print(' Removing work directory', self.work_dir)
shutil.rmtree(self.work_dir, ignore_errors=True)
if not os.path.isdir(self.work_dir):
os.makedirs(self.work_dir)
def fix_eol(stdout):
"""Fixes wrong EOL produced by cmake --build on Windows (\r\r\n instead of \r\n).
"""
return re.sub('\r*\n', os.linesep, stdout)
def load_build_variants_from_config(config_path):
with open(config_path, 'rb') as fconfig:
data = json.load(fconfig)
variants = data[ 'cmake_variants' ]
build_descs_by_axis = collections.defaultdict(list)
for axis in variants:
axis_name = axis["name"]
build_descs = []
if "generators" in axis:
for generator_data in axis["generators"]:
for generator in generator_data["generator"]:
build_desc = BuildDesc(generator=generator,
prepend_envs=generator_data.get("env_prepend"))
build_descs.append(build_desc)
elif "variables" in axis:
for variables in axis["variables"]:
build_desc = BuildDesc(variables=variables)
build_descs.append(build_desc)
elif "build_types" in axis:
for build_type in axis["build_types"]:
build_desc = BuildDesc(build_type=build_type)
build_descs.append(build_desc)
build_descs_by_axis[axis_name].extend(build_descs)
return build_descs_by_axis
def generate_build_variants(build_descs_by_axis):
"""Returns a list of BuildDesc generated for the partial BuildDesc for each axis."""
axis_names = list(build_descs_by_axis.keys())
build_descs = []
for axis_name, axis_build_descs in list(build_descs_by_axis.items()):
if len(build_descs):
# for each existing build_desc and each axis build desc, create a new build_desc
new_build_descs = []
for prototype_build_desc, axis_build_desc in itertools.product(build_descs, axis_build_descs):
new_build_descs.append(prototype_build_desc.merged_with(axis_build_desc))
build_descs = new_build_descs
else:
build_descs = axis_build_descs
return build_descs
HTML_TEMPLATE = string.Template('''<html>
<head>
<title>$title</title>
<style type="text/css">
td.failed {background-color:#f08080;}
td.ok {background-color:#c0eec0;}
</style>
</head>
<body>
<table border="1">
<thead>
<tr>
<th>Variables</th>
$th_vars
</tr>
<tr>
<th>Build type</th>
$th_build_types
</tr>
</thead>
<tbody>
$tr_builds
</tbody>
</table>
</body></html>''')
def generate_html_report(html_report_path, builds):
report_dir = os.path.dirname(html_report_path)
# Vertical axis: generator
# Horizontal: variables, then build_type
builds_by_generator = collections.defaultdict(list)
variables = set()
build_types_by_variable = collections.defaultdict(set)
build_by_pos_key = {} # { (generator, var_key, build_type): build }
for build in builds:
builds_by_generator[build.desc.generator].append(build)
var_key = tuple(sorted(build.desc.variables))
variables.add(var_key)
build_types_by_variable[var_key].add(build.desc.build_type)
pos_key = (build.desc.generator, var_key, build.desc.build_type)
build_by_pos_key[pos_key] = build
variables = sorted(variables)
th_vars = []
th_build_types = []
for variable in variables:
build_types = sorted(build_types_by_variable[variable])
nb_build_type = len(build_types_by_variable[variable])
th_vars.append('<th colspan="%d">%s</th>' % (nb_build_type, cgi.escape(' '.join(variable))))
for build_type in build_types:
th_build_types.append('<th>%s</th>' % cgi.escape(build_type))
tr_builds = []
for generator in sorted(builds_by_generator):
tds = [ '<td>%s</td>\n' % cgi.escape(generator) ]
for variable in variables:
build_types = sorted(build_types_by_variable[variable])
for build_type in build_types:
pos_key = (generator, variable, build_type)
build = build_by_pos_key.get(pos_key)
if build:
cmake_status = 'ok' if build.cmake_succeeded else 'FAILED'
build_status = 'ok' if build.build_succeeded else 'FAILED'
cmake_log_url = os.path.relpath(build.cmake_log_path, report_dir)
build_log_url = os.path.relpath(build.build_log_path, report_dir)
td = '<td class="%s"><a href="%s" class="%s">CMake: %s</a>' % ( build_status.lower(), cmake_log_url, cmake_status.lower(), cmake_status)
if build.cmake_succeeded:
td += '<br><a href="%s" class="%s">Build: %s</a>' % ( build_log_url, build_status.lower(), build_status)
td += '</td>'
else:
td = '<td></td>'
tds.append(td)
tr_builds.append('<tr>%s</tr>' % '\n'.join(tds))
html = HTML_TEMPLATE.substitute( title='Batch build report',
th_vars=' '.join(th_vars),
th_build_types=' '.join(th_build_types),
tr_builds='\n'.join(tr_builds))
with open(html_report_path, 'wt') as fhtml:
fhtml.write(html)
print('HTML report generated in:', html_report_path)
def main():
usage = r"""%prog WORK_DIR SOURCE_DIR CONFIG_JSON_PATH [CONFIG2_JSON_PATH...]
Build a given CMake based project located in SOURCE_DIR with multiple generators/options.dry_run
as described in CONFIG_JSON_PATH building in WORK_DIR.
Example of call:
python devtools\batchbuild.py e:\buildbots\jsoncpp\build . devtools\agent_vmw7.json
"""
from optparse import OptionParser
parser = OptionParser(usage=usage)
parser.allow_interspersed_args = True
# parser.add_option('-v', '--verbose', dest="verbose", action='store_true',
# help="""Be verbose.""")
parser.enable_interspersed_args()
options, args = parser.parse_args()
if len(args) < 3:
parser.error("Missing one of WORK_DIR SOURCE_DIR CONFIG_JSON_PATH.")
work_dir = args[0]
source_dir = args[1].rstrip('/\\')
config_paths = args[2:]
for config_path in config_paths:
if not os.path.isfile(config_path):
parser.error("Can not read: %r" % config_path)
# generate build variants
build_descs = []
for config_path in config_paths:
build_descs_by_axis = load_build_variants_from_config(config_path)
build_descs.extend(generate_build_variants(build_descs_by_axis))
print('Build variants (%d):' % len(build_descs))
# assign build directory for each variant
if not os.path.isdir(work_dir):
os.makedirs(work_dir)
builds = []
with open(os.path.join(work_dir, 'matrix-dir-map.txt'), 'wt') as fmatrixmap:
for index, build_desc in enumerate(build_descs):
build_desc_work_dir = os.path.join(work_dir, '%03d' % (index+1))
builds.append(BuildData(build_desc, build_desc_work_dir, source_dir))
fmatrixmap.write('%s: %s\n' % (build_desc_work_dir, build_desc))
for build in builds:
build.execute_build()
html_report_path = os.path.join(work_dir, 'batchbuild-report.html')
generate_html_report(html_report_path, builds)
print('Done')
if __name__ == '__main__':
main()

View File

@@ -1,63 +1,64 @@
import os.path
def fix_source_eol( path, is_dry_run = True, verbose = True, eol = '\n' ):
"""Makes sure that all sources have the specified eol sequence (default: unix)."""
if not os.path.isfile( path ):
raise ValueError( 'Path "%s" is not a file' % path )
try:
f = open(path, 'rb')
except IOError, msg:
print >> sys.stderr, "%s: I/O Error: %s" % (file, str(msg))
return False
try:
raw_lines = f.readlines()
finally:
f.close()
fixed_lines = [line.rstrip('\r\n') + eol for line in raw_lines]
if raw_lines != fixed_lines:
print '%s =>' % path,
if not is_dry_run:
f = open(path, "wb")
try:
f.writelines(fixed_lines)
finally:
f.close()
if verbose:
print is_dry_run and ' NEED FIX' or ' FIXED'
return True
##
##
##
##def _do_fix( is_dry_run = True ):
## from waftools import antglob
## python_sources = antglob.glob( '.',
## includes = '**/*.py **/wscript **/wscript_build',
## excludes = antglob.default_excludes + './waf.py',
## prune_dirs = antglob.prune_dirs + 'waf-* ./build' )
## for path in python_sources:
## _fix_python_source( path, is_dry_run )
##
## cpp_sources = antglob.glob( '.',
## includes = '**/*.cpp **/*.h **/*.inl',
## prune_dirs = antglob.prune_dirs + 'waf-* ./build' )
## for path in cpp_sources:
## _fix_source_eol( path, is_dry_run )
##
##
##def dry_fix(context):
## _do_fix( is_dry_run = True )
##
##def fix(context):
## _do_fix( is_dry_run = False )
##
##def shutdown():
## pass
##
##def check(context):
## # Unit tests are run when "check" target is used
## ut = UnitTest.unit_test()
## ut.change_to_testfile_dir = True
## ut.want_to_see_test_output = True
## ut.want_to_see_test_error = True
## ut.run()
## ut.print_results()
from __future__ import print_function
import os.path
def fix_source_eol(path, is_dry_run = True, verbose = True, eol = '\n'):
"""Makes sure that all sources have the specified eol sequence (default: unix)."""
if not os.path.isfile(path):
raise ValueError('Path "%s" is not a file' % path)
try:
f = open(path, 'rb')
except IOError as msg:
print("%s: I/O Error: %s" % (file, str(msg)), file=sys.stderr)
return False
try:
raw_lines = f.readlines()
finally:
f.close()
fixed_lines = [line.rstrip('\r\n') + eol for line in raw_lines]
if raw_lines != fixed_lines:
print('%s =>' % path, end=' ')
if not is_dry_run:
f = open(path, "wb")
try:
f.writelines(fixed_lines)
finally:
f.close()
if verbose:
print(is_dry_run and ' NEED FIX' or ' FIXED')
return True
##
##
##
##def _do_fix(is_dry_run = True):
## from waftools import antglob
## python_sources = antglob.glob('.',
## includes = '**/*.py **/wscript **/wscript_build',
## excludes = antglob.default_excludes + './waf.py',
## prune_dirs = antglob.prune_dirs + 'waf-* ./build')
## for path in python_sources:
## _fix_python_source(path, is_dry_run)
##
## cpp_sources = antglob.glob('.',
## includes = '**/*.cpp **/*.h **/*.inl',
## prune_dirs = antglob.prune_dirs + 'waf-* ./build')
## for path in cpp_sources:
## _fix_source_eol(path, is_dry_run)
##
##
##def dry_fix(context):
## _do_fix(is_dry_run = True)
##
##def fix(context):
## _do_fix(is_dry_run = False)
##
##def shutdown():
## pass
##
##def check(context):
## # Unit tests are run when "check" target is used
## ut = UnitTest.unit_test()
## ut.change_to_testfile_dir = True
## ut.want_to_see_test_output = True
## ut.want_to_see_test_error = True
## ut.run()
## ut.print_results()

View File

@@ -0,0 +1,94 @@
"""Updates the license text in source file.
"""
from __future__ import print_function
# An existing license is found if the file starts with the string below,
# and ends with the first blank line.
LICENSE_BEGIN = "// Copyright "
BRIEF_LICENSE = LICENSE_BEGIN + """2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
""".replace('\r\n','\n')
def update_license(path, dry_run, show_diff):
"""Update the license statement in the specified file.
Parameters:
path: path of the C++ source file to update.
dry_run: if True, just print the path of the file that would be updated,
but don't change it.
show_diff: if True, print the path of the file that would be modified,
as well as the change made to the file.
"""
with open(path, 'rt') as fin:
original_text = fin.read().replace('\r\n','\n')
newline = fin.newlines and fin.newlines[0] or '\n'
if not original_text.startswith(LICENSE_BEGIN):
# No existing license found => prepend it
new_text = BRIEF_LICENSE + original_text
else:
license_end_index = original_text.index('\n\n') # search first blank line
new_text = BRIEF_LICENSE + original_text[license_end_index+2:]
if original_text != new_text:
if not dry_run:
with open(path, 'wb') as fout:
fout.write(new_text.replace('\n', newline))
print('Updated', path)
if show_diff:
import difflib
print('\n'.join(difflib.unified_diff(original_text.split('\n'),
new_text.split('\n'))))
return True
return False
def update_license_in_source_directories(source_dirs, dry_run, show_diff):
"""Updates license text in C++ source files found in directory source_dirs.
Parameters:
source_dirs: list of directory to scan for C++ sources. Directories are
scanned recursively.
dry_run: if True, just print the path of the file that would be updated,
but don't change it.
show_diff: if True, print the path of the file that would be modified,
as well as the change made to the file.
"""
from devtools import antglob
prune_dirs = antglob.prune_dirs + 'scons-local* ./build* ./libs ./dist'
for source_dir in source_dirs:
cpp_sources = antglob.glob(source_dir,
includes = '''**/*.h **/*.cpp **/*.inl''',
prune_dirs = prune_dirs)
for source in cpp_sources:
update_license(source, dry_run, show_diff)
def main():
usage = """%prog DIR [DIR2...]
Updates license text in sources of the project in source files found
in the directory specified on the command-line.
Example of call:
python devtools\licenseupdater.py include src -n --diff
=> Show change that would be made to the sources.
python devtools\licenseupdater.py include src
=> Update license statement on all sources in directories include/ and src/.
"""
from optparse import OptionParser
parser = OptionParser(usage=usage)
parser.allow_interspersed_args = False
parser.add_option('-n', '--dry-run', dest="dry_run", action='store_true', default=False,
help="""Only show what files are updated, do not update the files""")
parser.add_option('--diff', dest="show_diff", action='store_true', default=False,
help="""On update, show change made to the file.""")
parser.enable_interspersed_args()
options, args = parser.parse_args()
update_license_in_source_directories(args, options.dry_run, options.show_diff)
print('Done')
if __name__ == '__main__':
import sys
import os.path
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
main()

View File

@@ -1,53 +1,47 @@
import os.path
import gzip
import tarfile
TARGZ_DEFAULT_COMPRESSION_LEVEL = 9
def make_tarball(tarball_path, sources, base_dir, prefix_dir=''):
"""Parameters:
tarball_path: output path of the .tar.gz file
sources: list of sources to include in the tarball, relative to the current directory
base_dir: if a source file is in a sub-directory of base_dir, then base_dir is stripped
from path in the tarball.
prefix_dir: all files stored in the tarball be sub-directory of prefix_dir. Set to ''
to make them child of root.
"""
base_dir = os.path.normpath( os.path.abspath( base_dir ) )
def archive_name( path ):
"""Makes path relative to base_dir."""
path = os.path.normpath( os.path.abspath( path ) )
common_path = os.path.commonprefix( (base_dir, path) )
archive_name = path[len(common_path):]
if os.path.isabs( archive_name ):
archive_name = archive_name[1:]
return os.path.join( prefix_dir, archive_name )
def visit(tar, dirname, names):
for name in names:
path = os.path.join(dirname, name)
if os.path.isfile(path):
path_in_tar = archive_name(path)
tar.add(path, path_in_tar )
compression = TARGZ_DEFAULT_COMPRESSION_LEVEL
tar = tarfile.TarFile.gzopen( tarball_path, 'w', compresslevel=compression )
try:
for source in sources:
source_path = source
if os.path.isdir( source ):
os.path.walk(source_path, visit, tar)
else:
path_in_tar = archive_name(source_path)
tar.add(source_path, path_in_tar ) # filename, arcname
finally:
tar.close()
def decompress( tarball_path, base_dir ):
"""Decompress the gzipped tarball into directory base_dir.
"""
# !!! This class method is not documented in the online doc
# nor is bz2open!
tar = tarfile.TarFile.gzopen(tarball_path, mode='r')
try:
tar.extractall( base_dir )
finally:
tar.close()
from contextlib import closing
import os
import tarfile
TARGZ_DEFAULT_COMPRESSION_LEVEL = 9
def make_tarball(tarball_path, sources, base_dir, prefix_dir=''):
"""Parameters:
tarball_path: output path of the .tar.gz file
sources: list of sources to include in the tarball, relative to the current directory
base_dir: if a source file is in a sub-directory of base_dir, then base_dir is stripped
from path in the tarball.
prefix_dir: all files stored in the tarball be sub-directory of prefix_dir. Set to ''
to make them child of root.
"""
base_dir = os.path.normpath(os.path.abspath(base_dir))
def archive_name(path):
"""Makes path relative to base_dir."""
path = os.path.normpath(os.path.abspath(path))
common_path = os.path.commonprefix((base_dir, path))
archive_name = path[len(common_path):]
if os.path.isabs(archive_name):
archive_name = archive_name[1:]
return os.path.join(prefix_dir, archive_name)
def visit(tar, dirname, names):
for name in names:
path = os.path.join(dirname, name)
if os.path.isfile(path):
path_in_tar = archive_name(path)
tar.add(path, path_in_tar)
compression = TARGZ_DEFAULT_COMPRESSION_LEVEL
with closing(tarfile.TarFile.open(tarball_path, 'w:gz',
compresslevel=compression)) as tar:
for source in sources:
source_path = source
if os.path.isdir(source):
for dirpath, dirnames, filenames in os.walk(source_path):
visit(tar, dirpath, filenames)
else:
path_in_tar = archive_name(source_path)
tar.add(source_path, path_in_tar) # filename, arcname
def decompress(tarball_path, base_dir):
"""Decompress the gzipped tarball into directory base_dir.
"""
with closing(tarfile.TarFile.open(tarball_path)) as tar:
tar.extractall(base_dir)

File diff suppressed because it is too large Load Diff

View File

@@ -1,23 +1,3 @@
<hr>
<table width="100%">
<tr>
<td width="10%" align="left" valign="center">
<a href="http://sourceforge.net">
<img
src="http://sourceforge.net/sflogo.php?group_id=144446"
width="88" height="31" border="0" alt="SourceForge Logo"></a>
</td>
<td width="20%" align="left" valign="center">
hosts this site.
</td>
<td>
</td>
<td align="right" valign="center">
Send comments to:<br>
<a href="mailto:jsoncpp-devel@lists.sourceforge.net">Json-cpp Developers</a>
</td>
</tr>
</table>
</body>
</html>

View File

@@ -11,12 +11,12 @@ JsonCpp - JSON data format manipulation library
<table width="100%">
<tr>
<td width="40%" align="left" valign="center">
<a href="http://sourceforge.net/projects/jsoncpp">
<a href="https://github.com/open-source-parsers/jsoncpp">
JsonCpp project page
</a>
</td>
<td width="40%" align="right" valign="center">
<a href="http://jsoncpp.sourceforge.net">JsonCpp home page</a>
<a href="http://open-source-parsers.github.io/jsoncpp-docs/doxygen/">JsonCpp home page</a>
</td>
</tr>
</table>

View File

@@ -4,11 +4,21 @@
<a HREF="http://www.json.org/">JSON (JavaScript Object Notation)</a>
is a lightweight data-interchange format.
It can represent integer, real number, string, an ordered sequence of value, and
a collection of name/value pairs.
Here is an example of JSON data:
\verbatim
{
"encoding" : "UTF-8",
"plug-ins" : [
"python",
"c++",
"ruby"
],
"indent" : { "length" : 3, "use_space": true }
}
\endverbatim
<b>JsonCpp</b> supports comments as <i>meta-data</i>:
\code
// Configuration options
{
// Default encoding for text
@@ -17,21 +27,22 @@ Here is an example of JSON data:
// Plug-ins loaded at start-up
"plug-ins" : [
"python",
"c++",
"c++", // trailing comment
"ruby"
],
// Tab indent size
"indent" : { "length" : 3, "use_space" = true }
// (multi-line comment)
"indent" : { /*embedded comment*/ "length" : 3, "use_space": true }
}
\endverbatim
\endcode
\section _features Features
- read and write JSON document
- attach C and C++ style comments to element during parsing
- attach C++ style comments to element during parsing
- rewrite JSON document preserving original comments
Notes: Comments used to be supported in JSON but where removed for
Notes: Comments used to be supported in JSON but were removed for
portability (C like comments are not supported in Python). Since
comments are useful in configuration/input file, this feature was
preserved.
@@ -39,78 +50,115 @@ preserved.
\section _example Code example
\code
Json::Value root; // will contains the root value after parsing.
Json::Reader reader;
bool parsingSuccessful = reader.parse( config_doc, root );
if ( !parsingSuccessful )
{
// report to the user the failure and their locations in the document.
std::cout << "Failed to parse configuration\n"
<< reader.getFormatedErrorMessages();
return;
}
Json::Value root; // 'root' will contain the root value after parsing.
std::cin >> root;
// Get the value of the member of root named 'encoding', return 'UTF-8' if there is no
// such member.
std::string encoding = root.get("encoding", "UTF-8" ).asString();
// Get the value of the member of root named 'encoding', return a 'null' value if
// there is no such member.
const Json::Value plugins = root["plug-ins"];
for ( int index = 0; index < plugins.size(); ++index ) // Iterates over the sequence elements.
loadPlugIn( plugins[index].asString() );
setIndentLength( root["indent"].get("length", 3).asInt() );
setIndentUseSpace( root["indent"].get("use_space", true).asBool() );
// ...
// At application shutdown to make the new configuration document:
// Since Json::Value has implicit constructor for all value types, it is not
// necessary to explicitly construct the Json::Value object:
root["encoding"] = getCurrentEncoding();
root["indent"]["length"] = getCurrentIndentLength();
root["indent"]["use_space"] = getCurrentIndentUseSpace();
Json::StyledWriter writer;
// Make a new JSON document for the configuration. Preserve original comments.
std::string outputConfig = writer.write( root );
// You can also use streams. This will put the contents of any JSON
// stream at a particular sub-value, if you'd like.
// You can also read into a particular sub-value.
std::cin >> root["subtree"];
// And you can write to a stream, using the StyledWriter automatically.
// Get the value of the member of root named 'encoding',
// and return 'UTF-8' if there is no such member.
std::string encoding = root.get("encoding", "UTF-8" ).asString();
// Get the value of the member of root named 'plug-ins'; return a 'null' value if
// there is no such member.
const Json::Value plugins = root["plug-ins"];
// Iterate over the sequence elements.
for ( int index = 0; index < plugins.size(); ++index )
loadPlugIn( plugins[index].asString() );
// Try other datatypes. Some are auto-convertible to others.
foo::setIndentLength( root["indent"].get("length", 3).asInt() );
foo::setIndentUseSpace( root["indent"].get("use_space", true).asBool() );
// Since Json::Value has an implicit constructor for all value types, it is not
// necessary to explicitly construct the Json::Value object.
root["encoding"] = foo::getCurrentEncoding();
root["indent"]["length"] = foo::getCurrentIndentLength();
root["indent"]["use_space"] = foo::getCurrentIndentUseSpace();
// If you like the defaults, you can insert directly into a stream.
std::cout << root;
// Of course, you can write to `std::ostringstream` if you prefer.
// If desired, remember to add a linefeed and flush.
std::cout << std::endl;
\endcode
\section _plinks Build instructions
\section _advanced Advanced usage
Configure *builders* to create *readers* and *writers*. For
configuration, we use our own `Json::Value` (rather than
standard setters/getters) so that we can add
features without losing binary-compatibility.
\code
// For convenience, use `writeString()` with a specialized builder.
Json::StreamWriterBuilder wbuilder;
wbuilder["indentation"] = "\t";
std::string document = Json::writeString(wbuilder, root);
// Here, using a specialized Builder, we discard comments and
// record errors as we parse.
Json::CharReaderBuilder rbuilder;
rbuilder["collectComments"] = false;
std::string errs;
bool ok = Json::parseFromStream(rbuilder, std::cin, &root, &errs);
\endcode
Yes, compile-time configuration-checking would be helpful,
but `Json::Value` lets you
write and read the builder configuration, which is better! In other words,
you can configure your JSON parser using JSON.
CharReaders and StreamWriters are not thread-safe, but they are re-usable.
\code
Json::CharReaderBuilder rbuilder;
cfg >> rbuilder.settings_;
std::unique_ptr<Json::CharReader> const reader(rbuilder.newCharReader());
reader->parse(start, stop, &value1, &errs);
// ...
reader->parse(start, stop, &value2, &errs);
// etc.
\endcode
\section _pbuild Build instructions
The build instructions are located in the file
<a HREF="README.txt">README.txt</a> in the top-directory of the project.
<a HREF="https://github.com/open-source-parsers/jsoncpp/blob/master/README.md">README.md</a> in the top-directory of the project.
Permanent link to the latest revision of the file in subversion:
<a HREF="http://svn.sourceforge.net/viewcvs.cgi/jsoncpp/README.txt?view=markup">latest README.txt</a>
The latest version of the source is available in the project's GitHub repository:
<a HREF="https://github.com/open-source-parsers/jsoncpp/">
jsoncpp</a>
\section _pdownload Download
The sources can be downloaded from
<a HREF="http://sourceforge.net/projects/jsoncpp/files/">SourceForge download page</a>.
The latest version of the source is available in the project's subversion repository:
<a HREF="http://jsoncpp.svn.sourceforge.net/svnroot/jsoncpp/trunk/">
http://jsoncpp.svn.sourceforge.net/svnroot/jsoncpp/trunk/</a>
To checkout the source, see the following
<a HREF="http://sourceforge.net/scm/?type=svn&group_id=144446">instructions</a>.
\section _plinks Project links
- <a HREF="http://jsoncpp.sourceforge.net">json-cpp home</a>
- <a HREF="http://www.sourceforge.net/projects/jsoncpp">json-cpp sourceforge project</a>
\section _news What's New?
The description of latest changes can be found in
<a HREF="https://github.com/open-source-parsers/jsoncpp/wiki/NEWS">
the NEWS wiki
</a>.
\section _rlinks Related links
- <a HREF="http://www.json.org/">JSON</a> Specification and alternate language implementations.
- <a HREF="http://www.yaml.org/">YAML</a> A data format designed for human readability.
- <a HREF="http://www.cl.cam.ac.uk/~mgk25/unicode.html">UTF-8 and Unicode FAQ</a>.
\section _license License
The json-cpp library and this documentation are in Public Domain.
\section _plinks Old project links
- <a href="https://sourceforge.net/projects/jsoncpp/">https://sourceforge.net/projects/jsoncpp/</a>
- <a href="http://jsoncpp.sourceforge.net">http://jsoncpp.sourceforge.net</a>
- <a href="http://sourceforge.net/projects/jsoncpp/files/">http://sourceforge.net/projects/jsoncpp/files/</a>
- <a href="http://jsoncpp.svn.sourceforge.net/svnroot/jsoncpp/trunk/">http://jsoncpp.svn.sourceforge.net/svnroot/jsoncpp/trunk/</a>
- <a href="http://jsoncpp.sourceforge.net/old.html">http://jsoncpp.sourceforge.net/old.html</a>
\author Baptiste Lepilleur <blep@users.sourceforge.net>
\section _license License
See file <a href="https://github.com/open-source-parsers/jsoncpp/blob/master/LICENSE"><code>LICENSE</code></a> in the top-directory of the project.
Basically JsonCpp is licensed under MIT license, or public domain if desired
and recognized in your jurisdiction.
\author Baptiste Lepilleur <blep@users.sourceforge.net> (originator)
\author Christopher Dunn <cdunn2001@gmail.com> (primary maintainer)
\version \include version
We make strong guarantees about binary-compatibility, consistent with
<a href="http://apr.apache.org/versioning.html">the Apache versioning scheme</a>.
\sa version.h
*/

View File

@@ -1,32 +1,3 @@
/*! \page roadmap JsonCpp roadmap
\section ms_release Makes JsonCpp ready for release
- Build system clean-up:
- Fix build on Windows (shared-library build is broken)
- Add enable/disable flag for static and shared library build
- Enhance help
- Platform portability check: (Notes: was ok on last check)
- linux/gcc,
- solaris/cc,
- windows/msvc678,
- aix/vacpp
- Add JsonCpp version to header as numeric for use in preprocessor test
- Remove buggy experimental hash stuff
- Release on sourceforge download
\section ms_strict Adds a strict mode to reader/parser
Strict JSON support as specific in RFC 4627 (http://www.ietf.org/rfc/rfc4627.txt?number=4627).
- Enforce only object or array as root element
- Disable comment support
- Get jsonchecker failing tests to pass in strict mode
\section ms_separation Expose json reader/writer API that do not impose using Json::Value.
Some typical use-case involve an application specific structure to/from a JSON document.
- Event base parser to allow unserializing a Json document directly in datastructure instead of
using the intermediate Json::Value.
- "Stream" based parser to serialized a Json document without using Json::Value as input.
- Performance oriented parser/writer:
- Provides an event based parser. Should allow pulling & skipping events for ease of use.
- Provides a JSON document builder: fast only.
\section ms_perfo Performance tuning
- Provides support for static property name definition avoiding allocation
- Static property dictionnary can be provided to JSON reader
- Performance scenario & benchmarking
Moved to: https://github.com/open-source-parsers/jsoncpp/wiki/Roadmap
*/

2301
doc/web_doxyfile.in Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,12 +1,27 @@
"""Script to generate doxygen documentation.
"""
from __future__ import print_function
from __future__ import unicode_literals
from devtools import tarball
from contextlib import contextmanager
import subprocess
import traceback
import re
import os
import os.path
import sys
import shutil
from devtools import tarball
@contextmanager
def cd(newdir):
"""
http://stackoverflow.com/questions/431684/how-do-i-cd-in-python
"""
prevdir = os.getcwd()
os.chdir(newdir)
try:
yield
finally:
os.chdir(prevdir)
def find_program(*filenames):
"""find a program in folders path_lst, and sets env[var]
@@ -14,7 +29,7 @@ def find_program(*filenames):
@return: the full path of the filename if found, or '' if filename could not be found
"""
paths = os.environ.get('PATH', '').split(os.pathsep)
suffixes = ('win32' in sys.platform ) and '.exe .com .bat .cmd' or ''
suffixes = ('win32' in sys.platform) and '.exe .com .bat .cmd' or ''
for filename in filenames:
for name in [filename+ext for ext in suffixes.split()]:
for directory in paths:
@@ -28,53 +43,56 @@ def do_subst_in_file(targetfile, sourcefile, dict):
For example, if dict is {'%VERSION%': '1.2345', '%BASE%': 'MyProg'},
then all instances of %VERSION% in the file will be replaced with 1.2345 etc.
"""
try:
f = open(sourcefile, 'rb')
with open(sourcefile, 'r') as f:
contents = f.read()
f.close()
except:
print "Can't read source file %s"%sourcefile
raise
for (k,v) in dict.items():
for (k,v) in list(dict.items()):
v = v.replace('\\','\\\\')
contents = re.sub(k, v, contents)
try:
f = open(targetfile, 'wb')
with open(targetfile, 'w') as f:
f.write(contents)
f.close()
def getstatusoutput(cmd):
"""cmd is a list.
"""
try:
process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
output, _ = process.communicate()
status = process.returncode
except:
print "Can't write target file %s"%targetfile
raise
status = -1
output = traceback.format_exc()
return status, output
def run_cmd(cmd, silent=False):
"""Raise exception on failure.
"""
info = 'Running: %r in %r' %(' '.join(cmd), os.getcwd())
print(info)
sys.stdout.flush()
if silent:
status, output = getstatusoutput(cmd)
else:
status, output = os.system(' '.join(cmd)), ''
if status:
msg = 'Error while %s ...\n\terror=%d, output="""%s"""' %(info, status, output)
raise Exception(msg)
def assert_is_exe(path):
if not path:
raise Exception('path is empty.')
if not os.path.isfile(path):
raise Exception('%r is not a file.' %path)
if not os.access(path, os.X_OK):
raise Exception('%r is not executable by this user.' %path)
def run_doxygen(doxygen_path, config_file, working_dir, is_silent):
config_file = os.path.abspath( config_file )
doxygen_path = doxygen_path
old_cwd = os.getcwd()
try:
os.chdir( working_dir )
assert_is_exe(doxygen_path)
config_file = os.path.abspath(config_file)
with cd(working_dir):
cmd = [doxygen_path, config_file]
print 'Running:', ' '.join( cmd )
try:
import subprocess
except:
if os.system( ' '.join( cmd ) ) != 0:
print 'Documentation generation failed'
return False
else:
if is_silent:
process = subprocess.Popen( cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT )
else:
process = subprocess.Popen( cmd )
stdout, _ = process.communicate()
if process.returncode:
print 'Documentation generation failed:'
print stdout
return False
return True
finally:
os.chdir( old_cwd )
run_cmd(cmd, is_silent)
def build_doc( options, make_release=False ):
def build_doc(options, make_release=False):
if make_release:
options.make_tarball = True
options.with_dot = True
@@ -83,54 +101,56 @@ def build_doc( options, make_release=False ):
options.open = False
options.silent = True
version = open('version','rt').read().strip()
version = open('version', 'rt').read().strip()
output_dir = 'dist/doxygen' # relative to doc/doxyfile location.
if not os.path.isdir( output_dir ):
os.makedirs( output_dir )
top_dir = os.path.abspath( '.' )
if not os.path.isdir(output_dir):
os.makedirs(output_dir)
top_dir = os.path.abspath('.')
html_output_dirname = 'jsoncpp-api-html-' + version
tarball_path = os.path.join( 'dist', html_output_dirname + '.tar.gz' )
warning_log_path = os.path.join( output_dir, '../jsoncpp-doxygen-warning.log' )
html_output_path = os.path.join( output_dir, html_output_dirname )
def yesno( bool ):
tarball_path = os.path.join('dist', html_output_dirname + '.tar.gz')
warning_log_path = os.path.join(output_dir, '../jsoncpp-doxygen-warning.log')
html_output_path = os.path.join(output_dir, html_output_dirname)
def yesno(bool):
return bool and 'YES' or 'NO'
subst_keys = {
'%JSONCPP_VERSION%': version,
'%DOC_TOPDIR%': '',
'%TOPDIR%': top_dir,
'%HTML_OUTPUT%': os.path.join( '..', output_dir, html_output_dirname ),
'%HTML_OUTPUT%': os.path.join('..', output_dir, html_output_dirname),
'%HAVE_DOT%': yesno(options.with_dot),
'%DOT_PATH%': os.path.split(options.dot_path)[0],
'%HTML_HELP%': yesno(options.with_html_help),
'%UML_LOOK%': yesno(options.with_uml_look),
'%WARNING_LOG_PATH%': os.path.join( '..', warning_log_path )
'%WARNING_LOG_PATH%': os.path.join('..', warning_log_path)
}
if os.path.isdir( output_dir ):
print 'Deleting directory:', output_dir
shutil.rmtree( output_dir )
if not os.path.isdir( output_dir ):
os.makedirs( output_dir )
if os.path.isdir(output_dir):
print('Deleting directory:', output_dir)
shutil.rmtree(output_dir)
if not os.path.isdir(output_dir):
os.makedirs(output_dir)
do_subst_in_file( 'doc/doxyfile', 'doc/doxyfile.in', subst_keys )
ok = run_doxygen( options.doxygen_path, 'doc/doxyfile', 'doc', is_silent=options.silent )
do_subst_in_file('doc/doxyfile', options.doxyfile_input_path, subst_keys)
run_doxygen(options.doxygen_path, 'doc/doxyfile', 'doc', is_silent=options.silent)
if not options.silent:
print open(warning_log_path, 'rb').read()
index_path = os.path.abspath(os.path.join(subst_keys['%HTML_OUTPUT%'], 'index.html'))
print 'Generated documentation can be found in:'
print index_path
print(open(warning_log_path, 'r').read())
index_path = os.path.abspath(os.path.join('doc', subst_keys['%HTML_OUTPUT%'], 'index.html'))
print('Generated documentation can be found in:')
print(index_path)
if options.open:
import webbrowser
webbrowser.open( 'file://' + index_path )
webbrowser.open('file://' + index_path)
if options.make_tarball:
print 'Generating doc tarball to', tarball_path
print('Generating doc tarball to', tarball_path)
tarball_sources = [
output_dir,
'README.txt',
'README.md',
'LICENSE',
'NEWS.txt',
'version'
]
tarball_basedir = os.path.join( output_dir, html_output_dirname )
tarball.make_tarball( tarball_path, tarball_sources, tarball_basedir, html_output_dirname )
tarball_basedir = os.path.join(output_dir, html_output_dirname)
tarball.make_tarball(tarball_path, tarball_sources, tarball_basedir, html_output_dirname)
return tarball_path, html_output_dirname
def main():
@@ -149,6 +169,8 @@ def main():
help="""Path to GraphViz dot tool. Must be full qualified path. [Default: %default]""")
parser.add_option('--doxygen', dest="doxygen_path", action='store', default=find_program('doxygen'),
help="""Path to Doxygen tool. [Default: %default]""")
parser.add_option('--in', dest="doxyfile_input_path", action='store', default='doc/doxyfile.in',
help="""Path to doxygen inputs. [Default: %default]""")
parser.add_option('--with-html-help', dest="with_html_help", action='store_true', default=False,
help="""Enable generation of Microsoft HTML HELP""")
parser.add_option('--no-uml-look', dest="with_uml_look", action='store_false', default=True,
@@ -161,7 +183,7 @@ def main():
help="""Hides doxygen output""")
parser.enable_interspersed_args()
options, args = parser.parse_args()
build_doc( options )
build_doc(options)
if __name__ == '__main__':
main()

2
include/CMakeLists.txt Normal file
View File

@@ -0,0 +1,2 @@
FILE(GLOB INCLUDE_FILES "json/*.h")
INSTALL(FILES ${INCLUDE_FILES} DESTINATION ${INCLUDE_INSTALL_DIR}/json)

54
include/json/assertions.h Normal file
View File

@@ -0,0 +1,54 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
#ifndef CPPTL_JSON_ASSERTIONS_H_INCLUDED
#define CPPTL_JSON_ASSERTIONS_H_INCLUDED
#include <stdlib.h>
#include <sstream>
#if !defined(JSON_IS_AMALGAMATION)
#include "config.h"
#endif // if !defined(JSON_IS_AMALGAMATION)
/** It should not be possible for a maliciously designed file to
* cause an abort() or seg-fault, so these macros are used only
* for pre-condition violations and internal logic errors.
*/
#if JSON_USE_EXCEPTION
// @todo <= add detail about condition in exception
# define JSON_ASSERT(condition) \
{if (!(condition)) {Json::throwLogicError( "assert json failed" );}}
# define JSON_FAIL_MESSAGE(message) \
{ \
std::ostringstream oss; oss << message; \
Json::throwLogicError(oss.str()); \
abort(); \
}
#else // JSON_USE_EXCEPTION
# define JSON_ASSERT(condition) assert(condition)
// The call to assert() will show the failure message in debug builds. In
// release builds we abort, for a core-dump or debugger.
# define JSON_FAIL_MESSAGE(message) \
{ \
std::ostringstream oss; oss << message; \
assert(false && oss.str().c_str()); \
abort(); \
}
#endif
#define JSON_ASSERT_MESSAGE(condition, message) \
if (!(condition)) { \
JSON_FAIL_MESSAGE(message); \
}
#endif // CPPTL_JSON_ASSERTIONS_H_INCLUDED

View File

@@ -1,19 +1,25 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
#ifndef JSON_AUTOLINK_H_INCLUDED
# define JSON_AUTOLINK_H_INCLUDED
#define JSON_AUTOLINK_H_INCLUDED
# include "config.h"
#include "config.h"
# ifdef JSON_IN_CPPTL
# include <cpptl/cpptl_autolink.h>
# endif
#ifdef JSON_IN_CPPTL
#include <cpptl/cpptl_autolink.h>
#endif
# if !defined(JSON_NO_AUTOLINK) && !defined(JSON_DLL_BUILD) && !defined(JSON_IN_CPPTL)
# define CPPTL_AUTOLINK_NAME "json"
# undef CPPTL_AUTOLINK_DLL
# ifdef JSON_DLL
# define CPPTL_AUTOLINK_DLL
# endif
# include "autolink.h"
# endif
#if !defined(JSON_NO_AUTOLINK) && !defined(JSON_DLL_BUILD) && \
!defined(JSON_IN_CPPTL)
#define CPPTL_AUTOLINK_NAME "json"
#undef CPPTL_AUTOLINK_DLL
#ifdef JSON_DLL
#define CPPTL_AUTOLINK_DLL
#endif
#include "autolink.h"
#endif
#endif // JSON_AUTOLINK_H_INCLUDED

View File

@@ -1,43 +1,109 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
#ifndef JSON_CONFIG_H_INCLUDED
# define JSON_CONFIG_H_INCLUDED
#define JSON_CONFIG_H_INCLUDED
/// If defined, indicates that json library is embedded in CppTL library.
//# define JSON_IN_CPPTL 1
/// If defined, indicates that json may leverage CppTL library
//# define JSON_USE_CPPTL 1
/// If defined, indicates that cpptl vector based map should be used instead of std::map
/// If defined, indicates that cpptl vector based map should be used instead of
/// std::map
/// as Value container.
//# define JSON_USE_CPPTL_SMALLMAP 1
/// If defined, indicates that Json specific container should be used
/// (hash table & simple deque container with customizable allocator).
/// THIS FEATURE IS STILL EXPERIMENTAL!
//# define JSON_VALUE_USE_INTERNAL_MAP 1
/// Force usage of standard new/malloc based allocator instead of memory pool based allocator.
/// The memory pools allocator used optimization (initializing Value and ValueInternalLink
/// as if it was a POD) that may cause some validation tool to report errors.
/// Only has effects if JSON_VALUE_USE_INTERNAL_MAP is defined.
//# define JSON_USE_SIMPLE_INTERNAL_ALLOCATOR 1
/// If defined, indicates that Json use exception to report invalid type manipulation
/// instead of C assert macro.
# define JSON_USE_EXCEPTION 1
// If non-zero, the library uses exceptions to report bad input instead of C
// assertion macros. The default is to use exceptions.
#ifndef JSON_USE_EXCEPTION
#define JSON_USE_EXCEPTION 1
#endif
# ifdef JSON_IN_CPPTL
# include <cpptl/config.h>
# ifndef JSON_USE_CPPTL
# define JSON_USE_CPPTL 1
# endif
# endif
/// If defined, indicates that the source file is amalgated
/// to prevent private header inclusion.
/// Remarks: it is automatically defined in the generated amalgated header.
// #define JSON_IS_AMALGAMATION
# ifdef JSON_IN_CPPTL
# define JSON_API CPPTL_API
# elif defined(JSON_DLL_BUILD)
# define JSON_API __declspec(dllexport)
# elif defined(JSON_DLL)
# define JSON_API __declspec(dllimport)
# else
# define JSON_API
# endif
#ifdef JSON_IN_CPPTL
#include <cpptl/config.h>
#ifndef JSON_USE_CPPTL
#define JSON_USE_CPPTL 1
#endif
#endif
#ifdef JSON_IN_CPPTL
#define JSON_API CPPTL_API
#elif defined(JSON_DLL_BUILD)
#if defined(_MSC_VER)
#define JSON_API __declspec(dllexport)
#define JSONCPP_DISABLE_DLL_INTERFACE_WARNING
#endif // if defined(_MSC_VER)
#elif defined(JSON_DLL)
#if defined(_MSC_VER)
#define JSON_API __declspec(dllimport)
#define JSONCPP_DISABLE_DLL_INTERFACE_WARNING
#endif // if defined(_MSC_VER)
#endif // ifdef JSON_IN_CPPTL
#if !defined(JSON_API)
#define JSON_API
#endif
// If JSON_NO_INT64 is defined, then Json only support C++ "int" type for
// integer
// Storages, and 64 bits integer support is disabled.
// #define JSON_NO_INT64 1
#if defined(_MSC_VER) && _MSC_VER <= 1200 // MSVC 6
// Microsoft Visual Studio 6 only support conversion from __int64 to double
// (no conversion from unsigned __int64).
#define JSON_USE_INT64_DOUBLE_CONVERSION 1
// Disable warning 4786 for VS6 caused by STL (identifier was truncated to '255'
// characters in the debug information)
// All projects I've ever seen with VS6 were using this globally (not bothering
// with pragma push/pop).
#pragma warning(disable : 4786)
#endif // if defined(_MSC_VER) && _MSC_VER < 1200 // MSVC 6
#if defined(_MSC_VER) && _MSC_VER >= 1500 // MSVC 2008
/// Indicates that the following function is deprecated.
#define JSONCPP_DEPRECATED(message) __declspec(deprecated(message))
#elif defined(__clang__) && defined(__has_feature)
#if __has_feature(attribute_deprecated_with_message)
#define JSONCPP_DEPRECATED(message) __attribute__ ((deprecated(message)))
#endif
#elif defined(__GNUC__) && (__GNUC__ > 4 || (__GNUC__ == 4 && __GNUC_MINOR__ >= 5))
#define JSONCPP_DEPRECATED(message) __attribute__ ((deprecated(message)))
#elif defined(__GNUC__) && (__GNUC__ > 3 || (__GNUC__ == 3 && __GNUC_MINOR__ >= 1))
#define JSONCPP_DEPRECATED(message) __attribute__((__deprecated__))
#endif
#if !defined(JSONCPP_DEPRECATED)
#define JSONCPP_DEPRECATED(message)
#endif // if !defined(JSONCPP_DEPRECATED)
namespace Json {
typedef int Int;
typedef unsigned int UInt;
#if defined(JSON_NO_INT64)
typedef int LargestInt;
typedef unsigned int LargestUInt;
#undef JSON_HAS_INT64
#else // if defined(JSON_NO_INT64)
// For Microsoft Visual use specific types as long long is not supported
#if defined(_MSC_VER) // Microsoft Visual Studio
typedef __int64 Int64;
typedef unsigned __int64 UInt64;
#else // if defined(_MSC_VER) // Other platforms, use long long
typedef long long int Int64;
typedef unsigned long long int UInt64;
#endif // if defined(_MSC_VER)
typedef Int64 LargestInt;
typedef UInt64 LargestUInt;
#define JSON_HAS_INT64
#endif // if defined(JSON_NO_INT64)
} // end namespace Json
#endif // JSON_CONFIG_H_INCLUDED

View File

@@ -1,42 +1,51 @@
#ifndef CPPTL_JSON_FEATURES_H_INCLUDED
# define CPPTL_JSON_FEATURES_H_INCLUDED
# include "forwards.h"
namespace Json {
/** \brief Configuration passed to reader and writer.
* This configuration object can be used to force the Reader or Writer
* to behave in a standard conforming way.
*/
class JSON_API Features
{
public:
/** \brief A configuration that allows all features and assumes all strings are UTF-8.
* - C & C++ comments are allowed
* - Root object can be any JSON value
* - Assumes Value strings are encoded in UTF-8
*/
static Features all();
/** \brief A configuration that is strictly compatible with the JSON specification.
* - Comments are forbidden.
* - Root object must be either an array or an object value.
* - Assumes Value strings are encoded in UTF-8
*/
static Features strictMode();
/** \brief Initialize the configuration like JsonConfig::allFeatures;
*/
Features();
/// \c true if comments are allowed. Default: \c true.
bool allowComments_;
/// \c true if root must be either an array or an object value. Default: \c false.
bool strictRoot_;
};
} // namespace Json
#endif // CPPTL_JSON_FEATURES_H_INCLUDED
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
#ifndef CPPTL_JSON_FEATURES_H_INCLUDED
#define CPPTL_JSON_FEATURES_H_INCLUDED
#if !defined(JSON_IS_AMALGAMATION)
#include "forwards.h"
#endif // if !defined(JSON_IS_AMALGAMATION)
namespace Json {
/** \brief Configuration passed to reader and writer.
* This configuration object can be used to force the Reader or Writer
* to behave in a standard conforming way.
*/
class JSON_API Features {
public:
/** \brief A configuration that allows all features and assumes all strings
* are UTF-8.
* - C & C++ comments are allowed
* - Root object can be any JSON value
* - Assumes Value strings are encoded in UTF-8
*/
static Features all();
/** \brief A configuration that is strictly compatible with the JSON
* specification.
* - Comments are forbidden.
* - Root object must be either an array or an object value.
* - Assumes Value strings are encoded in UTF-8
*/
static Features strictMode();
/** \brief Initialize the configuration like JsonConfig::allFeatures;
*/
Features();
/// \c true if comments are allowed. Default: \c true.
bool allowComments_;
/// \c true if root must be either an array or an object value. Default: \c
/// false.
bool strictRoot_;
};
} // namespace Json
#endif // CPPTL_JSON_FEATURES_H_INCLUDED

View File

@@ -1,39 +1,37 @@
#ifndef JSON_FORWARDS_H_INCLUDED
# define JSON_FORWARDS_H_INCLUDED
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
# include "config.h"
#ifndef JSON_FORWARDS_H_INCLUDED
#define JSON_FORWARDS_H_INCLUDED
#if !defined(JSON_IS_AMALGAMATION)
#include "config.h"
#endif // if !defined(JSON_IS_AMALGAMATION)
namespace Json {
// writer.h
class FastWriter;
class StyledWriter;
// writer.h
class FastWriter;
class StyledWriter;
// reader.h
class Reader;
// reader.h
class Reader;
// features.h
class Features;
// features.h
class Features;
// value.h
typedef int Int;
typedef unsigned int UInt;
class StaticString;
class Path;
class PathArgument;
class Value;
class ValueIteratorBase;
class ValueIterator;
class ValueConstIterator;
#ifdef JSON_VALUE_USE_INTERNAL_MAP
class ValueAllocator;
class ValueMapAllocator;
class ValueInternalLink;
class ValueInternalArray;
class ValueInternalMap;
#endif // #ifdef JSON_VALUE_USE_INTERNAL_MAP
// value.h
typedef unsigned int ArrayIndex;
class StaticString;
class Path;
class PathArgument;
class Value;
class ValueIteratorBase;
class ValueIterator;
class ValueConstIterator;
} // namespace Json
#endif // JSON_FORWARDS_H_INCLUDED

View File

@@ -1,10 +1,15 @@
#ifndef JSON_JSON_H_INCLUDED
# define JSON_JSON_H_INCLUDED
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
# include "autolink.h"
# include "value.h"
# include "reader.h"
# include "writer.h"
# include "features.h"
#ifndef JSON_JSON_H_INCLUDED
#define JSON_JSON_H_INCLUDED
#include "autolink.h"
#include "value.h"
#include "reader.h"
#include "writer.h"
#include "features.h"
#endif // JSON_JSON_H_INCLUDED

View File

@@ -1,196 +1,357 @@
#ifndef CPPTL_JSON_READER_H_INCLUDED
# define CPPTL_JSON_READER_H_INCLUDED
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
# include "features.h"
# include "value.h"
# include <deque>
# include <stack>
# include <string>
# include <iostream>
#ifndef CPPTL_JSON_READER_H_INCLUDED
#define CPPTL_JSON_READER_H_INCLUDED
#if !defined(JSON_IS_AMALGAMATION)
#include "features.h"
#include "value.h"
#endif // if !defined(JSON_IS_AMALGAMATION)
#include <deque>
#include <iosfwd>
#include <stack>
#include <string>
#include <istream>
// Disable warning C4251: <data member>: <type> needs to have dll-interface to
// be used by...
#if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING)
#pragma warning(push)
#pragma warning(disable : 4251)
#endif // if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING)
namespace Json {
/** \brief Unserialize a <a HREF="http://www.json.org">JSON</a> document into a Value.
*
*/
class JSON_API Reader
{
public:
typedef char Char;
typedef const Char *Location;
/** \brief Unserialize a <a HREF="http://www.json.org">JSON</a> document into a
*Value.
*
* \deprecated Use CharReader and CharReaderBuilder.
*/
class JSON_API Reader {
public:
typedef char Char;
typedef const Char* Location;
/** \brief Constructs a Reader allowing all features
* for parsing.
*/
Reader();
/** \brief Constructs a Reader allowing the specified feature set
* for parsing.
*/
Reader( const Features &features );
/** \brief Read a Value from a <a HREF="http://www.json.org">JSON</a> document.
* \param document UTF-8 encoded string containing the document to read.
* \param root [out] Contains the root value of the document if it was
* successfully parsed.
* \param collectComments \c true to collect comment and allow writing them back during
* serialization, \c false to discard comments.
* This parameter is ignored if Features::allowComments_
* is \c false.
* \return \c true if the document was successfully parsed, \c false if an error occurred.
*/
bool parse( const std::string &document,
Value &root,
bool collectComments = true );
/** \brief Read a Value from a <a HREF="http://www.json.org">JSON</a> document.
* \param document UTF-8 encoded string containing the document to read.
* \param root [out] Contains the root value of the document if it was
* successfully parsed.
* \param collectComments \c true to collect comment and allow writing them back during
* serialization, \c false to discard comments.
* This parameter is ignored if Features::allowComments_
* is \c false.
* \return \c true if the document was successfully parsed, \c false if an error occurred.
*/
bool parse( const char *beginDoc, const char *endDoc,
Value &root,
bool collectComments = true );
/// \brief Parse from input stream.
/// \see Json::operator>>(std::istream&, Json::Value&).
bool parse( std::istream &is,
Value &root,
bool collectComments = true );
/** \brief Returns a user friendly string that list errors in the parsed document.
* \return Formatted error message with the list of errors with their location in
* the parsed document. An empty string is returned if no error occurred
* during parsing.
*/
std::string getFormatedErrorMessages() const;
private:
enum TokenType
{
tokenEndOfStream = 0,
tokenObjectBegin,
tokenObjectEnd,
tokenArrayBegin,
tokenArrayEnd,
tokenString,
tokenNumber,
tokenTrue,
tokenFalse,
tokenNull,
tokenArraySeparator,
tokenMemberSeparator,
tokenComment,
tokenError
};
class Token
{
public:
TokenType type_;
Location start_;
Location end_;
};
class ErrorInfo
{
public:
Token token_;
std::string message_;
Location extra_;
};
typedef std::deque<ErrorInfo> Errors;
bool expectToken( TokenType type, Token &token, const char *message );
bool readToken( Token &token );
void skipSpaces();
bool match( Location pattern,
int patternLength );
bool readComment();
bool readCStyleComment();
bool readCppStyleComment();
bool readString();
void readNumber();
bool readValue();
bool readObject( Token &token );
bool readArray( Token &token );
bool decodeNumber( Token &token );
bool decodeString( Token &token );
bool decodeString( Token &token, std::string &decoded );
bool decodeDouble( Token &token );
bool decodeUnicodeCodePoint( Token &token,
Location &current,
Location end,
unsigned int &unicode );
bool decodeUnicodeEscapeSequence( Token &token,
Location &current,
Location end,
unsigned int &unicode );
bool addError( const std::string &message,
Token &token,
Location extra = 0 );
bool recoverFromError( TokenType skipUntilToken );
bool addErrorAndRecover( const std::string &message,
Token &token,
TokenType skipUntilToken );
void skipUntilSpace();
Value &currentValue();
Char getNextChar();
void getLocationLineAndColumn( Location location,
int &line,
int &column ) const;
std::string getLocationLineAndColumn( Location location ) const;
void addComment( Location begin,
Location end,
CommentPlacement placement );
void skipCommentTokens( Token &token );
typedef std::stack<Value *> Nodes;
Nodes nodes_;
Errors errors_;
std::string document_;
Location begin_;
Location end_;
Location current_;
Location lastValueEnd_;
Value *lastValue_;
std::string commentsBefore_;
Features features_;
bool collectComments_;
};
/** \brief Read from 'sin' into 'root'.
Always keep comments from the input JSON.
This can be used to read a file into a particular sub-object.
For example:
\code
Json::Value root;
cin >> root["dir"]["file"];
cout << root;
\endcode
Result:
\verbatim
{
"dir": {
"file": {
// The input stream JSON would be nested here.
}
}
}
\endverbatim
\throw std::exception on parse error.
\see Json::operator<<()
/** \brief Constructs a Reader allowing all features
* for parsing.
*/
std::istream& operator>>( std::istream&, Value& );
Reader();
/** \brief Constructs a Reader allowing the specified feature set
* for parsing.
*/
Reader(const Features& features);
/** \brief Read a Value from a <a HREF="http://www.json.org">JSON</a>
* document.
* \param document UTF-8 encoded string containing the document to read.
* \param root [out] Contains the root value of the document if it was
* successfully parsed.
* \param collectComments \c true to collect comment and allow writing them
* back during
* serialization, \c false to discard comments.
* This parameter is ignored if
* Features::allowComments_
* is \c false.
* \return \c true if the document was successfully parsed, \c false if an
* error occurred.
*/
bool
parse(const std::string& document, Value& root, bool collectComments = true);
/** \brief Read a Value from a <a HREF="http://www.json.org">JSON</a>
document.
* \param beginDoc Pointer on the beginning of the UTF-8 encoded string of the
document to read.
* \param endDoc Pointer on the end of the UTF-8 encoded string of the
document to read.
* Must be >= beginDoc.
* \param root [out] Contains the root value of the document if it was
* successfully parsed.
* \param collectComments \c true to collect comment and allow writing them
back during
* serialization, \c false to discard comments.
* This parameter is ignored if
Features::allowComments_
* is \c false.
* \return \c true if the document was successfully parsed, \c false if an
error occurred.
*/
bool parse(const char* beginDoc,
const char* endDoc,
Value& root,
bool collectComments = true);
/// \brief Parse from input stream.
/// \see Json::operator>>(std::istream&, Json::Value&).
bool parse(std::istream& is, Value& root, bool collectComments = true);
/** \brief Returns a user friendly string that list errors in the parsed
* document.
* \return Formatted error message with the list of errors with their location
* in
* the parsed document. An empty string is returned if no error
* occurred
* during parsing.
* \deprecated Use getFormattedErrorMessages() instead (typo fix).
*/
JSONCPP_DEPRECATED("Use getFormattedErrorMessages() instead.")
std::string getFormatedErrorMessages() const;
/** \brief Returns a user friendly string that list errors in the parsed
* document.
* \return Formatted error message with the list of errors with their location
* in
* the parsed document. An empty string is returned if no error
* occurred
* during parsing.
*/
std::string getFormattedErrorMessages() const;
private:
enum TokenType {
tokenEndOfStream = 0,
tokenObjectBegin,
tokenObjectEnd,
tokenArrayBegin,
tokenArrayEnd,
tokenString,
tokenNumber,
tokenTrue,
tokenFalse,
tokenNull,
tokenArraySeparator,
tokenMemberSeparator,
tokenComment,
tokenError
};
class Token {
public:
TokenType type_;
Location start_;
Location end_;
};
class ErrorInfo {
public:
Token token_;
std::string message_;
Location extra_;
};
typedef std::deque<ErrorInfo> Errors;
bool readToken(Token& token);
void skipSpaces();
bool match(Location pattern, int patternLength);
bool readComment();
bool readCStyleComment();
bool readCppStyleComment();
bool readString();
void readNumber();
bool readValue();
bool readObject(Token& token);
bool readArray(Token& token);
bool decodeNumber(Token& token);
bool decodeNumber(Token& token, Value& decoded);
bool decodeString(Token& token);
bool decodeString(Token& token, std::string& decoded);
bool decodeDouble(Token& token);
bool decodeDouble(Token& token, Value& decoded);
bool decodeUnicodeCodePoint(Token& token,
Location& current,
Location end,
unsigned int& unicode);
bool decodeUnicodeEscapeSequence(Token& token,
Location& current,
Location end,
unsigned int& unicode);
bool addError(const std::string& message, Token& token, Location extra = 0);
bool recoverFromError(TokenType skipUntilToken);
bool addErrorAndRecover(const std::string& message,
Token& token,
TokenType skipUntilToken);
void skipUntilSpace();
Value& currentValue();
Char getNextChar();
void
getLocationLineAndColumn(Location location, int& line, int& column) const;
std::string getLocationLineAndColumn(Location location) const;
void addComment(Location begin, Location end, CommentPlacement placement);
void skipCommentTokens(Token& token);
typedef std::stack<Value*> Nodes;
Nodes nodes_;
Errors errors_;
std::string document_;
Location begin_;
Location end_;
Location current_;
Location lastValueEnd_;
Value* lastValue_;
std::string commentsBefore_;
Features features_;
bool collectComments_;
}; // Reader
/** Interface for reading JSON from a char array.
*/
class JSON_API CharReader {
public:
virtual ~CharReader() {}
/** \brief Read a Value from a <a HREF="http://www.json.org">JSON</a>
document.
* The document must be a UTF-8 encoded string containing the document to read.
*
* \param beginDoc Pointer on the beginning of the UTF-8 encoded string of the
document to read.
* \param endDoc Pointer on the end of the UTF-8 encoded string of the
document to read.
* Must be >= beginDoc.
* \param root [out] Contains the root value of the document if it was
* successfully parsed.
* \param errs [out] Formatted error messages (if not NULL)
* a user friendly string that lists errors in the parsed
* document.
* \return \c true if the document was successfully parsed, \c false if an
error occurred.
*/
virtual bool parse(
char const* beginDoc, char const* endDoc,
Value* root, std::string* errs) = 0;
class Factory {
public:
virtual ~Factory() {}
/** \brief Allocate a CharReader via operator new().
* \throw std::exception if something goes wrong (e.g. invalid settings)
*/
virtual CharReader* newCharReader() const = 0;
}; // Factory
}; // CharReader
/** \brief Build a CharReader implementation.
Usage:
\code
using namespace Json;
CharReaderBuilder builder;
builder["collectComments"] = false;
Value value;
std::string errs;
bool ok = parseFromStream(builder, std::cin, &value, &errs);
\endcode
*/
class JSON_API CharReaderBuilder : public CharReader::Factory {
public:
// Note: We use a Json::Value so that we can add data-members to this class
// without a major version bump.
/** Configuration of this builder.
These are case-sensitive.
Available settings (case-sensitive):
- `"collectComments": false or true`
- true to collect comment and allow writing them
back during serialization, false to discard comments.
This parameter is ignored if allowComments is false.
- `"allowComments": false or true`
- true if comments are allowed.
- `"strictRoot": false or true`
- true if root must be either an array or an object value
- `"allowDroppedNullPlaceholders": false or true`
- true if dropped null placeholders are allowed. (See StreamWriterBuilder.)
- `"allowNumericKeys": false or true`
- true if numeric object keys are allowed.
- `"allowSingleQuotes": false or true`
- true if '' are allowed for strings (both keys and values)
- `"stackLimit": integer`
- Exceeding stackLimit (recursive depth of `readValue()`) will
cause an exception.
- This is a security issue (seg-faults caused by deeply nested JSON),
so the default is low.
- `"failIfExtra": false or true`
- If true, `parse()` returns false when extra non-whitespace trails
the JSON value in the input string.
- `"rejectDupKeys": false or true`
- If true, `parse()` returns false when a key is duplicated within an object.
You can examine 'settings_` yourself
to see the defaults. You can also write and read them just like any
JSON Value.
\sa setDefaults()
*/
Json::Value settings_;
CharReaderBuilder();
virtual ~CharReaderBuilder();
virtual CharReader* newCharReader() const;
/** \return true if 'settings' are legal and consistent;
* otherwise, indicate bad settings via 'invalid'.
*/
bool validate(Json::Value* invalid) const;
/** A simple way to update a specific setting.
*/
Value& operator[](std::string key);
/** Called by ctor, but you can use this to reset settings_.
* \pre 'settings' != NULL (but Json::null is fine)
* \remark Defaults:
* \snippet src/lib_json/json_reader.cpp CharReaderBuilderStrictMode
*/
static void setDefaults(Json::Value* settings);
/** Same as old Features::strictMode().
* \pre 'settings' != NULL (but Json::null is fine)
* \remark Defaults:
* \snippet src/lib_json/json_reader.cpp CharReaderBuilderDefaults
*/
static void strictMode(Json::Value* settings);
};
/** Consume entire stream and use its begin/end.
* Someday we might have a real StreamReader, but for now this
* is convenient.
*/
bool JSON_API parseFromStream(
CharReader::Factory const&,
std::istream&,
Value* root, std::string* errs);
/** \brief Read from 'sin' into 'root'.
Always keep comments from the input JSON.
This can be used to read a file into a particular sub-object.
For example:
\code
Json::Value root;
cin >> root["dir"]["file"];
cout << root;
\endcode
Result:
\verbatim
{
"dir": {
"file": {
// The input stream JSON would be nested here.
}
}
}
\endverbatim
\throw std::exception on parse error.
\see Json::operator<<()
*/
JSON_API std::istream& operator>>(std::istream&, Value&);
} // namespace Json
#if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING)
#pragma warning(pop)
#endif // if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING)
#endif // CPPTL_JSON_READER_H_INCLUDED

File diff suppressed because it is too large Load Diff

14
include/json/version.h Normal file
View File

@@ -0,0 +1,14 @@
// DO NOT EDIT. This file is generated by CMake from "version"
// and "version.h.in" files.
// Run CMake configure step to update it.
#ifndef JSON_VERSION_H_INCLUDED
# define JSON_VERSION_H_INCLUDED
# define JSONCPP_VERSION_STRING "1.10.0"
# define JSONCPP_VERSION_MAJOR 1
# define JSONCPP_VERSION_MINOR 10
# define JSONCPP_VERSION_PATCH 0
# define JSONCPP_VERSION_QUALIFIER
# define JSONCPP_VERSION_HEXA ((JSONCPP_VERSION_MAJOR << 24) | (JSONCPP_VERSION_MINOR << 16) | (JSONCPP_VERSION_PATCH << 8))
#endif // JSON_VERSION_H_INCLUDED

View File

@@ -1,174 +1,316 @@
#ifndef JSON_WRITER_H_INCLUDED
# define JSON_WRITER_H_INCLUDED
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
# include "value.h"
# include <vector>
# include <string>
# include <iostream>
#ifndef JSON_WRITER_H_INCLUDED
#define JSON_WRITER_H_INCLUDED
#if !defined(JSON_IS_AMALGAMATION)
#include "value.h"
#endif // if !defined(JSON_IS_AMALGAMATION)
#include <vector>
#include <string>
#include <ostream>
// Disable warning C4251: <data member>: <type> needs to have dll-interface to
// be used by...
#if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING)
#pragma warning(push)
#pragma warning(disable : 4251)
#endif // if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING)
namespace Json {
class Value;
class Value;
/** \brief Abstract class for writers.
/**
Usage:
\code
using namespace Json;
void writeToStdout(StreamWriter::Factory const& factory, Value const& value) {
std::unique_ptr<StreamWriter> const writer(
factory.newStreamWriter());
writer->write(value, &std::cout);
std::cout << std::endl; // add lf and flush
}
\endcode
*/
class JSON_API StreamWriter {
protected:
std::ostream* sout_; // not owned; will not delete
public:
StreamWriter();
virtual ~StreamWriter();
/** Write Value into document as configured in sub-class.
Do not take ownership of sout, but maintain a reference during function.
\pre sout != NULL
\return zero on success (For now, we always return zero, so check the stream instead.)
\throw std::exception possibly, depending on configuration
*/
virtual int write(Value const& root, std::ostream* sout) = 0;
/** \brief A simple abstract factory.
*/
class JSON_API Factory {
public:
virtual ~Factory();
/** \brief Allocate a CharReader via operator new().
* \throw std::exception if something goes wrong (e.g. invalid settings)
*/
virtual StreamWriter* newStreamWriter() const = 0;
}; // Factory
}; // StreamWriter
/** \brief Write into stringstream, then return string, for convenience.
* A StreamWriter will be created from the factory, used, and then deleted.
*/
std::string JSON_API writeString(StreamWriter::Factory const& factory, Value const& root);
/** \brief Build a StreamWriter implementation.
Usage:
\code
using namespace Json;
Value value = ...;
StreamWriterBuilder builder;
builder["commentStyle"] = "None";
builder["indentation"] = " "; // or whatever you like
std::unique_ptr<Json::StreamWriter> writer(
builder.newStreamWriter());
writer->write(value, &std::cout);
std::cout << std::endl; // add lf and flush
\endcode
*/
class JSON_API StreamWriterBuilder : public StreamWriter::Factory {
public:
// Note: We use a Json::Value so that we can add data-members to this class
// without a major version bump.
/** Configuration of this builder.
Available settings (case-sensitive):
- "commentStyle": "None" or "All"
- "indentation": "<anything>"
- "enableYAMLCompatibility": false or true
- slightly change the whitespace around colons
- "dropNullPlaceholders": false or true
- Drop the "null" string from the writer's output for nullValues.
Strictly speaking, this is not valid JSON. But when the output is being
fed to a browser's Javascript, it makes for smaller output and the
browser can handle the output just fine.
You can examine 'settings_` yourself
to see the defaults. You can also write and read them just like any
JSON Value.
\sa setDefaults()
*/
class JSON_API Writer
{
public:
virtual ~Writer();
Json::Value settings_;
virtual std::string write( const Value &root ) = 0;
};
StreamWriterBuilder();
virtual ~StreamWriterBuilder();
/** \brief Outputs a Value in <a HREF="http://www.json.org">JSON</a> format without formatting (not human friendly).
*
* The JSON document is written in a single line. It is not intended for 'human' consumption,
* but may be usefull to support feature such as RPC where bandwith is limited.
* \sa Reader, Value
*/
class JSON_API FastWriter : public Writer
{
public:
FastWriter();
virtual ~FastWriter(){}
/**
* \throw std::exception if something goes wrong (e.g. invalid settings)
*/
virtual StreamWriter* newStreamWriter() const;
void enableYAMLCompatibility();
/** \return true if 'settings' are legal and consistent;
* otherwise, indicate bad settings via 'invalid'.
*/
bool validate(Json::Value* invalid) const;
/** A simple way to update a specific setting.
*/
Value& operator[](std::string key);
public: // overridden from Writer
virtual std::string write( const Value &root );
/** Called by ctor, but you can use this to reset settings_.
* \pre 'settings' != NULL (but Json::null is fine)
* \remark Defaults:
* \snippet src/lib_json/json_writer.cpp StreamWriterBuilderDefaults
*/
static void setDefaults(Json::Value* settings);
};
private:
void writeValue( const Value &value );
/** \brief Abstract class for writers.
* \deprecated Use StreamWriter. (And really, this is an implementation detail.)
*/
class JSON_API Writer {
public:
virtual ~Writer();
std::string document_;
bool yamlCompatiblityEnabled_;
};
virtual std::string write(const Value& root) = 0;
};
/** \brief Writes a Value in <a HREF="http://www.json.org">JSON</a> format in a human friendly way.
*
* The rules for line break and indent are as follow:
* - Object value:
* - if empty then print {} without indent and line break
* - if not empty the print '{', line break & indent, print one value per line
* and then unindent and line break and print '}'.
* - Array value:
* - if empty then print [] without indent and line break
* - if the array contains no object value, empty array or some other value types,
* and all the values fit on one lines, then print the array on a single line.
* - otherwise, it the values do not fit on one line, or the array contains
* object or non empty array, then print one value per line.
*
* If the Value have comments then they are outputed according to their #CommentPlacement.
*
* \sa Reader, Value, Value::setComment()
*/
class JSON_API StyledWriter: public Writer
{
public:
StyledWriter();
virtual ~StyledWriter(){}
/** \brief Outputs a Value in <a HREF="http://www.json.org">JSON</a> format
*without formatting (not human friendly).
*
* The JSON document is written in a single line. It is not intended for 'human'
*consumption,
* but may be usefull to support feature such as RPC where bandwith is limited.
* \sa Reader, Value
* \deprecated Use StreamWriterBuilder.
*/
class JSON_API FastWriter : public Writer {
public: // overridden from Writer
/** \brief Serialize a Value in <a HREF="http://www.json.org">JSON</a> format.
* \param root Value to serialize.
* \return String containing the JSON document that represents the root value.
*/
virtual std::string write( const Value &root );
public:
FastWriter();
virtual ~FastWriter() {}
private:
void writeValue( const Value &value );
void writeArrayValue( const Value &value );
bool isMultineArray( const Value &value );
void pushValue( const std::string &value );
void writeIndent();
void writeWithIndent( const std::string &value );
void indent();
void unindent();
void writeCommentBeforeValue( const Value &root );
void writeCommentAfterValueOnSameLine( const Value &root );
bool hasCommentForValue( const Value &value );
static std::string normalizeEOL( const std::string &text );
void enableYAMLCompatibility();
typedef std::vector<std::string> ChildValues;
public: // overridden from Writer
virtual std::string write(const Value& root);
ChildValues childValues_;
std::string document_;
std::string indentString_;
int rightMargin_;
int indentSize_;
bool addChildValues_;
};
private:
void writeValue(const Value& value);
/** \brief Writes a Value in <a HREF="http://www.json.org">JSON</a> format in a human friendly way,
to a stream rather than to a string.
*
* The rules for line break and indent are as follow:
* - Object value:
* - if empty then print {} without indent and line break
* - if not empty the print '{', line break & indent, print one value per line
* and then unindent and line break and print '}'.
* - Array value:
* - if empty then print [] without indent and line break
* - if the array contains no object value, empty array or some other value types,
* and all the values fit on one lines, then print the array on a single line.
* - otherwise, it the values do not fit on one line, or the array contains
* object or non empty array, then print one value per line.
*
* If the Value have comments then they are outputed according to their #CommentPlacement.
*
* \param indentation Each level will be indented by this amount extra.
* \sa Reader, Value, Value::setComment()
*/
class JSON_API StyledStreamWriter
{
public:
StyledStreamWriter( std::string indentation="\t" );
~StyledStreamWriter(){}
std::string document_;
bool yamlCompatiblityEnabled_;
};
public:
/** \brief Serialize a Value in <a HREF="http://www.json.org">JSON</a> format.
* \param out Stream to write to. (Can be ostringstream, e.g.)
* \param root Value to serialize.
* \note There is no point in deriving from Writer, since write() should not return a value.
*/
void write( std::ostream &out, const Value &root );
/** \brief Writes a Value in <a HREF="http://www.json.org">JSON</a> format in a
*human friendly way.
*
* The rules for line break and indent are as follow:
* - Object value:
* - if empty then print {} without indent and line break
* - if not empty the print '{', line break & indent, print one value per
*line
* and then unindent and line break and print '}'.
* - Array value:
* - if empty then print [] without indent and line break
* - if the array contains no object value, empty array or some other value
*types,
* and all the values fit on one lines, then print the array on a single
*line.
* - otherwise, it the values do not fit on one line, or the array contains
* object or non empty array, then print one value per line.
*
* If the Value have comments then they are outputed according to their
*#CommentPlacement.
*
* \sa Reader, Value, Value::setComment()
* \deprecated Use StreamWriterBuilder.
*/
class JSON_API StyledWriter : public Writer {
public:
StyledWriter();
virtual ~StyledWriter() {}
private:
void writeValue( const Value &value );
void writeArrayValue( const Value &value );
bool isMultineArray( const Value &value );
void pushValue( const std::string &value );
void writeIndent();
void writeWithIndent( const std::string &value );
void indent();
void unindent();
void writeCommentBeforeValue( const Value &root );
void writeCommentAfterValueOnSameLine( const Value &root );
bool hasCommentForValue( const Value &value );
static std::string normalizeEOL( const std::string &text );
public: // overridden from Writer
/** \brief Serialize a Value in <a HREF="http://www.json.org">JSON</a> format.
* \param root Value to serialize.
* \return String containing the JSON document that represents the root value.
*/
virtual std::string write(const Value& root);
typedef std::vector<std::string> ChildValues;
private:
void writeValue(const Value& value);
void writeArrayValue(const Value& value);
bool isMultineArray(const Value& value);
void pushValue(const std::string& value);
void writeIndent();
void writeWithIndent(const std::string& value);
void indent();
void unindent();
void writeCommentBeforeValue(const Value& root);
void writeCommentAfterValueOnSameLine(const Value& root);
bool hasCommentForValue(const Value& value);
static std::string normalizeEOL(const std::string& text);
ChildValues childValues_;
std::ostream* document_;
std::string indentString_;
int rightMargin_;
std::string indentation_;
bool addChildValues_;
};
typedef std::vector<std::string> ChildValues;
std::string JSON_API valueToString( Int value );
std::string JSON_API valueToString( UInt value );
std::string JSON_API valueToString( double value );
std::string JSON_API valueToString( bool value );
std::string JSON_API valueToQuotedString( const char *value );
ChildValues childValues_;
std::string document_;
std::string indentString_;
int rightMargin_;
int indentSize_;
bool addChildValues_;
};
/// \brief Output using the StyledStreamWriter.
/// \see Json::operator>>()
std::ostream& operator<<( std::ostream&, const Value &root );
/** \brief Writes a Value in <a HREF="http://www.json.org">JSON</a> format in a
human friendly way,
to a stream rather than to a string.
*
* The rules for line break and indent are as follow:
* - Object value:
* - if empty then print {} without indent and line break
* - if not empty the print '{', line break & indent, print one value per
line
* and then unindent and line break and print '}'.
* - Array value:
* - if empty then print [] without indent and line break
* - if the array contains no object value, empty array or some other value
types,
* and all the values fit on one lines, then print the array on a single
line.
* - otherwise, it the values do not fit on one line, or the array contains
* object or non empty array, then print one value per line.
*
* If the Value have comments then they are outputed according to their
#CommentPlacement.
*
* \param indentation Each level will be indented by this amount extra.
* \sa Reader, Value, Value::setComment()
* \deprecated Use StreamWriterBuilder.
*/
class JSON_API StyledStreamWriter {
public:
StyledStreamWriter(std::string indentation = "\t");
~StyledStreamWriter() {}
public:
/** \brief Serialize a Value in <a HREF="http://www.json.org">JSON</a> format.
* \param out Stream to write to. (Can be ostringstream, e.g.)
* \param root Value to serialize.
* \note There is no point in deriving from Writer, since write() should not
* return a value.
*/
void write(std::ostream& out, const Value& root);
private:
void writeValue(const Value& value);
void writeArrayValue(const Value& value);
bool isMultineArray(const Value& value);
void pushValue(const std::string& value);
void writeIndent();
void writeWithIndent(const std::string& value);
void indent();
void unindent();
void writeCommentBeforeValue(const Value& root);
void writeCommentAfterValueOnSameLine(const Value& root);
bool hasCommentForValue(const Value& value);
static std::string normalizeEOL(const std::string& text);
typedef std::vector<std::string> ChildValues;
ChildValues childValues_;
std::ostream* document_;
std::string indentString_;
int rightMargin_;
std::string indentation_;
bool addChildValues_ : 1;
bool indented_ : 1;
};
#if defined(JSON_HAS_INT64)
std::string JSON_API valueToString(Int value);
std::string JSON_API valueToString(UInt value);
#endif // if defined(JSON_HAS_INT64)
std::string JSON_API valueToString(LargestInt value);
std::string JSON_API valueToString(LargestUInt value);
std::string JSON_API valueToString(double value);
std::string JSON_API valueToString(bool value);
std::string JSON_API valueToQuotedString(const char* value);
/// \brief Output using the StyledStreamWriter.
/// \see Json::operator>>()
JSON_API std::ostream& operator<<(std::ostream&, const Value& root);
} // namespace Json
#if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING)
#pragma warning(pop)
#endif // if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING)
#endif // JSON_WRITER_H_INCLUDED

View File

@@ -0,0 +1,42 @@

Microsoft Visual Studio Solution File, Format Version 11.00
# Visual Studio 2010
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "lib_json", "lib_json.vcxproj", "{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}"
EndProject
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "jsontest", "jsontest.vcxproj", "{25AF2DD2-D396-4668-B188-488C33B8E620}"
EndProject
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "test_lib_json", "test_lib_json.vcxproj", "{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Win32 = Debug|Win32
Debug|x64 = Debug|x64
Release|Win32 = Release|Win32
Release|x64 = Release|x64
EndGlobalSection
GlobalSection(ProjectConfigurationPlatforms) = postSolution
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Debug|Win32.ActiveCfg = Debug|Win32
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Debug|Win32.Build.0 = Debug|Win32
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Debug|x64.ActiveCfg = Debug|x64
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Debug|x64.Build.0 = Debug|x64
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Release|Win32.ActiveCfg = Release|Win32
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Release|Win32.Build.0 = Release|Win32
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Release|x64.ActiveCfg = Release|x64
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Release|x64.Build.0 = Release|x64
{25AF2DD2-D396-4668-B188-488C33B8E620}.Debug|Win32.ActiveCfg = Debug|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.Debug|Win32.Build.0 = Debug|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.Debug|x64.ActiveCfg = Debug|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.Release|Win32.ActiveCfg = Release|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.Release|Win32.Build.0 = Release|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.Release|x64.ActiveCfg = Release|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Debug|Win32.ActiveCfg = Debug|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Debug|Win32.Build.0 = Debug|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Debug|x64.ActiveCfg = Debug|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Release|Win32.ActiveCfg = Release|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Release|Win32.Build.0 = Release|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Release|x64.ActiveCfg = Release|Win32
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
EndGlobalSection
EndGlobal

View File

@@ -0,0 +1,96 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup Label="ProjectConfigurations">
<ProjectConfiguration Include="Debug|Win32">
<Configuration>Debug</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|Win32">
<Configuration>Release</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
</ItemGroup>
<PropertyGroup Label="Globals">
<ProjectGuid>{25AF2DD2-D396-4668-B188-488C33B8E620}</ProjectGuid>
<Keyword>Win32Proj</Keyword>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.props" />
<ImportGroup Label="ExtensionSettings">
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<PropertyGroup Label="UserMacros" />
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<OutDir Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">../../build/vs71/debug/jsontest\</OutDir>
<IntDir Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">../../build/vs71/debug/jsontest\</IntDir>
<LinkIncremental Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">true</LinkIncremental>
<OutDir Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">../../build/vs71/release/jsontest\</OutDir>
<IntDir Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">../../build/vs71/release/jsontest\</IntDir>
<LinkIncremental Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">false</LinkIncremental>
</PropertyGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<ClCompile>
<Optimization>Disabled</Optimization>
<AdditionalIncludeDirectories>../../include;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<PreprocessorDefinitions>WIN32;_DEBUG;_CONSOLE;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<MinimalRebuild>true</MinimalRebuild>
<BasicRuntimeChecks>EnableFastChecks</BasicRuntimeChecks>
<RuntimeLibrary>MultiThreadedDebug</RuntimeLibrary>
<PrecompiledHeader>
</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
<DebugInformationFormat>EditAndContinue</DebugInformationFormat>
</ClCompile>
<Link>
<OutputFile>$(OutDir)jsontest.exe</OutputFile>
<GenerateDebugInformation>true</GenerateDebugInformation>
<ProgramDatabaseFile>$(OutDir)jsontest.pdb</ProgramDatabaseFile>
<SubSystem>Console</SubSystem>
<TargetMachine>MachineX86</TargetMachine>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<ClCompile>
<AdditionalIncludeDirectories>../../include;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<PreprocessorDefinitions>WIN32;NDEBUG;_CONSOLE;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<RuntimeLibrary>MultiThreaded</RuntimeLibrary>
<PrecompiledHeader>
</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
<DebugInformationFormat>ProgramDatabase</DebugInformationFormat>
</ClCompile>
<Link>
<OutputFile>$(OutDir)jsontest.exe</OutputFile>
<GenerateDebugInformation>true</GenerateDebugInformation>
<SubSystem>Console</SubSystem>
<OptimizeReferences>true</OptimizeReferences>
<EnableCOMDATFolding>true</EnableCOMDATFolding>
<TargetMachine>MachineX86</TargetMachine>
</Link>
</ItemDefinitionGroup>
<ItemGroup>
<ClCompile Include="..\..\src\jsontestrunner\main.cpp" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="lib_json.vcxproj">
<Project>{1e6c2c1c-6453-4129-ae3f-0ee8e6599c89}</Project>
</ProjectReference>
</ItemGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.targets" />
<ImportGroup Label="ExtensionTargets">
</ImportGroup>
</Project>

View File

@@ -0,0 +1,13 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup>
<Filter Include="Source Files">
<UniqueIdentifier>{903591b3-ade3-4ce4-b1f9-1e175e62b014}</UniqueIdentifier>
</Filter>
</ItemGroup>
<ItemGroup>
<ClCompile Include="..\..\src\jsontestrunner\main.cpp">
<Filter>Source Files</Filter>
</ClCompile>
</ItemGroup>
</Project>

View File

@@ -0,0 +1,143 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup Label="ProjectConfigurations">
<ProjectConfiguration Include="Debug|Win32">
<Configuration>Debug</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Debug|x64">
<Configuration>Debug</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|Win32">
<Configuration>Release</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|x64">
<Configuration>Release</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
</ItemGroup>
<ItemGroup>
<ClCompile Include="..\..\src\lib_json\json_reader.cpp" />
<ClCompile Include="..\..\src\lib_json\json_value.cpp" />
<ClCompile Include="..\..\src\lib_json\json_writer.cpp" />
</ItemGroup>
<ItemGroup>
<ClInclude Include="..\..\include\json\reader.h" />
<ClInclude Include="..\..\include\json\value.h" />
<ClInclude Include="..\..\include\json\writer.h" />
</ItemGroup>
<PropertyGroup Label="Globals">
<ProjectGuid>{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}</ProjectGuid>
<Keyword>Win32Proj</Keyword>
<RootNamespace>jsoncpp</RootNamespace>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="Configuration">
<ConfigurationType>StaticLibrary</ConfigurationType>
<UseDebugLibraries>true</UseDebugLibraries>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="Configuration">
<ConfigurationType>StaticLibrary</ConfigurationType>
<UseDebugLibraries>true</UseDebugLibraries>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="Configuration">
<ConfigurationType>StaticLibrary</ConfigurationType>
<UseDebugLibraries>false</UseDebugLibraries>
<WholeProgramOptimization>true</WholeProgramOptimization>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="Configuration">
<ConfigurationType>StaticLibrary</ConfigurationType>
<UseDebugLibraries>false</UseDebugLibraries>
<WholeProgramOptimization>true</WholeProgramOptimization>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.props" />
<ImportGroup Label="ExtensionSettings">
</ImportGroup>
<ImportGroup Label="PropertySheets" Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Label="PropertySheets" Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<PropertyGroup Label="UserMacros" />
<PropertyGroup />
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<ClCompile>
<PrecompiledHeader>NotUsing</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
<Optimization>Disabled</Optimization>
<PreprocessorDefinitions>WIN32;_DEBUG;_LIB;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>../../include</AdditionalIncludeDirectories>
<RuntimeLibrary>MultiThreadedDebug</RuntimeLibrary>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">
<ClCompile>
<PrecompiledHeader>NotUsing</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
<Optimization>Disabled</Optimization>
<PreprocessorDefinitions>WIN32;_DEBUG;_LIB;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>../../include</AdditionalIncludeDirectories>
<RuntimeLibrary>MultiThreadedDebug</RuntimeLibrary>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<ClCompile>
<WarningLevel>Level3</WarningLevel>
<PrecompiledHeader>NotUsing</PrecompiledHeader>
<Optimization>MaxSpeed</Optimization>
<FunctionLevelLinking>true</FunctionLevelLinking>
<IntrinsicFunctions>true</IntrinsicFunctions>
<PreprocessorDefinitions>WIN32;NDEBUG;_LIB;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>../../include</AdditionalIncludeDirectories>
<RuntimeLibrary>MultiThreaded</RuntimeLibrary>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
<EnableCOMDATFolding>true</EnableCOMDATFolding>
<OptimizeReferences>true</OptimizeReferences>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'">
<ClCompile>
<WarningLevel>Level3</WarningLevel>
<PrecompiledHeader>NotUsing</PrecompiledHeader>
<Optimization>MaxSpeed</Optimization>
<FunctionLevelLinking>true</FunctionLevelLinking>
<IntrinsicFunctions>true</IntrinsicFunctions>
<PreprocessorDefinitions>WIN32;NDEBUG;_LIB;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>../../include</AdditionalIncludeDirectories>
<RuntimeLibrary>MultiThreaded</RuntimeLibrary>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
<EnableCOMDATFolding>true</EnableCOMDATFolding>
<OptimizeReferences>true</OptimizeReferences>
</Link>
</ItemDefinitionGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.targets" />
<ImportGroup Label="ExtensionTargets">
</ImportGroup>
</Project>

View File

@@ -0,0 +1,33 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup>
<Filter Include="Header Files">
<UniqueIdentifier>{c110bc57-c46e-476c-97ea-84d8014f431c}</UniqueIdentifier>
</Filter>
<Filter Include="Source Files">
<UniqueIdentifier>{ed718592-5acf-47b5-8f2b-b8224590da6a}</UniqueIdentifier>
</Filter>
</ItemGroup>
<ItemGroup>
<ClCompile Include="..\..\src\lib_json\json_reader.cpp">
<Filter>Source Files</Filter>
</ClCompile>
<ClCompile Include="..\..\src\lib_json\json_value.cpp">
<Filter>Source Files</Filter>
</ClCompile>
<ClCompile Include="..\..\src\lib_json\json_writer.cpp">
<Filter>Source Files</Filter>
</ClCompile>
</ItemGroup>
<ItemGroup>
<ClInclude Include="..\..\include\json\reader.h">
<Filter>Header Files</Filter>
</ClInclude>
<ClInclude Include="..\..\include\json\value.h">
<Filter>Header Files</Filter>
</ClInclude>
<ClInclude Include="..\..\include\json\writer.h">
<Filter>Header Files</Filter>
</ClInclude>
</ItemGroup>
</Project>

View File

@@ -0,0 +1,109 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup Label="ProjectConfigurations">
<ProjectConfiguration Include="Debug|Win32">
<Configuration>Debug</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|Win32">
<Configuration>Release</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
</ItemGroup>
<PropertyGroup Label="Globals">
<ProjectGuid>{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}</ProjectGuid>
<RootNamespace>test_lib_json</RootNamespace>
<Keyword>Win32Proj</Keyword>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.props" />
<ImportGroup Label="ExtensionSettings">
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<PropertyGroup Label="UserMacros" />
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<OutDir Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">../../build/vs71/debug/test_lib_json\</OutDir>
<IntDir Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">../../build/vs71/debug/test_lib_json\</IntDir>
<LinkIncremental Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">true</LinkIncremental>
<OutDir Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">../../build/vs71/release/test_lib_json\</OutDir>
<IntDir Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">../../build/vs71/release/test_lib_json\</IntDir>
<LinkIncremental Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">false</LinkIncremental>
</PropertyGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<ClCompile>
<Optimization>Disabled</Optimization>
<AdditionalIncludeDirectories>../../include;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<PreprocessorDefinitions>WIN32;_DEBUG;_CONSOLE;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<MinimalRebuild>true</MinimalRebuild>
<BasicRuntimeChecks>EnableFastChecks</BasicRuntimeChecks>
<RuntimeLibrary>MultiThreadedDebug</RuntimeLibrary>
<PrecompiledHeader>
</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
<DebugInformationFormat>EditAndContinue</DebugInformationFormat>
</ClCompile>
<Link>
<OutputFile>$(OutDir)test_lib_json.exe</OutputFile>
<GenerateDebugInformation>true</GenerateDebugInformation>
<ProgramDatabaseFile>$(OutDir)test_lib_json.pdb</ProgramDatabaseFile>
<SubSystem>Console</SubSystem>
<TargetMachine>MachineX86</TargetMachine>
</Link>
<PostBuildEvent>
<Message>Running all unit tests</Message>
<Command>$(TargetPath)</Command>
</PostBuildEvent>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<ClCompile>
<AdditionalIncludeDirectories>../../include;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<PreprocessorDefinitions>WIN32;NDEBUG;_CONSOLE;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<RuntimeLibrary>MultiThreaded</RuntimeLibrary>
<PrecompiledHeader>
</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
<DebugInformationFormat>ProgramDatabase</DebugInformationFormat>
</ClCompile>
<Link>
<OutputFile>$(OutDir)test_lib_json.exe</OutputFile>
<GenerateDebugInformation>true</GenerateDebugInformation>
<SubSystem>Console</SubSystem>
<OptimizeReferences>true</OptimizeReferences>
<EnableCOMDATFolding>true</EnableCOMDATFolding>
<TargetMachine>MachineX86</TargetMachine>
</Link>
<PostBuildEvent>
<Message>Running all unit tests</Message>
<Command>$(TargetPath)</Command>
</PostBuildEvent>
</ItemDefinitionGroup>
<ItemGroup>
<ClCompile Include="..\..\src\test_lib_json\jsontest.cpp" />
<ClCompile Include="..\..\src\test_lib_json\main.cpp" />
</ItemGroup>
<ItemGroup>
<ClInclude Include="..\..\src\test_lib_json\jsontest.h" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="lib_json.vcxproj">
<Project>{1e6c2c1c-6453-4129-ae3f-0ee8e6599c89}</Project>
</ProjectReference>
</ItemGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.targets" />
<ImportGroup Label="ExtensionTargets">
</ImportGroup>
</Project>

View File

@@ -0,0 +1,24 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup>
<ClCompile Include="..\..\src\test_lib_json\jsontest.cpp">
<Filter>Source Filter</Filter>
</ClCompile>
<ClCompile Include="..\..\src\test_lib_json\main.cpp">
<Filter>Source Filter</Filter>
</ClCompile>
</ItemGroup>
<ItemGroup>
<Filter Include="Source Filter">
<UniqueIdentifier>{bf40cbfc-8e98-40b4-b9f3-7e8d579cbae2}</UniqueIdentifier>
</Filter>
<Filter Include="Header Files">
<UniqueIdentifier>{5fd39074-89e6-4939-aa3f-694fefd296b1}</UniqueIdentifier>
</Filter>
</ItemGroup>
<ItemGroup>
<ClInclude Include="..\..\src\test_lib_json\jsontest.h">
<Filter>Header Files</Filter>
</ClInclude>
</ItemGroup>
</Project>

View File

@@ -178,15 +178,6 @@
<File
RelativePath="..\..\include\json\json.h">
</File>
<File
RelativePath="..\..\src\lib_json\json_batchallocator.h">
</File>
<File
RelativePath="..\..\src\lib_json\json_internalarray.inl">
</File>
<File
RelativePath="..\..\src\lib_json\json_internalmap.inl">
</File>
<File
RelativePath="..\..\src\lib_json\json_reader.cpp">
</File>

View File

@@ -3,11 +3,18 @@
Requires Python 2.6
Example of invocation (use to test the script):
python makerelease.py --force --retag --platform=msvc6,msvc71,msvc80,mingw -ublep 0.5.0 0.6.0-dev
python makerelease.py --platform=msvc6,msvc71,msvc80,msvc90,mingw -ublep 0.6.0 0.7.0-dev
When testing this script:
python makerelease.py --force --retag --platform=msvc6,msvc71,msvc80,mingw -ublep test-0.6.0 test-0.6.1-dev
Example of invocation when doing a release:
python makerelease.py 0.5.0 0.6.0-dev
Note: This was for Subversion. Now that we are in GitHub, we do not
need to build versioned tarballs anymore, so makerelease.py is defunct.
"""
from __future__ import print_function
import os.path
import subprocess
import sys
@@ -20,146 +27,147 @@ import tempfile
import os
import time
from devtools import antglob, fixeol, tarball
import amalgamate
SVN_ROOT = 'https://jsoncpp.svn.sourceforge.net/svnroot/jsoncpp/'
SVN_TAG_ROOT = SVN_ROOT + 'tags/jsoncpp'
SCONS_LOCAL_URL = 'http://sourceforge.net/projects/scons/files/scons-local/1.2.0/scons-local-1.2.0.tar.gz/download'
SOURCEFORGE_PROJECT = 'jsoncpp'
def set_version( version ):
def set_version(version):
with open('version','wb') as f:
f.write( version.strip() )
f.write(version.strip())
def rmdir_if_exist( dir_path ):
if os.path.isdir( dir_path ):
shutil.rmtree( dir_path )
def rmdir_if_exist(dir_path):
if os.path.isdir(dir_path):
shutil.rmtree(dir_path)
class SVNError(Exception):
pass
def svn_command( command, *args ):
def svn_command(command, *args):
cmd = ['svn', '--non-interactive', command] + list(args)
print 'Running:', ' '.join( cmd )
process = subprocess.Popen( cmd,
print('Running:', ' '.join(cmd))
process = subprocess.Popen(cmd,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT )
stderr=subprocess.STDOUT)
stdout = process.communicate()[0]
if process.returncode:
error = SVNError( 'SVN command failed:\n' + stdout )
error = SVNError('SVN command failed:\n' + stdout)
error.returncode = process.returncode
raise error
return stdout
def check_no_pending_commit():
"""Checks that there is no pending commit in the sandbox."""
stdout = svn_command( 'status', '--xml' )
etree = ElementTree.fromstring( stdout )
stdout = svn_command('status', '--xml')
etree = ElementTree.fromstring(stdout)
msg = []
for entry in etree.getiterator( 'entry' ):
for entry in etree.getiterator('entry'):
path = entry.get('path')
status = entry.find('wc-status').get('item')
if status != 'unversioned' and path != 'version':
msg.append( 'File "%s" has pending change (status="%s")' % (path, status) )
msg.append('File "%s" has pending change (status="%s")' % (path, status))
if msg:
msg.insert(0, 'Pending change to commit found in sandbox. Commit them first!' )
return '\n'.join( msg )
msg.insert(0, 'Pending change to commit found in sandbox. Commit them first!')
return '\n'.join(msg)
def svn_join_url( base_url, suffix ):
def svn_join_url(base_url, suffix):
if not base_url.endswith('/'):
base_url += '/'
if suffix.startswith('/'):
suffix = suffix[1:]
return base_url + suffix
def svn_check_if_tag_exist( tag_url ):
def svn_check_if_tag_exist(tag_url):
"""Checks if a tag exist.
Returns: True if the tag exist, False otherwise.
"""
try:
list_stdout = svn_command( 'list', tag_url )
except SVNError, e:
list_stdout = svn_command('list', tag_url)
except SVNError as e:
if e.returncode != 1 or not str(e).find('tag_url'):
raise e
# otherwise ignore error, meaning tag does not exist
return False
return True
def svn_commit( message ):
def svn_commit(message):
"""Commit the sandbox, providing the specified comment.
"""
svn_command( 'ci', '-m', message )
svn_command('ci', '-m', message)
def svn_tag_sandbox( tag_url, message ):
def svn_tag_sandbox(tag_url, message):
"""Makes a tag based on the sandbox revisions.
"""
svn_command( 'copy', '-m', message, '.', tag_url )
svn_command('copy', '-m', message, '.', tag_url)
def svn_remove_tag( tag_url, message ):
def svn_remove_tag(tag_url, message):
"""Removes an existing tag.
"""
svn_command( 'delete', '-m', message, tag_url )
svn_command('delete', '-m', message, tag_url)
def svn_export( tag_url, export_dir ):
def svn_export(tag_url, export_dir):
"""Exports the tag_url revision to export_dir.
Target directory, including its parent is created if it does not exist.
If the directory export_dir exist, it is deleted before export proceed.
"""
rmdir_if_exist( export_dir )
svn_command( 'export', tag_url, export_dir )
rmdir_if_exist(export_dir)
svn_command('export', tag_url, export_dir)
def fix_sources_eol( dist_dir ):
def fix_sources_eol(dist_dir):
"""Set file EOL for tarball distribution.
"""
print 'Preparing exported source file EOL for distribution...'
print('Preparing exported source file EOL for distribution...')
prune_dirs = antglob.prune_dirs + 'scons-local* ./build* ./libs ./dist'
win_sources = antglob.glob( dist_dir,
win_sources = antglob.glob(dist_dir,
includes = '**/*.sln **/*.vcproj',
prune_dirs = prune_dirs )
unix_sources = antglob.glob( dist_dir,
prune_dirs = prune_dirs)
unix_sources = antglob.glob(dist_dir,
includes = '''**/*.h **/*.cpp **/*.inl **/*.txt **/*.dox **/*.py **/*.html **/*.in
sconscript *.json *.expected AUTHORS LICENSE''',
excludes = antglob.default_excludes + 'scons.py sconsign.py scons-*',
prune_dirs = prune_dirs )
prune_dirs = prune_dirs)
for path in win_sources:
fixeol.fix_source_eol( path, is_dry_run = False, verbose = True, eol = '\r\n' )
fixeol.fix_source_eol(path, is_dry_run = False, verbose = True, eol = '\r\n')
for path in unix_sources:
fixeol.fix_source_eol( path, is_dry_run = False, verbose = True, eol = '\n' )
fixeol.fix_source_eol(path, is_dry_run = False, verbose = True, eol = '\n')
def download( url, target_path ):
def download(url, target_path):
"""Download file represented by url to target_path.
"""
f = urllib2.urlopen( url )
f = urllib2.urlopen(url)
try:
data = f.read()
finally:
f.close()
fout = open( target_path, 'wb' )
fout = open(target_path, 'wb')
try:
fout.write( data )
fout.write(data)
finally:
fout.close()
def check_compile( distcheck_top_dir, platform ):
def check_compile(distcheck_top_dir, platform):
cmd = [sys.executable, 'scons.py', 'platform=%s' % platform, 'check']
print 'Running:', ' '.join( cmd )
log_path = os.path.join( distcheck_top_dir, 'build-%s.log' % platform )
flog = open( log_path, 'wb' )
print('Running:', ' '.join(cmd))
log_path = os.path.join(distcheck_top_dir, 'build-%s.log' % platform)
flog = open(log_path, 'wb')
try:
process = subprocess.Popen( cmd,
process = subprocess.Popen(cmd,
stdout=flog,
stderr=subprocess.STDOUT,
cwd=distcheck_top_dir )
cwd=distcheck_top_dir)
stdout = process.communicate()[0]
status = (process.returncode == 0)
finally:
flog.close()
return (status, log_path)
def write_tempfile( content, **kwargs ):
fd, path = tempfile.mkstemp( **kwargs )
f = os.fdopen( fd, 'wt' )
def write_tempfile(content, **kwargs):
fd, path = tempfile.mkstemp(**kwargs)
f = os.fdopen(fd, 'wt')
try:
f.write( content )
f.write(content)
finally:
f.close()
return path
@@ -167,34 +175,34 @@ def write_tempfile( content, **kwargs ):
class SFTPError(Exception):
pass
def run_sftp_batch( userhost, sftp, batch, retry=0 ):
path = write_tempfile( batch, suffix='.sftp', text=True )
def run_sftp_batch(userhost, sftp, batch, retry=0):
path = write_tempfile(batch, suffix='.sftp', text=True)
# psftp -agent -C blep,jsoncpp@web.sourceforge.net -batch -b batch.sftp -bc
cmd = [sftp, '-agent', '-C', '-batch', '-b', path, '-bc', userhost]
error = None
for retry_index in xrange(0, max(1,retry)):
for retry_index in range(0, max(1,retry)):
heading = retry_index == 0 and 'Running:' or 'Retrying:'
print heading, ' '.join( cmd )
process = subprocess.Popen( cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT )
print(heading, ' '.join(cmd))
process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
stdout = process.communicate()[0]
if process.returncode != 0:
error = SFTPError( 'SFTP batch failed:\n' + stdout )
error = SFTPError('SFTP batch failed:\n' + stdout)
else:
break
if error:
raise error
return stdout
def sourceforge_web_synchro( sourceforge_project, doc_dir,
user=None, sftp='sftp' ):
def sourceforge_web_synchro(sourceforge_project, doc_dir,
user=None, sftp='sftp'):
"""Notes: does not synchronize sub-directory of doc-dir.
"""
userhost = '%s,%s@web.sourceforge.net' % (user, sourceforge_project)
stdout = run_sftp_batch( userhost, sftp, """
stdout = run_sftp_batch(userhost, sftp, """
cd htdocs
dir
exit
""" )
""")
existing_paths = set()
collect = 0
for line in stdout.split('\n'):
@@ -208,36 +216,36 @@ exit
elif collect == 2:
path = line.strip().split()[-1:]
if path and path[0] not in ('.', '..'):
existing_paths.add( path[0] )
upload_paths = set( [os.path.basename(p) for p in antglob.glob( doc_dir )] )
existing_paths.add(path[0])
upload_paths = set([os.path.basename(p) for p in antglob.glob(doc_dir)])
paths_to_remove = existing_paths - upload_paths
if paths_to_remove:
print 'Removing the following file from web:'
print '\n'.join( paths_to_remove )
stdout = run_sftp_batch( userhost, sftp, """cd htdocs
print('Removing the following file from web:')
print('\n'.join(paths_to_remove))
stdout = run_sftp_batch(userhost, sftp, """cd htdocs
rm %s
exit""" % ' '.join(paths_to_remove) )
print 'Uploading %d files:' % len(upload_paths)
exit""" % ' '.join(paths_to_remove))
print('Uploading %d files:' % len(upload_paths))
batch_size = 10
upload_paths = list(upload_paths)
start_time = time.time()
for index in xrange(0,len(upload_paths),batch_size):
for index in range(0,len(upload_paths),batch_size):
paths = upload_paths[index:index+batch_size]
file_per_sec = (time.time() - start_time) / (index+1)
remaining_files = len(upload_paths) - index
remaining_sec = file_per_sec * remaining_files
print '%d/%d, ETA=%.1fs' % (index+1, len(upload_paths), remaining_sec)
run_sftp_batch( userhost, sftp, """cd htdocs
print('%d/%d, ETA=%.1fs' % (index+1, len(upload_paths), remaining_sec))
run_sftp_batch(userhost, sftp, """cd htdocs
lcd %s
mput %s
exit""" % (doc_dir, ' '.join(paths) ), retry=3 )
exit""" % (doc_dir, ' '.join(paths)), retry=3)
def sourceforge_release_tarball( sourceforge_project, paths, user=None, sftp='sftp' ):
def sourceforge_release_tarball(sourceforge_project, paths, user=None, sftp='sftp'):
userhost = '%s,%s@frs.sourceforge.net' % (user, sourceforge_project)
run_sftp_batch( userhost, sftp, """
run_sftp_batch(userhost, sftp, """
mput %s
exit
""" % (' '.join(paths),) )
""" % (' '.join(paths),))
def main():
@@ -278,91 +286,99 @@ Warning: --force should only be used when developping/testing the release script
options, args = parser.parse_args()
if len(args) != 2:
parser.error( 'release_version missing on command-line.' )
parser.error('release_version missing on command-line.')
release_version = args[0]
next_version = args[1]
if not options.platforms and not options.no_test:
parser.error( 'You must specify either --platform or --no-test option.' )
parser.error('You must specify either --platform or --no-test option.')
if options.ignore_pending_commit:
msg = ''
else:
msg = check_no_pending_commit()
if not msg:
print 'Setting version to', release_version
set_version( release_version )
svn_commit( 'Release ' + release_version )
tag_url = svn_join_url( SVN_TAG_ROOT, release_version )
if svn_check_if_tag_exist( tag_url ):
print('Setting version to', release_version)
set_version(release_version)
svn_commit('Release ' + release_version)
tag_url = svn_join_url(SVN_TAG_ROOT, release_version)
if svn_check_if_tag_exist(tag_url):
if options.retag_release:
svn_remove_tag( tag_url, 'Overwriting previous tag' )
svn_remove_tag(tag_url, 'Overwriting previous tag')
else:
print 'Aborting, tag %s already exist. Use --retag to overwrite it!' % tag_url
sys.exit( 1 )
svn_tag_sandbox( tag_url, 'Release ' + release_version )
print('Aborting, tag %s already exist. Use --retag to overwrite it!' % tag_url)
sys.exit(1)
svn_tag_sandbox(tag_url, 'Release ' + release_version)
print 'Generated doxygen document...'
print('Generated doxygen document...')
## doc_dirname = r'jsoncpp-api-html-0.5.0'
## doc_tarball_path = r'e:\prg\vc\Lib\jsoncpp-trunk\dist\jsoncpp-api-html-0.5.0.tar.gz'
doc_tarball_path, doc_dirname = doxybuild.build_doc( options, make_release=True )
doc_tarball_path, doc_dirname = doxybuild.build_doc(options, make_release=True)
doc_distcheck_dir = 'dist/doccheck'
tarball.decompress( doc_tarball_path, doc_distcheck_dir )
doc_distcheck_top_dir = os.path.join( doc_distcheck_dir, doc_dirname )
tarball.decompress(doc_tarball_path, doc_distcheck_dir)
doc_distcheck_top_dir = os.path.join(doc_distcheck_dir, doc_dirname)
export_dir = 'dist/export'
svn_export( tag_url, export_dir )
fix_sources_eol( export_dir )
svn_export(tag_url, export_dir)
fix_sources_eol(export_dir)
source_dir = 'jsoncpp-src-' + release_version
source_tarball_path = 'dist/%s.tar.gz' % source_dir
print 'Generating source tarball to', source_tarball_path
tarball.make_tarball( source_tarball_path, [export_dir], export_dir, prefix_dir=source_dir )
print('Generating source tarball to', source_tarball_path)
tarball.make_tarball(source_tarball_path, [export_dir], export_dir, prefix_dir=source_dir)
amalgamation_tarball_path = 'dist/%s-amalgamation.tar.gz' % source_dir
print('Generating amalgamation source tarball to', amalgamation_tarball_path)
amalgamation_dir = 'dist/amalgamation'
amalgamate.amalgamate_source(export_dir, '%s/jsoncpp.cpp' % amalgamation_dir, 'json/json.h')
amalgamation_source_dir = 'jsoncpp-src-amalgamation' + release_version
tarball.make_tarball(amalgamation_tarball_path, [amalgamation_dir],
amalgamation_dir, prefix_dir=amalgamation_source_dir)
# Decompress source tarball, download and install scons-local
distcheck_dir = 'dist/distcheck'
distcheck_top_dir = distcheck_dir + '/' + source_dir
print 'Decompressing source tarball to', distcheck_dir
rmdir_if_exist( distcheck_dir )
tarball.decompress( source_tarball_path, distcheck_dir )
print('Decompressing source tarball to', distcheck_dir)
rmdir_if_exist(distcheck_dir)
tarball.decompress(source_tarball_path, distcheck_dir)
scons_local_path = 'dist/scons-local.tar.gz'
print 'Downloading scons-local to', scons_local_path
download( SCONS_LOCAL_URL, scons_local_path )
print 'Decompressing scons-local to', distcheck_top_dir
tarball.decompress( scons_local_path, distcheck_top_dir )
print('Downloading scons-local to', scons_local_path)
download(SCONS_LOCAL_URL, scons_local_path)
print('Decompressing scons-local to', distcheck_top_dir)
tarball.decompress(scons_local_path, distcheck_top_dir)
# Run compilation
print 'Compiling decompressed tarball'
print('Compiling decompressed tarball')
all_build_status = True
for platform in options.platforms.split(','):
print 'Testing platform:', platform
build_status, log_path = check_compile( distcheck_top_dir, platform )
print 'see build log:', log_path
print build_status and '=> ok' or '=> FAILED'
print('Testing platform:', platform)
build_status, log_path = check_compile(distcheck_top_dir, platform)
print('see build log:', log_path)
print(build_status and '=> ok' or '=> FAILED')
all_build_status = all_build_status and build_status
if not build_status:
print 'Testing failed on at least one platform, aborting...'
svn_remove_tag( tag_url, 'Removing tag due to failed testing' )
print('Testing failed on at least one platform, aborting...')
svn_remove_tag(tag_url, 'Removing tag due to failed testing')
sys.exit(1)
if options.user:
if not options.no_web:
print 'Uploading documentation using user', options.user
sourceforge_web_synchro( SOURCEFORGE_PROJECT, doc_distcheck_top_dir, user=options.user, sftp=options.sftp )
print 'Completed documentation upload'
print 'Uploading source and documentation tarballs for release using user', options.user
sourceforge_release_tarball( SOURCEFORGE_PROJECT,
print('Uploading documentation using user', options.user)
sourceforge_web_synchro(SOURCEFORGE_PROJECT, doc_distcheck_top_dir, user=options.user, sftp=options.sftp)
print('Completed documentation upload')
print('Uploading source and documentation tarballs for release using user', options.user)
sourceforge_release_tarball(SOURCEFORGE_PROJECT,
[source_tarball_path, doc_tarball_path],
user=options.user, sftp=options.sftp )
print 'Source and doc release tarballs uploaded'
user=options.user, sftp=options.sftp)
print('Source and doc release tarballs uploaded')
else:
print 'No upload user specified. Web site and download tarbal were not uploaded.'
print 'Tarball can be found at:', doc_tarball_path
print('No upload user specified. Web site and download tarbal were not uploaded.')
print('Tarball can be found at:', doc_tarball_path)
# Set next version number and commit
set_version( next_version )
svn_commit( 'Released ' + release_version )
set_version(next_version)
svn_commit('Released ' + release_version)
else:
sys.stderr.write( msg + '\n' )
sys.stderr.write(msg + '\n')
if __name__ == '__main__':
main()

11
pkg-config/jsoncpp.pc.in Normal file
View File

@@ -0,0 +1,11 @@
prefix=@CMAKE_INSTALL_PREFIX@
exec_prefix=${prefix}
libdir=${exec_prefix}/@LIBRARY_INSTALL_DIR@
includedir=${prefix}/@INCLUDE_INSTALL_DIR@
Name: jsoncpp
Description: A C++ library for interacting with JSON
Version: @JSONCPP_VERSION@
URL: https://github.com/open-source-parsers/jsoncpp
Libs: -L${libdir} -ljsoncpp
Cflags: -I${includedir}

View File

@@ -1,53 +1,53 @@
import fnmatch
import os
def generate( env ):
def Glob( env, includes = None, excludes = None, dir = '.' ):
"""Adds Glob( includes = Split( '*' ), excludes = None, dir = '.')
helper function to environment.
Glob both the file-system files.
includes: list of file name pattern included in the return list when matched.
excludes: list of file name pattern exluced from the return list.
Example:
sources = env.Glob( ("*.cpp", '*.h'), "~*.cpp", "#src" )
"""
def filterFilename(path):
abs_path = os.path.join( dir, path )
if not os.path.isfile(abs_path):
return 0
fn = os.path.basename(path)
match = 0
for include in includes:
if fnmatch.fnmatchcase( fn, include ):
match = 1
break
if match == 1 and not excludes is None:
for exclude in excludes:
if fnmatch.fnmatchcase( fn, exclude ):
match = 0
break
return match
if includes is None:
includes = ('*',)
elif type(includes) in ( type(''), type(u'') ):
includes = (includes,)
if type(excludes) in ( type(''), type(u'') ):
excludes = (excludes,)
dir = env.Dir(dir).abspath
paths = os.listdir( dir )
def makeAbsFileNode( path ):
return env.File( os.path.join( dir, path ) )
nodes = filter( filterFilename, paths )
return map( makeAbsFileNode, nodes )
from SCons.Script import Environment
import fnmatch
import os
def generate(env):
def Glob(env, includes = None, excludes = None, dir = '.'):
"""Adds Glob(includes = Split('*'), excludes = None, dir = '.')
helper function to environment.
Glob both the file-system files.
includes: list of file name pattern included in the return list when matched.
excludes: list of file name pattern exluced from the return list.
Example:
sources = env.Glob(("*.cpp", '*.h'), "~*.cpp", "#src")
"""
def filterFilename(path):
abs_path = os.path.join(dir, path)
if not os.path.isfile(abs_path):
return 0
fn = os.path.basename(path)
match = 0
for include in includes:
if fnmatch.fnmatchcase(fn, include):
match = 1
break
if match == 1 and not excludes is None:
for exclude in excludes:
if fnmatch.fnmatchcase(fn, exclude):
match = 0
break
return match
if includes is None:
includes = ('*',)
elif type(includes) in (type(''), type(u'')):
includes = (includes,)
if type(excludes) in (type(''), type(u'')):
excludes = (excludes,)
dir = env.Dir(dir).abspath
paths = os.listdir(dir)
def makeAbsFileNode(path):
return env.File(os.path.join(dir, path))
nodes = filter(filterFilename, paths)
return map(makeAbsFileNode, nodes)
from SCons.Script import Environment
Environment.Glob = Glob
def exists(env):
"""
Tool always exists.
"""
return True
def exists(env):
"""
Tool always exists.
"""
return True

View File

@@ -47,7 +47,7 @@ import targz
## elif token == "=":
## data[key] = list()
## else:
## append_data( data, key, new_data, token )
## append_data(data, key, new_data, token)
## new_data = True
##
## last_token = token
@@ -55,7 +55,7 @@ import targz
##
## if last_token == '\\' and token != '\n':
## new_data = False
## append_data( data, key, new_data, '\\' )
## append_data(data, key, new_data, '\\')
##
## # compress lists of len 1 into single strings
## for (k, v) in data.items():
@@ -116,7 +116,7 @@ import targz
## else:
## for pattern in file_patterns:
## sources.extend(glob.glob("/".join([node, pattern])))
## sources = map( lambda path: env.File(path), sources )
## sources = map(lambda path: env.File(path), sources)
## return sources
##
##
@@ -143,7 +143,7 @@ def srcDistEmitter(source, target, env):
## # add our output locations
## for (k, v) in output_formats.items():
## if data.get("GENERATE_" + k, v[0]) == "YES":
## targets.append(env.Dir( os.path.join(out_dir, data.get(k + "_OUTPUT", v[1]))) )
## targets.append(env.Dir(os.path.join(out_dir, data.get(k + "_OUTPUT", v[1]))))
##
## # don't clobber targets
## for node in targets:
@@ -161,14 +161,13 @@ def generate(env):
Add builders and construction variables for the
SrcDist tool.
"""
## doxyfile_scanner = env.Scanner(
## DoxySourceScan,
## doxyfile_scanner = env.Scanner(## DoxySourceScan,
## "DoxySourceScan",
## scan_check = DoxySourceScanCheck,
## )
##)
if targz.exists(env):
srcdist_builder = targz.makeBuilder( srcDistEmitter )
srcdist_builder = targz.makeBuilder(srcDistEmitter)
env['BUILDERS']['SrcDist'] = srcdist_builder

View File

@@ -1,5 +1,6 @@
import re
from SCons.Script import * # the usual scons stuff you get in a SConscript
import collections
def generate(env):
"""
@@ -25,28 +26,28 @@ def generate(env):
contents = f.read()
f.close()
except:
raise SCons.Errors.UserError, "Can't read source file %s"%sourcefile
for (k,v) in dict.items():
raise SCons.Errors.UserError("Can't read source file %s"%sourcefile)
for (k,v) in list(dict.items()):
contents = re.sub(k, v, contents)
try:
f = open(targetfile, 'wb')
f.write(contents)
f.close()
except:
raise SCons.Errors.UserError, "Can't write target file %s"%targetfile
raise SCons.Errors.UserError("Can't write target file %s"%targetfile)
return 0 # success
def subst_in_file(target, source, env):
if not env.has_key('SUBST_DICT'):
raise SCons.Errors.UserError, "SubstInFile requires SUBST_DICT to be set."
if 'SUBST_DICT' not in env:
raise SCons.Errors.UserError("SubstInFile requires SUBST_DICT to be set.")
d = dict(env['SUBST_DICT']) # copy it
for (k,v) in d.items():
if callable(v):
for (k,v) in list(d.items()):
if isinstance(v, collections.Callable):
d[k] = env.subst(v()).replace('\\','\\\\')
elif SCons.Util.is_String(v):
d[k] = env.subst(v).replace('\\','\\\\')
else:
raise SCons.Errors.UserError, "SubstInFile: key %s: %s must be a string or callable"%(k, repr(v))
raise SCons.Errors.UserError("SubstInFile: key %s: %s must be a string or callable"%(k, repr(v)))
for (t,s) in zip(target, source):
return do_subst_in_file(str(t), str(s), d)
@@ -60,8 +61,8 @@ def generate(env):
Returns original target, source tuple unchanged.
"""
d = env['SUBST_DICT'].copy() # copy it
for (k,v) in d.items():
if callable(v):
for (k,v) in list(d.items()):
if isinstance(v, collections.Callable):
d[k] = env.subst(v())
elif SCons.Util.is_String(v):
d[k]=env.subst(v)
@@ -69,7 +70,7 @@ def generate(env):
return target, source
## env.Append(TOOLS = 'substinfile') # this should be automaticaly done by Scons ?!?
subst_action = SCons.Action.Action( subst_in_file, subst_in_file_string )
subst_action = SCons.Action.Action(subst_in_file, subst_in_file_string)
env['BUILDERS']['SubstInFile'] = Builder(action=subst_action, emitter=subst_emitter)
def exists(env):

View File

@@ -27,9 +27,9 @@ TARGZ_DEFAULT_COMPRESSION_LEVEL = 9
if internal_targz:
def targz(target, source, env):
def archive_name( path ):
path = os.path.normpath( os.path.abspath( path ) )
common_path = os.path.commonprefix( (base_dir, path) )
def archive_name(path):
path = os.path.normpath(os.path.abspath(path))
common_path = os.path.commonprefix((base_dir, path))
archive_name = path[len(common_path):]
return archive_name
@@ -37,23 +37,23 @@ if internal_targz:
for name in names:
path = os.path.join(dirname, name)
if os.path.isfile(path):
tar.add(path, archive_name(path) )
tar.add(path, archive_name(path))
compression = env.get('TARGZ_COMPRESSION_LEVEL',TARGZ_DEFAULT_COMPRESSION_LEVEL)
base_dir = os.path.normpath( env.get('TARGZ_BASEDIR', env.Dir('.')).abspath )
base_dir = os.path.normpath(env.get('TARGZ_BASEDIR', env.Dir('.')).abspath)
target_path = str(target[0])
fileobj = gzip.GzipFile( target_path, 'wb', compression )
fileobj = gzip.GzipFile(target_path, 'wb', compression)
tar = tarfile.TarFile(os.path.splitext(target_path)[0], 'w', fileobj)
for source in source:
source_path = str(source)
if source.isdir():
os.path.walk(source_path, visit, tar)
else:
tar.add(source_path, archive_name(source_path) ) # filename, arcname
tar.add(source_path, archive_name(source_path)) # filename, arcname
tar.close()
targzAction = SCons.Action.Action(targz, varlist=['TARGZ_COMPRESSION_LEVEL','TARGZ_BASEDIR'])
def makeBuilder( emitter = None ):
def makeBuilder(emitter = None):
return SCons.Builder.Builder(action = SCons.Action.Action('$TARGZ_COM', '$TARGZ_COMSTR'),
source_factory = SCons.Node.FS.Entry,
source_scanner = SCons.Defaults.DirScanner,

5
src/CMakeLists.txt Normal file
View File

@@ -0,0 +1,5 @@
ADD_SUBDIRECTORY(lib_json)
IF(JSONCPP_WITH_TESTS)
ADD_SUBDIRECTORY(jsontestrunner)
ADD_SUBDIRECTORY(test_lib_json)
ENDIF(JSONCPP_WITH_TESTS)

View File

@@ -0,0 +1,28 @@
FIND_PACKAGE(PythonInterp 2.6)
IF(JSONCPP_LIB_BUILD_SHARED)
ADD_DEFINITIONS( -DJSON_DLL )
ENDIF(JSONCPP_LIB_BUILD_SHARED)
ADD_EXECUTABLE(jsontestrunner_exe
main.cpp
)
IF(JSONCPP_LIB_BUILD_SHARED)
TARGET_LINK_LIBRARIES(jsontestrunner_exe jsoncpp_lib)
ELSE(JSONCPP_LIB_BUILD_SHARED)
TARGET_LINK_LIBRARIES(jsontestrunner_exe jsoncpp_lib_static)
ENDIF(JSONCPP_LIB_BUILD_SHARED)
SET_TARGET_PROPERTIES(jsontestrunner_exe PROPERTIES OUTPUT_NAME jsontestrunner_exe)
IF(PYTHONINTERP_FOUND)
# Run end to end parser/writer tests
SET(TEST_DIR ${CMAKE_CURRENT_SOURCE_DIR}/../../test)
SET(RUNJSONTESTS_PATH ${TEST_DIR}/runjsontests.py)
ADD_CUSTOM_TARGET(jsoncpp_readerwriter_tests
"${PYTHON_EXECUTABLE}" -B "${RUNJSONTESTS_PATH}" $<TARGET_FILE:jsontestrunner_exe> "${TEST_DIR}/data"
DEPENDS jsontestrunner_exe jsoncpp_test
)
ADD_CUSTOM_TARGET(jsoncpp_check DEPENDS jsoncpp_readerwriter_tests)
ENDIF(PYTHONINTERP_FOUND)

View File

@@ -1,233 +1,325 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
/* This executable is used for testing parser/writer using real JSON files.
*/
#include <json/json.h>
#include <algorithm> // sort
#include <sstream>
#include <stdio.h>
#if defined(_MSC_VER) && _MSC_VER >= 1310
# pragma warning( disable: 4996 ) // disable fopen deprecation warning
#if defined(_MSC_VER) && _MSC_VER >= 1310
#pragma warning(disable : 4996) // disable fopen deprecation warning
#endif
static std::string
readInputTestFile( const char *path )
struct Options
{
FILE *file = fopen( path, "rb" );
if ( !file )
return std::string("");
fseek( file, 0, SEEK_END );
long size = ftell( file );
fseek( file, 0, SEEK_SET );
std::string text;
char *buffer = new char[size+1];
buffer[size] = 0;
if ( fread( buffer, 1, size, file ) == (unsigned long)size )
text = buffer;
fclose( file );
delete[] buffer;
return text;
std::string path;
Json::Features features;
bool parseOnly;
typedef std::string (*writeFuncType)(Json::Value const&);
writeFuncType write;
};
static std::string normalizeFloatingPointStr(double value) {
char buffer[32];
#if defined(_MSC_VER) && defined(__STDC_SECURE_LIB__)
sprintf_s(buffer, sizeof(buffer), "%.16g", value);
#else
snprintf(buffer, sizeof(buffer), "%.16g", value);
#endif
buffer[sizeof(buffer) - 1] = 0;
std::string s(buffer);
std::string::size_type index = s.find_last_of("eE");
if (index != std::string::npos) {
std::string::size_type hasSign =
(s[index + 1] == '+' || s[index + 1] == '-') ? 1 : 0;
std::string::size_type exponentStartIndex = index + 1 + hasSign;
std::string normalized = s.substr(0, exponentStartIndex);
std::string::size_type indexDigit =
s.find_first_not_of('0', exponentStartIndex);
std::string exponent = "0";
if (indexDigit !=
std::string::npos) // There is an exponent different from 0
{
exponent = s.substr(indexDigit);
}
return normalized + exponent;
}
return s;
}
static std::string readInputTestFile(const char* path) {
FILE* file = fopen(path, "rb");
if (!file)
return std::string("");
fseek(file, 0, SEEK_END);
long size = ftell(file);
fseek(file, 0, SEEK_SET);
std::string text;
char* buffer = new char[size + 1];
buffer[size] = 0;
if (fread(buffer, 1, size, file) == (unsigned long)size)
text = buffer;
fclose(file);
delete[] buffer;
return text;
}
static void
printValueTree( FILE *fout, Json::Value &value, const std::string &path = "." )
{
switch ( value.type() )
{
case Json::nullValue:
fprintf( fout, "%s=null\n", path.c_str() );
break;
case Json::intValue:
fprintf( fout, "%s=%d\n", path.c_str(), value.asInt() );
break;
case Json::uintValue:
fprintf( fout, "%s=%u\n", path.c_str(), value.asUInt() );
break;
case Json::realValue:
fprintf( fout, "%s=%.16g\n", path.c_str(), value.asDouble() );
break;
case Json::stringValue:
fprintf( fout, "%s=\"%s\"\n", path.c_str(), value.asString().c_str() );
break;
case Json::booleanValue:
fprintf( fout, "%s=%s\n", path.c_str(), value.asBool() ? "true" : "false" );
break;
case Json::arrayValue:
{
fprintf( fout, "%s=[]\n", path.c_str() );
int size = value.size();
for ( int index =0; index < size; ++index )
{
static char buffer[16];
sprintf( buffer, "[%d]", index );
printValueTree( fout, value[index], path + buffer );
}
}
break;
case Json::objectValue:
{
fprintf( fout, "%s={}\n", path.c_str() );
Json::Value::Members members( value.getMemberNames() );
std::sort( members.begin(), members.end() );
std::string suffix = *(path.end()-1) == '.' ? "" : ".";
for ( Json::Value::Members::iterator it = members.begin();
it != members.end();
++it )
{
const std::string &name = *it;
printValueTree( fout, value[name], path + suffix + name );
}
}
break;
default:
break;
}
printValueTree(FILE* fout, Json::Value& value, const std::string& path = ".") {
if (value.hasComment(Json::commentBefore)) {
fprintf(fout, "%s\n", value.getComment(Json::commentBefore).c_str());
}
switch (value.type()) {
case Json::nullValue:
fprintf(fout, "%s=null\n", path.c_str());
break;
case Json::intValue:
fprintf(fout,
"%s=%s\n",
path.c_str(),
Json::valueToString(value.asLargestInt()).c_str());
break;
case Json::uintValue:
fprintf(fout,
"%s=%s\n",
path.c_str(),
Json::valueToString(value.asLargestUInt()).c_str());
break;
case Json::realValue:
fprintf(fout,
"%s=%s\n",
path.c_str(),
normalizeFloatingPointStr(value.asDouble()).c_str());
break;
case Json::stringValue:
fprintf(fout, "%s=\"%s\"\n", path.c_str(), value.asString().c_str());
break;
case Json::booleanValue:
fprintf(fout, "%s=%s\n", path.c_str(), value.asBool() ? "true" : "false");
break;
case Json::arrayValue: {
fprintf(fout, "%s=[]\n", path.c_str());
int size = value.size();
for (int index = 0; index < size; ++index) {
static char buffer[16];
#if defined(_MSC_VER) && defined(__STDC_SECURE_LIB__)
sprintf_s(buffer, sizeof(buffer), "[%d]", index);
#else
snprintf(buffer, sizeof(buffer), "[%d]", index);
#endif
printValueTree(fout, value[index], path + buffer);
}
} break;
case Json::objectValue: {
fprintf(fout, "%s={}\n", path.c_str());
Json::Value::Members members(value.getMemberNames());
std::sort(members.begin(), members.end());
std::string suffix = *(path.end() - 1) == '.' ? "" : ".";
for (Json::Value::Members::iterator it = members.begin();
it != members.end();
++it) {
const std::string& name = *it;
printValueTree(fout, value[name], path + suffix + name);
}
} break;
default:
break;
}
if (value.hasComment(Json::commentAfter)) {
fprintf(fout, "%s\n", value.getComment(Json::commentAfter).c_str());
}
}
static int
parseAndSaveValueTree( const std::string &input,
const std::string &actual,
const std::string &kind,
Json::Value &root,
const Json::Features &features,
bool parseOnly )
static int parseAndSaveValueTree(const std::string& input,
const std::string& actual,
const std::string& kind,
const Json::Features& features,
bool parseOnly,
Json::Value* root)
{
Json::Reader reader( features );
bool parsingSuccessful = reader.parse( input, root );
if ( !parsingSuccessful )
{
printf( "Failed to parse %s file: \n%s\n",
kind.c_str(),
reader.getFormatedErrorMessages().c_str() );
return 1;
}
if ( !parseOnly )
{
FILE *factual = fopen( actual.c_str(), "wt" );
if ( !factual )
{
printf( "Failed to create %s actual file.\n", kind.c_str() );
return 2;
}
printValueTree( factual, root );
fclose( factual );
}
return 0;
}
static int
rewriteValueTree( const std::string &rewritePath,
const Json::Value &root,
std::string &rewrite )
{
//Json::FastWriter writer;
//writer.enableYAMLCompatibility();
Json::StyledWriter writer;
rewrite = writer.write( root );
FILE *fout = fopen( rewritePath.c_str(), "wt" );
if ( !fout )
{
printf( "Failed to create rewrite file: %s\n", rewritePath.c_str() );
Json::Reader reader(features);
bool parsingSuccessful = reader.parse(input, *root);
if (!parsingSuccessful) {
printf("Failed to parse %s file: \n%s\n",
kind.c_str(),
reader.getFormattedErrorMessages().c_str());
return 1;
}
if (!parseOnly) {
FILE* factual = fopen(actual.c_str(), "wt");
if (!factual) {
printf("Failed to create %s actual file.\n", kind.c_str());
return 2;
}
fprintf( fout, "%s\n", rewrite.c_str() );
fclose( fout );
return 0;
}
printValueTree(factual, *root);
fclose(factual);
}
return 0;
}
static std::string
removeSuffix( const std::string &path,
const std::string &extension )
// static std::string useFastWriter(Json::Value const& root) {
// Json::FastWriter writer;
// writer.enableYAMLCompatibility();
// return writer.write(root);
// }
static std::string useStyledWriter(
Json::Value const& root)
{
if ( extension.length() >= path.length() )
return std::string("");
std::string suffix = path.substr( path.length() - extension.length() );
if ( suffix != extension )
return std::string("");
return path.substr( 0, path.length() - extension.length() );
Json::StyledWriter writer;
return writer.write(root);
}
static int
printUsage( const char *argv[] )
static std::string useStyledStreamWriter(
Json::Value const& root)
{
printf( "Usage: %s [--strict] input-json-file", argv[0] );
return 3;
Json::StyledStreamWriter writer;
std::ostringstream sout;
writer.write(sout, root);
return sout.str();
}
int
parseCommandLine( int argc, const char *argv[],
Json::Features &features, std::string &path,
bool &parseOnly )
static std::string useBuiltStyledStreamWriter(
Json::Value const& root)
{
parseOnly = false;
if ( argc < 2 )
{
return printUsage( argv );
}
int index = 1;
if ( std::string(argv[1]) == "--json-checker" )
{
features = Json::Features::strictMode();
parseOnly = true;
++index;
}
if ( index == argc || index + 1 < argc )
{
return printUsage( argv );
}
path = argv[index];
return 0;
Json::StreamWriterBuilder builder;
return Json::writeString(builder, root);
}
int main( int argc, const char *argv[] )
static int rewriteValueTree(
const std::string& rewritePath,
const Json::Value& root,
Options::writeFuncType write,
std::string* rewrite)
{
std::string path;
Json::Features features;
bool parseOnly;
int exitCode = parseCommandLine( argc, argv, features, path, parseOnly );
if ( exitCode != 0 )
{
return exitCode;
}
std::string input = readInputTestFile( path.c_str() );
if ( input.empty() )
{
printf( "Failed to read input or empty input: %s\n", path.c_str() );
return 3;
}
std::string basePath = removeSuffix( argv[1], ".json" );
if ( !parseOnly && basePath.empty() )
{
printf( "Bad input path. Path does not end with '.expected':\n%s\n", path.c_str() );
return 3;
}
std::string actualPath = basePath + ".actual";
std::string rewritePath = basePath + ".rewrite";
std::string rewriteActualPath = basePath + ".actual-rewrite";
Json::Value root;
exitCode = parseAndSaveValueTree( input, actualPath, "input", root, features, parseOnly );
if ( exitCode == 0 && !parseOnly )
{
std::string rewrite;
exitCode = rewriteValueTree( rewritePath, root, rewrite );
if ( exitCode == 0 )
{
Json::Value rewriteRoot;
exitCode = parseAndSaveValueTree( rewrite, rewriteActualPath,
"rewrite", rewriteRoot, features, parseOnly );
}
}
return exitCode;
*rewrite = write(root);
FILE* fout = fopen(rewritePath.c_str(), "wt");
if (!fout) {
printf("Failed to create rewrite file: %s\n", rewritePath.c_str());
return 2;
}
fprintf(fout, "%s\n", rewrite->c_str());
fclose(fout);
return 0;
}
static std::string removeSuffix(const std::string& path,
const std::string& extension) {
if (extension.length() >= path.length())
return std::string("");
std::string suffix = path.substr(path.length() - extension.length());
if (suffix != extension)
return std::string("");
return path.substr(0, path.length() - extension.length());
}
static void printConfig() {
// Print the configuration used to compile JsonCpp
#if defined(JSON_NO_INT64)
printf("JSON_NO_INT64=1\n");
#else
printf("JSON_NO_INT64=0\n");
#endif
}
static int printUsage(const char* argv[]) {
printf("Usage: %s [--strict] input-json-file", argv[0]);
return 3;
}
static int parseCommandLine(
int argc, const char* argv[], Options* opts)
{
opts->parseOnly = false;
opts->write = &useStyledWriter;
if (argc < 2) {
return printUsage(argv);
}
int index = 1;
if (std::string(argv[index]) == "--json-checker") {
opts->features = Json::Features::strictMode();
opts->parseOnly = true;
++index;
}
if (std::string(argv[index]) == "--json-config") {
printConfig();
return 3;
}
if (std::string(argv[index]) == "--json-writer") {
++index;
std::string const writerName(argv[index++]);
if (writerName == "StyledWriter") {
opts->write = &useStyledWriter;
} else if (writerName == "StyledStreamWriter") {
opts->write = &useStyledStreamWriter;
} else if (writerName == "BuiltStyledStreamWriter") {
opts->write = &useBuiltStyledStreamWriter;
} else {
printf("Unknown '--json-writer %s'\n", writerName.c_str());
return 4;
}
}
if (index == argc || index + 1 < argc) {
return printUsage(argv);
}
opts->path = argv[index];
return 0;
}
static int runTest(Options const& opts)
{
int exitCode = 0;
std::string input = readInputTestFile(opts.path.c_str());
if (input.empty()) {
printf("Failed to read input or empty input: %s\n", opts.path.c_str());
return 3;
}
std::string basePath = removeSuffix(opts.path, ".json");
if (!opts.parseOnly && basePath.empty()) {
printf("Bad input path. Path does not end with '.expected':\n%s\n",
opts.path.c_str());
return 3;
}
std::string const actualPath = basePath + ".actual";
std::string const rewritePath = basePath + ".rewrite";
std::string const rewriteActualPath = basePath + ".actual-rewrite";
Json::Value root;
exitCode = parseAndSaveValueTree(
input, actualPath, "input",
opts.features, opts.parseOnly, &root);
if (exitCode || opts.parseOnly) {
return exitCode;
}
std::string rewrite;
exitCode = rewriteValueTree(rewritePath, root, opts.write, &rewrite);
if (exitCode) {
return exitCode;
}
Json::Value rewriteRoot;
exitCode = parseAndSaveValueTree(
rewrite, rewriteActualPath, "rewrite",
opts.features, opts.parseOnly, &rewriteRoot);
if (exitCode) {
return exitCode;
}
return 0;
}
int main(int argc, const char* argv[]) {
Options opts;
int exitCode = parseCommandLine(argc, argv, &opts);
if (exitCode != 0) {
printf("Failed to parse command-line.");
return exitCode;
}
try {
return runTest(opts);
}
catch (const std::exception& e) {
printf("Unhandled exception:\n%s\n", e.what());
return 1;
}
}

View File

@@ -0,0 +1,85 @@
OPTION(JSONCPP_LIB_BUILD_SHARED "Build jsoncpp_lib as a shared library." OFF)
OPTION(JSONCPP_LIB_BUILD_STATIC "Build jsoncpp_lib static library." ON)
IF(BUILD_SHARED_LIBS)
SET(JSONCPP_LIB_BUILD_SHARED ON)
ENDIF(BUILD_SHARED_LIBS)
if( CMAKE_COMPILER_IS_GNUCXX )
#Get compiler version.
execute_process( COMMAND ${CMAKE_CXX_COMPILER} -dumpversion
OUTPUT_VARIABLE GNUCXX_VERSION )
#-Werror=* was introduced -after- GCC 4.1.2
if( GNUCXX_VERSION VERSION_GREATER 4.1.2 )
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Werror=strict-aliasing")
endif()
endif( CMAKE_COMPILER_IS_GNUCXX )
SET( JSONCPP_INCLUDE_DIR ../../include )
SET( PUBLIC_HEADERS
${JSONCPP_INCLUDE_DIR}/json/config.h
${JSONCPP_INCLUDE_DIR}/json/forwards.h
${JSONCPP_INCLUDE_DIR}/json/features.h
${JSONCPP_INCLUDE_DIR}/json/value.h
${JSONCPP_INCLUDE_DIR}/json/reader.h
${JSONCPP_INCLUDE_DIR}/json/writer.h
${JSONCPP_INCLUDE_DIR}/json/assertions.h
${JSONCPP_INCLUDE_DIR}/json/version.h
)
SOURCE_GROUP( "Public API" FILES ${PUBLIC_HEADERS} )
SET(jsoncpp_sources
json_tool.h
json_reader.cpp
json_valueiterator.inl
json_value.cpp
json_writer.cpp
version.h.in)
# Install instructions for this target
IF(JSONCPP_WITH_CMAKE_PACKAGE)
SET(INSTALL_EXPORT EXPORT jsoncpp)
ELSE(JSONCPP_WITH_CMAKE_PACKAGE)
SET(INSTALL_EXPORT)
ENDIF(JSONCPP_WITH_CMAKE_PACKAGE)
IF(JSONCPP_LIB_BUILD_SHARED)
ADD_DEFINITIONS( -DJSON_DLL_BUILD )
ADD_LIBRARY(jsoncpp_lib SHARED ${PUBLIC_HEADERS} ${jsoncpp_sources})
SET_TARGET_PROPERTIES( jsoncpp_lib PROPERTIES VERSION ${JSONCPP_VERSION} SOVERSION ${JSONCPP_VERSION_MAJOR})
SET_TARGET_PROPERTIES( jsoncpp_lib PROPERTIES OUTPUT_NAME jsoncpp )
INSTALL( TARGETS jsoncpp_lib ${INSTALL_EXPORT}
RUNTIME DESTINATION ${RUNTIME_INSTALL_DIR}
LIBRARY DESTINATION ${LIBRARY_INSTALL_DIR}
ARCHIVE DESTINATION ${ARCHIVE_INSTALL_DIR})
IF(NOT CMAKE_VERSION VERSION_LESS 2.8.11)
TARGET_INCLUDE_DIRECTORIES( jsoncpp_lib PUBLIC
$<INSTALL_INTERFACE:${INCLUDE_INSTALL_DIR}>
$<BUILD_INTERFACE:${CMAKE_CURRENT_LIST_DIR}/${JSONCPP_INCLUDE_DIR}>)
ENDIF(NOT CMAKE_VERSION VERSION_LESS 2.8.11)
ENDIF()
IF(JSONCPP_LIB_BUILD_STATIC)
ADD_LIBRARY(jsoncpp_lib_static STATIC ${PUBLIC_HEADERS} ${jsoncpp_sources})
SET_TARGET_PROPERTIES( jsoncpp_lib_static PROPERTIES VERSION ${JSONCPP_VERSION} SOVERSION ${JSONCPP_VERSION_MAJOR})
SET_TARGET_PROPERTIES( jsoncpp_lib_static PROPERTIES OUTPUT_NAME jsoncpp )
INSTALL( TARGETS jsoncpp_lib_static ${INSTALL_EXPORT}
RUNTIME DESTINATION ${RUNTIME_INSTALL_DIR}
LIBRARY DESTINATION ${LIBRARY_INSTALL_DIR}
ARCHIVE DESTINATION ${ARCHIVE_INSTALL_DIR})
IF(NOT CMAKE_VERSION VERSION_LESS 2.8.11)
TARGET_INCLUDE_DIRECTORIES( jsoncpp_lib_static PUBLIC
$<INSTALL_INTERFACE:${INCLUDE_INSTALL_DIR}>
$<BUILD_INTERFACE:${CMAKE_CURRENT_LIST_DIR}/${JSONCPP_INCLUDE_DIR}>
)
ENDIF(NOT CMAKE_VERSION VERSION_LESS 2.8.11)
ENDIF()

View File

@@ -1,125 +0,0 @@
#ifndef JSONCPP_BATCHALLOCATOR_H_INCLUDED
# define JSONCPP_BATCHALLOCATOR_H_INCLUDED
# include <stdlib.h>
# include <assert.h>
# ifndef JSONCPP_DOC_EXCLUDE_IMPLEMENTATION
namespace Json {
/* Fast memory allocator.
*
* This memory allocator allocates memory for a batch of object (specified by
* the page size, the number of object in each page).
*
* It does not allow the destruction of a single object. All the allocated objects
* can be destroyed at once. The memory can be either released or reused for future
* allocation.
*
* The in-place new operator must be used to construct the object using the pointer
* returned by allocate.
*/
template<typename AllocatedType
,const unsigned int objectPerAllocation>
class BatchAllocator
{
public:
typedef AllocatedType Type;
BatchAllocator( unsigned int objectsPerPage = 255 )
: freeHead_( 0 )
, objectsPerPage_( objectsPerPage )
{
// printf( "Size: %d => %s\n", sizeof(AllocatedType), typeid(AllocatedType).name() );
assert( sizeof(AllocatedType) * objectPerAllocation >= sizeof(AllocatedType *) ); // We must be able to store a slist in the object free space.
assert( objectsPerPage >= 16 );
batches_ = allocateBatch( 0 ); // allocated a dummy page
currentBatch_ = batches_;
}
~BatchAllocator()
{
for ( BatchInfo *batch = batches_; batch; )
{
BatchInfo *nextBatch = batch->next_;
free( batch );
batch = nextBatch;
}
}
/// allocate space for an array of objectPerAllocation object.
/// @warning it is the responsability of the caller to call objects constructors.
AllocatedType *allocate()
{
if ( freeHead_ ) // returns node from free list.
{
AllocatedType *object = freeHead_;
freeHead_ = *(AllocatedType **)object;
return object;
}
if ( currentBatch_->used_ == currentBatch_->end_ )
{
currentBatch_ = currentBatch_->next_;
while ( currentBatch_ && currentBatch_->used_ == currentBatch_->end_ )
currentBatch_ = currentBatch_->next_;
if ( !currentBatch_ ) // no free batch found, allocate a new one
{
currentBatch_ = allocateBatch( objectsPerPage_ );
currentBatch_->next_ = batches_; // insert at the head of the list
batches_ = currentBatch_;
}
}
AllocatedType *allocated = currentBatch_->used_;
currentBatch_->used_ += objectPerAllocation;
return allocated;
}
/// Release the object.
/// @warning it is the responsability of the caller to actually destruct the object.
void release( AllocatedType *object )
{
assert( object != 0 );
*(AllocatedType **)object = freeHead_;
freeHead_ = object;
}
private:
struct BatchInfo
{
BatchInfo *next_;
AllocatedType *used_;
AllocatedType *end_;
AllocatedType buffer_[objectPerAllocation];
};
// disabled copy constructor and assignement operator.
BatchAllocator( const BatchAllocator & );
void operator =( const BatchAllocator &);
static BatchInfo *allocateBatch( unsigned int objectsPerPage )
{
const unsigned int mallocSize = sizeof(BatchInfo) - sizeof(AllocatedType)* objectPerAllocation
+ sizeof(AllocatedType) * objectPerAllocation * objectsPerPage;
BatchInfo *batch = static_cast<BatchInfo*>( malloc( mallocSize ) );
batch->next_ = 0;
batch->used_ = batch->buffer_;
batch->end_ = batch->buffer_ + objectsPerPage;
return batch;
}
BatchInfo *batches_;
BatchInfo *currentBatch_;
/// Head of a single linked list within the allocated space of freeed object
AllocatedType *freeHead_;
unsigned int objectsPerPage_;
};
} // namespace Json
# endif // ifndef JSONCPP_DOC_INCLUDE_IMPLEMENTATION
#endif // JSONCPP_BATCHALLOCATOR_H_INCLUDED

View File

@@ -1,448 +0,0 @@
// included by json_value.cpp
// everything is within Json namespace
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
// class ValueInternalArray
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
ValueArrayAllocator::~ValueArrayAllocator()
{
}
// //////////////////////////////////////////////////////////////////
// class DefaultValueArrayAllocator
// //////////////////////////////////////////////////////////////////
#ifdef JSON_USE_SIMPLE_INTERNAL_ALLOCATOR
class DefaultValueArrayAllocator : public ValueArrayAllocator
{
public: // overridden from ValueArrayAllocator
virtual ~DefaultValueArrayAllocator()
{
}
virtual ValueInternalArray *newArray()
{
return new ValueInternalArray();
}
virtual ValueInternalArray *newArrayCopy( const ValueInternalArray &other )
{
return new ValueInternalArray( other );
}
virtual void destructArray( ValueInternalArray *array )
{
delete array;
}
virtual void reallocateArrayPageIndex( Value **&indexes,
ValueInternalArray::PageIndex &indexCount,
ValueInternalArray::PageIndex minNewIndexCount )
{
ValueInternalArray::PageIndex newIndexCount = (indexCount*3)/2 + 1;
if ( minNewIndexCount > newIndexCount )
newIndexCount = minNewIndexCount;
void *newIndexes = realloc( indexes, sizeof(Value*) * newIndexCount );
if ( !newIndexes )
throw std::bad_alloc();
indexCount = newIndexCount;
indexes = static_cast<Value **>( newIndexes );
}
virtual void releaseArrayPageIndex( Value **indexes,
ValueInternalArray::PageIndex indexCount )
{
if ( indexes )
free( indexes );
}
virtual Value *allocateArrayPage()
{
return static_cast<Value *>( malloc( sizeof(Value) * ValueInternalArray::itemsPerPage ) );
}
virtual void releaseArrayPage( Value *value )
{
if ( value )
free( value );
}
};
#else // #ifdef JSON_USE_SIMPLE_INTERNAL_ALLOCATOR
/// @todo make this thread-safe (lock when accessign batch allocator)
class DefaultValueArrayAllocator : public ValueArrayAllocator
{
public: // overridden from ValueArrayAllocator
virtual ~DefaultValueArrayAllocator()
{
}
virtual ValueInternalArray *newArray()
{
ValueInternalArray *array = arraysAllocator_.allocate();
new (array) ValueInternalArray(); // placement new
return array;
}
virtual ValueInternalArray *newArrayCopy( const ValueInternalArray &other )
{
ValueInternalArray *array = arraysAllocator_.allocate();
new (array) ValueInternalArray( other ); // placement new
return array;
}
virtual void destructArray( ValueInternalArray *array )
{
if ( array )
{
array->~ValueInternalArray();
arraysAllocator_.release( array );
}
}
virtual void reallocateArrayPageIndex( Value **&indexes,
ValueInternalArray::PageIndex &indexCount,
ValueInternalArray::PageIndex minNewIndexCount )
{
ValueInternalArray::PageIndex newIndexCount = (indexCount*3)/2 + 1;
if ( minNewIndexCount > newIndexCount )
newIndexCount = minNewIndexCount;
void *newIndexes = realloc( indexes, sizeof(Value*) * newIndexCount );
if ( !newIndexes )
throw std::bad_alloc();
indexCount = newIndexCount;
indexes = static_cast<Value **>( newIndexes );
}
virtual void releaseArrayPageIndex( Value **indexes,
ValueInternalArray::PageIndex indexCount )
{
if ( indexes )
free( indexes );
}
virtual Value *allocateArrayPage()
{
return static_cast<Value *>( pagesAllocator_.allocate() );
}
virtual void releaseArrayPage( Value *value )
{
if ( value )
pagesAllocator_.release( value );
}
private:
BatchAllocator<ValueInternalArray,1> arraysAllocator_;
BatchAllocator<Value,ValueInternalArray::itemsPerPage> pagesAllocator_;
};
#endif // #ifdef JSON_USE_SIMPLE_INTERNAL_ALLOCATOR
static ValueArrayAllocator *&arrayAllocator()
{
static DefaultValueArrayAllocator defaultAllocator;
static ValueArrayAllocator *arrayAllocator = &defaultAllocator;
return arrayAllocator;
}
static struct DummyArrayAllocatorInitializer {
DummyArrayAllocatorInitializer()
{
arrayAllocator(); // ensure arrayAllocator() statics are initialized before main().
}
} dummyArrayAllocatorInitializer;
// //////////////////////////////////////////////////////////////////
// class ValueInternalArray
// //////////////////////////////////////////////////////////////////
bool
ValueInternalArray::equals( const IteratorState &x,
const IteratorState &other )
{
return x.array_ == other.array_
&& x.currentItemIndex_ == other.currentItemIndex_
&& x.currentPageIndex_ == other.currentPageIndex_;
}
void
ValueInternalArray::increment( IteratorState &it )
{
JSON_ASSERT_MESSAGE( it.array_ &&
(it.currentPageIndex_ - it.array_->pages_)*itemsPerPage + it.currentItemIndex_
!= it.array_->size_,
"ValueInternalArray::increment(): moving iterator beyond end" );
++(it.currentItemIndex_);
if ( it.currentItemIndex_ == itemsPerPage )
{
it.currentItemIndex_ = 0;
++(it.currentPageIndex_);
}
}
void
ValueInternalArray::decrement( IteratorState &it )
{
JSON_ASSERT_MESSAGE( it.array_ && it.currentPageIndex_ == it.array_->pages_
&& it.currentItemIndex_ == 0,
"ValueInternalArray::decrement(): moving iterator beyond end" );
if ( it.currentItemIndex_ == 0 )
{
it.currentItemIndex_ = itemsPerPage-1;
--(it.currentPageIndex_);
}
else
{
--(it.currentItemIndex_);
}
}
Value &
ValueInternalArray::unsafeDereference( const IteratorState &it )
{
return (*(it.currentPageIndex_))[it.currentItemIndex_];
}
Value &
ValueInternalArray::dereference( const IteratorState &it )
{
JSON_ASSERT_MESSAGE( it.array_ &&
(it.currentPageIndex_ - it.array_->pages_)*itemsPerPage + it.currentItemIndex_
< it.array_->size_,
"ValueInternalArray::dereference(): dereferencing invalid iterator" );
return unsafeDereference( it );
}
void
ValueInternalArray::makeBeginIterator( IteratorState &it ) const
{
it.array_ = const_cast<ValueInternalArray *>( this );
it.currentItemIndex_ = 0;
it.currentPageIndex_ = pages_;
}
void
ValueInternalArray::makeIterator( IteratorState &it, ArrayIndex index ) const
{
it.array_ = const_cast<ValueInternalArray *>( this );
it.currentItemIndex_ = index % itemsPerPage;
it.currentPageIndex_ = pages_ + index / itemsPerPage;
}
void
ValueInternalArray::makeEndIterator( IteratorState &it ) const
{
makeIterator( it, size_ );
}
ValueInternalArray::ValueInternalArray()
: pages_( 0 )
, size_( 0 )
, pageCount_( 0 )
{
}
ValueInternalArray::ValueInternalArray( const ValueInternalArray &other )
: pages_( 0 )
, pageCount_( 0 )
, size_( other.size_ )
{
PageIndex minNewPages = other.size_ / itemsPerPage;
arrayAllocator()->reallocateArrayPageIndex( pages_, pageCount_, minNewPages );
JSON_ASSERT_MESSAGE( pageCount_ >= minNewPages,
"ValueInternalArray::reserve(): bad reallocation" );
IteratorState itOther;
other.makeBeginIterator( itOther );
Value *value;
for ( ArrayIndex index = 0; index < size_; ++index, increment(itOther) )
{
if ( index % itemsPerPage == 0 )
{
PageIndex pageIndex = index / itemsPerPage;
value = arrayAllocator()->allocateArrayPage();
pages_[pageIndex] = value;
}
new (value) Value( dereference( itOther ) );
}
}
ValueInternalArray &
ValueInternalArray::operator =( const ValueInternalArray &other )
{
ValueInternalArray temp( other );
swap( temp );
return *this;
}
ValueInternalArray::~ValueInternalArray()
{
// destroy all constructed items
IteratorState it;
IteratorState itEnd;
makeBeginIterator( it);
makeEndIterator( itEnd );
for ( ; !equals(it,itEnd); increment(it) )
{
Value *value = &dereference(it);
value->~Value();
}
// release all pages
PageIndex lastPageIndex = size_ / itemsPerPage;
for ( PageIndex pageIndex = 0; pageIndex < lastPageIndex; ++pageIndex )
arrayAllocator()->releaseArrayPage( pages_[pageIndex] );
// release pages index
arrayAllocator()->releaseArrayPageIndex( pages_, pageCount_ );
}
void
ValueInternalArray::swap( ValueInternalArray &other )
{
Value **tempPages = pages_;
pages_ = other.pages_;
other.pages_ = tempPages;
ArrayIndex tempSize = size_;
size_ = other.size_;
other.size_ = tempSize;
PageIndex tempPageCount = pageCount_;
pageCount_ = other.pageCount_;
other.pageCount_ = tempPageCount;
}
void
ValueInternalArray::clear()
{
ValueInternalArray dummy;
swap( dummy );
}
void
ValueInternalArray::resize( ArrayIndex newSize )
{
if ( newSize == 0 )
clear();
else if ( newSize < size_ )
{
IteratorState it;
IteratorState itEnd;
makeIterator( it, newSize );
makeIterator( itEnd, size_ );
for ( ; !equals(it,itEnd); increment(it) )
{
Value *value = &dereference(it);
value->~Value();
}
PageIndex pageIndex = (newSize + itemsPerPage - 1) / itemsPerPage;
PageIndex lastPageIndex = size_ / itemsPerPage;
for ( ; pageIndex < lastPageIndex; ++pageIndex )
arrayAllocator()->releaseArrayPage( pages_[pageIndex] );
size_ = newSize;
}
else if ( newSize > size_ )
resolveReference( newSize );
}
void
ValueInternalArray::makeIndexValid( ArrayIndex index )
{
// Need to enlarge page index ?
if ( index >= pageCount_ * itemsPerPage )
{
PageIndex minNewPages = (index + 1) / itemsPerPage;
arrayAllocator()->reallocateArrayPageIndex( pages_, pageCount_, minNewPages );
JSON_ASSERT_MESSAGE( pageCount_ >= minNewPages, "ValueInternalArray::reserve(): bad reallocation" );
}
// Need to allocate new pages ?
ArrayIndex nextPageIndex =
(size_ % itemsPerPage) != 0 ? size_ - (size_%itemsPerPage) + itemsPerPage
: size_;
if ( nextPageIndex <= index )
{
PageIndex pageIndex = nextPageIndex / itemsPerPage;
PageIndex pageToAllocate = (index - nextPageIndex) / itemsPerPage + 1;
for ( ; pageToAllocate-- > 0; ++pageIndex )
pages_[pageIndex] = arrayAllocator()->allocateArrayPage();
}
// Initialize all new entries
IteratorState it;
IteratorState itEnd;
makeIterator( it, size_ );
size_ = index + 1;
makeIterator( itEnd, size_ );
for ( ; !equals(it,itEnd); increment(it) )
{
Value *value = &dereference(it);
new (value) Value(); // Construct a default value using placement new
}
}
Value &
ValueInternalArray::resolveReference( ArrayIndex index )
{
if ( index >= size_ )
makeIndexValid( index );
return pages_[index/itemsPerPage][index%itemsPerPage];
}
Value *
ValueInternalArray::find( ArrayIndex index ) const
{
if ( index >= size_ )
return 0;
return &(pages_[index/itemsPerPage][index%itemsPerPage]);
}
ValueInternalArray::ArrayIndex
ValueInternalArray::size() const
{
return size_;
}
int
ValueInternalArray::distance( const IteratorState &x, const IteratorState &y )
{
return indexOf(y) - indexOf(x);
}
ValueInternalArray::ArrayIndex
ValueInternalArray::indexOf( const IteratorState &iterator )
{
if ( !iterator.array_ )
return ArrayIndex(-1);
return ArrayIndex(
(iterator.currentPageIndex_ - iterator.array_->pages_) * itemsPerPage
+ iterator.currentItemIndex_ );
}
int
ValueInternalArray::compare( const ValueInternalArray &other ) const
{
int sizeDiff( size_ - other.size_ );
if ( sizeDiff != 0 )
return sizeDiff;
for ( ArrayIndex index =0; index < size_; ++index )
{
int diff = pages_[index/itemsPerPage][index%itemsPerPage].compare(
other.pages_[index/itemsPerPage][index%itemsPerPage] );
if ( diff != 0 )
return diff;
}
return 0;
}

View File

@@ -1,607 +0,0 @@
// included by json_value.cpp
// everything is within Json namespace
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
// class ValueInternalMap
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
/** \internal MUST be safely initialized using memset( this, 0, sizeof(ValueInternalLink) );
* This optimization is used by the fast allocator.
*/
ValueInternalLink::ValueInternalLink()
: previous_( 0 )
, next_( 0 )
{
}
ValueInternalLink::~ValueInternalLink()
{
for ( int index =0; index < itemPerLink; ++index )
{
if ( !items_[index].isItemAvailable() )
{
if ( !items_[index].isMemberNameStatic() )
free( keys_[index] );
}
else
break;
}
}
ValueMapAllocator::~ValueMapAllocator()
{
}
#ifdef JSON_USE_SIMPLE_INTERNAL_ALLOCATOR
class DefaultValueMapAllocator : public ValueMapAllocator
{
public: // overridden from ValueMapAllocator
virtual ValueInternalMap *newMap()
{
return new ValueInternalMap();
}
virtual ValueInternalMap *newMapCopy( const ValueInternalMap &other )
{
return new ValueInternalMap( other );
}
virtual void destructMap( ValueInternalMap *map )
{
delete map;
}
virtual ValueInternalLink *allocateMapBuckets( unsigned int size )
{
return new ValueInternalLink[size];
}
virtual void releaseMapBuckets( ValueInternalLink *links )
{
delete [] links;
}
virtual ValueInternalLink *allocateMapLink()
{
return new ValueInternalLink();
}
virtual void releaseMapLink( ValueInternalLink *link )
{
delete link;
}
};
#else
/// @todo make this thread-safe (lock when accessign batch allocator)
class DefaultValueMapAllocator : public ValueMapAllocator
{
public: // overridden from ValueMapAllocator
virtual ValueInternalMap *newMap()
{
ValueInternalMap *map = mapsAllocator_.allocate();
new (map) ValueInternalMap(); // placement new
return map;
}
virtual ValueInternalMap *newMapCopy( const ValueInternalMap &other )
{
ValueInternalMap *map = mapsAllocator_.allocate();
new (map) ValueInternalMap( other ); // placement new
return map;
}
virtual void destructMap( ValueInternalMap *map )
{
if ( map )
{
map->~ValueInternalMap();
mapsAllocator_.release( map );
}
}
virtual ValueInternalLink *allocateMapBuckets( unsigned int size )
{
return new ValueInternalLink[size];
}
virtual void releaseMapBuckets( ValueInternalLink *links )
{
delete [] links;
}
virtual ValueInternalLink *allocateMapLink()
{
ValueInternalLink *link = linksAllocator_.allocate();
memset( link, 0, sizeof(ValueInternalLink) );
return link;
}
virtual void releaseMapLink( ValueInternalLink *link )
{
link->~ValueInternalLink();
linksAllocator_.release( link );
}
private:
BatchAllocator<ValueInternalMap,1> mapsAllocator_;
BatchAllocator<ValueInternalLink,1> linksAllocator_;
};
#endif
static ValueMapAllocator *&mapAllocator()
{
static DefaultValueMapAllocator defaultAllocator;
static ValueMapAllocator *mapAllocator = &defaultAllocator;
return mapAllocator;
}
static struct DummyMapAllocatorInitializer {
DummyMapAllocatorInitializer()
{
mapAllocator(); // ensure mapAllocator() statics are initialized before main().
}
} dummyMapAllocatorInitializer;
// h(K) = value * K >> w ; with w = 32 & K prime w.r.t. 2^32.
/*
use linked list hash map.
buckets array is a container.
linked list element contains 6 key/values. (memory = (16+4) * 6 + 4 = 124)
value have extra state: valid, available, deleted
*/
ValueInternalMap::ValueInternalMap()
: buckets_( 0 )
, tailLink_( 0 )
, bucketsSize_( 0 )
, itemCount_( 0 )
{
}
ValueInternalMap::ValueInternalMap( const ValueInternalMap &other )
: buckets_( 0 )
, tailLink_( 0 )
, bucketsSize_( 0 )
, itemCount_( 0 )
{
reserve( other.itemCount_ );
IteratorState it;
IteratorState itEnd;
other.makeBeginIterator( it );
other.makeEndIterator( itEnd );
for ( ; !equals(it,itEnd); increment(it) )
{
bool isStatic;
const char *memberName = key( it, isStatic );
const Value &aValue = value( it );
resolveReference(memberName, isStatic) = aValue;
}
}
ValueInternalMap &
ValueInternalMap::operator =( const ValueInternalMap &other )
{
ValueInternalMap dummy( other );
swap( dummy );
return *this;
}
ValueInternalMap::~ValueInternalMap()
{
if ( buckets_ )
{
for ( BucketIndex bucketIndex =0; bucketIndex < bucketsSize_; ++bucketIndex )
{
ValueInternalLink *link = buckets_[bucketIndex].next_;
while ( link )
{
ValueInternalLink *linkToRelease = link;
link = link->next_;
mapAllocator()->releaseMapLink( linkToRelease );
}
}
mapAllocator()->releaseMapBuckets( buckets_ );
}
}
void
ValueInternalMap::swap( ValueInternalMap &other )
{
ValueInternalLink *tempBuckets = buckets_;
buckets_ = other.buckets_;
other.buckets_ = tempBuckets;
ValueInternalLink *tempTailLink = tailLink_;
tailLink_ = other.tailLink_;
other.tailLink_ = tempTailLink;
BucketIndex tempBucketsSize = bucketsSize_;
bucketsSize_ = other.bucketsSize_;
other.bucketsSize_ = tempBucketsSize;
BucketIndex tempItemCount = itemCount_;
itemCount_ = other.itemCount_;
other.itemCount_ = tempItemCount;
}
void
ValueInternalMap::clear()
{
ValueInternalMap dummy;
swap( dummy );
}
ValueInternalMap::BucketIndex
ValueInternalMap::size() const
{
return itemCount_;
}
bool
ValueInternalMap::reserveDelta( BucketIndex growth )
{
return reserve( itemCount_ + growth );
}
bool
ValueInternalMap::reserve( BucketIndex newItemCount )
{
if ( !buckets_ && newItemCount > 0 )
{
buckets_ = mapAllocator()->allocateMapBuckets( 1 );
bucketsSize_ = 1;
tailLink_ = &buckets_[0];
}
// BucketIndex idealBucketCount = (newItemCount + ValueInternalLink::itemPerLink) / ValueInternalLink::itemPerLink;
return true;
}
const Value *
ValueInternalMap::find( const char *key ) const
{
if ( !bucketsSize_ )
return 0;
HashKey hashedKey = hash( key );
BucketIndex bucketIndex = hashedKey % bucketsSize_;
for ( const ValueInternalLink *current = &buckets_[bucketIndex];
current != 0;
current = current->next_ )
{
for ( BucketIndex index=0; index < ValueInternalLink::itemPerLink; ++index )
{
if ( current->items_[index].isItemAvailable() )
return 0;
if ( strcmp( key, current->keys_[index] ) == 0 )
return &current->items_[index];
}
}
return 0;
}
Value *
ValueInternalMap::find( const char *key )
{
const ValueInternalMap *constThis = this;
return const_cast<Value *>( constThis->find( key ) );
}
Value &
ValueInternalMap::resolveReference( const char *key,
bool isStatic )
{
HashKey hashedKey = hash( key );
if ( bucketsSize_ )
{
BucketIndex bucketIndex = hashedKey % bucketsSize_;
ValueInternalLink **previous = 0;
BucketIndex index;
for ( ValueInternalLink *current = &buckets_[bucketIndex];
current != 0;
previous = &current->next_, current = current->next_ )
{
for ( index=0; index < ValueInternalLink::itemPerLink; ++index )
{
if ( current->items_[index].isItemAvailable() )
return setNewItem( key, isStatic, current, index );
if ( strcmp( key, current->keys_[index] ) == 0 )
return current->items_[index];
}
}
}
reserveDelta( 1 );
return unsafeAdd( key, isStatic, hashedKey );
}
void
ValueInternalMap::remove( const char *key )
{
HashKey hashedKey = hash( key );
if ( !bucketsSize_ )
return;
BucketIndex bucketIndex = hashedKey % bucketsSize_;
for ( ValueInternalLink *link = &buckets_[bucketIndex];
link != 0;
link = link->next_ )
{
BucketIndex index;
for ( index =0; index < ValueInternalLink::itemPerLink; ++index )
{
if ( link->items_[index].isItemAvailable() )
return;
if ( strcmp( key, link->keys_[index] ) == 0 )
{
doActualRemove( link, index, bucketIndex );
return;
}
}
}
}
void
ValueInternalMap::doActualRemove( ValueInternalLink *link,
BucketIndex index,
BucketIndex bucketIndex )
{
// find last item of the bucket and swap it with the 'removed' one.
// set removed items flags to 'available'.
// if last page only contains 'available' items, then desallocate it (it's empty)
ValueInternalLink *&lastLink = getLastLinkInBucket( index );
BucketIndex lastItemIndex = 1; // a link can never be empty, so start at 1
for ( ;
lastItemIndex < ValueInternalLink::itemPerLink;
++lastItemIndex ) // may be optimized with dicotomic search
{
if ( lastLink->items_[lastItemIndex].isItemAvailable() )
break;
}
BucketIndex lastUsedIndex = lastItemIndex - 1;
Value *valueToDelete = &link->items_[index];
Value *valueToPreserve = &lastLink->items_[lastUsedIndex];
if ( valueToDelete != valueToPreserve )
valueToDelete->swap( *valueToPreserve );
if ( lastUsedIndex == 0 ) // page is now empty
{ // remove it from bucket linked list and delete it.
ValueInternalLink *linkPreviousToLast = lastLink->previous_;
if ( linkPreviousToLast != 0 ) // can not deleted bucket link.
{
mapAllocator()->releaseMapLink( lastLink );
linkPreviousToLast->next_ = 0;
lastLink = linkPreviousToLast;
}
}
else
{
Value dummy;
valueToPreserve->swap( dummy ); // restore deleted to default Value.
valueToPreserve->setItemUsed( false );
}
--itemCount_;
}
ValueInternalLink *&
ValueInternalMap::getLastLinkInBucket( BucketIndex bucketIndex )
{
if ( bucketIndex == bucketsSize_ - 1 )
return tailLink_;
ValueInternalLink *&previous = buckets_[bucketIndex+1].previous_;
if ( !previous )
previous = &buckets_[bucketIndex];
return previous;
}
Value &
ValueInternalMap::setNewItem( const char *key,
bool isStatic,
ValueInternalLink *link,
BucketIndex index )
{
char *duplicatedKey = valueAllocator()->makeMemberName( key );
++itemCount_;
link->keys_[index] = duplicatedKey;
link->items_[index].setItemUsed();
link->items_[index].setMemberNameIsStatic( isStatic );
return link->items_[index]; // items already default constructed.
}
Value &
ValueInternalMap::unsafeAdd( const char *key,
bool isStatic,
HashKey hashedKey )
{
JSON_ASSERT_MESSAGE( bucketsSize_ > 0, "ValueInternalMap::unsafeAdd(): internal logic error." );
BucketIndex bucketIndex = hashedKey % bucketsSize_;
ValueInternalLink *&previousLink = getLastLinkInBucket( bucketIndex );
ValueInternalLink *link = previousLink;
BucketIndex index;
for ( index =0; index < ValueInternalLink::itemPerLink; ++index )
{
if ( link->items_[index].isItemAvailable() )
break;
}
if ( index == ValueInternalLink::itemPerLink ) // need to add a new page
{
ValueInternalLink *newLink = mapAllocator()->allocateMapLink();
index = 0;
link->next_ = newLink;
previousLink = newLink;
link = newLink;
}
return setNewItem( key, isStatic, link, index );
}
ValueInternalMap::HashKey
ValueInternalMap::hash( const char *key ) const
{
HashKey hash = 0;
while ( *key )
hash += *key++ * 37;
return hash;
}
int
ValueInternalMap::compare( const ValueInternalMap &other ) const
{
int sizeDiff( itemCount_ - other.itemCount_ );
if ( sizeDiff != 0 )
return sizeDiff;
// Strict order guaranty is required. Compare all keys FIRST, then compare values.
IteratorState it;
IteratorState itEnd;
makeBeginIterator( it );
makeEndIterator( itEnd );
for ( ; !equals(it,itEnd); increment(it) )
{
if ( !other.find( key( it ) ) )
return 1;
}
// All keys are equals, let's compare values
makeBeginIterator( it );
for ( ; !equals(it,itEnd); increment(it) )
{
const Value *otherValue = other.find( key( it ) );
int valueDiff = value(it).compare( *otherValue );
if ( valueDiff != 0 )
return valueDiff;
}
return 0;
}
void
ValueInternalMap::makeBeginIterator( IteratorState &it ) const
{
it.map_ = const_cast<ValueInternalMap *>( this );
it.bucketIndex_ = 0;
it.itemIndex_ = 0;
it.link_ = buckets_;
}
void
ValueInternalMap::makeEndIterator( IteratorState &it ) const
{
it.map_ = const_cast<ValueInternalMap *>( this );
it.bucketIndex_ = bucketsSize_;
it.itemIndex_ = 0;
it.link_ = 0;
}
bool
ValueInternalMap::equals( const IteratorState &x, const IteratorState &other )
{
return x.map_ == other.map_
&& x.bucketIndex_ == other.bucketIndex_
&& x.link_ == other.link_
&& x.itemIndex_ == other.itemIndex_;
}
void
ValueInternalMap::incrementBucket( IteratorState &iterator )
{
++iterator.bucketIndex_;
JSON_ASSERT_MESSAGE( iterator.bucketIndex_ <= iterator.map_->bucketsSize_,
"ValueInternalMap::increment(): attempting to iterate beyond end." );
if ( iterator.bucketIndex_ == iterator.map_->bucketsSize_ )
iterator.link_ = 0;
else
iterator.link_ = &(iterator.map_->buckets_[iterator.bucketIndex_]);
iterator.itemIndex_ = 0;
}
void
ValueInternalMap::increment( IteratorState &iterator )
{
JSON_ASSERT_MESSAGE( iterator.map_, "Attempting to iterator using invalid iterator." );
++iterator.itemIndex_;
if ( iterator.itemIndex_ == ValueInternalLink::itemPerLink )
{
JSON_ASSERT_MESSAGE( iterator.link_ != 0,
"ValueInternalMap::increment(): attempting to iterate beyond end." );
iterator.link_ = iterator.link_->next_;
if ( iterator.link_ == 0 )
incrementBucket( iterator );
}
else if ( iterator.link_->items_[iterator.itemIndex_].isItemAvailable() )
{
incrementBucket( iterator );
}
}
void
ValueInternalMap::decrement( IteratorState &iterator )
{
if ( iterator.itemIndex_ == 0 )
{
JSON_ASSERT_MESSAGE( iterator.map_, "Attempting to iterate using invalid iterator." );
if ( iterator.link_ == &iterator.map_->buckets_[iterator.bucketIndex_] )
{
JSON_ASSERT_MESSAGE( iterator.bucketIndex_ > 0, "Attempting to iterate beyond beginning." );
--(iterator.bucketIndex_);
}
iterator.link_ = iterator.link_->previous_;
iterator.itemIndex_ = ValueInternalLink::itemPerLink - 1;
}
}
const char *
ValueInternalMap::key( const IteratorState &iterator )
{
JSON_ASSERT_MESSAGE( iterator.link_, "Attempting to iterate using invalid iterator." );
return iterator.link_->keys_[iterator.itemIndex_];
}
const char *
ValueInternalMap::key( const IteratorState &iterator, bool &isStatic )
{
JSON_ASSERT_MESSAGE( iterator.link_, "Attempting to iterate using invalid iterator." );
isStatic = iterator.link_->items_[iterator.itemIndex_].isMemberNameStatic();
return iterator.link_->keys_[iterator.itemIndex_];
}
Value &
ValueInternalMap::value( const IteratorState &iterator )
{
JSON_ASSERT_MESSAGE( iterator.link_, "Attempting to iterate using invalid iterator." );
return iterator.link_->items_[iterator.itemIndex_];
}
int
ValueInternalMap::distance( const IteratorState &x, const IteratorState &y )
{
int offset = 0;
IteratorState it = x;
while ( !equals( it, y ) )
increment( it );
return offset;
}

File diff suppressed because it is too large Load Diff

87
src/lib_json/json_tool.h Normal file
View File

@@ -0,0 +1,87 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
#ifndef LIB_JSONCPP_JSON_TOOL_H_INCLUDED
#define LIB_JSONCPP_JSON_TOOL_H_INCLUDED
/* This header provides common string manipulation support, such as UTF-8,
* portable conversion from/to string...
*
* It is an internal header that must not be exposed.
*/
namespace Json {
/// Converts a unicode code-point to UTF-8.
static inline std::string codePointToUTF8(unsigned int cp) {
std::string result;
// based on description from http://en.wikipedia.org/wiki/UTF-8
if (cp <= 0x7f) {
result.resize(1);
result[0] = static_cast<char>(cp);
} else if (cp <= 0x7FF) {
result.resize(2);
result[1] = static_cast<char>(0x80 | (0x3f & cp));
result[0] = static_cast<char>(0xC0 | (0x1f & (cp >> 6)));
} else if (cp <= 0xFFFF) {
result.resize(3);
result[2] = static_cast<char>(0x80 | (0x3f & cp));
result[1] = 0x80 | static_cast<char>((0x3f & (cp >> 6)));
result[0] = 0xE0 | static_cast<char>((0xf & (cp >> 12)));
} else if (cp <= 0x10FFFF) {
result.resize(4);
result[3] = static_cast<char>(0x80 | (0x3f & cp));
result[2] = static_cast<char>(0x80 | (0x3f & (cp >> 6)));
result[1] = static_cast<char>(0x80 | (0x3f & (cp >> 12)));
result[0] = static_cast<char>(0xF0 | (0x7 & (cp >> 18)));
}
return result;
}
/// Returns true if ch is a control character (in range [0,32[).
static inline bool isControlCharacter(char ch) { return ch > 0 && ch <= 0x1F; }
enum {
/// Constant that specify the size of the buffer that must be passed to
/// uintToString.
uintToStringBufferSize = 3 * sizeof(LargestUInt) + 1
};
// Defines a char buffer for use with uintToString().
typedef char UIntToStringBuffer[uintToStringBufferSize];
/** Converts an unsigned integer to string.
* @param value Unsigned interger to convert to string
* @param current Input/Output string buffer.
* Must have at least uintToStringBufferSize chars free.
*/
static inline void uintToString(LargestUInt value, char*& current) {
*--current = 0;
do {
*--current = char(value % 10) + '0';
value /= 10;
} while (value != 0);
}
/** Change ',' to '.' everywhere in buffer.
*
* We had a sophisticated way, but it did not work in WinCE.
* @see https://github.com/open-source-parsers/jsoncpp/pull/9
*/
static inline void fixNumericLocale(char* begin, char* end) {
while (begin < end) {
if (*begin == ',') {
*begin = '.';
}
++begin;
}
}
} // namespace Json {
#endif // LIB_JSONCPP_JSON_TOOL_H_INCLUDED

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,11 @@
// included by json_value.cpp
// everything is within Json namespace
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
// included by json_value.cpp
namespace Json {
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
@@ -11,201 +16,105 @@
// //////////////////////////////////////////////////////////////////
ValueIteratorBase::ValueIteratorBase()
#ifndef JSON_VALUE_USE_INTERNAL_MAP
: current_()
, isNull_( true )
{
: current_(), isNull_(true) {
}
ValueIteratorBase::ValueIteratorBase(
const Value::ObjectValues::iterator& current)
: current_(current), isNull_(false) {}
Value& ValueIteratorBase::deref() const {
return current_->second;
}
void ValueIteratorBase::increment() {
++current_;
}
void ValueIteratorBase::decrement() {
--current_;
}
ValueIteratorBase::difference_type
ValueIteratorBase::computeDistance(const SelfType& other) const {
#ifdef JSON_USE_CPPTL_SMALLMAP
return other.current_ - current_;
#else
: isArray_( true )
, isNull_( true )
{
iterator_.array_ = ValueInternalArray::IteratorState();
}
#endif
// Iterator for null value are initialized using the default
// constructor, which initialize current_ to the default
// std::map::iterator. As begin() and end() are two instance
// of the default std::map::iterator, they can not be compared.
// To allow this, we handle this comparison specifically.
if (isNull_ && other.isNull_) {
return 0;
}
#ifndef JSON_VALUE_USE_INTERNAL_MAP
ValueIteratorBase::ValueIteratorBase( const Value::ObjectValues::iterator &current )
: current_( current )
, isNull_( false )
{
}
#else
ValueIteratorBase::ValueIteratorBase( const ValueInternalArray::IteratorState &state )
: isArray_( true )
{
iterator_.array_ = state;
}
ValueIteratorBase::ValueIteratorBase( const ValueInternalMap::IteratorState &state )
: isArray_( false )
{
iterator_.map_ = state;
}
#endif
Value &
ValueIteratorBase::deref() const
{
#ifndef JSON_VALUE_USE_INTERNAL_MAP
return current_->second;
#else
if ( isArray_ )
return ValueInternalArray::dereference( iterator_.array_ );
return ValueInternalMap::value( iterator_.map_ );
// Usage of std::distance is not portable (does not compile with Sun Studio 12
// RogueWave STL,
// which is the one used by default).
// Using a portable hand-made version for non random iterator instead:
// return difference_type( std::distance( current_, other.current_ ) );
difference_type myDistance = 0;
for (Value::ObjectValues::iterator it = current_; it != other.current_;
++it) {
++myDistance;
}
return myDistance;
#endif
}
void
ValueIteratorBase::increment()
{
#ifndef JSON_VALUE_USE_INTERNAL_MAP
++current_;
#else
if ( isArray_ )
ValueInternalArray::increment( iterator_.array_ );
ValueInternalMap::increment( iterator_.map_ );
#endif
bool ValueIteratorBase::isEqual(const SelfType& other) const {
if (isNull_) {
return other.isNull_;
}
return current_ == other.current_;
}
void
ValueIteratorBase::decrement()
{
#ifndef JSON_VALUE_USE_INTERNAL_MAP
--current_;
#else
if ( isArray_ )
ValueInternalArray::decrement( iterator_.array_ );
ValueInternalMap::decrement( iterator_.map_ );
#endif
void ValueIteratorBase::copy(const SelfType& other) {
current_ = other.current_;
isNull_ = other.isNull_;
}
ValueIteratorBase::difference_type
ValueIteratorBase::computeDistance( const SelfType &other ) const
{
#ifndef JSON_VALUE_USE_INTERNAL_MAP
# ifdef JSON_USE_CPPTL_SMALLMAP
return current_ - other.current_;
# else
// Iterator for null value are initialized using the default
// constructor, which initialize current_ to the default
// std::map::iterator. As begin() and end() are two instance
// of the default std::map::iterator, they can not be compared.
// To allow this, we handle this comparison specifically.
if ( isNull_ && other.isNull_ )
{
return 0;
}
// Usage of std::distance is not portable (does not compile with Sun Studio 12 RogueWave STL,
// which is the one used by default).
// Using a portable hand-made version for non random iterator instead:
// return difference_type( std::distance( current_, other.current_ ) );
difference_type myDistance = 0;
for ( Value::ObjectValues::iterator it = current_; it != other.current_; ++it )
{
++myDistance;
}
return myDistance;
# endif
#else
if ( isArray_ )
return ValueInternalArray::distance( iterator_.array_, other.iterator_.array_ );
return ValueInternalMap::distance( iterator_.map_, other.iterator_.map_ );
#endif
Value ValueIteratorBase::key() const {
const Value::CZString czstring = (*current_).first;
if (czstring.data()) {
if (czstring.isStaticString())
return Value(StaticString(czstring.data()));
return Value(czstring.data(), czstring.data() + czstring.length());
}
return Value(czstring.index());
}
bool
ValueIteratorBase::isEqual( const SelfType &other ) const
{
#ifndef JSON_VALUE_USE_INTERNAL_MAP
if ( isNull_ )
{
return other.isNull_;
}
return current_ == other.current_;
#else
if ( isArray_ )
return ValueInternalArray::equals( iterator_.array_, other.iterator_.array_ );
return ValueInternalMap::equals( iterator_.map_, other.iterator_.map_ );
#endif
UInt ValueIteratorBase::index() const {
const Value::CZString czstring = (*current_).first;
if (!czstring.data())
return czstring.index();
return Value::UInt(-1);
}
void
ValueIteratorBase::copy( const SelfType &other )
{
#ifndef JSON_VALUE_USE_INTERNAL_MAP
current_ = other.current_;
#else
if ( isArray_ )
iterator_.array_ = other.iterator_.array_;
iterator_.map_ = other.iterator_.map_;
#endif
std::string ValueIteratorBase::name() const {
char const* key;
char const* end;
key = memberName(&end);
if (!key) return std::string();
return std::string(key, end);
}
Value
ValueIteratorBase::key() const
{
#ifndef JSON_VALUE_USE_INTERNAL_MAP
const Value::CZString czstring = (*current_).first;
if ( czstring.c_str() )
{
if ( czstring.isStaticString() )
return Value( StaticString( czstring.c_str() ) );
return Value( czstring.c_str() );
}
return Value( czstring.index() );
#else
if ( isArray_ )
return Value( ValueInternalArray::indexOf( iterator_.array_ ) );
bool isStatic;
const char *memberName = ValueInternalMap::key( iterator_.map_, isStatic );
if ( isStatic )
return Value( StaticString( memberName ) );
return Value( memberName );
#endif
char const* ValueIteratorBase::memberName() const {
const char* name = (*current_).first.data();
return name ? name : "";
}
UInt
ValueIteratorBase::index() const
{
#ifndef JSON_VALUE_USE_INTERNAL_MAP
const Value::CZString czstring = (*current_).first;
if ( !czstring.c_str() )
return czstring.index();
return Value::UInt( -1 );
#else
if ( isArray_ )
return Value::UInt( ValueInternalArray::indexOf( iterator_.array_ ) );
return Value::UInt( -1 );
#endif
char const* ValueIteratorBase::memberName(char const** end) const {
const char* name = (*current_).first.data();
if (!name) {
*end = NULL;
return NULL;
}
*end = name + (*current_).first.length();
return name;
}
const char *
ValueIteratorBase::memberName() const
{
#ifndef JSON_VALUE_USE_INTERNAL_MAP
const char *name = (*current_).first.c_str();
return name ? name : "";
#else
if ( !isArray_ )
return ValueInternalMap::key( iterator_.map_ );
return "";
#endif
}
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
@@ -214,35 +123,17 @@ ValueIteratorBase::memberName() const
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
ValueConstIterator::ValueConstIterator()
{
}
ValueConstIterator::ValueConstIterator() {}
ValueConstIterator::ValueConstIterator(
const Value::ObjectValues::iterator& current)
: ValueIteratorBase(current) {}
#ifndef JSON_VALUE_USE_INTERNAL_MAP
ValueConstIterator::ValueConstIterator( const Value::ObjectValues::iterator &current )
: ValueIteratorBase( current )
{
ValueConstIterator& ValueConstIterator::
operator=(const ValueIteratorBase& other) {
copy(other);
return *this;
}
#else
ValueConstIterator::ValueConstIterator( const ValueInternalArray::IteratorState &state )
: ValueIteratorBase( state )
{
}
ValueConstIterator::ValueConstIterator( const ValueInternalMap::IteratorState &state )
: ValueIteratorBase( state )
{
}
#endif
ValueConstIterator &
ValueConstIterator::operator =( const ValueIteratorBase &other )
{
copy( other );
return *this;
}
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
@@ -252,41 +143,20 @@ ValueConstIterator::operator =( const ValueIteratorBase &other )
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
ValueIterator::ValueIterator()
{
ValueIterator::ValueIterator() {}
ValueIterator::ValueIterator(const Value::ObjectValues::iterator& current)
: ValueIteratorBase(current) {}
ValueIterator::ValueIterator(const ValueConstIterator& other)
: ValueIteratorBase(other) {}
ValueIterator::ValueIterator(const ValueIterator& other)
: ValueIteratorBase(other) {}
ValueIterator& ValueIterator::operator=(const SelfType& other) {
copy(other);
return *this;
}
#ifndef JSON_VALUE_USE_INTERNAL_MAP
ValueIterator::ValueIterator( const Value::ObjectValues::iterator &current )
: ValueIteratorBase( current )
{
}
#else
ValueIterator::ValueIterator( const ValueInternalArray::IteratorState &state )
: ValueIteratorBase( state )
{
}
ValueIterator::ValueIterator( const ValueInternalMap::IteratorState &state )
: ValueIteratorBase( state )
{
}
#endif
ValueIterator::ValueIterator( const ValueConstIterator &other )
: ValueIteratorBase( other )
{
}
ValueIterator::ValueIterator( const ValueIterator &other )
: ValueIteratorBase( other )
{
}
ValueIterator &
ValueIterator::operator =( const SelfType &other )
{
copy( other );
return *this;
}
} // namespace Json

File diff suppressed because it is too large Load Diff

14
src/lib_json/version.h.in Normal file
View File

@@ -0,0 +1,14 @@
// DO NOT EDIT. This file is generated by CMake from "version"
// and "version.h.in" files.
// Run CMake configure step to update it.
#ifndef JSON_VERSION_H_INCLUDED
# define JSON_VERSION_H_INCLUDED
# define JSONCPP_VERSION_STRING "@JSONCPP_VERSION@"
# define JSONCPP_VERSION_MAJOR @JSONCPP_VERSION_MAJOR@
# define JSONCPP_VERSION_MINOR @JSONCPP_VERSION_MINOR@
# define JSONCPP_VERSION_PATCH @JSONCPP_VERSION_PATCH@
# define JSONCPP_VERSION_QUALIFIER
# define JSONCPP_VERSION_HEXA ((JSONCPP_VERSION_MAJOR << 24) | (JSONCPP_VERSION_MINOR << 16) | (JSONCPP_VERSION_PATCH << 8))
#endif // JSON_VERSION_H_INCLUDED

View File

@@ -0,0 +1,41 @@
# vim: et ts=4 sts=4 sw=4 tw=0
IF(JSONCPP_LIB_BUILD_SHARED)
ADD_DEFINITIONS( -DJSON_DLL )
ENDIF(JSONCPP_LIB_BUILD_SHARED)
ADD_EXECUTABLE( jsoncpp_test
jsontest.cpp
jsontest.h
main.cpp
)
IF(JSONCPP_LIB_BUILD_SHARED)
TARGET_LINK_LIBRARIES(jsoncpp_test jsoncpp_lib)
ELSE(JSONCPP_LIB_BUILD_SHARED)
TARGET_LINK_LIBRARIES(jsoncpp_test jsoncpp_lib_static)
ENDIF(JSONCPP_LIB_BUILD_SHARED)
# another way to solve issue #90
#set_target_properties(jsoncpp_test PROPERTIES COMPILE_FLAGS -ffloat-store)
# Run unit tests in post-build
# (default cmake workflow hides away the test result into a file, resulting in poor dev workflow?!?)
IF(JSONCPP_WITH_POST_BUILD_UNITTEST)
IF(JSONCPP_LIB_BUILD_SHARED)
# First, copy the shared lib, for Microsoft.
# Then, run the test executable.
ADD_CUSTOM_COMMAND( TARGET jsoncpp_test
POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy_if_different $<TARGET_FILE:jsoncpp_lib> $<TARGET_FILE_DIR:jsoncpp_test>
COMMAND $<TARGET_FILE:jsoncpp_test>)
ELSE(JSONCPP_LIB_BUILD_SHARED)
# Just run the test executable.
ADD_CUSTOM_COMMAND( TARGET jsoncpp_test
POST_BUILD
COMMAND $<TARGET_FILE:jsoncpp_test>)
ENDIF(JSONCPP_LIB_BUILD_SHARED)
ENDIF(JSONCPP_WITH_POST_BUILD_UNITTEST)
SET_TARGET_PROPERTIES(jsoncpp_test PROPERTIES OUTPUT_NAME jsoncpp_test)

View File

@@ -1,3 +1,8 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
#define _CRT_SECURE_NO_WARNINGS 1 // Prevents deprecation warning with MSVC
#include "jsontest.h"
#include <stdio.h>
@@ -5,599 +10,434 @@
#if defined(_MSC_VER)
// Used to install a report hook that prevent dialog on assertion and error.
# include <crtdbg.h>
#include <crtdbg.h>
#endif // if defined(_MSC_VER)
#if defined(_WIN32)
// Used to prevent dialog on memory fault.
// Limits headers included by Windows.h
# define WIN32_LEAN_AND_MEAN
# define NOSERVICE
# define NOMCX
# define NOIME
# define NOSOUND
# define NOCOMM
# define NORPC
# define NOGDI
# define NOUSER
# define NODRIVERS
# define NOLOGERROR
# define NOPROFILER
# define NOMEMMGR
# define NOLFILEIO
# define NOOPENFILE
# define NORESOURCE
# define NOATOM
# define NOLANGUAGE
# define NOLSTRING
# define NODBCS
# define NOKEYBOARDINFO
# define NOGDICAPMASKS
# define NOCOLOR
# define NOGDIOBJ
# define NODRAWTEXT
# define NOTEXTMETRIC
# define NOSCALABLEFONT
# define NOBITMAP
# define NORASTEROPS
# define NOMETAFILE
# define NOSYSMETRICS
# define NOSYSTEMPARAMSINFO
# define NOMSG
# define NOWINSTYLES
# define NOWINOFFSETS
# define NOSHOWWINDOW
# define NODEFERWINDOWPOS
# define NOVIRTUALKEYCODES
# define NOKEYSTATES
# define NOWH
# define NOMENUS
# define NOSCROLL
# define NOCLIPBOARD
# define NOICONS
# define NOMB
# define NOSYSCOMMANDS
# define NOMDI
# define NOCTLMGR
# define NOWINMESSAGES
# include <windows.h>
#define WIN32_LEAN_AND_MEAN
#define NOSERVICE
#define NOMCX
#define NOIME
#define NOSOUND
#define NOCOMM
#define NORPC
#define NOGDI
#define NOUSER
#define NODRIVERS
#define NOLOGERROR
#define NOPROFILER
#define NOMEMMGR
#define NOLFILEIO
#define NOOPENFILE
#define NORESOURCE
#define NOATOM
#define NOLANGUAGE
#define NOLSTRING
#define NODBCS
#define NOKEYBOARDINFO
#define NOGDICAPMASKS
#define NOCOLOR
#define NOGDIOBJ
#define NODRAWTEXT
#define NOTEXTMETRIC
#define NOSCALABLEFONT
#define NOBITMAP
#define NORASTEROPS
#define NOMETAFILE
#define NOSYSMETRICS
#define NOSYSTEMPARAMSINFO
#define NOMSG
#define NOWINSTYLES
#define NOWINOFFSETS
#define NOSHOWWINDOW
#define NODEFERWINDOWPOS
#define NOVIRTUALKEYCODES
#define NOKEYSTATES
#define NOWH
#define NOMENUS
#define NOSCROLL
#define NOCLIPBOARD
#define NOICONS
#define NOMB
#define NOSYSCOMMANDS
#define NOMDI
#define NOCTLMGR
#define NOWINMESSAGES
#include <windows.h>
#endif // if defined(_WIN32)
namespace JsonTest {
// class TestResult
// //////////////////////////////////////////////////////////////////
TestResult::TestResult()
: predicateId_( 1 )
, lastUsedPredicateId_( 0 )
, messageTarget_( 0 )
{
// The root predicate has id 0
rootPredicateNode_.id_ = 0;
rootPredicateNode_.next_ = 0;
predicateStackTail_ = &rootPredicateNode_;
: predicateId_(1), lastUsedPredicateId_(0), messageTarget_(0) {
// The root predicate has id 0
rootPredicateNode_.id_ = 0;
rootPredicateNode_.next_ = 0;
predicateStackTail_ = &rootPredicateNode_;
}
void TestResult::setTestName(const std::string& name) { name_ = name; }
void
TestResult::setTestName( const std::string &name )
{
name_ = name;
TestResult&
TestResult::addFailure(const char* file, unsigned int line, const char* expr) {
/// Walks the PredicateContext stack adding them to failures_ if not already
/// added.
unsigned int nestingLevel = 0;
PredicateContext* lastNode = rootPredicateNode_.next_;
for (; lastNode != 0; lastNode = lastNode->next_) {
if (lastNode->id_ > lastUsedPredicateId_) // new PredicateContext
{
lastUsedPredicateId_ = lastNode->id_;
addFailureInfo(
lastNode->file_, lastNode->line_, lastNode->expr_, nestingLevel);
// Link the PredicateContext to the failure for message target when
// popping the PredicateContext.
lastNode->failure_ = &(failures_.back());
}
++nestingLevel;
}
// Adds the failed assertion
addFailureInfo(file, line, expr, nestingLevel);
messageTarget_ = &(failures_.back());
return *this;
}
TestResult &
TestResult::addFailure( const char *file, unsigned int line,
const char *expr )
{
/// Walks the PredicateContext stack adding them to failures_ if not already added.
unsigned int nestingLevel = 0;
PredicateContext *lastNode = rootPredicateNode_.next_;
for ( ; lastNode != 0; lastNode = lastNode->next_ )
{
if ( lastNode->id_ > lastUsedPredicateId_ ) // new PredicateContext
{
lastUsedPredicateId_ = lastNode->id_;
addFailureInfo( lastNode->file_, lastNode->line_, lastNode->expr_,
nestingLevel );
// Link the PredicateContext to the failure for message target when
// popping the PredicateContext.
lastNode->failure_ = &( failures_.back() );
}
++nestingLevel;
}
// Adds the failed assertion
addFailureInfo( file, line, expr, nestingLevel );
messageTarget_ = &( failures_.back() );
return *this;
void TestResult::addFailureInfo(const char* file,
unsigned int line,
const char* expr,
unsigned int nestingLevel) {
Failure failure;
failure.file_ = file;
failure.line_ = line;
if (expr) {
failure.expr_ = expr;
}
failure.nestingLevel_ = nestingLevel;
failures_.push_back(failure);
}
void
TestResult::addFailureInfo( const char *file, unsigned int line,
const char *expr, unsigned int nestingLevel )
{
Failure failure;
failure.file_ = file;
failure.line_ = line;
if ( expr )
{
failure.expr_ = expr;
}
failure.nestingLevel_ = nestingLevel;
failures_.push_back( failure );
TestResult& TestResult::popPredicateContext() {
PredicateContext* lastNode = &rootPredicateNode_;
while (lastNode->next_ != 0 && lastNode->next_->next_ != 0) {
lastNode = lastNode->next_;
}
// Set message target to popped failure
PredicateContext* tail = lastNode->next_;
if (tail != 0 && tail->failure_ != 0) {
messageTarget_ = tail->failure_;
}
// Remove tail from list
predicateStackTail_ = lastNode;
lastNode->next_ = 0;
return *this;
}
bool TestResult::failed() const { return !failures_.empty(); }
TestResult &
TestResult::popPredicateContext()
{
PredicateContext *lastNode = &rootPredicateNode_;
while ( lastNode->next_ != 0 && lastNode->next_->next_ != 0 )
{
lastNode = lastNode->next_;
}
// Set message target to popped failure
PredicateContext *tail = lastNode->next_;
if ( tail != 0 && tail->failure_ != 0 )
{
messageTarget_ = tail->failure_;
}
// Remove tail from list
predicateStackTail_ = lastNode;
lastNode->next_ = 0;
return *this;
unsigned int TestResult::getAssertionNestingLevel() const {
unsigned int level = 0;
const PredicateContext* lastNode = &rootPredicateNode_;
while (lastNode->next_ != 0) {
lastNode = lastNode->next_;
++level;
}
return level;
}
void TestResult::printFailure(bool printTestName) const {
if (failures_.empty()) {
return;
}
bool
TestResult::failed() const
{
return !failures_.empty();
if (printTestName) {
printf("* Detail of %s test failure:\n", name_.c_str());
}
// Print in reverse to display the callstack in the right order
Failures::const_iterator itEnd = failures_.end();
for (Failures::const_iterator it = failures_.begin(); it != itEnd; ++it) {
const Failure& failure = *it;
std::string indent(failure.nestingLevel_ * 2, ' ');
if (failure.file_) {
printf("%s%s(%d): ", indent.c_str(), failure.file_, failure.line_);
}
if (!failure.expr_.empty()) {
printf("%s\n", failure.expr_.c_str());
} else if (failure.file_) {
printf("\n");
}
if (!failure.message_.empty()) {
std::string reindented = indentText(failure.message_, indent + " ");
printf("%s\n", reindented.c_str());
}
}
}
unsigned int
TestResult::getAssertionNestingLevel() const
{
unsigned int level = 0;
const PredicateContext *lastNode = &rootPredicateNode_;
while ( lastNode->next_ != 0 )
{
lastNode = lastNode->next_;
++level;
}
return level;
std::string TestResult::indentText(const std::string& text,
const std::string& indent) {
std::string reindented;
std::string::size_type lastIndex = 0;
while (lastIndex < text.size()) {
std::string::size_type nextIndex = text.find('\n', lastIndex);
if (nextIndex == std::string::npos) {
nextIndex = text.size() - 1;
}
reindented += indent;
reindented += text.substr(lastIndex, nextIndex - lastIndex + 1);
lastIndex = nextIndex + 1;
}
return reindented;
}
void
TestResult::printFailure( bool printTestName ) const
{
if ( failures_.empty() )
{
return;
}
if ( printTestName )
{
printf( "* Detail of %s test failure:\n", name_.c_str() );
}
// Print in reverse to display the callstack in the right order
Failures::const_iterator itEnd = failures_.end();
for ( Failures::const_iterator it = failures_.begin(); it != itEnd; ++it )
{
const Failure &failure = *it;
std::string indent( failure.nestingLevel_ * 2, ' ' );
if ( failure.file_ )
{
printf( "%s%s(%d): ", indent.c_str(), failure.file_, failure.line_ );
}
if ( !failure.expr_.empty() )
{
printf( "%s\n", failure.expr_.c_str() );
}
else if ( failure.file_ )
{
printf( "\n" );
}
if ( !failure.message_.empty() )
{
std::string reindented = indentText( failure.message_, indent + " " );
printf( "%s\n", reindented.c_str() );
}
}
TestResult& TestResult::addToLastFailure(const std::string& message) {
if (messageTarget_ != 0) {
messageTarget_->message_ += message;
}
return *this;
}
std::string
TestResult::indentText( const std::string &text,
const std::string &indent )
{
std::string reindented;
std::string::size_type lastIndex = 0;
while ( lastIndex < text.size() )
{
std::string::size_type nextIndex = text.find( '\n', lastIndex );
if ( nextIndex == std::string::npos )
{
nextIndex = text.size() - 1;
}
reindented += indent;
reindented += text.substr( lastIndex, nextIndex - lastIndex + 1 );
lastIndex = nextIndex + 1;
}
return reindented;
TestResult& TestResult::operator<<(Json::Int64 value) {
return addToLastFailure(Json::valueToString(value));
}
TestResult &
TestResult::addToLastFailure( const std::string &message )
{
if ( messageTarget_ != 0 )
{
messageTarget_->message_ += message;
}
return *this;
TestResult& TestResult::operator<<(Json::UInt64 value) {
return addToLastFailure(Json::valueToString(value));
}
TestResult &
TestResult::operator << ( bool value )
{
return addToLastFailure( value ? "true" : "false" );
TestResult& TestResult::operator<<(bool value) {
return addToLastFailure(value ? "true" : "false");
}
TestResult &
TestResult::operator << ( int value )
{
char buffer[32];
sprintf( buffer, "%d", value );
return addToLastFailure( buffer );
}
TestResult &
TestResult::operator << ( unsigned int value )
{
char buffer[32];
sprintf( buffer, "%u", value );
return addToLastFailure( buffer );
}
TestResult &
TestResult::operator << ( double value )
{
char buffer[32];
sprintf( buffer, "%16g", value );
return addToLastFailure( buffer );
}
TestResult &
TestResult::operator << ( const char *value )
{
return addToLastFailure( value ? value
: "<NULL>" );
}
TestResult &
TestResult::operator << ( const std::string &value )
{
return addToLastFailure( value );
}
// class TestCase
// //////////////////////////////////////////////////////////////////
TestCase::TestCase()
: result_( 0 )
{
TestCase::TestCase() : result_(0) {}
TestCase::~TestCase() {}
void TestCase::run(TestResult& result) {
result_ = &result;
runTestCase();
}
TestCase::~TestCase()
{
}
void
TestCase::run( TestResult &result )
{
result_ = &result;
runTestCase();
}
// class Runner
// //////////////////////////////////////////////////////////////////
Runner::Runner()
{
Runner::Runner() {}
Runner& Runner::add(TestCaseFactory factory) {
tests_.push_back(factory);
return *this;
}
Runner &
Runner::add( TestCaseFactory factory )
{
tests_.push_back( factory );
return *this;
unsigned int Runner::testCount() const {
return static_cast<unsigned int>(tests_.size());
}
unsigned int
Runner::testCount() const
{
return static_cast<unsigned int>( tests_.size() );
std::string Runner::testNameAt(unsigned int index) const {
TestCase* test = tests_[index]();
std::string name = test->testName();
delete test;
return name;
}
std::string
Runner::testNameAt( unsigned int index ) const
{
TestCase *test = tests_[index]();
std::string name = test->testName();
delete test;
return name;
}
void
Runner::runTestAt( unsigned int index, TestResult &result ) const
{
TestCase *test = tests_[index]();
result.setTestName( test->testName() );
printf( "Testing %s: ", test->testName() );
fflush( stdout );
void Runner::runTestAt(unsigned int index, TestResult& result) const {
TestCase* test = tests_[index]();
result.setTestName(test->testName());
printf("Testing %s: ", test->testName());
fflush(stdout);
#if JSON_USE_EXCEPTION
try
{
try {
#endif // if JSON_USE_EXCEPTION
test->run( result );
test->run(result);
#if JSON_USE_EXCEPTION
}
catch ( const std::exception &e )
{
result.addFailure( __FILE__, __LINE__,
"Unexpected exception caugth:" ) << e.what();
}
}
catch (const std::exception& e) {
result.addFailure(__FILE__, __LINE__, "Unexpected exception caught:")
<< e.what();
}
#endif // if JSON_USE_EXCEPTION
delete test;
const char *status = result.failed() ? "FAILED"
: "OK";
printf( "%s\n", status );
fflush( stdout );
delete test;
const char* status = result.failed() ? "FAILED" : "OK";
printf("%s\n", status);
fflush(stdout);
}
bool Runner::runAllTest(bool printSummary) const {
unsigned int count = testCount();
std::deque<TestResult> failures;
for (unsigned int index = 0; index < count; ++index) {
TestResult result;
runTestAt(index, result);
if (result.failed()) {
failures.push_back(result);
}
}
bool
Runner::runAllTest( bool printSummary ) const
{
unsigned int count = testCount();
std::deque<TestResult> failures;
for ( unsigned int index = 0; index < count; ++index )
{
TestResult result;
runTestAt( index, result );
if ( result.failed() )
{
failures.push_back( result );
}
}
if (failures.empty()) {
if (printSummary) {
printf("All %d tests passed\n", count);
}
return true;
} else {
for (unsigned int index = 0; index < failures.size(); ++index) {
TestResult& result = failures[index];
result.printFailure(count > 1);
}
if ( failures.empty() )
{
if ( printSummary )
{
printf( "All %d tests passed\n", count );
}
if (printSummary) {
unsigned int failedCount = static_cast<unsigned int>(failures.size());
unsigned int passedCount = count - failedCount;
printf("%d/%d tests passed (%d failure(s))\n",
passedCount,
count,
failedCount);
}
return false;
}
}
bool Runner::testIndex(const std::string& testName,
unsigned int& indexOut) const {
unsigned int count = testCount();
for (unsigned int index = 0; index < count; ++index) {
if (testNameAt(index) == testName) {
indexOut = index;
return true;
}
else
{
for ( unsigned int index = 0; index < failures.size(); ++index )
{
TestResult &result = failures[index];
result.printFailure( count > 1 );
}
if ( printSummary )
{
unsigned int failedCount = static_cast<unsigned int>( failures.size() );
unsigned int passedCount = count - failedCount;
printf( "%d/%d tests passed (%d failure(s))\n", passedCount, count, failedCount );
}
return false;
}
}
}
return false;
}
bool
Runner::testIndex( const std::string &testName,
unsigned int &indexOut ) const
{
unsigned int count = testCount();
for ( unsigned int index = 0; index < count; ++index )
{
if ( testNameAt(index) == testName )
{
indexOut = index;
return true;
}
}
return false;
void Runner::listTests() const {
unsigned int count = testCount();
for (unsigned int index = 0; index < count; ++index) {
printf("%s\n", testNameAt(index).c_str());
}
}
void
Runner::listTests() const
{
unsigned int count = testCount();
for ( unsigned int index = 0; index < count; ++index )
{
printf( "%s\n", testNameAt( index ).c_str() );
}
int Runner::runCommandLine(int argc, const char* argv[]) const {
// typedef std::deque<std::string> TestNames;
Runner subrunner;
for (int index = 1; index < argc; ++index) {
std::string opt = argv[index];
if (opt == "--list-tests") {
listTests();
return 0;
} else if (opt == "--test-auto") {
preventDialogOnCrash();
} else if (opt == "--test") {
++index;
if (index < argc) {
unsigned int testNameIndex;
if (testIndex(argv[index], testNameIndex)) {
subrunner.add(tests_[testNameIndex]);
} else {
fprintf(stderr, "Test '%s' does not exist!\n", argv[index]);
return 2;
}
} else {
printUsage(argv[0]);
return 2;
}
} else {
printUsage(argv[0]);
return 2;
}
}
bool succeeded;
if (subrunner.testCount() > 0) {
succeeded = subrunner.runAllTest(subrunner.testCount() > 1);
} else {
succeeded = runAllTest(true);
}
return succeeded ? 0 : 1;
}
int
Runner::runCommandLine( int argc, const char *argv[] ) const
{
typedef std::deque<std::string> TestNames;
Runner subrunner;
for ( int index = 1; index < argc; ++index )
{
std::string opt = argv[index];
if ( opt == "--list-tests" )
{
listTests();
return 0;
}
else if ( opt == "--test-auto" )
{
preventDialogOnCrash();
}
else if ( opt == "--test" )
{
++index;
if ( index < argc )
{
unsigned int testNameIndex;
if ( testIndex( argv[index], testNameIndex ) )
{
subrunner.add( tests_[testNameIndex] );
}
else
{
fprintf( stderr, "Test '%s' does not exist!\n", argv[index] );
return 2;
}
}
else
{
printUsage( argv[0] );
return 2;
}
}
else
{
printUsage( argv[0] );
return 2;
}
}
bool succeeded;
if ( subrunner.testCount() > 0 )
{
succeeded = subrunner.runAllTest( subrunner.testCount() > 1 );
}
else
{
succeeded = runAllTest( true );
}
return succeeded ? 0
: 1;
}
#if defined(_MSC_VER)
#if defined(_MSC_VER) && defined(_DEBUG)
// Hook MSVCRT assertions to prevent dialog from appearing
static int
msvcrtSilentReportHook( int reportType, char *message, int *returnValue )
{
// The default CRT handling of error and assertion is to display
// an error dialog to the user.
// Instead, when an error or an assertion occurs, we force the
// application to terminate using abort() after display
// the message on stderr.
if ( reportType == _CRT_ERROR ||
reportType == _CRT_ASSERT )
{
// calling abort() cause the ReportHook to be called
// The following is used to detect this case and let's the
// error handler fallback on its default behaviour (
// display a warning message)
static volatile bool isAborting = false;
if ( isAborting )
{
return TRUE;
}
isAborting = true;
static int
msvcrtSilentReportHook(int reportType, char* message, int* /*returnValue*/) {
// The default CRT handling of error and assertion is to display
// an error dialog to the user.
// Instead, when an error or an assertion occurs, we force the
// application to terminate using abort() after display
// the message on stderr.
if (reportType == _CRT_ERROR || reportType == _CRT_ASSERT) {
// calling abort() cause the ReportHook to be called
// The following is used to detect this case and let's the
// error handler fallback on its default behaviour (
// display a warning message)
static volatile bool isAborting = false;
if (isAborting) {
return TRUE;
}
isAborting = true;
fprintf( stderr, "CRT Error/Assert:\n%s\n", message );
fflush( stderr );
abort();
}
// Let's other reportType (_CRT_WARNING) be handled as they would by default
return FALSE;
fprintf(stderr, "CRT Error/Assert:\n%s\n", message);
fflush(stderr);
abort();
}
// Let's other reportType (_CRT_WARNING) be handled as they would by default
return FALSE;
}
#endif // if defined(_MSC_VER)
void
Runner::preventDialogOnCrash()
{
#if defined(_MSC_VER)
// Install a hook to prevent MSVCRT error and assertion from
// popping a dialog.
_CrtSetReportHook( &msvcrtSilentReportHook );
void Runner::preventDialogOnCrash() {
#if defined(_MSC_VER) && defined(_DEBUG)
// Install a hook to prevent MSVCRT error and assertion from
// popping a dialog
// This function a NO-OP in release configuration
// (which cause warning since msvcrtSilentReportHook is not referenced)
_CrtSetReportHook(&msvcrtSilentReportHook);
#endif // if defined(_MSC_VER)
// @todo investiguate this handler (for buffer overflow)
// _set_security_error_handler
// @todo investiguate this handler (for buffer overflow)
// _set_security_error_handler
#if defined(_WIN32)
// Prevents the system from popping a dialog for debugging if the
// application fails due to invalid memory access.
SetErrorMode( SEM_FAILCRITICALERRORS
| SEM_NOGPFAULTERRORBOX
| SEM_NOOPENFILEERRORBOX );
// Prevents the system from popping a dialog for debugging if the
// application fails due to invalid memory access.
SetErrorMode(SEM_FAILCRITICALERRORS | SEM_NOGPFAULTERRORBOX |
SEM_NOOPENFILEERRORBOX);
#endif // if defined(_WIN32)
}
void
Runner::printUsage( const char *appName )
{
printf(
"Usage: %s [options]\n"
"\n"
"If --test is not specified, then all the test cases be run.\n"
"\n"
"Valid options:\n"
"--list-tests: print the name of all test cases on the standard\n"
" output and exit.\n"
"--test TESTNAME: executes the test case with the specified name.\n"
" May be repeated.\n"
"--test-auto: prevent dialog prompting for debugging on crash.\n"
, appName );
void Runner::printUsage(const char* appName) {
printf("Usage: %s [options]\n"
"\n"
"If --test is not specified, then all the test cases be run.\n"
"\n"
"Valid options:\n"
"--list-tests: print the name of all test cases on the standard\n"
" output and exit.\n"
"--test TESTNAME: executes the test case with the specified name.\n"
" May be repeated.\n"
"--test-auto: prevent dialog prompting for debugging on crash.\n",
appName);
}
// Assertion functions
// //////////////////////////////////////////////////////////////////
TestResult &
checkStringEqual( TestResult &result,
const std::string &expected, const std::string &actual,
const char *file, unsigned int line, const char *expr )
{
if ( expected != actual )
{
result.addFailure( file, line, expr );
result << "Expected: '" << expected << "'\n";
result << "Actual : '" << actual << "'";
}
return result;
TestResult& checkStringEqual(TestResult& result,
const std::string& expected,
const std::string& actual,
const char* file,
unsigned int line,
const char* expr) {
if (expected != actual) {
result.addFailure(file, line, expr);
result << "Expected: '" << expected << "'\n";
result << "Actual : '" << actual << "'";
}
return result;
}
} // namespace JsonTest

View File

@@ -1,10 +1,18 @@
#ifndef JSONTEST_H_INCLUDED
# define JSONTEST_H_INCLUDED
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
# include <json/config.h>
# include <stdio.h>
# include <deque>
# include <string>
#ifndef JSONTEST_H_INCLUDED
#define JSONTEST_H_INCLUDED
#include <json/config.h>
#include <json/value.h>
#include <json/writer.h>
#include <stdio.h>
#include <deque>
#include <sstream>
#include <string>
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
@@ -12,8 +20,6 @@
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
/** \brief Unit testing framework.
* \warning: all assertions are non-aborting, test case execution will continue
* even if an assertion namespace.
@@ -22,233 +28,253 @@
*/
namespace JsonTest {
class Failure {
public:
const char* file_;
unsigned int line_;
std::string expr_;
std::string message_;
unsigned int nestingLevel_;
};
class Failure
{
public:
const char *file_;
unsigned int line_;
std::string expr_;
std::string message_;
unsigned int nestingLevel_;
};
/// Context used to create the assertion callstack on failure.
/// Must be a POD to allow inline initialisation without stepping
/// into the debugger.
struct PredicateContext {
typedef unsigned int Id;
Id id_;
const char* file_;
unsigned int line_;
const char* expr_;
PredicateContext* next_;
/// Related Failure, set when the PredicateContext is converted
/// into a Failure.
Failure* failure_;
};
class TestResult {
public:
TestResult();
/// Context used to create the assertion callstack on failure.
/// Must be a POD to allow inline initialisation without stepping
/// into the debugger.
struct PredicateContext
{
typedef unsigned int Id;
Id id_;
const char *file_;
unsigned int line_;
const char *expr_;
PredicateContext *next_;
/// Related Failure, set when the PredicateContext is converted
/// into a Failure.
Failure *failure_;
};
/// \internal Implementation detail for assertion macros
/// Not encapsulated to prevent step into when debugging failed assertions
/// Incremented by one on assertion predicate entry, decreased by one
/// by addPredicateContext().
PredicateContext::Id predicateId_;
class TestResult
{
public:
TestResult();
/// \internal Implementation detail for predicate macros
PredicateContext* predicateStackTail_;
/// \internal Implementation detail for assertion macros
/// Not encapsulated to prevent step into when debugging failed assertions
/// Incremented by one on assertion predicate entry, decreased by one
/// by addPredicateContext().
PredicateContext::Id predicateId_;
void setTestName(const std::string& name);
/// \internal Implementation detail for predicate macros
PredicateContext *predicateStackTail_;
/// Adds an assertion failure.
TestResult&
addFailure(const char* file, unsigned int line, const char* expr = 0);
void setTestName( const std::string &name );
/// Removes the last PredicateContext added to the predicate stack
/// chained list.
/// Next messages will be targed at the PredicateContext that was removed.
TestResult& popPredicateContext();
/// Adds an assertion failure.
TestResult &addFailure( const char *file, unsigned int line,
const char *expr = 0 );
bool failed() const;
/// Removes the last PredicateContext added to the predicate stack
/// chained list.
/// Next messages will be targed at the PredicateContext that was removed.
TestResult &popPredicateContext();
void printFailure(bool printTestName) const;
bool failed() const;
// Generic operator that will work with anything ostream can deal with.
template <typename T> TestResult& operator<<(const T& value) {
std::ostringstream oss;
oss.precision(16);
oss.setf(std::ios_base::floatfield);
oss << value;
return addToLastFailure(oss.str());
}
void printFailure( bool printTestName ) const;
// Specialized versions.
TestResult& operator<<(bool value);
// std:ostream does not support 64bits integers on all STL implementation
TestResult& operator<<(Json::Int64 value);
TestResult& operator<<(Json::UInt64 value);
TestResult &operator << ( bool value );
TestResult &operator << ( int value );
TestResult &operator << ( unsigned int value );
TestResult &operator << ( double value );
TestResult &operator << ( const char *value );
TestResult &operator << ( const std::string &value );
private:
TestResult& addToLastFailure(const std::string& message);
unsigned int getAssertionNestingLevel() const;
/// Adds a failure or a predicate context
void addFailureInfo(const char* file,
unsigned int line,
const char* expr,
unsigned int nestingLevel);
static std::string indentText(const std::string& text,
const std::string& indent);
private:
TestResult &addToLastFailure( const std::string &message );
unsigned int getAssertionNestingLevel() const;
/// Adds a failure or a predicate context
void addFailureInfo( const char *file, unsigned int line,
const char *expr, unsigned int nestingLevel );
static std::string indentText( const std::string &text,
const std::string &indent );
typedef std::deque<Failure> Failures;
Failures failures_;
std::string name_;
PredicateContext rootPredicateNode_;
PredicateContext::Id lastUsedPredicateId_;
/// Failure which is the target of the messages added using operator <<
Failure* messageTarget_;
};
typedef std::deque<Failure> Failures;
Failures failures_;
std::string name_;
PredicateContext rootPredicateNode_;
PredicateContext::Id lastUsedPredicateId_;
/// Failure which is the target of the messages added using operator <<
Failure *messageTarget_;
};
class TestCase {
public:
TestCase();
virtual ~TestCase();
class TestCase
{
public:
TestCase();
void run(TestResult& result);
virtual ~TestCase();
virtual const char* testName() const = 0;
void run( TestResult &result );
protected:
TestResult* result_;
virtual const char *testName() const = 0;
private:
virtual void runTestCase() = 0;
};
protected:
TestResult *result_;
/// Function pointer type for TestCase factory
typedef TestCase* (*TestCaseFactory)();
private:
virtual void runTestCase() = 0;
};
class Runner {
public:
Runner();
/// Function pointer type for TestCase factory
typedef TestCase *(*TestCaseFactory)();
/// Adds a test to the suite
Runner& add(TestCaseFactory factory);
class Runner
{
public:
Runner();
/// Runs test as specified on the command-line
/// If no command-line arguments are provided, run all tests.
/// If --list-tests is provided, then print the list of all test cases
/// If --test <testname> is provided, then run test testname.
int runCommandLine(int argc, const char* argv[]) const;
/// Adds a test to the suite
Runner &add( TestCaseFactory factory );
/// Runs all the test cases
bool runAllTest(bool printSummary) const;
/// Runs test as specified on the command-line
/// If no command-line arguments are provided, run all tests.
/// If --list-tests is provided, then print the list of all test cases
/// If --test <testname> is provided, then run test testname.
int runCommandLine( int argc, const char *argv[] ) const;
/// Returns the number of test case in the suite
unsigned int testCount() const;
/// Runs all the test cases
bool runAllTest( bool printSummary ) const;
/// Returns the name of the test case at the specified index
std::string testNameAt(unsigned int index) const;
/// Returns the number of test case in the suite
unsigned int testCount() const;
/// Runs the test case at the specified index using the specified TestResult
void runTestAt(unsigned int index, TestResult& result) const;
/// Returns the name of the test case at the specified index
std::string testNameAt( unsigned int index ) const;
static void printUsage(const char* appName);
/// Runs the test case at the specified index using the specified TestResult
void runTestAt( unsigned int index, TestResult &result ) const;
private: // prevents copy construction and assignment
Runner(const Runner& other);
Runner& operator=(const Runner& other);
static void printUsage( const char *appName );
private:
void listTests() const;
bool testIndex(const std::string& testName, unsigned int& index) const;
static void preventDialogOnCrash();
private: // prevents copy construction and assignment
Runner( const Runner &other );
Runner &operator =( const Runner &other );
private:
typedef std::deque<TestCaseFactory> Factories;
Factories tests_;
};
private:
void listTests() const;
bool testIndex( const std::string &testName, unsigned int &index ) const;
static void preventDialogOnCrash();
template <typename T, typename U>
TestResult& checkEqual(TestResult& result,
T expected,
U actual,
const char* file,
unsigned int line,
const char* expr) {
if (static_cast<U>(expected) != actual) {
result.addFailure(file, line, expr);
result << "Expected: " << static_cast<U>(expected) << "\n";
result << "Actual : " << actual;
}
return result;
}
private:
typedef std::deque<TestCaseFactory> Factories;
Factories tests_;
};
template<typename T>
TestResult &
checkEqual( TestResult &result, const T &expected, const T &actual,
const char *file, unsigned int line, const char *expr )
{
if ( expected != actual )
{
result.addFailure( file, line, expr );
result << "Expected: " << expected << "\n";
result << "Actual : " << actual;
}
return result;
}
TestResult &
checkStringEqual( TestResult &result,
const std::string &expected, const std::string &actual,
const char *file, unsigned int line, const char *expr );
TestResult& checkStringEqual(TestResult& result,
const std::string& expected,
const std::string& actual,
const char* file,
unsigned int line,
const char* expr);
} // namespace JsonTest
/// \brief Asserts that the given expression is true.
/// JSONTEST_ASSERT( x == y ) << "x=" << x << ", y=" << y;
/// JSONTEST_ASSERT( x == y );
#define JSONTEST_ASSERT( expr ) \
if ( condition ) \
{ \
} \
else \
result_->addFailure( __FILE__, __LINE__, #expr )
#define JSONTEST_ASSERT(expr) \
if (expr) { \
} else \
result_->addFailure(__FILE__, __LINE__, #expr)
/// \brief Asserts that the given predicate is true.
/// The predicate may do other assertions and be a member function of the fixture.
#define JSONTEST_ASSERT_PRED( expr ) \
{ \
JsonTest::PredicateContext _minitest_Context = { \
result_->predicateId_, __FILE__, __LINE__, #expr }; \
result_->predicateStackTail_->next_ = &_minitest_Context; \
result_->predicateId_ += 1; \
result_->predicateStackTail_ = &_minitest_Context; \
(expr); \
result_->popPredicateContext(); \
} \
*result_
/// The predicate may do other assertions and be a member function of the
/// fixture.
#define JSONTEST_ASSERT_PRED(expr) \
{ \
JsonTest::PredicateContext _minitest_Context = { \
result_->predicateId_, __FILE__, __LINE__, #expr, NULL, NULL \
}; \
result_->predicateStackTail_->next_ = &_minitest_Context; \
result_->predicateId_ += 1; \
result_->predicateStackTail_ = &_minitest_Context; \
(expr); \
result_->popPredicateContext(); \
}
/// \brief Asserts that two values are equals.
#define JSONTEST_ASSERT_EQUAL( expected, actual ) \
JsonTest::checkEqual( *result_, expected, actual, \
__FILE__, __LINE__, \
#expected " == " #actual )
#define JSONTEST_ASSERT_EQUAL(expected, actual) \
JsonTest::checkEqual(*result_, \
expected, \
actual, \
__FILE__, \
__LINE__, \
#expected " == " #actual)
/// \brief Asserts that two values are equals.
#define JSONTEST_ASSERT_STRING_EQUAL( expected, actual ) \
JsonTest::checkStringEqual( *result_, \
std::string(expected), std::string(actual), \
#expected " == " #actual )
#define JSONTEST_ASSERT_STRING_EQUAL(expected, actual) \
JsonTest::checkStringEqual(*result_, \
std::string(expected), \
std::string(actual), \
__FILE__, \
__LINE__, \
#expected " == " #actual)
/// \brief Asserts that a given expression throws an exception
#define JSONTEST_ASSERT_THROWS(expr) \
{ \
bool _threw = false; \
try { \
expr; \
} \
catch (...) { \
_threw = true; \
} \
if (!_threw) \
result_->addFailure( \
__FILE__, __LINE__, "expected exception thrown: " #expr); \
}
/// \brief Begin a fixture test case.
#define JSONTEST_FIXTURE( FixtureType, name ) \
class Test##FixtureType##name : public FixtureType \
{ \
public: \
static JsonTest::TestCase *factory() \
{ \
return new Test##FixtureType##name(); \
} \
public: /* overidden from TestCase */ \
virtual const char *testName() const \
{ \
return #FixtureType "/" #name; \
} \
virtual void runTestCase(); \
}; \
\
void Test##FixtureType##name::runTestCase()
#define JSONTEST_FIXTURE(FixtureType, name) \
class Test##FixtureType##name : public FixtureType { \
public: \
static JsonTest::TestCase* factory() { \
return new Test##FixtureType##name(); \
} \
\
public: /* overidden from TestCase */ \
virtual const char* testName() const { return #FixtureType "/" #name; } \
virtual void runTestCase(); \
}; \
\
void Test##FixtureType##name::runTestCase()
#define JSONTEST_FIXTURE_FACTORY( FixtureType, name ) \
&Test##FixtureType##name::factory
#define JSONTEST_FIXTURE_FACTORY(FixtureType, name) \
&Test##FixtureType##name::factory
#define JSONTEST_REGISTER_FIXTURE( runner, FixtureType, name ) \
(runner).add( JSONTEST_FIXTURE_FACTORY( FixtureType, name ) )
#define JSONTEST_REGISTER_FIXTURE(runner, FixtureType, name) \
(runner).add(JSONTEST_FIXTURE_FACTORY(FixtureType, name))
#endif // ifndef JSONTEST_H_INCLUDED

File diff suppressed because it is too large Load Diff

View File

@@ -4,7 +4,7 @@ import os
paths = []
for pattern in [ '*.actual', '*.actual-rewrite', '*.rewrite', '*.process-output' ]:
paths += glob.glob( 'data/' + pattern )
paths += glob.glob('data/' + pattern)
for path in paths:
os.unlink( path )
os.unlink(path)

View File

@@ -0,0 +1 @@
[ 1 2 3]

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

View File

@@ -1,2 +1,3 @@
// C++ style comment
.=null

View File

@@ -1,2 +1,4 @@
/* C style comment
*/
.=null

View File

@@ -0,0 +1,4 @@
// Comment for array
.=[]
// Comment within array
.[0]="one-element"

View File

@@ -0,0 +1,5 @@
// Comment for array
[
// Comment within array
"one-element"
]

View File

@@ -1,8 +1,10 @@
.={}
.test=[]
.test[0]={}
.test[0].a="aaa"
.test[1]={}
.test[1].b="bbb"
.test[2]={}
.test[2].c="ccc"
.={}
// Comment for array
.test=[]
// Comment within array
.test[0]={}
.test[0].a="aaa"
.test[1]={}
.test[1].b="bbb"
.test[2]={}
.test[2].c="ccc"

View File

@@ -1,8 +1,10 @@
{
"test":
[
{ "a" : "aaa" }, // Comment for a
{ "b" : "bbb" }, // Comment for b
{ "c" : "ccc" } // Comment for c
]
}
{
"test":
// Comment for array
[
// Comment within array
{ "a" : "aaa" }, // Comment for a
{ "b" : "bbb" }, // Comment for b
{ "c" : "ccc" } // Comment for c
]
}

View File

@@ -0,0 +1,23 @@
.={}
/* C-style comment
C-style-2 comment */
.c-test={}
.c-test.a=1
/* Internal comment c-style */
.c-test.b=2
// C++-style comment
.cpp-test={}
// Multiline comment cpp-style
// Second line
.cpp-test.c=3
// Comment before double
.cpp-test.d=4.1
// Comment before string
.cpp-test.e="e-string"
// Comment before true
.cpp-test.f=true
// Comment before false
.cpp-test.g=false
// Comment before null
.cpp-test.h=null

View File

@@ -0,0 +1,26 @@
{
/* C-style comment
C-style-2 comment */
"c-test" : {
"a" : 1,
/* Internal comment c-style */
"b" : 2
},
// C++-style comment
"cpp-test" : {
// Multiline comment cpp-style
// Second line
"c" : 3,
// Comment before double
"d" : 4.1,
// Comment before string
"e" : "e-string",
// Comment before true
"f" : true,
// Comment before false
"g" : false,
// Comment before null
"h" : null
}
}

View File

@@ -1 +1,2 @@
// Max signed integer
.=2147483647

View File

@@ -1 +1,2 @@
// Min signed integer
.=-2147483648

View File

@@ -1 +1,2 @@
// Max unsigned integer
.=4294967295

View File

@@ -1,2 +1,3 @@
// Min unsigned integer
.=0

View File

@@ -0,0 +1 @@
.=9223372036854775808

View File

@@ -0,0 +1,2 @@
9223372036854775808

View File

@@ -0,0 +1 @@
.=-9223372036854775808

View File

@@ -0,0 +1,2 @@
-9223372036854775808

View File

@@ -0,0 +1 @@
.=18446744073709551615

View File

@@ -0,0 +1,2 @@
18446744073709551615

View File

@@ -1,3 +1,11 @@
/* A comment
at the beginning of the file.
*/
.={}
.first=1
/* Comment before 'second'
*/
.second=2
/* A comment at
the end of the file.
*/

View File

@@ -1,2 +1,3 @@
// 2^33 => out of integer range, switch to double
.=8589934592

View File

@@ -1,2 +1,3 @@
// -2^32 => out of signed integer range, switch to double
.=-4294967295

View File

@@ -1,2 +1,3 @@
// -2^32 => out of signed integer range, switch to double
.=-4294967295

View File

@@ -1,2 +1,3 @@
// 1.2345678
.=1.2345678

View File

@@ -1,3 +1,4 @@
// 1234567.8
.=1234567.8

View File

@@ -1,3 +1,4 @@
// -1.2345678
.=-1.2345678

View File

@@ -1,3 +1,4 @@
// -1234567.8
.=-1234567.8

View File

@@ -0,0 +1,4 @@
// Out of 32-bit integer range, switch to double in 32-bit mode. Length the
// same as UINT_MAX in base 10 and digit less than UINT_MAX's last digit in
// order to catch a bug in the parsing code.
.=4300000001

View File

@@ -0,0 +1,4 @@
// Out of 32-bit integer range, switch to double in 32-bit mode. Length the
// same as UINT_MAX in base 10 and digit less than UINT_MAX's last digit in
// order to catch a bug in the parsing code.
4300000001

View File

@@ -0,0 +1,4 @@
// Out of 64-bit integer range, switch to double in all modes. Length the same
// as ULONG_MAX in base 10 and digit less than ULONG_MAX's last digit in order
// to catch a bug in the parsing code.
.=1.9e+19

Some files were not shown because too many files have changed in this diff Show More