Compare commits

..

799 Commits
1.3.0 ... 1.9.0

Author SHA1 Message Date
lilinchao
3c32dca892 adjust some codes position 2019-07-02 13:39:32 -07:00
Jordan Bayles
7924d3ff97 Update version.h.in header comments
Currently, the comments in the version.h.in header file are
incorrect. This tiny patch just updates them.
2019-07-01 13:23:53 -07:00
Jordan Bayles
95b3092ce4 Fix comments on Json Reader
There have been multiple discussions of the inaccurate comments in the
Json Reader class. This patch just updates those comments.
2019-06-28 10:25:13 -07:00
Jordan Bayles
f8db40ff83 Update minimum CMake version requirement 2019-06-28 10:24:50 -07:00
Jordan Bayles
44bc38f0a1 Issue #633: Fix issue with maxInt
This patch is a minor fix to Json::OurReader to properly check against
maxLargestInt, not maxInt. Some cleanup in the decodeNumber method is
included.
2019-06-28 09:43:32 -07:00
Jordan Bayles
ddc9e0fcd7 Run clang-format on the repository
We currently don't have any checks for clang formatting as part of our
check-in process, this is an incremental patch to get things compliant.
2019-06-27 12:25:42 -07:00
Google AutoFuzz Team
879a5b80ce Add fuzz.cpp to jsoncpp_test 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
dc170e30e2 Update main.cpp 2019-06-27 11:58:42 -07:00
Google-Autofuzz
d148e28b9b added fuzz.cpp to macro in main.cpp 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
bcc0472621 Update jsontest.cpp 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
c4d1cb1cd1 Update jsontest.cpp 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
336c300ca4 Update jsontest.cpp 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
400ec89811 Update CMakeLists.txt 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
181f9eb129 Update CMakeLists.txt 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
13afd0e455 Update main.cpp 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
caa2f3bf42 Update main.cpp 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
fdcd2fc232 Update main.cpp 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
92cc77392e Added include fuzz.cpp 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
7e69f15a64 added llvm 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
8e01024ce3 fix llvm 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
6d236e1948 Update fuzz.cpp 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
d81a3caece Update fuzz.h 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
2939d85b84 Update fuzz.cpp 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
9725530a4f Update fuzz.h 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
576b271a04 Update fuzz.cpp 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
9d6db96f36 Update fuzz.h 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
46d35659ef Update fuzz.h 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
29434414d7 Update fuzz.cpp 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
3247202676 Updated fuzz.h 2019-06-27 11:58:42 -07:00
Google AutoFuzz Team
0e3b22dd3a Updated header and fixed the bug 2019-06-27 11:58:42 -07:00
Autofuzz team
786851819e Add a simple fuzz test for jsoncpp. 2019-06-27 11:58:42 -07:00
Olivier LIESS
629a727b5f version.h : wrong file was deployed, added required include path and 2019-06-26 09:05:34 -07:00
cmlchen
c51d718ead extract variable 2019-06-26 09:03:12 -07:00
cmlchen
7c7ccbf934 fix compile problem 2019-06-26 09:03:12 -07:00
cmlchen
b7feb2d493 use fpclassify to test a float number is zero or nan 2019-06-26 09:03:12 -07:00
chenguoping
5510f14a71 repair a typo error 2019-06-25 15:16:16 -07:00
Jordan Bayles
d34479ec34 Issue 920: Fix android build with casting fix
This patch removes an unchecked conversion from a 64bit wide type
to a 32bit wide type, fixing a compile error on some platforms.

Issue:920
2019-06-25 15:14:53 -07:00
Billy Donahue
dd6921f479 Add WideString test for Issue #756 2019-06-25 15:14:01 -07:00
Jordan Bayles
101d4797db Merge pull request #955 from baylesj/yaml-cleanups
Modernize Travis and Appveyor configs
2019-06-25 14:56:34 -07:00
Jordan Bayles
2690bc9a9a Update appveyor to use build images 2019-06-25 14:48:40 -07:00
Jordan Bayles
c84f2e19c9 Update travis scripts 2019-06-25 14:40:55 -07:00
Jordan Bayles
408b466b57 Modernize Travis and Appveyor configs
This PR updates the Travis and Appveyor configs to use more recent
toolchain versions, allowing for better C++11 compliance.
2019-06-25 14:27:26 -07:00
Jordan Bayles
2a3ae0e79f Merge pull request #932 from oleurodecision/cmake_clean
cmake cleanup
2019-06-25 13:59:46 -07:00
Jordan Bayles
2d211de06e Update issue templates 2019-06-24 14:40:08 -07:00
Jordan Bayles
18e51e23fd Update issue templates 2019-06-24 14:38:38 -07:00
Jordan Bayles
56c41fbd88 Merge pull request #934 from oleurodecision/cmake_config_version
added cmake config version file for proper cmake delivery
2019-06-24 14:32:55 -07:00
Jordan Bayles
4babd12a25 Merge pull request #953 from baylesj/clang-format
Run clang format
2019-06-24 14:25:16 -07:00
Jordan Bayles
6935317d84 Merge pull request #952 from baylesj/update-meson-req
Update meson build requirement
2019-06-24 14:08:10 -07:00
Jordan Bayles
d5bd1a7716 Run clang format
Clang format hasn't been run on some recent checkins. This patch updates
the repository with clang format properly run on all files.
2019-06-24 14:06:45 -07:00
Jordan Bayles
f7182a0fdc Update CONTRIBUTING.md
Added style information.
2019-06-24 14:05:18 -07:00
Jordan Bayles
be4dc51c1f Update README.md
Separate contributing guidelines into their own separate documentation.
2019-06-24 13:54:28 -07:00
Jordan Bayles
12461e5bf1 Create CONTRIBUTING.md 2019-06-24 13:53:55 -07:00
Jordan Bayles
185dfd592d Update meson build requirement
Currently, we have a build type warning due to listing a requirement for
meson build version that doesn't implement features we use in our build
file. The minimum meson build version required is actually 0.50.0, so
this PR updates our meson.build file to depend on 0.50.0.
2019-06-24 13:38:00 -07:00
Jordan Bayles
518875d2ea Update AUTHORS 2019-06-24 13:32:20 -07:00
Jordan Bayles
3e4c8f8f1d Merge pull request #935 from abigailbunyan/forward-declarations
Add missing classes to forwards.h
2019-06-24 12:50:59 -07:00
Jordan Bayles
83cc92161b Fix JSON_USE_EXCEPTION=0 use case
This patch fixes the JSON_USE_EXCEPTION flag. Currently, due to the
throwRuntimeError and throwLogicError methods implemented in json_value,
even if JSON_USE_EXCEPTION is set to 0 jsoncpp will still throw. This
breaks integration into projects with -fno-exceptions set, such as
Chromium.
2019-06-21 18:16:52 -05:00
Olivier LIESS
85f5b1c8f9 fixed typos 2019-06-03 16:31:07 +02:00
Abigail Bunyan
1234f4227b Add missing classes to forwards.h
Fixes #904.
2019-06-03 15:04:01 +01:00
Olivier LIESS
8d2095af3c cmake fixes 2019-06-03 12:45:24 +02:00
Olivier LIESS
0155f38b5b added cmake config version file for proper cmake delivery 2019-06-03 12:39:50 +02:00
David Demelier
5b91551f39 Rename version.md to version.txt 2019-04-24 23:56:30 -05:00
David Demelier
27290cf81d Use version.md in dev.makefile 2019-04-24 23:56:30 -05:00
David Demelier
27b9501683 Fix build with libc++, closes #910 2019-04-24 23:56:30 -05:00
Frank Richter
b16abf8ce1 Explicitly set JSON_API to 'default' visibility on clang & gcc 2019-04-08 18:08:25 -05:00
Christopher Dunn
cd1121290a Merge pull request #901 from res2k/demand
Implement Value::demand()
2019-03-30 09:39:32 -05:00
Frank Richter
69402d1fbb Bump minor version, SOVERSION 2019-03-23 21:03:30 +01:00
Christopher Dunn
3e2f8d3ea8 Merge pull request #902 from res2k/fix-888
Fix #888
2019-03-23 14:43:56 -05:00
Frank Richter
99a99d4032 Cast to unsigned char in Value::setType() to appease gcc (issue #888) 2019-03-23 15:04:30 +01:00
Frank Richter
9a629bc5e1 tests: Add a comment 2019-03-23 14:39:59 +01:00
Frank Richter
d76fe5687d Implement Value::demand() 2019-03-23 14:32:13 +01:00
Frank Richter
eb7bd9546e Value::find(): Fix assert message 2019-03-23 14:32:13 +01:00
Frank Richter
0adb053294 tests: Add small checks for find() 2019-03-23 14:16:13 +01:00
Willem
863aa36165 Update README.md
Update the link to the conan page, as https://conan.io/source/jsoncpp/1.8.0/theirix/ci returns a 404.
2019-03-20 00:12:33 -05:00
Billy Donahue
9a55d22d3d remove JSON_HAS_RVALUE_REFERENCES 2019-03-01 06:32:18 -06:00
Billy Donahue
00558b38db VS2013 doesn't allow move ops to be =default 2019-03-01 06:31:58 -06:00
Billy Donahue
433107f1d9 refactor comments_ into a class 2019-03-01 06:31:58 -06:00
Christopher Dunn
a732207060 Merge pull request #883 from hjmjohnson/remove-msvc2010
COMP: Remove build files for unsupported IDE's
2019-02-28 22:26:50 -06:00
Marcel Raad
36d8cfd768 Fix macro redefinition warning with clang-cl
clang-cl defines _MSC_VER by default, so JSONCPP_DEPRECATED was first
defined for MSVC and then redefined for clang. Integrate the MSVC
definition into the block with clang and GCC's JSONCPP_DEPRECATED
definitions to fix this.
2019-02-28 22:19:13 -06:00
Hans Johnson
2ab1d63480 COMP: Remove visual studio specialization in favor of meson or cmake
More robust build environments can be generated from meson
or cmake rather than including those files in every download.
2019-01-24 09:57:56 -06:00
Hans Johnson
b4ca2db5ff COMP: Remove build files for unsupported IDE's
The msvc2010 and vs71 IDE's do not support sufficient
C++11 feature sets for jsoncpp.

Remove these build environments.

resolves: #882
2019-01-24 09:00:55 -06:00
Billy Donahue
0c1cc6e1a3 pack the {type,allocated} bitfield (#876)
* pack the {type,allocated} bitfield (Issue#873)
This allows special functions to be implemented more easily.
2019-01-20 23:59:16 -05:00
Billy Donahue
d85d75045c Issue #872: add json/allocator.h in the amalgamated header.
I don't know why we didn't include this before.
It seems to work fine.
2019-01-20 22:13:38 -05:00
Billy Donahue
2b593a9da8 apply the C++11 style change in .clang-format 2019-01-18 07:02:16 -06:00
Billy Donahue
756a08fbbd switch .clang-format to C++11 2019-01-18 07:02:16 -06:00
Hans Johnson
2c257590a1 BUG: VERSION_LESS_EQUAL introduced in cmake 3.7
Older versions of cmake, according to documentation:
https://cmake.org/cmake/help/v3.5/command/if.html , do not know
VERSION_LESS_EQUAL, just VERSION_LESS.

This leads to errors:

CMake Error at somewhere/jsoncpp/CMakeLists.txt:18 (if):
  if given arguments:

    "3.5.1" "VERSION_LESS_EQUAL" "3.13.1"

  Unknown arguments specified

Resolves: #866
2019-01-18 07:00:39 -06:00
Hans Johnson
deb6cca214 STYLE: FATAL_ERROR ignored in cmake_required_minimum since 2.6.0 2019-01-18 07:00:39 -06:00
Billy Donahue
b9ed29a221 Jsoncpp aliases 2 (#868)
convert JSONCPP_STRING etc from macros to typedefs
2019-01-17 23:26:28 -05:00
Billy Donahue
1c2ed7a10f convert JSONCPP_STRING etc from macros to typedefs 2019-01-17 23:14:18 -05:00
Billy Donahue
6e7cbf8f54 Merge pull request #867 from BillyDonahue/apply_clang_format
Reapply clang-format.
2019-01-17 15:30:37 -05:00
Billy Donahue
dc4a7f9b61 Reapply clang-format.
$ clang-format -i -style=file \
        $(find  . | egrep '.*\.(h|cpp|inl)$')
2019-01-17 11:11:55 -05:00
Hans Johnson
21a4185634 STYLE: Avoid unnecessary conversions from size_t to unsigned int
Make the index values consistent with size_t.
2019-01-15 18:30:49 -06:00
Hans Johnson
d11732043a STYLE: Pefer = delete to explicitly trivial implementations
This check replaces undefined special member functions with
= delete;. The explicitly deleted function declarations enable more
opportunities in optimization, because the compiler might treat
explicitly delted functions as noops.

Additionally, the C++11 use of = delete more clearly expreses the
intent for the special member functions.

SRCDIR=/Users/johnsonhj/src/jsoncpp/ #My local SRC
BLDDIR=/Users/johnsonhj/src/jsoncpp/cmake-build-debug/ #My local BLD

cd /Users/johnsonhj/src/jsoncpp/cmake-build-debug/
run-clang-tidy.py -extra-arg=-D__clang__ -checks=-*,modernize-use-equals-delete  -header-filter=.* -fix
2019-01-15 18:30:49 -06:00
Hans Johnson
e3e05c7085 STYLE: Pefer = default to explicitly trivial implementations
This check replaces default bodies of special member functions with
= default;. The explicitly defaulted function declarations enable more
opportunities in optimization, because the compiler might treat
explicitly defaulted functions as trivial.

Additionally, the C++11 use of = default more clearly expreses the
intent for the special member functions.

SRCDIR=/Users/johnsonhj/src/jsoncpp/ #My local SRC
BLDDIR=/Users/johnsonhj/src/jsoncpp/cmake-build-debug/ #My local BLD

cd /Users/johnsonhj/src/jsoncpp/cmake-build-debug/
run-clang-tidy.py -extra-arg=-D__clang__ -checks=-*,modernize-use-equals-default  -header-filter=.* -fix
2019-01-15 18:30:49 -06:00
Hans Johnson
e817e4fc25 STYLE: Use default member initialization
Converts a default constructor’s member initializers into the new
default member initializers in C++11. Other member initializers that match the
default member initializer are removed. This can reduce repeated code or allow
use of ‘= default’.

SRCDIR=/Users/johnsonhj/src/jsoncpp/ #My local SRC
BLDDIR=/Users/johnsonhj/src/jsoncpp/cmake-build-debug/ #My local BLD

cd /Users/johnsonhj/src/jsoncpp/cmake-build-debug/
run-clang-tidy.py -extra-arg=-D__clang__ -checks=-*,modernize-use-default-member-init  -header-filter=.* -fix
2019-01-15 18:30:49 -06:00
Hans Johnson
b5093e8122 PERF: Allow compiler to choose best way to construct a copy
With move semantics added to the language and the standard library updated with
move constructors added for many types it is now interesting to take an
argument directly by value, instead of by const-reference, and then copy. This
check allows the compiler to take care of choosing the best way to construct
the copy.

The transformation is usually beneficial when the calling code passes an rvalue
and assumes the move construction is a cheap operation. This short example
illustrates how the construction of the value happens:

SRCDIR=/Users/johnsonhj/src/jsoncpp/ #My local SRC
BLDDIR=/Users/johnsonhj/src/jsoncpp/cmake-build-debug/ #My local BLD

cd /Users/johnsonhj/src/jsoncpp/cmake-build-debug/
run-clang-tidy.py -extra-arg=-D__clang__ -checks=-*,modernize-pass-by-value  -header-filter=.* -fix
2019-01-15 18:30:49 -06:00
Hans Johnson
1fc3de7ca1 STYLE: Use auto for variable type matches the type of the initializer expression
This check is responsible for using the auto type specifier for variable
declarations to improve code readability and maintainability.

The auto type specifier will only be introduced in situations where the
variable type matches the type of the initializer expression. In other words
auto should deduce the same type that was originally spelled in the source

SRCDIR=/Users/johnsonhj/src/jsoncpp/ #My local SRC
BLDDIR=/Users/johnsonhj/src/jsoncpp/cmake-build-debug/ #My local BLD

cd /Users/johnsonhj/src/jsoncpp/cmake-build-debug/
run-clang-tidy.py -extra-arg=-D__clang__ -checks=-*,modernize-use-auto -header-filter = .* -fix
2019-01-15 18:30:49 -06:00
Hans Johnson
cbeed7b076 STYLE: Use range-based loops from C++11
C++11 Range based for loops can be used in

Used as a more readable equivalent to the traditional for loop operating over a
range of values, such as all elements in a container, in the forward direction..

Range based loopes are more explicit for only computing the
end location once for containers.

SRCDIR=/Users/johnsonhj/src/jsoncpp/ #My local SRC
BLDDIR=/Users/johnsonhj/src/jsoncpp/cmake-build-debug/ #My local BLD

cd /Users/johnsonhj/src/jsoncpp/cmake-build-debug/
run-clang-tidy.py -extra-arg=-D__clang__ -checks=-*,modernize-loop-convert  -header-filter=.* -fix
2019-01-15 18:30:49 -06:00
Hans Johnson
3beadff472 PERF: readability container size empty
The emptiness of a container should be checked using the empty() method
instead of the size() method. It is not guaranteed that size() is a
constant-time function, and it is generally more efficient and also
shows clearer intent to use empty(). Furthermore some containers may
implement the empty() method but not implement the size() method. Using
empty() whenever possible makes it easier to switch to another container
in the future.

SRCDIR=/Users/johnsonhj/src/jsoncpp/ #My local SRC
BLDDIR=/Users/johnsonhj/src/jsoncpp/cmake-build-debug/ #My local BLD

cd /Users/johnsonhj/src/jsoncpp/cmake-build-debug/
run-clang-tidy.py -extra-arg=-D__clang__ -checks=-*,readability-container-size-empty  -header-filter=.* -fix
2019-01-15 18:30:49 -06:00
James Clarke
d3d2e17b64 Switch to the VCPP dll 2019-01-15 09:01:19 -06:00
James Clarke
908383abeb Fix bogus asm setting 2019-01-15 09:01:19 -06:00
James Clarke
fdc0824505 Add amd64 support 2019-01-15 09:01:19 -06:00
James Clarke
4e8b4e313f Add support for VS2017 2019-01-15 09:01:19 -06:00
Hans Johnson
31d65711d6 ENH: Remove conditionals for unsupported VS compilers
Visual Studio 12 (2013) with _MSC_VER=1800 is the oldest supported
compiler with sufficient C++11 capabilities

See:
https://blogs.msdn.microsoft.com/vcblog/2013/12/02/c1114-core-language-features-in-vs-2013-and-the-nov-2013-ctp/
for details related to language features supported.
2019-01-14 16:27:52 -06:00
Hans Johnson
2853b1cdac COMP: Use C++11 override directly
The override support in C++11 is required so avoid aliasing
this feature.  Compilers that do not support the override keyword
are no longer supported.
2019-01-14 16:14:12 -06:00
terrycz126
8b31c6f0fd Fix redefined(_CRT_SECURE_CPP_OVERLOAD_STANDARD_NAMES) warning 2019-01-14 16:13:21 -06:00
Hans Johnson
a3c8e86c0b ENH: Refactor and enhance the CI testing infrastructure
1) Improve travis build script for use outside travis.
   Allow the script used for CI builds to also be used
   locally in a similar manner to the CI use of the scrips

2) Add ctest compatible testing and CDASH support
   Report testing and building results to
   https://my.cdash.org/index.php?project=jsoncpp

   NOTE: The new ctest infrastructure is not yet robust on winodws
         Do no yet enable the new features for running test with ctest
         on windows platform.  The previous behaviors are maintainted,
         but enhance test reporting from windows is not yet supported.

3) Add a cmake coverage testing option
   Ensure that cmake builds on linux are tested.
   Ensure that code coverage is reported.

4) Move conditional environment checking into the matrix
   Avoid multiple places where conditional logic is used to
   change compiler behavior.  As more test environments are
   created fromt the travis.yml matrix, all settings should be
   obvious from that one location.

5) Tests with known regressions from the jsonchecker are suppressed
    Tests that are known to pass with jsoncpp more lenient
    syntax enforcement are exluded from tests in test/runjsontests.py
2019-01-14 16:12:43 -06:00
Hans Johnson
10a1a38b37 ENH: move travis support scripts to .travis_scripts
Move the build support scripts for travis to a hidden
subdirectory to keep the top level directory more
clean.
2019-01-12 10:35:25 -06:00
Hans Johnson
fa61a49b83 ENH: Use recommended homebrew addon for travis.
Remove unnecessary python3 environment from osx that
made configuring the test environment slower.

Use "addon:" features for installing homebrew
packages more efficiently.

Re-organized the .travis.yml file in a standard
order so that the "before-install" and "install"
steps occur after the configurations of addons and
 matrix, and language features are set.
2019-01-12 10:35:25 -06:00
Hans Johnson
f8ad1ab352 BUG: Fix bug in CI where failure during homebrew update occured. 2019-01-12 10:35:25 -06:00
Brad King
056850c44b reader: fix signed overflow when parsing negative value
Clang's ubsan (-fsanitize=undefined) reports:

    runtime error: negation of -9223372036854775808 cannot be represented in
    type 'Json::Value::LargestInt' (aka 'long'); cast to an unsigned type to
    negate this value to itself

Follow its advice and update the code to remove the explicit negation.
2019-01-11 14:04:41 -06:00
Stefano Fiorentino
009a3ad24c issue_836: Check if `removed' is a valid pointer before copy data to it
functions involved:
	- bool Value::removeMember(const char* key, const char* cend, Value* removed)

Signed-off-by: Stefano Fiorentino <stefano.fiore84@gmail.com>
2019-01-11 11:27:59 -06:00
fangguo
7d16e10113 fiexd “Cannot take the address of a bit field.”
```c++
 #include <iostream>

class TestBool
{
public:
    TestBool():addChildValues_(){}
    TestBool(int):addChildValues_(false){}
    bool addChildValues_ : 1;
    bool indented_ : 1;
};


int main()
{
    std::cout << "\n TestBool () addChildValues_ = " << TestBool().addChildValues_;
    std::cout << "\n TestBool false addChildValues_ = " << TestBool(3).addChildValues_;
    return 0;
}
```

```text
root@osssvr-1 # /opt/SUNWspro/prod/bin/CC  testbool.cpp -o testbool   
Error: Cannot take the address of a bit field.
1 Error(s) detected.
```
2018-12-31 07:39:04 +07:00
Orivej Desh
d8723104f3 Update removeMember docs after #693 2018-12-31 07:37:07 +07:00
Mathias L. Baumann
08ddeed927 JsonValue documentation: Rephrase confusing sentence 2018-12-30 15:33:43 -06:00
Hans Johnson
97e05c41f2 COMP: Provide C++11 feature testing during config
Test compiler feature sets early so that required features
are validated before long compilation process is started.
2018-12-30 15:32:57 -06:00
Hans Johnson
b3b92df879 ENH: Provide range for non-warned cmake versions
Allow configuring without cmake policy developer warnings
for a range of cmake versions.

This prevents the need to explicitly enumerate every new
policy for each new cmake version.

===

Moved setting of the CMAKE_CXX_STANDARD to before the project()
directive.
2018-12-30 15:32:57 -06:00
Hans Johnson
892a386018 ENH: Use cmake builtin versioning capabilities
The project directive in cmake 3.1 has a builtin
mechanism for providing consistent versioning
in a package.
2018-12-30 15:32:57 -06:00
Hans Johnson
0417e626c0 STYLE: Convert CMake-language commands to lower case
Ancient CMake versions required upper-case commands.  Later command names
became case-insensitive.  Now the preferred style is lower-case.
2018-12-30 15:32:57 -06:00
Hans Johnson
2cb1ad5d0c STYLE: Replace integer literals which are cast to bool.
Finds and replaces integer literals which are cast to bool.

SRCDIR=/Users/johnsonhj/src/jsoncpp #My local SRC
BLDDIR=/Users/johnsonhj/src/jsoncpp/cmake-build-debug/ #My local BLD

cd /Users/johnsonhj/src/jsoncpp/cmake-build-debug/
run-clang-tidy.py -extra-arg=-D__clang__ -checks=-*,modernize-use-bool-literals  -header-filter=.* -fix
2018-12-30 15:31:12 -06:00
Kostiantyn Ponomarenko
4bfa962967 Add Meson related info to README
Add information about how one can get a Meson wrap file.

Signed-off-by: Kostiantyn Ponomarenko <konstantin.ponomarenko@gmail.com>
2018-12-30 15:30:35 -06:00
Hans Johnson
e50bfefef1 COMP: Prefer the C++ headers over the C99 headers
Using the C++11 headers keeps the library cleaner and more
rigorously scoped use of namespaces.
2018-12-30 15:29:22 -06:00
Hans Johnson
5c8e539af4 ENH: MSVS 2013 snprintf compatible substitute
Simplify the backwards compatible snprintf configuration for pre
1900 version of MSVC.  Otherwise prefer C++11 syntax using std::snprintf.
2018-12-30 15:29:22 -06:00
Radoslav Atanasov
ccd077ffce Fix MSVC 15.9 (2017) warning C4866
by changing operator[] param type from JSONCPP_STRING to const JSONCPP_STRING& for CharReaderBuilder and StreamWriterBuilder (as it is already in Value).

https://docs.microsoft.com/en-us/cpp/error-messages/compiler-warnings/c4866?view=vs-2017
2018-12-30 15:28:09 -06:00
Hans Johnson
4abf4ec208 PERF: Replace explicit return calls of constructor
Replaces explicit calls to the constructor in a return with a braced
initializer list. This way the return type is not needlessly duplicated in the
function definition and the return statement.

SRCDIR=/Users/johnsonhj/src/jsoncpp #My local SRC
BLDDIR=/Users/johnsonhj/src/jsoncpp/cmake-build-debug/ #My local BLD

cd /Users/johnsonhj/src/jsoncpp/cmake-build-debug/
run-clang-tidy.py -extra-arg=-D__clang__ -checks=-*,modernize-return-braced-init-list  -header-filter=.* -fix
2018-12-30 15:26:29 -06:00
Christopher Dunn
9026a16ff5 Merge pull request #849 from hjmjohnson/modernize-use-nullptr
COMP: Use nullptr instead of 0 or NULL
2018-12-30 15:23:22 -06:00
Hans Johnson
f64244ed3f COMP: Use nullptr instead of 0 or NULL
The check converts the usage of null pointer constants (eg. NULL, 0) to
use the new C++11 nullptr keyword.

SRCDIR=/Users/johnsonhj/src/jsoncpp #My local SRC
BLDDIR=/Users/johnsonhj/src/jsoncpp/cmake-build-debug/ #My local BLD

cd /Users/johnsonhj/src/jsoncpp/cmake-build-debug/
run-clang-tidy.py -extra-arg=-D__clang__ -checks=-*,modernize-use-nullptr  -header-filter=.* -fix
2018-12-12 13:41:06 -06:00
Christopher Dunn
6219eae304 Merge pull request #843 from manang/master
Update json_writer.cpp
2018-12-03 20:20:55 -08:00
manang
b955e0f699 Update json_writer.cpp 2018-12-03 10:26:27 +01:00
Julien Schueller
d501fbe741 Set CMAKE_BUILD_TYPE default on win32 too 2018-12-02 18:37:11 -06:00
Julien Schueller
ec4251b728 Use CMAKE_CROSSCOMPILING_EMULATOR to run tests
Needed when cross-compiling
2018-12-02 18:37:11 -06:00
Julien Schueller
a72266d00b Remove useless BUILD_STATIC_LIBS option 2018-12-02 18:37:11 -06:00
Julien Schueller
010a2d04d3 Unique lib target name 2018-12-02 18:37:11 -06:00
Christopher Dunn
2baad4923e Merge pull request #804 from yantaozhao/master
allow nullptr when not care the removed array value
2018-07-14 22:11:35 -05:00
YantaoZhao
e32ee4717c allow nullptr when not care the removed array value 2018-07-03 21:29:18 +08:00
Christopher Dunn
80bc776bae Merge pull request #250 from cdunn2001/travis
in travis, build for osx also
2018-06-24 20:42:46 -05:00
Christopher Dunn
da498591fc In travis-ci, build for osx also
Drop gcc b/c it takes too long to install via addon.

Build only static/release, to save VMs. (No shared to debug.)
2018-06-24 20:35:49 -05:00
pavel.pimenov
745287275c "\n" -> '\n' 2018-06-24 18:51:10 -05:00
Christopher Dunn
c00a3b95c2 Merge pull request #800 from cdunn2001/patch-1
Fixes #798
Closes #799
2018-06-23 18:17:08 -05:00
Christopher Dunn
c59db80002 Try the way I build locally 2018-06-23 18:08:53 -05:00
Christopher Dunn
473afca1e3 Tell meson/ninja versions 2018-06-23 18:08:53 -05:00
Christopher Dunn
59d41de5b1 Try to avoid empty string
- g++ has a problem with ''
- clang++ does not seem to mind it.
2018-06-23 18:08:53 -05:00
Peter Spiess-Knafl
b87f6dbc8a Fix for #798
Add preprocessor definitions for MSVC dllexport/dllimport statements

(cherry picked from commit 2654b6bbbf)
2018-06-23 16:11:47 -05:00
Kamel CHAOUCHE
ee34ac1fbb Add position independent code feature to CMakeList.txt
Enable Position Independent Code for shared lib
2018-06-22 11:37:18 -05:00
Christopher Dunn
d31a5300e1 Merge pull request #788 from pavel-pimenov/fix-782
Fix #782
2018-06-22 10:29:52 -05:00
pavel.pimenov
86789e7c2f Fix #782 2018-06-05 10:17:36 +03:00
Christopher Dunn
c4103ab390 Merge pull request #784 from Nekto89/cppcheck_fix
Multiple fixes for issues found by Cppcheck
2018-06-03 13:28:53 -05:00
Marian Klymov
a5d7c714b1 Fix typo in previous fix. 2018-06-02 21:52:45 +03:00
Marian Klymov
84ca7d6f0b Apply the formatting specified in .clang-format file. 2018-06-02 20:27:31 +03:00
Marian Klymov
fc20134c92 Fix different names for parameters in declaration and definition 2018-06-02 20:15:26 +03:00
Marian Klymov
091e03979d Reduce scope of variable. 2018-06-02 19:48:10 +03:00
Marian Klymov
a7d0ffc717 Remove unused private function in TestResult class 2018-06-02 19:46:16 +03:00
Marian Klymov
48112c8b62 Make several methods static. 2018-06-02 19:43:31 +03:00
Marian Klymov
c8bb600d27 Pass string as a const reference. 2018-06-02 19:41:57 +03:00
Marian Klymov
85a263e89f Fix improper format specifier in printf
%d in format string requires 'int' but the argument type is 'unsigned int'.
2018-06-02 19:38:12 +03:00
Christopher Dunn
cfab607c0d Merge pull request #776 from BillyDonahue/apply_clang_format
Reapply clang format
2018-05-22 13:32:58 -05:00
Billy Donahue
b5e1fe89aa Apply the formatting specified in .clang-format file.
$ clang-format --version
  clang-format version 7.0.0 (tags/google/stable/2018-01-11)
  $ clang-format -i --style=file $(find . -name '*.cpp' -o -name '*.h')
2018-05-20 18:38:42 -04:00
Billy Donahue
abd39e791b json_tool missing include 2018-05-20 18:38:42 -04:00
Christopher Dunn
768e31fc68 Merge pull request #773 from BillyDonahue/precision
Improvements in writing precision and json_tool.h helpers.

resolves #772
2018-05-13 22:57:16 -05:00
Billy Donahue
aa1b383666 fix string construction 2018-05-13 18:28:05 -04:00
Billy Donahue
8bf20bdc35 Merge branch 'precision' of github.com:BillyDonahue/jsoncpp into precision 2018-05-11 14:31:51 -04:00
Billy Donahue
0ba5c435f4 Improvements in writing precision and json_tool.h helpers 2018-05-11 14:31:12 -04:00
Billy Donahue
fdcc2e4428 single-arg string ctor 2018-05-11 14:26:09 -04:00
Billy Donahue
9ebfc8d37b whitespace cleanup 2018-05-11 14:20:51 -04:00
Billy Donahue
4cec95a2e7 formatting refactor 2018-05-11 14:03:34 -04:00
fo40225
cf73619e28 refactoring cross compiler macro 2018-05-09 02:06:19 -05:00
Christopher Dunn
ded953e0a6 Merge pull request #771 from Binyang2014/master
Disable warning "C4702" when compiling json cpp using vs2013 and above

resolves #759
2018-05-09 02:04:08 -05:00
binyangl
0a62267fe4 Disable warning "C4702" when compiling json cpp using vs2013 and above 2018-05-08 20:55:30 +08:00
Christopher Dunn
2cc9b24f0d Merge pull request #768 from fo40225/fix_msvc_fpfast
Fix msvc /fp:fast test failure
2018-05-08 00:30:14 -05:00
fo40225
6e5e9be736 corss compiler isnan 2018-05-08 12:35:08 +08:00
fo40225
4050143288 fix ValueTest/integers, CharReaderAllowSpecialFloatsTest/issue209 test failure when fp:fast on msvc 2018-05-05 15:05:22 +08:00
fo40225
3f0d91f08a fix ValueTest/specialFloats test failure when fp:fast on msvc 2018-05-05 14:38:53 +08:00
Christopher Dunn
02211117f1 Merge pull request #764 from Melown/master
allow out-of-source build
2018-04-20 00:22:22 -05:00
Tomáš Malý
323450eafc allow out-of-source build 2018-04-17 16:16:54 +02:00
Christopher Dunn
af17fecd29 Merge pull request #760 from ldionne/master
[CMake] Generate CMake config files by default
2018-04-05 19:36:03 -05:00
Louis Dionne
ffc62d26f3 [CMake] Generate CMake config files by default 2018-04-05 16:37:58 -07:00
Mike R
a07fc53287 Add setting precision for json writers and also add decimal places precision type. (#752)
* Added setting precision for writers.
* Added special case for precise precision and global precision.
* Added good setting of type of precision and also added this type to BuiltStreamWriter and for its settings.
* Added some tests.
2018-03-13 15:35:31 -05:00
Christopher Dunn
af2598cdd3 Merge pull request #751 from open-source-parsers/properly_swappable
Remove std::swap<Json::Value> in favor of ADL
2018-03-06 17:14:24 -06:00
Billy Donahue
1d95628ba8 Remove std::swap<Json::Value> in favor of ADL
Comply with http://en.cppreference.com/w/cpp/concept/Swappable
Don't open namespace std.
2018-03-06 12:51:58 -05:00
Christopher Dunn
3e2b8ea9cc Minor changes for static analysis (#749)
re: #747
2018-03-03 12:51:17 -06:00
Christopher Dunn
1ab310e3ed Merge pull request #748 from dueringa/feature/clarifyIndentDocumentation
Clarify documentation regarding indentation / newline
2018-03-03 09:46:36 -06:00
uvok cheetah
c7728e8658 Clarify documentation regarding indentation / newline 2018-03-03 12:45:54 +01:00
Christopher Dunn
313a0e4c34 Merge pull request #743 from tjanc/tjanc/fix-utf8-codepoint
Incorrect byte shift when interpreting 32-bit utf-8 codepoints
2018-02-14 10:33:35 -06:00
Thomas Jandecka
592d942b3b fix: byte shift when interpreting 32-bit utf-8 codepoints 2018-02-14 14:23:58 +01:00
luzpaz
5b45aa55ca Misc-typos (#741)
Found in downstream CMake repo via `codespell -q 3`
2018-02-08 19:05:50 -06:00
Christopher Dunn
07a324fb14 Merge pull request #736 from maxim-ky/master
Move the existing value to "removed" argument; removed is optional (could be nullptr)
2018-01-29 21:13:47 -06:00
Maxim Ky
1ec85c76a4 Value::removeMember arg "removed" is optional now (could be nullptr) 2018-01-29 16:59:24 +03:00
Maxim Ky
c27936e0aa Value::removeMember moves the existing value to "removed" now 2018-01-29 16:58:45 +03:00
drgler
04abe38148 Issue #731: Provide new JSONCPP_OP_EXPLICIT macro to restore VS 2012 support after recent introduction of explicit conversion function in JSON::Value. 2018-01-20 15:38:39 -06:00
Christof Krüger
edb4bdb7ec Do not deprecate whole class but only constructors of Json::Reader.
This should fix warning C4996 issued by Visual Studio in cases where
Json::Reader is not even used by client code.
2018-01-20 15:32:22 -06:00
Christopher Dunn
0ced843c97 Merge pull request #726 from okodron/fix-704
Value::copy() creates a deep copy now
2018-01-20 15:27:46 -06:00
Andrey Okoshkin
9b569c8ce3 Make Value copy constructor simplier
Helper private methods Value::dupPayload() and Value::dupMeta() are added.
Value copy constructor doesn't attempt to delete its data first.
* Value::dupPayload() duplicates a payload.
* Value::dupMeta() duplicates comments and an offset position with a limit.
2018-01-12 15:59:20 +03:00
Andrey Okoshkin
392e3a5b49 Add basic test for Value::copy() (#704) 2018-01-12 14:36:01 +03:00
Andrey Okoshkin
c69148c946 Fix Value::copyPayload() and Value::copy() (#704)
Value copy constructor shares the same code with Value::copy() and Value::copyPayload().
New Value::releasePayload() is used to free payload memory.
Fixes: #704
2018-01-12 14:33:47 +03:00
Christopher Dunn
2f227cb122 Merge pull request #718 from dbeurle/master
CZString as public when using NVCC, see issue #486
2017-12-22 23:19:07 -06:00
Darcy Beurle
798f6ba055 CZString as public when using NVCC, see issue #486 2017-12-22 22:48:20 +01:00
Christopher Dunn
72f6cc7fd0 Merge pull request #716 from cdunn2001/master
Speed up TravisCI build
2017-12-21 02:33:50 -06:00
Christopher Dunn
d3ce75c74e pyenv global 3.6
We need pip3, and TravisCI build error says:

    The `pip3` command exists in these Python versions: 3.6, 3.6.3
2017-12-21 02:05:52 -06:00
Christopher Dunn
de5fb8e022 Try to use default python on Trusty, for speed
Running `pyenv install` wastes about 3 minutes.

* https://docs.travis-ci.com/user/languages/python

    "for Trusty, this means 2.7.6 and 3.4.3"
2017-12-21 01:27:38 -06:00
Christopher Dunn
899894f0f5 -std=c++11 (#715)
We set this is the Meson build to eliminate warnings, but
c++0x should still work, at least for now.

See #695 for discussion.
2017-12-21 01:22:40 -06:00
Christopher Dunn
ddabf50f72 1.8.4; soversion=20 2017-12-20 15:07:10 -06:00
Christopher Dunn
63ab03ca28 replace code point in range(0xD800, 0xDFFF) to replacement mark (#714)
closes #712
2017-12-20 14:43:55 -06:00
Christopher Dunn
41ff85f443 pyenv install (#713)
```
Unfortunately, since our latest image update, Python 3.5 doesn't come pre-installed anymore. Hence, you will have to install it via `pyenv` as a first step e.g.

before_install
  - pyenv install 3.5.0 && pyenv global 3.5.0

support@travis-ci.com
```
2017-12-20 14:24:51 -06:00
Wolfram Rösler
9079422ac1 Allow Json::Value to be used in a boolean context (#695)
Must bump soversion too.
2017-12-05 11:18:55 -06:00
Christopher Dunn
c39aa295e4 Merge pull request #707 from remyjette/valuetostring-sign-mismatch
Fix sign mismatch in `valueToString`
2017-12-04 20:02:47 -06:00
Remy Jette
42ca02b833 Fix sign mismatch in valueToString
`valueToString` takes an argument `unsigned int precision`, but it is used with `%d` rather than `%u` in the `snprintf` format string. Make the format string look for an unsigned value instead.
2017-12-04 17:49:36 -08:00
Josh Soref
e6a588a246 Spelling (#703) 2017-12-03 10:54:29 -06:00
Sascha Zelzer
7c979e8661 Suppress implicit-fallthrough warnings from GCC 7 (#697)
GCC 7, when compiling with -Wimplicit-fallthrough=1 or higher, issues a warning which can be suppressed using a comment that matches certain regular expressions. The comment change does just that: signal to GCC that the fall through is intentional.

Fixes #676
2017-11-16 13:13:55 -06:00
Christopher Dunn
c469326b47 Merge pull request #699 from MarcelRaad/msvc_warnings
MSVC warning fixes in tests
2017-11-16 13:08:39 -06:00
Marcel Raad
240c85a10c MSVC warning fixes in tests
- only use "#pragma GCC" on GCC-compatible compilers
- suppress deprecation warnings also on MSVC
2017-11-10 11:00:40 +01:00
Christopher Dunn
d61cddedac rm unused func 2017-10-29 23:45:01 -05:00
Brian W. Mulligan
5a2dc7a2ad Add comment to README giving instructions on how to install to a directory other than /usr/local (#694) 2017-10-18 00:20:45 -05:00
Wolfram Rösler
a06b390187 Un-deprecate removeMember overloads, return void (#693)
* Un-deprecate removeMember overloads, return void

Sometimes we just want to remove something we don't need anymore. Having
to supply a return buffer for the removeMember function to return something
we don't care about is a nuisance. There are removeMember overloads that
don't need a return buffer but they are deprecated. This commit un-deprecates
these overloads and modifies them to return nothing (void) instead of the
object that was removed.

Further discussion: https://github.com/open-source-parsers/jsoncpp/pull/689

WARNING: Changes the return type of the formerly deprecated removeMember
overloads from Value to void. May break existing client code.

* Minor stylistic fixes

Don't explicitly return a void value from a void function. Also, convert
size_t to unsigned in the CZString ctor to avoid a compiler warning.
2017-10-18 00:19:27 -05:00
Paweł Kierski
42a161fc80 Serialize UTF-8 string with Unicode escapes (#687)
Squashed and merged.
2017-10-03 18:19:20 -07:00
Christopher Dunn
a3a4059367 Use non-deprecated removeMember()
closes #683
2017-09-30 00:46:15 -05:00
Christopher Dunn
4d587638af Merge pull request #679 from hughbe/clang-warnings
Fix unknown pragma warnings with clang
2017-09-17 02:56:21 -05:00
Christopher Dunn
75e0c39393 Merge pull request #680 from jasonszang/master
Fix meson.build to allow using jsoncpp as a subproject
2017-09-17 02:54:43 -05:00
Jason S Zang
43fd41d1fc Fix meson.build to allow using jsoncpp as a subproject 2017-09-16 11:19:30 +01:00
Hugh Bellamy
7287065b63 Fix unknown pragma warnings with clang 2017-09-16 10:01:09 +01:00
Christopher Dunn
9249878229 Merge pull request #678 from open-source-parsers/append-move
fixes #677
2017-09-15 19:15:50 -05:00
Christopher Dunn
17c14e73a9 Use move ctor in append() 2017-09-15 18:55:50 -05:00
Christopher Dunn
21e133c6fb Merge pull request #675 from wolframroesler/patch-1
closes #671
2017-09-15 01:11:24 -05:00
Wolfram Rösler
ff6b449a07 Add value_type to improve integration with boost
Without value_type, Boost.Test version 1.65.0 throws a compiler error when a Json::Value object is compared to another with BOOST_TEST. Example and further discussion are in https://github.com/open-source-parsers/jsoncpp/issues/671.
2017-09-14 09:31:36 +02:00
Christopher Dunn
f2f19b03fb Merge pull request #670 from cdunn2001/fix-travis
Fix travis
2017-09-13 23:07:21 -05:00
Christopher Dunn
026c39fa1a Try Travis support suggestion for py3
Hi Christopher,

Thank you for reaching out and sorry to hear about the troubles.

Regarding the pip3 error, it was indeed caused by our image updates. We've cleaned-up the way we set-up the Python environment and now strictly enforce Python version use using pyenv. Which means that if you want to use a different Python version than the system one (which is 2.7.6), you have to explicitly specify it. Adding a "before_install: pyenv global 3.5" step to your travis.yml should switch the system version and make pip3 work without installing any additional packages.
2017-09-13 22:36:39 -05:00
Christopher Dunn
614671d09b Merge pull request #669 from cdunn2001/avoid-redundant-depreciation-warnings
Ignoring the unrelated TravisCI build errors. Those are being addressed separately, in #670.
2017-09-11 14:00:59 -05:00
Christopher Dunn
132840aaa1 More VS warning prevention
See comment by jpo38 in SO:
* https://stackoverflow.com/questions/46151531/how-works-deprecated-warnings-and-how-to-remove-them-when-using-jsoncpp/46156833#46156833
2017-09-11 13:44:07 -05:00
Motti Lanzkron
9bb984a594 Update writer.h
fix typos
2017-09-11 13:17:21 -05:00
Christopher Dunn
66d4573206 Drop TITLE from Doxygen docs
It took up too much room at the top.

Note that we needed to remove it from 2 places, since the main
index.html seems not to use the same top-of-page as the rest uses.
2017-09-10 20:10:18 -05:00
Christopher Dunn
b29fc9834f Link classes and namespace 2017-09-10 20:04:24 -05:00
Christopher Dunn
1a54511aa1 Drop timestamp from HTML doxygen 2017-09-10 20:00:43 -05:00
Christopher Dunn
a7ad98fb82 rsync less 2017-09-10 19:55:08 -05:00
Christopher Dunn
692164d471 Update header.html 2017-09-10 19:55:08 -05:00
Christopher Dunn
c95a841fef Generated new header.html 2017-09-10 19:55:08 -05:00
Christopher Dunn
e895eccd18 Generated new footer.html 2017-09-10 19:55:07 -05:00
Christopher Dunn
3b5f8bef41 Merge pull request #667 from cdunn2001/foo
Drop stderr
2017-09-09 15:15:10 -05:00
Christopher Dunn
c89f0282d1 Do not write to stderr
fixes #665
closes #666
2017-09-09 14:49:55 -05:00
Christopher Dunn
1b68b02ccd Try adding python-3.5 in TravisCI 2017-09-09 14:48:02 -05:00
Christopher Dunn
adb9ab1424 Merge pull request #660 from SloCompTech/master
fixes #659
fixes #661
2017-09-05 02:58:50 -05:00
Martin Dagarin
49da91c786 Fixed compile bug 2017-09-04 21:10:15 +02:00
Martin Dagarin
e8378d1e74 Fixed swiched parameters in install 2017-09-04 21:07:49 +02:00
Christopher Dunn
2de18021fc Merge pull request #655 from cdunn2001/fix-649
Fixes #649
Fixes #654
2017-08-28 09:11:00 -05:00
Christopher Dunn
c98e1d85e3 Bump to soversion=19, 1.8.3
Note that cmake is deprecated, but we keep it in-sync manually for now.
2017-08-28 09:04:33 -05:00
Christopher Dunn
d830c0ab94 Fix writeCommentBeforeValue() iter deref
fixes #649
2017-08-28 08:43:05 -05:00
Christopher Dunn
90591c70cd Suppress GCC deprecated-declarations warning for tests 2017-08-28 08:42:43 -05:00
Christopher Dunn
4cfae897c0 Merge pull request #652 from cdunn2001/meson-not-scons
Meson not scons; 1.8.2<-1.8.1
2017-08-27 15:30:15 -05:00
Christopher Dunn
f4ec601fd3 Drop NEWS.txt
Older news can be found at
* https://github.com/open-source-parsers/jsoncpp/wiki/News
2017-08-27 15:23:55 -05:00
Christopher Dunn
d40f26d472 Move amalgamated source details to wiki 2017-08-27 15:16:43 -05:00
Christopher Dunn
c668af9d41 Update README
* Document meson/ninja.
* Deprecate cmake.
* Drop scons.
2017-08-27 15:11:40 -05:00
Christopher Dunn
13b5ed7287 1.8.2 <- 1.8.1
Soon, I hope to drop the cmake stuff and let meson handle
the version numbers.
2017-08-27 15:02:01 -05:00
Christopher Dunn
6d31cec7cf Drop scons support 2017-08-27 15:02:01 -05:00
Christopher Dunn
5331d295aa Merge branch 'fix-578' 2017-08-27 14:17:31 -05:00
Christopher Dunn
004270db37 Avoid memory error
But simply use `.assign()` instead of the extra copy. (See comment from
@BillyDonhue at #580.)

fixes #578
closes #580
2017-08-27 14:16:01 -05:00
Gaurav
9006194139 Fix uninitialized value detected by valgrind
Fix issue reported in https://github.com/open-source-parsers/jsoncpp/issues/578
For std::string variable, length() is more readable than size().
2017-08-27 14:16:01 -05:00
Christopher Dunn
6062f9b848 Merge pull request #641 from maksdamir/master
Fixing warnings. Added JSONCPP_DEPRECATED definition for clang. Also …
2017-08-05 15:45:01 -05:00
damiram
ef16a35328 Fixing warnings. Added JSONCPP_DEPRECATED definition for clang. Also updating .gitignore to ignore .DS_Store files (Mac OS Finder generated) 2017-08-02 22:44:42 -07:00
Christopher Dunn
7354da8077 Merge pull request #640 from cfyzium/master
Fix non-rvalue Json::Value assignment operator (should copy, not move)
2017-08-01 01:22:41 -05:00
Александр Малинин
6a15ca6442 Fix non-rvalue Json::Value assignment operator (should copy, not move) 2017-07-31 15:29:02 +03:00
Christopher Dunn
9a048e5766 Merge pull request #637 from ssbr/fix-owners
Restore BL's authorship attribution, and add "The Jsoncpp Authors" where it was missing
2017-07-30 21:43:25 -05:00
Devin Jeanpierre
59e4d35339 Restore BL's authorship attribution, and add "The Jsoncpp Authors" where it was missing.
Requested/noticed in https://github.com/open-source-parsers/jsoncpp/pull/610, and a
followup to https://github.com/open-source-parsers/jsoncpp/pull/607 .
2017-07-21 03:44:36 -07:00
Christopher Dunn
f26edb05e5 Merge pull request #630 from jschueller/appveyor
Fix shared/static lib build conflict

resolves #631
2017-07-16 17:18:24 -05:00
Billy Donahue
cadb6dd9a6 Merge pull request #636 from pavel-pimenov/fix-strstr
strstr -> strchr
2017-07-13 11:23:04 -04:00
pavel.pimenov
ea9f0cec30 strstr -> strchr
https://www.viva64.com/en/w/V817/print/
2017-07-13 14:21:53 +03:00
Julien Schueller
ffdcc9355d Avoid import/static libs name clash 2017-07-13 09:03:35 +02:00
Julien Schueller
f45c01a46e Enable shared libs on appveyor 2017-07-12 17:36:23 +02:00
Julien Schueller
3c2069fdd1 Cleanup appveyor script 2017-07-12 17:35:22 +02:00
Christopher Dunn
414b179d86 Merge pull request #635 from Dark-Passenger/master
Add move assignment operator for Json::Value class and overload append member function for RValue references

resolves #621
2017-07-11 16:08:36 -05:00
Dhruv Paranjape
0ba8bd73f5 add move assignment operator for CZString and change copy assignment to const reference. 2017-07-08 17:47:13 +05:30
Dhruv Paranjape
23c44d9f9e overload append function for R value references. 2017-07-08 17:30:47 +05:30
Dhruv Paranjape
8996c377aa add move assignment operator for Json::Value class. 2017-07-08 17:27:07 +05:30
Christopher Dunn
a679dde58d 1.8.1 2017-06-25 22:01:22 -07:00
Christopher Dunn
c21b4bbfdb Merge pull request #625 from SoapGentoo/mesonise
Add initial Meson build file
2017-06-25 21:51:15 -07:00
David Seifert
d14d8c35c3 Update Travis configuration 2017-06-26 06:12:05 +02:00
David Seifert
ed258de63d Add initial Meson build file 2017-06-26 06:12:05 +02:00
Christopher Dunn
154652ee7a Merge pull request #623 from bernhardHartleb/master
Fix #567 in writing real values in different locales
2017-06-24 10:34:14 -07:00
Bernhard Hartleb
4a9d77bcf7 Fix issue #567 in writing real values in different locales
The output of snprintf might produce ',' separators for decimal places if
certain locales are set. This commit moves the converversion from ',' to '.'
to correct place. Otherwise an additional ".0" might be appended.
2017-06-22 22:46:16 +02:00
Christopher Dunn
56efb6ba83 Merge pull request #622 from sylvestre/master
Allocate the proper memory for formatString. Fix a warning with gcc 7.1
2017-06-12 19:44:58 -05:00
Sylvestre Ledru
7f9cc2705c Allocate the proper memory for formatString. Fix a warning with gcc 7.1
/root/firefox-gcc-last/toolkit/components/jsoncpp/src/lib_json/json_writer.cpp:139:16: note: using the range [-2147483648, 2147483647] for directive argument
/root/firefox-gcc-last/toolkit/components/jsoncpp/src/lib_json/json_writer.cpp:146:10: note: 'sprintf' output between 5 and 15 bytes into a destination of size 6
   sprintf(formatString, "%%.%dg", precision);
   ~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
2017-06-09 22:41:48 +02:00
Christopher Dunn
d7347a2623 Merge pull request #609 from antonindrawan/QNX_Fix
Fix QNX build: QNX defines sprintf under the std namespace.
2017-05-01 21:52:55 -05:00
Anton Indrawan
2e319850d1 Fix QNX build: QNX defines sprintf under the std namespace. Use snprintf instead 2017-05-01 23:14:23 +02:00
Christopher Dunn
a3d35d7fb8 Merge pull request #607 from ssbr/master
Refactor authorship information for more technical accuracy.
2017-04-25 00:51:37 -05:00
Devin Jeanpierre
19fc55f408 Refactor authorship information for more technical accuracy.
Google advises its employees to add Google Inc. as an author, but that hasn't
been done yet and would be super inconvenient. So instead I've refactored the
file to refer to "The JsonCpp Authors", which are listed in the AUTHORS file.

The AUTHORS file itself is generated via:

    git log --pretty="%an <%ae>%n%cn <%ce>" | sort | uniq

Plus the addition of "Google Inc." as a copyright author. (Google owns the work
of anyone contributing from an @google.com address, for example.)

The list contains some probable duplicates where people have used more than one
email address. I didn't deduplicate because -- well, who's to say they're
duplicates, anyway? :)
2017-04-24 11:01:12 -07:00
Christopher Dunn
acf74290f1 Merge pull request #601 from paulobrizolara/master
Including instructions in how to use jsonCpp with conan
2017-04-09 21:47:09 -05:00
paulo
746ef154f1 Including instructions in how to use jsonCpp with conan
Also added the badge to the conan package.

Related to issue #564
2017-04-09 14:14:38 -03:00
Christopher Dunn
559b4416e6 Merge pull request #599 from pavel-pimenov/fix-v815
Fix V815:Decreased performance
2017-04-08 00:49:25 -05:00
pavel.pimenov
6ca374371e Fix V815:Decreased performance 2017-04-07 15:41:07 +03:00
Christopher Dunn
f7df408a6a Merge pull request #593 from AlB80/master
Optimize Value::isIntegral() method
2017-04-05 20:09:53 -05:00
Christopher Dunn
86ed860c4b Merge pull request #589 from ya1gaurav/patch-42
Fix warning issue with gcc flags.

closes #586
2017-04-05 19:50:21 -05:00
Alexander V. Brezgin
c442fd96e6 Optimize Value::isIntegral() method
Worst case called modf() twice
2017-03-29 06:37:37 +05:00
Gaurav
c68443f3a0 Fix Cmake build issue
FIx cmake build.
2017-03-10 10:33:03 +05:30
Gaurav
11c48d0047 Fix warning issue with gcc flags.
PR for - https://github.com/open-source-parsers/jsoncpp/issues/586
Separating the default options for compiler flags.
2017-03-10 10:22:33 +05:30
Christopher Dunn
264c3edca7 Merge pull request #573 from ya1gaurav/patch-39
Fix crash issue due to NULL value.
2017-03-09 16:06:37 -06:00
Christopher Dunn
a47fc398ef Merge pull request #571 from ibc/master
README: Give some love
2017-03-09 15:58:32 -06:00
David Seifert
2f178f390f Use full CMake paths in pkg-config template
Using full paths is more versatile. The current solution
breaks when specifying an absolute path for CMAKE_INSTALL_INCLUDEDIR
which is an otherwise supported option by CMake's GNUInstallDirs.
CMake does not support Autoconf-style ${prefix}-pseudo variables,
hence trying to emulate the behaviour gains us nothing and breaks
providing absolute paths to CMAKE_INSTALL_LIBDIR.
2017-03-09 07:13:45 -06:00
Gaurav
f251f15e6a Fix crash issue due to NULL value.
Null value in Value constructor will crash strlen(). Avoid crash with JSON_ASSERT_MESSAGE
2017-01-17 17:28:43 +05:30
Iñaki Baz Castillo
60bfcf1715 README: Give some love. 2017-01-12 11:24:29 +01:00
Christopher Dunn
81065748e3 Merge pull request #566 from open-source-parsers/update
std::min<unsigned>, for VS2015

fixes #565
2016-12-21 12:56:14 -06:00
Christopher Dunn
11836ae9aa std::min<unsigned>, for VS2015
fixes #565
2016-12-21 11:09:57 -06:00
Christopher Dunn
e25fb5384a Path for pkg-config
See #497, bottom comment.
2016-12-19 11:42:51 -06:00
Christopher Dunn
f700fe4559 Require cmake>=3.1
Plus some other build-related changes. I don't think there is anything
functionally different from 1.7.7, or even any binary incompatibilities, but
the cmake change is significant.
2016-12-14 13:39:05 -06:00
Christopher Dunn
d167a09b1c Merge pull request #562 from SoapGentoo/cmake-fixes
Replace current install variables with GNUInstallDirs
2016-12-14 13:30:53 -06:00
David Seifert
ba158fd22d Update Travis requirements for modern CMake 2016-12-14 17:53:10 +01:00
David Seifert
f3a4941590 Replace current install variables with GNUInstallDirs
* The GNUInstallDirs module is more idiomatic and supported by
  Kitware upstream, whereas the current directories are not
  standardised across CMake-using packages. Using CMake native
  mechanisms is better than reinventing the wheel, as it makes
  using the build system more uniform across the ecosystem
* Use CMAKE_CXX_STANDARD to force C++11
* Require CMake 3.1.0 at a minimum
* Fixed lower/UPPERcase format for function/macro calls
* Fixed indents by replacing tabs with 4 spaces
2016-12-14 17:53:10 +01:00
Christopher Dunn
0d25d9aebf Merge pull request #556 from Infotecs/nnkur-rec-fix
Removed a static variable used to contain the current recursion depth in json_reader.cpp
2016-12-09 10:47:17 -06:00
nnkur
5021e799dc Renamed JSONCPP_STACK_LIMIT to JSONCPP_DEPRECATED_STACK_LIMIT
Renamed JSONCPP_STACK_LIMIT to JSONCPP_DEPRECATED_STACK_LIMIT to stress that usage of this macros assumes old interface.
2016-12-07 15:47:08 +03:00
Christopher Dunn
762ad0fe9d Merge pull request #557 from sergiy80/master
Add pragma pack directive

resolves #458
2016-12-05 00:14:08 -06:00
Sergiy80
d6e666f573 Add pragma pack directive
Related to https://github.com/open-source-parsers/jsoncpp/issues/458
2016-12-03 22:29:14 +02:00
nnkur
2ecd2a59de Add files via upload
Removed a static variable used to contain the current recursion depth of Reader::readValue().  The number of elements in an internal container Reader::nodes_  is used instead.  It is correct because any recursive call of Reader::readValue() is executed with adjacent nodes_.push()  and nodes_.pop() calls.  
Added the option to change the allowed recursion depth at compile time by defining a macro JSONCPP_STACK_LIMIT as the required integer value.
2016-11-30 18:30:12 +03:00
Christopher Dunn
a691cb19de Merge pull request #553 from AlB80/master
Clarify code for value type return
2016-11-20 18:53:23 -06:00
Alexander V. Brezgin
ee7935986e Optimize value check 2016-11-20 03:55:08 +03:00
Alexander V. Brezgin
b4abc8241f Optimize value range check 2016-11-20 03:50:32 +03:00
Alexander V. Brezgin
12e9ef32f9 Remove repeated condition
isDouble() contains isIntegral()
2016-11-20 03:28:15 +03:00
Christopher Dunn
77632b2611 Merge pull request #549 from jia3ep/master
Added stack overflow test
2016-11-09 14:28:13 -06:00
Kirill V. Lyadvinsky
89aa87bd24 Use clang-3.5 since the Travis version has a conflict with gcc (check this issue https://bugs.debian.org/cgi-bin/bugreport.cgi?msg=11;bug=744872) 2016-11-09 12:05:22 +03:00
Christopher Dunn
34fc0020c0 Merge pull request #552 from omki2005/noexcept
change throw() to noexcept to conform to c++11
2016-11-08 07:21:56 -06:00
Christopher Dunn
f880a9432d Merge pull request #551 from suttungdigital/detect_locale_support
Check for locale support in CMake
2016-11-08 07:19:51 -06:00
Magnus Bjerke Vik
5a82131033 Rename NO_LOCALE_SUPPORT to JSONCPP_NO_LOCALE_SUPPORT 2016-11-08 09:47:27 +01:00
Magnus Bjerke Vik
1839f2da34 Check for locale support in CMake 2016-11-08 09:47:27 +01:00
Omkar Wagh
91c1d23461 change throw() to noexcept to conform to c++11 2016-11-07 17:39:38 -05:00
Kirill V. Lyadvinsky
86f085b810 Make it a bit more multithreading friendly 2016-11-03 22:45:36 +03:00
Kirill V. Lyadvinsky
ac372d2b00 Added stack overflow test 2016-11-02 15:33:57 +03:00
Christopher Dunn
0e24e3c64f Merge pull request #547 from BrendanDrewDaqri/master
resolves #546: Ensure floating point values on input render as floats on output
2016-10-28 01:22:38 -05:00
Brendan Drew
89ab7eca7f Ensure that the fact that a float was provided on input is preserved when writing output; update tests to reflect this fact 2016-10-27 04:49:11 -07:00
Christopher Dunn
a1db52b030 Merge pull request #543 from chfast/patch-1
Rename variable empty to emptyString
2016-10-24 11:16:24 -05:00
Paweł Bylica
1572539bec Rename variable empty to emptyString
Rename variable empty to emptyString in Value constructor to avoid shadowing of Value::empty().

GCC 4.8 produces the warning about this:
lib_json/json_value.cpp: In constructor ‘Json::Value::Value(Json::ValueType)’:
lib_json/json_value.cpp:346:27: warning: declaration of ‘empty’ shadows a member of 'this' [-Wshadow]
2016-10-14 11:59:28 +02:00
Christopher Dunn
d8cd848ede 1.7.7 2016-10-02 11:32:21 -05:00
Christopher Dunn
92259f7147 Bump SOVERSION, separate from MAJOR.MINOR.MICRO 2016-10-02 11:29:12 -05:00
yiqiju
4a431bcdac Fix typo
resolves #538
2016-10-02 10:52:13 -05:00
Christopher Dunn
7d868636de Merge pull request #536 from vriera/master
resolves #537
closes #538
2016-09-27 20:12:01 -05:00
Vicente Olivert Riera
ab0f1e234a Include stdint.h necessary for int64_t and uint64_t
Otherwise failures like these one can happen during the configure phase
of other applications that use jsoncpp, like upmpdcli for instance:

checking jsoncpp/json/json.h usability... yes
checking jsoncpp/json/json.h presence... yes
checking for jsoncpp/json/json.h... yes
configure: error: libjsoncpp not found.

And this is the actual problem that you can see in config.log:

configure:5233: checking for jsoncpp/json/json.h
configure:5233: result: yes
configure:5259: /usr/bin/mipsel-linux-g++ -o conftest conftest.cpp
-lmicrohttpd -lmpdclient -lpthread  -ljsoncpp >&5
In file included from /usr/include/jsoncpp/json/autolink.h:9:0,
                 from /usr/include/jsoncpp/json/json.h:9,
                 from conftest.cpp:26:
/usr/include/jsoncpp/json/config.h:155:9: error: 'int64_t' does not name
a type
 typedef int64_t Int64;
         ^
/usr/include/jsoncpp/json/config.h:156:9: error: 'uint64_t' does not
name a type
 typedef uint64_t UInt64;
         ^

Signed-off-by: Vicente Olivert Riera <Vincent.Riera@imgtec.com>
2016-09-26 11:24:32 +01:00
Christopher Dunn
45a560a8c0 1.7.6 <- 1.7.5 2016-09-25 19:05:56 -05:00
Christopher Dunn
4893a8f667 Merge pull request #535 from kavika13/master
Add RPATH to dynamic library build on OSX

fixes #534 

But we will revert if there are any complaints.
2016-09-25 18:58:14 -05:00
Gergely Nagy
f6d785fda8 Fix poss SEGV
for non-null terminated input.
2016-09-25 18:45:04 -05:00
Merlyn Morgan-Graham
8d54e333ff Add RPATH to dynamic library build on OSX 2016-09-22 22:06:25 -07:00
Christopher Dunn
b063cf4ada Merge pull request #529 from chrox802/chrox802-patch-1
fix a bug about Json::Path
2016-09-07 21:57:56 -05:00
Christopher Dunn
c4ab6d733f Merge pull request #528 from DrMetallius/master
Used macros to disable localeconv() calls
2016-09-07 21:52:18 -05:00
chason
2f97c0147b fix a bug about Json::Path 2016-09-07 19:56:19 +08:00
Alexander Gazarov
52cfe5ae88 Replaced the template-based solution for avoiding calls to localeconv() with a macro-based one (fixes #527) 2016-09-06 14:41:13 +03:00
Christopher Dunn
a304d61a7b 1.7.5 <- 1.7.4 2016-09-01 02:45:08 -05:00
Christopher Dunn
0a97e38ea5 Merge pull request #523 from prezi/fix-android-lconv
Workaround for missing lconv::decimal_point on android
2016-09-01 02:36:34 -05:00
Gida Pataki
894e78bff1 Workaround for missing lconv::decimal_point on android 2016-08-26 23:30:18 +02:00
Christopher Dunn
126bdc2b05 Reject extra chars if strictRoot
resolves #511
2016-08-21 20:32:16 -05:00
Christopher Dunn
094a7d8564 Fix locale for decimal points
resolves #514
2016-08-21 20:13:58 -05:00
Christopher Dunn
b9afdf190d Use int64_t for 64bit ints
resolves #509
2016-08-21 19:58:43 -05:00
Christopher Dunn
80a82ea269 Optional space after comma
resolves #513
2016-08-21 16:35:19 -05:00
Christopher Dunn
f78f685bab Remove needless if.
resolves #516
2016-08-21 16:31:14 -05:00
Christopher Dunn
7d8eddb98c Merge pull request #519 from open-source-parsers/fix-517
Avoid null for stringValue
2016-08-21 16:30:28 -05:00
Christopher Dunn
7e0571b444 Avoid null for stringValue
fixes #517
2016-08-21 16:25:29 -05:00
Christopher Dunn
b14c8c1423 Merge pull request #502 from open-source-parsers/null-object
Allow dtor for nullSingleton
2016-07-20 22:28:44 -07:00
Christopher Dunn
b299d3581f Allow dtor for nullSingleton
re #488 and #490
2016-07-20 11:31:41 -07:00
Christopher Dunn
48d2a69d47 1.7.4 <- 1.7.3 2016-07-09 13:27:28 -05:00
Christopher Dunn
772e257fc9 Merge pull request #493 from zorun/master
Fix compilation errors for downstream projects caused by incorrect pkconfig paths
2016-07-08 17:02:30 -05:00
Baptiste Jonglez
101fcf0806 Fix compilation errors for downstream projects caused by incorrect pkgconfig paths
Recent commit 911e2b0fea ("By default use <prefix> relative paths when
installing") introduced relative install paths in CMake.  But this
interacts badly with commit e6f1cffdd3 from a year ago: now, the paths in
`pkgconfig/jsoncpp.pc` are relative, which is incorrect.

Before 911e2b0fea (1.7.2 on Archlinux), this was correct:

    $ head -4 /usr/lib/pkgconfig/jsoncpp.pc
    prefix=/usr
    exec_prefix=${prefix}
    libdir=/usr/lib
    includedir=/usr/include

After 911e2b0fea (1.7.3 on Archlinux), this is now incorrect:

    $ head -4 /usr/lib/pkgconfig/jsoncpp.pc
    prefix=/usr
    exec_prefix=${prefix}
    libdir=lib
    includedir=include

This change causes hard-to-debug compilation errors for projects that
depend on jsoncpp, for instance:

    CXXLD    libring.la
    /tmp/ring-daemon/src/ring-daemon/src/../libtool: line 7486: cd: lib: No such file or directory
    libtool:   error: cannot determine absolute directory name of 'lib'
    make[3]: *** [Makefile:679: libring.la] Error 1

This is because jsoncpp contributes `-Llib -ljsoncpp` to the LDFLAGS, via
the pkg-config machinery.  Notice the relative path in `-Llib`.

To fix this, simply revert commit e6f1cffdd3 ("Fix custom includedir &
libdir substitution in pkg-config").  The change in 911e2b0fea should have
the same effect.

See #279, #470 for references.
2016-07-08 00:46:04 +02:00
Christopher Dunn
7e4df50d17 Merge pull request #490 from open-source-parsers/null-singleton
Use a Myers Singleton for null
2016-06-27 00:12:23 -05:00
Christopher Dunn
318f30357c 1.7.3 2016-06-26 19:40:43 -05:00
Christopher Dunn
0f288aecdd Use a Myers Singleton for null
Avoid some static initialization problems.

From @marklakata
See #488
2016-06-26 19:36:40 -05:00
Christopher Dunn
e0f9aab0bf Make internal func anon
fixes #489
2016-06-26 17:54:15 -05:00
Christopher Dunn
43203f1d09 Merge pull request #478 from quantumsteve/msvc_override
Use override keyword with Visual Studio.
2016-05-28 10:01:46 -05:00
Steven Hahn
55176b2bdd Use override keyword with Visual Studio. 2016-05-25 18:29:34 -04:00
Christopher Dunn
4356d9bba1 Merge pull request #474 from cdunn2001/fix-warning
Fix some warnings
2016-05-15 23:35:02 -05:00
Christopher Dunn
ea4af18317 Fix int->char conv warn
resolves #473
2016-05-15 23:13:56 -05:00
Christopher Dunn
b999616df8 fix warning 2016-05-15 23:13:47 -05:00
Christopher Dunn
660307d357 Merge pull request #470 from aboseley/relative_install_paths
By default use <prefix> relative paths when installing
2016-05-05 13:59:20 -05:00
Adam Boseley
911e2b0fea By default use <prefix> relative paths when installing
This allows the config file to keep working
when it is installed to a non-standard location
from a make DESTDIR=<location> install
2016-05-04 23:41:18 -07:00
Christopher Dunn
d4a49cf511 Merge pull request #466 from bkuhls/buildroot
CMakeLists.txt: Treat conversion warning as error only with JSONCPP_W…
2016-05-02 20:40:08 -05:00
Christopher Dunn
8bd4f943da Merge pull request #467 from jcfr/generalize-setting-of-JSONCPP_OVERRIDE
json/config.h: Generalize setting of JSONCPP_OVERRIDE to all compilers
2016-04-27 02:45:32 -05:00
Christopher Dunn
4018422a05 Merge pull request #465 from ds283/intel-compiler
Add CMAKE_CXX_FLAGS for Intel compiler
2016-04-27 02:28:41 -05:00
Jean-Christophe Fillion-Robin
ba6fa48d31 json/config.h: Generalize setting of JSONCPP_OVERRIDE to all compilers
This commit has been adapted from InsightSoftwareConsortium/ITK@1c86090
2016-04-25 17:35:12 -04:00
Bernd Kuhls
17fc9b1a80 CMakeLists.txt: Treat conversion warning as error only with JSONCPP_WITH_WARNING_AS_ERROR=On
Fixes errors when building with buildroot:
http://autobuild.buildroot.net/?reason=jsoncpp-1.7.2

Signed-off-by: Bernd Kuhls <bernd.kuhls@t-online.de>
2016-04-25 19:47:13 +02:00
ds283
6f22b0e076 Add CMAKE_CXX_FLAGS for Intel compiler 2016-04-25 14:47:05 +01:00
Christopher Dunn
980cdf0fb5 Exclude allocator.h from amalgamated header
Nobody using SecureAllocator really needs the amalgamated header.

resolves #461
2016-04-22 00:45:18 -05:00
Christopher Dunn
45da594e71 Merge pull request #454 from cbeiraod/master
Small fix for strict compilers (using the flag -Werror for instance)
2016-03-26 15:07:55 -05:00
Cristóvão B da Cruz e Silva
c8a7b445ea Small fix for strict compilers (using the flag -Werror for instance) 2016-03-26 18:41:46 +00:00
Christopher Dunn
c8054483f8 1.7.2 <- 1.7.1
This only fixes a clang warning, but we have already
seen more than one report for it.
2016-03-25 15:09:15 -05:00
Christopher Dunn
96d32b37ea Merge pull request #452 from cdunn2001/fix-451
Fix a clang warning
2016-03-24 00:27:13 -05:00
Christopher Dunn
ef2ff8754a Fix a clang warning
Resolves #451.
2016-03-23 22:33:18 -05:00
Christopher Dunn
b58c844579 1.7.1 <- 1.7.0
And 1.7.0 is recalled b/c we accidentally
had SecureAlloc by default.
2016-03-21 21:17:56 -05:00
Christopher Dunn
b803b92d11 Merge pull request #449 from cdunn2001/override-keyword
Use macro for `override`
2016-03-21 21:16:00 -05:00
Christopher Dunn
98e981dff9 Use macro for override
b/c MS VS2010 is supposed to be C++11 but does not fulfull
the entire standard.

Resolves #410.
Re: #430.
2016-03-21 21:00:24 -05:00
Christopher Dunn
1c47796479 JSONCPP_USING_SECURE_MEMORY default is 0
Re: #410
2016-03-21 20:44:16 -05:00
Christopher Dunn
12c67e810d Fix amalgamate.py
Fixes #448.
2016-03-21 20:33:34 -05:00
Christopher Dunn
c018d9fc3a Merge pull request #442 from open-source-parsers/JSONCPP_STRING
Secure allocator available for wiping memory.

Resolves #437.
Resolves #152.
2016-03-19 19:30:30 -05:00
dawesc
ae564653c4 -DJSONCPP_USE_SECURE_MEMORY=1 for cmake
Add allocator.h to amalgamated header
Test JSONCPP_USE_SECURE_MEMORY in Travis
2016-03-19 19:21:15 -05:00
dawesc
f8674c63b1 allocator.h 2016-03-19 14:37:30 -05:00
dawesc
b50a124fa6 .gitignore for eclipse 2016-03-19 14:37:30 -05:00
Christopher Dunn
59e327f50b Merge pull request #445 from ya1gaurav/patch-36
Added NORETURN for throw functions.
2016-03-17 20:54:26 -05:00
Gaurav
0b597b4b48 Added NORETURN for throw functions.
Fix in definition also.
2016-03-16 11:17:21 +05:30
Gaurav
d97ea5bf8d Added NORETURN for throw functions.
Resolve Issue - https://github.com/open-source-parsers/jsoncpp/issues/389
2016-03-16 11:15:09 +05:30
Christopher Dunn
e29b671ed5 Merge pull request #444 from ya1gaurav/patch-35
Supporting GCC 6.0

Resolves #411.
2016-03-15 19:23:17 -05:00
Gaurav
fbe1cf3916 Supporting GCC 6.0
Fixes test with GCC-6.0
2016-03-15 18:33:34 +05:30
Gaurav
cf86c473a5 Supporting GCC 6.0
This patch is also needed to build success for GCC 6.0.
Refer issue - https://github.com/open-source-parsers/jsoncpp/issues/411
2016-03-15 18:31:44 +05:30
Christopher Dawes
75570d7068 Fixing up for #define instead of typedef in secure allocators 2016-03-14 19:15:17 -05:00
Christopher Dunn
5da29e2707 Another shot at #411 2016-03-14 18:35:53 -05:00
Christopher Dunn
c80faa400c Merge pull request #438 from ya1gaurav/patch-34
MinGW support while building as DLL

Fixed #434.
2016-03-12 14:40:19 -06:00
Gaurav
8aabf93cc1 MinGW support while building as DLL
__MINGW32__ is appropriate to support for MinGW
2016-03-08 17:34:22 +05:30
Gaurav
3e51598176 MinGW support while building as DLL
This is PR for https://github.com/open-source-parsers/jsoncpp/issues/434
It will fix reported build error.
2016-03-08 17:17:42 +05:30
Christopher Dunn
1b5e61d008 Merge pull request #435 from cdunn2001/JSONCPP_STRING
JSONCPP_STRING etc.
2016-03-06 12:43:47 -06:00
Christopher Dunn
b84e0c159d JSONCPP_ISTREAM 2016-03-06 11:56:39 -06:00
Christopher Dunn
1e990640a9 JSONCPP_ISTRINGSTREAM 2016-03-06 11:56:39 -06:00
Christopher Dunn
38bb491400 JSONCPP_OSTRINGSTREAM 2016-03-06 11:56:38 -06:00
Christopher Dunn
724ba29bd3 JSONCPP_OSTREAM 2016-03-06 11:56:38 -06:00
Christopher Dunn
de5b792168 JSONCPP_STRING 2016-03-06 11:56:38 -06:00
Christopher Dunn
1b8e3b7f4d Merge pull request #432 from ya1gaurav/patch-33
Avoid passing Null to memcmp

hopefully resolves #404
2016-03-02 14:27:03 -06:00
Gaurav
4878913143 Avoid passing Null to memcmp
As per discussion in - https://github.com/open-source-parsers/jsoncpp/issues/404
Null should not be pass to memcmp, it may show undesired behaviour, so avoid doing that using assertion.
Also, changed one direct "assert" to JSON_ASSERT - it will be decided if exceptions are used or not.
2016-03-01 14:13:28 +05:30
Christopher Dunn
d179e24c14 Merge pull request #429 from tmaciejewski/remove_c_style_casting
remove C-style casting
2016-02-29 23:20:07 -06:00
Tomasz Maciejewski
ccd70540e3 remove C-style casting 2016-02-28 12:56:04 +01:00
Christopher Dunn
b4d2b65841 Merge pull request #418 from Techwolfy/master
std::snprintf fix for Cygwin
2016-02-10 21:17:31 -06:00
Techwolf
7e46bf76e8 std::snprintf fix for Cygwin 2016-02-10 17:09:32 -08:00
Christopher Dunn
da0c50f7b2 Merge pull request #416 from cdunn2001/fix-gcc6-errors
Fix gcc6 errors
2016-02-07 11:40:50 -06:00
Christopher Dunn
02bc3d77de This *might* fix the last gcc-6 error.
See https://github.com/open-source-parsers/jsoncpp/issues/411#issuecomment-180974558

I was unable to produce a warning in Clang, so I am not certain. But based on a [SO answer](http://stackoverflow.com/questions/25480059/gcc-conversion-warning-when-assigning-to-a-bitfield), I think I've fixed the following:
```
/tmp/jsoncpp/src/lib_json/json_value.cpp: In copy constructor 'Json::Value::CZString::CZString(const Json::Value::CZString&)':
/tmp/jsoncpp/src/lib_json/json_value.cpp:235:18: error: conversion to 'unsigned char:2' from 'unsigned int' may alter its value [-Werror=conversion]
   storage_.policy_ = (other.cstr_
                      ~~~~~~~~~~~~
                  ? (static_cast<DuplicationPolicy>(other.storage_.policy_) == noDuplication
                  ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
                      ? noDuplication : duplicate)
                      ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
                  : static_cast<DuplicationPolicy>(other.storage_.policy_));
                  ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
```
2016-02-07 11:28:50 -06:00
Christopher Dunn
6b562c850d Drop -Wno-sign-conversion suppression 2016-02-07 11:19:04 -06:00
Christopher Dunn
2713f4f456 Fix a sign-compare 2016-02-07 11:17:28 -06:00
Christopher Dunn
83bc9c7cf6 Errors for sign-compare, since gcc6 is stricter 2016-02-07 11:12:25 -06:00
Christopher Dunn
95f120f68e For gcc>=6 JSON_USE_INT64_DOUBLE_CONVERSION 2016-02-07 11:09:41 -06:00
Christopher Dunn
9a4b1e39bf 1.7.0 < 1.6.5 2016-02-06 10:27:39 -06:00
Christopher Dunn
2c872ec997 Merge pull request #406 from magnific0/master
std::snprintf not part of std for MinGW32 using c++11
2016-02-06 10:21:45 -06:00
Christopher Dunn
b860cc38e8 Merge pull request #414 from cdunn2001/conversion-errors
Fix conversion warnings/errors
2016-02-06 10:19:23 -06:00
Christopher Dunn
fef4b75796 More conversion fixes for gcc 2016-02-06 10:10:49 -06:00
Christopher Dunn
779d8a33fc Try to find ptrdiff_t for GNUC 4.9.2
https://travis-ci.org/open-source-parsers/jsoncpp/jobs/107452667
2016-02-06 09:49:29 -06:00
Christopher Dunn
d4513fcf45 Fix conversion warnings/errors
See #411.
  http://paste.debian.net/378673/
2016-02-06 09:25:20 -06:00
Christopher Dunn
baefec773c In case cmake is run in root-dir 2016-02-06 08:29:12 -06:00
Christopher Dunn
bc72070f43 Merge pull request #393 from ds283/cmake-binary-dir
Change ${CMAKE_BINARY_DIR} to ${CMAKE_CURRENT_BINARY_DIR}
2016-02-04 15:05:02 -06:00
Jacco
bc9b445fee std::snprintf fix for MinGW32 c++11 2016-01-25 11:39:36 +01:00
Jacco
2646ac5fa5 std::snprintf fix for MinGW32 c++11 2016-01-25 11:38:49 +01:00
ds283
2cca1cd239 Change ${CMAKE_BINARY_DIR} to ${CMAKE_CURRENT_BINARY_DIR}
- if building as a submodule of another repository, installation of pkg-config files can fail because they may not be in the top-level binary directory

- changing ${CMAKE_BINARY_DIR} to ${CMAKE_CURRENT_BINARY_DIR} allow CMake to find the files for installation
2015-12-05 12:18:12 +00:00
Christopher Dunn
9234cbbc90 Merge pull request #387 from joerg-krause/cmake-option-strict-iso
Disable -pedantic conditionally; add -pedantic-errors and -Werror conditionally
2015-10-30 01:16:49 -05:00
Jörg Krause
7c93031c9b Enable -Werror with JSONCPP_WITH_WARNING_AS_ERROR
Commit 912d55094d disabled '-Werror'. Enable it
if jsoncpp is build with JSONCPP_WITH_WARNING_AS_ERROR=ON.
2015-10-29 09:21:01 +01:00
Jörg Krause
48bfe91062 Add option JSONCPP_WITH_STRICT_ISO
'-pedantic' issues all warnings demanded by strict ISO C/C++; rejecting
extensions that do not follow ISO C/C++. Without this option, certain GNU
extensions and traditional C/C++ features are supported as well.

With this option enabled building jsoncpp fails with the musl toolchain on
x86 because of an incompatible posix_memalign declaration [1]. Without
'-pedantic' there is no error anymore and jsoncpp builds fine.

Add an option JSONCPP_WITH_STRICT_ISO to disable compilation with '-pedantic'
with GCC. If jsoncpp is build with the JSONCPP_WITH_WARNING_AS_ERROR option
'-pedantic-errors' is used instead.

[1] https://gcc.gnu.org/ml/gcc-patches/2015-05/msg01425.html
2015-10-29 09:16:21 +01:00
Christopher Dunn
0cce773019 Merge pull request #384 from EvinceMoi/master
json_reader throwRuntimeError return error details
2015-10-27 15:48:05 -05:00
Evince
6b10ce8c0d json_reader throwRuntimeError return error details instead of hard-coded message
Signed-off-by: Evince <baneyue@gmail.com>
2015-10-28 00:22:46 +08:00
Christopher Dunn
34bdbb58b4 Merge pull request #267 from cdunn2001/moveSemantics
move ctors
2015-10-21 01:13:59 -05:00
Christopher Dunn
527965cbde Minor
adjustments, based on comments in PR.
2015-10-19 23:49:07 -05:00
Motti
2b00891a86 move ctors
* Add move constructor to Value::CZString
* Add unit test for Value move constructor
* Allow includer to specify in advance the value for
JSON_HAS_RVALUE_REFERENCES
2015-10-19 23:42:52 -05:00
Christopher Dunn
a4ce2829dc Some indentation
in anticipation of another change.
2015-10-19 23:40:47 -05:00
Christopher Dunn
69e7f1c858 Merge pull request #382 from cdunn2001/ccache
Support ccache
2015-10-19 22:50:41 -05:00
Christopher Dunn
d1a2b94d5b Support ccache
http://stackoverflow.com/a/24305849/263998
2015-10-19 12:10:58 -05:00
Christopher Dunn
772f634548 Merge pull request #381 from bknecht/precision
Resolves #284
2015-10-15 16:15:15 -05:00
Benjamin Knecht
9fd1ca8d68 Add test code for precision 2015-10-15 18:32:24 +02:00
Benjamin Knecht
38022157b2 making precision unsigned int
adding precision as settings value for StreamBuilder
2015-10-15 18:00:42 +02:00
Benjamin Knecht
039a6e3b61 Create format string with sprintf.
For now use hardcoded precision '17' for now
2015-10-15 17:28:56 +02:00
Christopher Dunn
9c17e61bd0 Merge pull request #379 from cdunn2001/issue-377
Iterator conversions
2015-10-10 17:20:37 -05:00
ycqiu
c8a8cfcd4b fix
In value.h, ValueConstIterator can convert to ValueIterator, I think that is a bug. the correct way is ValueIterator can convert to ValueConstIterator.
2015-10-10 17:17:20 -05:00
ycqiu
4994c77d09 add test code
does not compile
2015-10-10 16:22:14 -05:00
Christopher Dunn
beae99924f Merge pull request #373 from antonindrawan/QNX_support
Compiles jsoncpp with QNX 6.6
2015-10-04 14:54:09 -05:00
Christopher Dunn
8b9940fd24 Merge pull request #375 from Dani-Hub/issue-374-remove-defaulted-ctor
Remove defaulted default constructor
2015-10-03 16:03:27 -05:00
drgler
b96d90efbd Remove defaulted default constructor 2015-10-03 19:40:23 +02:00
Anton Indrawan
e375b8c89e Compiles jsoncpp with QNX 6.6 2015-10-03 11:48:19 +02:00
Christopher Dunn
49393ead06 Merge pull request #371 from mathstuf/more-platform-support
json_writer: improve isfinite support on *nix
2015-10-03 03:58:20 -05:00
Ben Boeckel
8df11d518b json_writer: improve isfinite support on *nix
Based on a patches to CMake by:

Ådne Hovda <ahovda@openit.com>:

    commit 7b1cdb00279908cacabada92f8a53e4986465423

    jsoncpp: Provide 'isfinite' implementation on older AIX and HP-UX

    Newer AIX and HP-UX platforms provide 'isfinite' as a <math.h> macro.
    Older versions do not, so add the definition if it is not provided.

Michael Scott <michael.scott@gbgplc.com>:

    commit 9217b678b305d7df7471ba476a81bf28961fdfa3

    jsoncpp: Provide 'isfinite' impl on more HP-UX versions (#15576)

    Some versions of HP-UX do not define 'isfinite' or 'finite' in math.h
    for Itanium when preprocessing with C++, so we have to add the
    definition ourselves instead to map to the internal version.

Joerg Sonnenberger <joerg@bec.de>:

    commit 75644dafe54c21902f14cfe58cb8338b553b69d8

    jsoncpp: Fix compilation as C99 on Solaris

    In C99 mode, Solaris variants may already define isfinite, so check for
    the existence first.
2015-10-01 13:27:19 -04:00
Christopher Dunn
8e400e9be7 Merge pull request #368 from mathstuf/export-factory-inner-class
reader: export CharReader::Factory
2015-09-28 17:06:34 -05:00
Christopher Dunn
dc5aa4ad7f Fix VS warnings
These don't really need to be const.

resolves #369
2015-09-28 17:05:57 -05:00
Ben Boeckel
80def66fa5 reader: export CharReader::Factory 2015-09-28 15:45:11 -04:00
Christopher Dunn
6992831c1c Merge pull request #367 from Dani-Hub/issue-350-unique_ptr-2
__cplusplus value should not be used to decide for std::unique_ptr

resolves #350
2015-09-28 11:27:29 -05:00
drgler
7e4875a239 __cplusplus value should not be used to decide for std::unique_ptr #350:
In addition to the C++ language version define __cplusplus also check _CPPLIB_VER for better Dinkumware support.
2015-09-27 14:03:35 +02:00
Christopher Dunn
979cbec237 Fully init OurReader
See #363, similar to #364.
2015-09-23 09:44:58 -05:00
Christopher Dunn
2e625dd9af Merge pull request #364 from ya1gaurav/patch-28
Have default ctor for OurFeatures
2015-09-23 09:41:17 -05:00
Gaurav
83ea25e5e2 Make OurFeatures ctor as default.
Please review suggested changes.
2015-09-23 09:42:26 +05:30
Christopher Dunn
5721f1ca57 Merge pull request #366 from ya1gaurav/patch-30
parseCommandLine also throws
2015-09-22 05:09:33 -05:00
Christopher Dunn
9942f6a1c7 Merge pull request #365 from ya1gaurav/patch-29
Remove conditional check for isMultiLine
2015-09-22 05:08:56 -05:00
Gaurav
6c14548293 parseCommandLine also throws
Catching exceptions thrown by parseCommandLine (std::bad_alloc & std::length_error) also.
2015-09-22 13:53:19 +05:30
Gaurav
6ee0bff822 Remove conditional check for isMultiLine
At all 3 places isMultiLine is checked in for loop :
for (int index = 0; index < size && !isMultiLine; ++index) {
It means !isMultiLine is always true (otherwise do not enter loop), so || condition does not depend on isMultiLine, so removed that.
2015-09-22 09:48:54 +05:30
Gaurav
e3b35992f8 Add default value of stackLimit couple of places
stackLimit default value is missing at two places.Adding them.
2015-09-21 18:05:15 +05:30
Christopher Dunn
cc5cdb565c Merge pull request #348 from cdunn2001/override-keyword
C++11: override keyword
2015-09-05 12:12:37 -05:00
Gaurav
aadd0b1b63 C++11: override keyword
Source : http://en.cppreference.com/w/cpp/language/override
2015-09-05 12:03:38 -05:00
Christopher Dunn
3ee15b7bcc Merge pull request #339 from Dani-Hub/master
Floating-point NaN or Infinity values should be allowed as a feature …
2015-09-05 11:59:54 -05:00
drgler
68509e6161 Fix number reading in the presence of Infinity: Only check for infinity if we have a leading sign character. 2015-09-05 14:49:33 +02:00
drgler
4cea1f6f6c Adjust whitespace formatting 2015-09-05 14:48:29 +02:00
Daniel Krügler
6f9ed421d0 Merge pull request #2 from BillyDonahue/billy_danihub_fix
Billy danihub fix
2015-09-05 13:38:04 +02:00
Billy Donahue
7f7bbeff76 don't need out field of TestData 2015-09-05 04:22:18 -04:00
Billy Donahue
bfffe8cec7 prettier test 2015-09-05 04:07:56 -04:00
Billy Donahue
73154fb546 expanded Infinity test 2015-09-05 03:48:38 -04:00
drgler
63c747218b Floating-point NaN or Infinity values should be allowed as a feature #209
Introduce 'allowSpecialFloats' for readers and 'useSpecialFloats' for writers, use consistent macro snprintf definition for writers and readers, provide new unit tests for #209
2015-09-03 22:50:03 +02:00
drgler
2084563efb Floating-point NaN or Infinity values should be allowed as a feature #209
Introduce 'allowSpecialFloats' for readers and 'useSpecialFloats' for writers, use consistent macro snprintf definition for writers and readers, provide new unit tests for #209
2015-09-03 22:19:22 +02:00
Christopher Dunn
6329975e6d Merge pull request #337 from AMDmi3/patch-1
Specify float constant as float
2015-08-21 20:45:37 -05:00
Dmitry Marakasov
7acfd599f0 Specify float constant as float
Otherwise, on some 32 bit platforms this may not fit into long and compilation will fail:

    src/test_lib_json/main.cpp:1260: error: integer constant is too large for 'long' type
2015-08-21 21:19:26 +03:00
Robert Dailey
63a961a752 Clean up cmake END* (again)
(I missed a couple. ~cd)
2015-08-14 14:47:46 -07:00
Christopher Dunn
cb2378fa41 Merge pull request #332 from cdunn2001/END
Clean up cmake END*
2015-08-14 14:40:42 -07:00
Robert Dailey
37aaaec70e Clean up cmake END*
* Clean up closing statements for if conditions, functions, macros,
  and other entities. Newer versions of CMake do not require you to
  redundantly respecify the parameters to the opening arguments.
2015-08-14 14:31:08 -07:00
Christopher Dunn
585446e6b3 Merge pull request #327 from cdunn2001/more-gitattributes
More gitattributes
2015-08-09 16:38:21 -07:00
Christopher Dunn
7f4be39e9f add .gitattributes
helps #325
2015-08-09 16:25:36 -07:00
Christopher Dunn
47595e922b normalized some windows VS stuff 2015-08-09 16:23:50 -07:00
Christopher Dunn
9f7dbcb19b Merge pull request #326 from rcdailey/git-attributes
Introduce .gitattributes file and normalize line endings
2015-08-09 16:23:20 -07:00
Robert Dailey
c1996256d6 Normalize line endings
This commit contains nothing but line ending normalization
changes. These changes were performed after the introduction
of .gitattributes into the repository.
2015-08-09 18:02:52 -05:00
Robert Dailey
25e4adc4e1 Add .gitattributes file 2015-08-09 18:02:37 -05:00
Aaron Jacobs
cc2c15c3eb Remove undefined behavior from a left shift of a negative value.
Fixed by shifting a positive value, then negating the result.

(Credit: Richard Trieu)
2015-08-03 10:58:29 +10:00
Christopher Dunn
912d55094d Merge pull request #323 from joerg-krause/master
Remove -Werror
2015-08-02 15:48:38 -05:00
Jörg Krause
d7b84f69c5 Remove Werror
-Werror shouldn't be used in released code since it can cause random build
failures on moderate warnings. It also depends on the used toolchain since
different toolchains may or may not print the same warnings.
2015-07-30 23:56:28 +02:00
Christopher Dunn
9dad198af6 Merge pull request #320 from shields/negation-overflow
Fix cases where the most negative signed integer was negated, causing undefined behavior.
2015-07-28 11:35:52 -05:00
Michael Shields
7f06e9dc28 Fix cases where the most negative signed integer was negated, causing
undefined behavior.
2015-07-27 16:35:19 -07:00
Christopher Dunn
d84702c903 1.6.5 2015-07-23 00:26:13 -05:00
Christopher Dunn
949babd7b0 Exceptions declared in header
resolves #272
2015-07-23 00:26:13 -05:00
Christopher Dunn
6ed877c77c correction for #316 2015-07-23 00:26:13 -05:00
Christopher Dunn
1c69568f8d Merge pull request #316 from filipjs/master
Update json_tool.h

typo in a comment
2015-07-17 06:44:33 -05:00
filipjs
770fdda28b Update json_tool.h
Fix a typo in comment.
2015-07-14 14:34:07 +02:00
Christopher Dunn
81cf237917 Merge pull request #314 from cdunn2001/master
-Werror

plus small bug-fix
2015-07-12 14:38:02 -05:00
Christopher Dunn
cac79543f8 1.6.4
minor bug-fix
2015-07-12 14:29:53 -05:00
Christopher Dunn
d8186f36a6 -Werror 2015-07-12 14:28:55 -05:00
Christopher Dunn
7f240623d3 fixed a bug found by -Wshadow 2015-07-12 14:28:55 -05:00
Christopher Dunn
784433ac72 fix some warnings 2015-07-12 14:28:37 -05:00
Christopher Dunn
7275e3ce3c drop -Wsign-conversion 2015-07-12 12:49:57 -05:00
Christopher Dunn
46aa9d75fa -Wconversion
* https://gcc.gnu.org/onlinedocs/gcc/Warning-Options.html
* http://programmers.stackexchange.com/questions/122608/clang-warning-flags-for-objective-c-development/124574#124574

In clang: `-Wconversion` implies `-Wshorten-64-to-32`
2015-07-12 12:39:04 -05:00
Christopher Dunn
f94a0e8989 auto-generated file with minor update 2015-07-12 12:31:43 -05:00
Christopher Dunn
e22a2f36f7 Merge pull request #313 from cdunn2001/master
`-std=c++11` for gcc builds too

There was an issue with Travis, but we seem to be past that now.

We were using only -std=c++0x for gcc, as you can see in the diff.

resolves #134
2015-07-12 12:18:28 -05:00
Christopher Dunn
fac87108a4 -std=c++11 for gcc builds too 2015-07-12 12:08:34 -05:00
Christopher Dunn
14fc9f124e Merge pull request #312 from cdunn2001/master
gcc-4.9, clang (3.0)
2015-07-12 12:04:41 -05:00
Christopher Dunn
658fa37e63 gcc-4.9, clang (3.0) 2015-07-12 11:53:49 -05:00
Christopher Dunn
056e5f9b64 Merge pull request #309 from cdunn2001/master
dockerize the Travis build, and allow C++11
2015-07-11 14:19:02 -05:00
Christopher Dunn
d8e8c14ffc valgrind in Travis 2015-07-11 14:11:45 -05:00
Christopher Dunn
f4e6fccd46 dockerize the Travis build
Docker builds are *much* faster in Travis.

Also, we prepare to enable C++11.
2015-07-11 14:06:18 -05:00
Christopher Dunn
2428889813 1.6.3 2015-07-11 13:41:13 -05:00
Christopher Dunn
89704039a0 minor doc fix, for #302 2015-07-11 12:11:00 -05:00
Christopher Dunn
6ca8ffcb91 Merge pull request #305 from cdunn2001/fix-fixeol-undefined-name-sys
Fix undefined name "sys"

Same as #299 (bca0eff), but an earlier commit needed to be rebased.
2015-07-11 11:17:09 -05:00
Mike Naberezny
b5e70f950e Fix undefined name "sys"
Same as #299 (bca0eff81a1c5ef160d9858b8e89b1c919b71c1f), but an earlier
commit needed to be rebased.
2015-07-11 11:15:43 -05:00
Christopher Dunn
b26804d1c2 Merge pull request #304 from cdunn2001/297
Same as #297 (1c4f274ab3), but properly rebased
2015-07-11 11:08:47 -05:00
Stuart Eichert
702a539762 Fix #296: Explicitly cast size_t results to unsigned when needed
This is rebased from #297, where AppVeyor had been failing, and which
was not properly based on the master branch.
2015-07-11 11:00:18 -05:00
Stuart Eichert
81cb7e5c5b Warn about implicit 64 to 32 bit conversions when using clang 2015-07-11 10:59:56 -05:00
Christopher Dunn
d259f608fd Merge pull request #303 from cdunn2001/appveyor-14.0
fix appveyor 32-bit windows build

I've backed rebased under #297 because AppVeyor has been failing since there, and because that was not properly based on master anyway.
2015-07-11 10:58:59 -05:00
Christopher Dunn
4652f818fe fix appveyor 32-bit windows build
* http://help.appveyor.com/discussions/problems/2229-v140-not-found-on-vs2105rc
```
Done Building Project "C:\projects\jsoncpp\jsoncpp.sln" (default targets) -- FAILED.

Build FAILED.

"C:\projects\jsoncpp\jsoncpp.sln" (default target) (1) ->
"C:\projects\jsoncpp\ALL_BUILD.vcxproj.metaproj" (default target) (2) ->
"C:\projects\jsoncpp\ZERO_CHECK.vcxproj" (default target) (3) ->
(PlatformPrepareForBuild target) ->
  C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V120\Microsoft.Cpp.Platform.targets(64,5): error MSB8020: The build tools for v140 (Platform Toolset = 'v140') cannot be found. To build using the v140 build tools, please install v140 build tools.  Alternatively, you may upgrade to the current Visual Studio tools by selecting the Project menu or right-click the solution, and then selecting "Upgrade Solution...". [C:\projects\jsoncpp\ZERO_CHECK.vcxproj]
```
2015-07-11 10:53:13 -05:00
Christopher Dunn
ce32274ba5 Merge pull request #295 from martyngigg/master
Allow an optional suffix on the debug library name in CMake
2015-07-01 03:18:33 -05:00
Martyn Gigg
717c791d4e Allow an optional suffix on the debug library name in CMake. 2015-06-29 19:20:08 +01:00
Christopher Dunn
6e52e272da Merge pull request #294 from cdunn2001/master
fix ,/. problem in reader
2015-06-19 00:10:32 -05:00
Christopher Dunn
6416350438 fix ,/. problem in reader
fixes #293
2015-06-18 22:45:36 -05:00
Christopher Dunn
bcb83b921c fix doxybuild.py for Windows
issue #287 (tylerknott@)
2015-06-18 22:26:44 -05:00
Christopher Dunn
3f05b1a897 Merge pull request #276 from bmyerz/master
make the unix cmake example work
2015-05-21 00:47:35 -05:00
Christopher Dunn
de2c85f576 Merge pull request #282 from keithkml/patch-1
Clarify which parts of README for users vs devs
2015-05-20 20:33:31 -05:00
Christopher Dunn
b389d81bf9 Merge pull request #280 from mgorny/pkg-config-fix
Fix custom includedir & libdir substitution in pkg-config
2015-05-20 20:24:37 -05:00
Keith Lea
89c51f3457 Clarify which parts of README for users vs devs
When I arrived at the JsonCpp GitHub page, as an intermediate C++ developer, I could not figure out how to include JsonCpp into my project. The changes I propose to the README make this much clearer, and define a clear distinction between which instructions are for those developing and contributing to JsonCpp, and those who are just using it.
2015-05-20 09:43:47 -07:00
Michał Górny
e6f1cffdd3 Fix custom includedir & libdir substitution in pkg-config
Do not prepend ${prefix} to substituted includedir & libdir
in the pkg-config file -- if the paths are overriden by user, CMake puts
absolute paths there (even if user specifies a relative path). Instead,
use the absolute path provided by CMake and appropriately default
LIBRARY_INSTALL_DIR & INCLUDE_INSTALL_DIR to absolute paths with
${CMAKE_INSTALL_PREFIX} prepended.

Fixes: https://github.com/open-source-parsers/jsoncpp/issues/279
Signed-off-by: Michał Górny <mgorny@gentoo.org>
2015-05-19 17:32:31 +02:00
Christopher Dunn
64441486ac Merge pull request #275 from stefan-it/stefan/cmake-generator-fix
[Documentation][Markdown] Use correct help option for cmake.
2015-05-19 01:53:11 -05:00
Christopher Dunn
d5e54d2609 Merge pull request #277 from gogo40/master
fix compile error on android

But note that we do not have continuous integration testing for Android. This could break again.

`snprintf` drives me crazy. It should have been part of every C library 20 years ago.
2015-05-19 01:52:57 -05:00
Péricles Lopes Machado
97e093a361 fix compile error on android 2015-05-18 14:31:05 -03:00
Brandon Myers
d57ac97db8 make the unix make example work
...by setting the archive directory variable
2015-05-18 10:06:21 -07:00
Stefan Schweter
31e9962754 [Documentation][Markdown] Use correct help option for cmake. 2015-05-17 13:04:40 +02:00
Christopher Dunn
5256551b03 address compilation probs for C++ Builder
BORLANDC compiler strangeness. Thanks to:

* Dan Liu
* Victor Chen

close #269
close #252
2015-04-28 05:08:58 +01:00
Christopher Dunn
6649009ffa another fix for BORLANDC 2015-04-28 04:57:49 +01:00
Christopher Dunn
2a10f4a3b8 move ctors for BORLAND 2015-04-28 04:55:12 +01:00
Christopher Dunn
28d086e1d9 Merge pull request #266 from cdunn2001/issue-252
Use unsigned for DuplicationPolicy, to fix a problem with "C++ Builder"
IDE.

Fixes #252.

Thanks to:

* Dan Liu -- http://blog.csdn.net/gzliudan/article/details/45264201)
* Victor Chen -- http://www.cppfans.com/sdk/json/jsoncpp.asp
2015-04-27 18:28:06 -07:00
Christopher Dunn
a0a7c5f6de a little test for issue 252, but does not fail for me 2015-04-27 18:14:09 -07:00
Dan Liu
fcbab02e4a fix crash for "C++ Builder" IDE
http://blog.csdn.net/gzliudan/article/details/45264201
2015-04-27 18:10:12 -07:00
Christopher Dunn
f4ee48bc21 Merge pull request #265 from cdunn2001/valgrind
run valgrind in Travis CI
2015-04-26 20:08:17 -07:00
Christopher Dunn
88184d142b run valgrind in Travis CI
Because this runs apt-get, it will not work as-is for OSX. So when
we have OSX in Travis, we will have to wrap this somehow. See #250.

Closes #222.
2015-04-27 04:03:34 +01:00
Christopher Dunn
ae177fd901 Merge pull request #263 from cdunn2001/static-shared
Use standard **cmake** variables, to support superprojects better.

- `JSONCPP_LIB_BUILD_SHARED` -> `BUILD_SHARED_LIBS`
- `JSONCPP_LIB_BUILD_STATIC` -> `BUILD_STATIC_LIBS`
2015-04-23 08:58:38 -07:00
Gaurav
3f6345234f Use standard CMake variables - static/shared lib.
Replace JSONCPP_LIB_BUILD_SHARED => BUILD_SHARED_LIBS
2015-04-23 07:32:19 -07:00
Gaurav
a53070c42b Use standard CMake variables - static/shared lib.
Replace JSONCPP_LIB_BUILD_SHARED => BUILD_SHARED_LIBS
2015-04-23 07:32:19 -07:00
Gaurav
c09e121aeb Use standard CMake variables - static/shared lib.
Replace JSONCPP_LIB_BUILD_SHARED => BUILD_SHARED_LIBS
2015-04-23 07:32:18 -07:00
Gaurav
4f8ec9d207 Use standard CMake variables - static/shared lib.
Replaced JSONCPP_LIB_BUILD_SHARED => BUILD_SHARED_LIBS
Replaced JSONCPP_LIB_BUILD_STATIC => BUILD_STATIC_LIBS
2015-04-23 07:32:18 -07:00
Gaurav
0fe61a68f8 Use standard CMake variables - static/shared lib.
Replaced JSONCPP_LIB_BUILD_SHARED => BUILD_SHARED_LIBS
2015-04-23 07:32:18 -07:00
Gaurav
43019088f0 Use standard CMake variables - static/shared lib.
Replaced JSONCPP_LIB_BUILD_SHARED => BUILD_SHARED_LIBS
Moved flag JSON_DLL to line no 8.
2015-04-23 07:32:18 -07:00
Gaurav
0c1c076b7c Use standard CMake variables - static/shared lib.
Replaced JSONCPP_LIB_BUILD_SHARED => BUILD_SHARED_LIBS
Moved definition DJSON_DLL to line 11.
2015-04-23 07:32:18 -07:00
Gaurav
11130997c3 Use standard CMake variables - static/shared lib.
Replace JSONCPP_LIB_BUILD_SHARED => BUILD_SHARED_LIBS
Replace JSONCPP_LIB_BUILD_STATIC => BUILD_STATIC_LIBS
Removed workaround  https://github.com/open-source-parsers/jsoncpp/issues/51
Removed OPTION for shared/static in this file.
2015-04-23 07:32:17 -07:00
Gaurav
30bb4ccb67 Use standard CMake variables - static/shared lib.
Currently JSONCPP_LIB_BUILD_SHARED variable is used as option to build static/shared libraries.
The current patch uses standard CMake variables for this.
Such a workaround is done in https://github.com/open-source-parsers/jsoncpp/issues/51
Current patch will make it generic.
2015-04-23 18:39:00 +05:30
Christopher Dunn
74143f39e7 fix leak in unit-tests 2015-04-22 19:33:41 -07:00
Christopher Dunn
56650e83c5 swap docs for default vs. strictMode 2015-04-20 13:10:31 -07:00
Christopher Dunn
441f8cdfa1 Merge pull request #244 from cdunn2001/appveyor
New `appveyor.yml`: All tests pass, in both Appveyor and Travis!

Henceforth, GitHub will run both for any pull-request, so this file will be needed in the `0.y.z` branch too.
2015-04-18 17:14:33 -07:00
Christopher Dunn
a658759039 maybe fix an error 2015-04-16 18:33:39 -07:00
Christopher Dunn
0eb0e502c8 add a comment, to force a build 2015-04-16 18:30:24 -07:00
Marek Kotewicz
e983204906 appveyor deploy init 2015-04-16 18:12:18 -07:00
Marek Kotewicz
fe06acb587 fixed version on appveyor build 2015-04-16 18:12:17 -07:00
Marek Kotewicz
1b49a55ea1 appveyor multiple platforms 2015-04-16 18:12:17 -07:00
Marek Kotewicz
13c36e9807 appveyor.yml 2015-04-16 18:12:17 -07:00
Christopher Dunn
50069d72da prefer std::string for setComment()
in case of embedded nulls
2015-04-11 14:49:28 -05:00
Christopher Dunn
24682e37bf 1.6.2 <- 1.6.1
Fix UTF-8 for old (deprecated) Writers.

* Do not truncate at embedded zeroes.
2015-04-11 14:45:33 -05:00
Christopher Dunn
c2b988ee74 Merge pull request #241 from cdunn2001/fix-more-utf8
support UTF-8 (specifically, embedded zeroes) in old Writers
2015-04-11 14:44:41 -05:00
Christopher Dunn
e255ce31a4 support UTF-8 in old Writers
We had already fixed Value to hold UTF-8 properly, but only the newer
StreamWriter was writing UTF-8 properly.

Old FasterWriter etc. were using asCString() instead of asString() in
Value::writeValue().

Hopefully this change does not break any existing code. Seems unlikely.

issue #240
2015-04-11 14:41:30 -05:00
Christopher Dunn
779b5bc5ba Merge pull request #239 from sbc100/copyright
Add copyright information to .py files
2015-04-11 14:41:03 -05:00
Sam Clegg
63860617b3 Add copyright information to .py files
This change adds explicit copyright information too python
files files.  The copyright year used in each case is the
date of the first git commit of each file.

The goal is to allow jsoncpp to be integrated into the
chromium source tree which requires license information in
each source file.

fixes #234
2015-04-09 18:05:47 -07:00
Christopher Dunn
9cb88d2ca6 1.6.1 <- 1.6.0 2015-03-31 15:07:14 -05:00
Christopher Dunn
363e51c0a9 Merge pull request #232 from cdunn2001/fix-snprintf
Fix snprintf

Well, it passes Travis. But when we have time, we should clean up how snprintf is used in both reader and writer.
2015-03-31 15:06:11 -05:00
Christopher Dunn
240ddb6a1b use std::snprintf for C++11 2015-03-31 15:04:24 -05:00
Baruch Siach
9dd77dc0ef Revert "Use std namespace for snprintf."
This reverts commit 1c58876185.

std::snprintf() is only available in C++11, which is not provided by
all compilers. Since the C library snprintf() can easily be used as a
replacement on Linux systems, this patch changes jsoncpp to use the C
library snprintf() instead of C++11 std::snprintf(), fixing the build error
below:

    src/lib_json/json_writer.cpp:33:18: error: 'snprintf' is not a member of 'std'

See #231, #224, and #218.
2015-03-31 15:04:24 -05:00
Christopher Dunn
244b1496e1 Merge pull request #225 from selaselah/master
fix find_program() bug: no result in not-win sys
2015-03-31 11:32:06 -05:00
selaselah
c083835261 fix find_program() bug: no result in not-win sys 2015-03-19 19:18:58 +08:00
Christopher Dunn
cbe7e7c9cb Merge pull request #221 from btolfa/forgotten-virtual-dtor
Added forgotten virtual dtor for `Json::CharReader::Factory`.

(Without this, the destructor of the derived `CharReaderBuilder` would not be called, which is a small memory leak.)
2015-03-15 13:49:24 -05:00
Tengiz Sharafiev
be183def8f Update reader.h 2015-03-14 21:30:00 +03:00
Christopher Dunn
951bd3d05d Merge pull request #219 from cdunn2001/c-std-headers
Close #218. Fix #214.
2015-03-11 21:36:51 -05:00
Connor Manning
1c58876185 Use std namespace for snprintf. 2015-03-11 21:33:08 -05:00
Connor Manning
2f2034629e Constrain MSVC _isfinite to before 2013, remove duplicate includes. 2015-03-11 21:33:08 -05:00
Dani-Hub
7020451b44 Fix isfinite for MSVC. 2015-03-11 21:32:59 -05:00
Connor Manning
80497f102e Use C++ standard headers. 2015-03-10 18:48:45 -05:00
Dani-Hub
f9feb66be2 Change exception data member
from "reference to string" to "string" (Resolves the most serious part of issue #216)
2015-03-09 18:42:16 -05:00
Christopher Dunn
ed495edcc1 prefer ValueIterator::name() to ::memberName()
in case of embedded nulls
2015-03-08 14:35:00 -05:00
Christopher Dunn
3c0a383877 Merge pull request #212 from cdunn2001/macro-deprec
close #210
2015-03-08 13:10:37 -05:00
Dani-Hub
5003983029 Make preprocessor query robust against older gcc versions 2015-03-08 13:07:27 -05:00
Dani-Hub
871b311e7e Provide JSONCPP_DEPRECATED definitions for clang and gcc 2015-03-08 13:07:27 -05:00
Christopher Dunn
cdbc35f6ac 1.6.0 2015-03-08 12:57:13 -05:00
Christopher Dunn
4e30c4fcdb comments 2015-03-08 12:56:32 -05:00
Christopher Dunn
0d33cb3639 Merge pull request #211 from cdunn2001/except
* Add Json::Exception and derivatives.
* Clarify when exceptions are thrown, to avoid crashes caused by malicious input.
* Use our own type (derived fro std::exception) so they are trappable.
2015-03-08 12:50:34 -05:00
Christopher Dunn
2250b3c29d use Json::RuntimeError 2015-03-08 12:44:55 -05:00
Christopher Dunn
9376368d86 use Json::LogicError in macros 2015-03-08 12:42:53 -05:00
Christopher Dunn
5383794cc9 Runtime/LogicError and throwers 2015-03-08 12:31:57 -05:00
Christopher Dunn
75279ccec2 base Json::Exception 2015-03-08 12:20:06 -05:00
Christopher Dunn
717b08695e clarify errors
* use macros for logic errors, not input errors
* throw on parsing failure in `operator>>()`, not assert
* throw on malloc, not assert
2015-03-08 12:06:22 -05:00
Christopher Dunn
ee4ea0ec3f delete debug code from test 2015-03-07 15:47:39 -06:00
Christopher Dunn
ce19001238 require length
Ugh! I meant to do this long ago. It would have caught my blunder.
2015-03-07 15:12:52 -06:00
Christopher Dunn
078f991c57 1.5.4 <- 1.5.3
important bug-fix (thx to datadiode@)
2015-03-07 14:52:01 -06:00
Christopher Dunn
72b5293695 Merge pull request #207 from cdunn2001/fix_CZString_copy_constructor
Fix czstring copy constructor
2015-03-07 14:49:54 -06:00
Christopher Dunn
a63d82d78a drop unused CString ctor case
`Value::CZString::CZString(char const* str, unsigned length, DuplicationPolicy allocate)` with `allocate == duplicate` does not happen.
2015-03-07 14:43:37 -06:00
datadiode
ee83f8891c Trivial fixes in CZString constructors. 2015-03-07 14:43:07 -06:00
Christopher Dunn
5c448687e1 fix ValueTest/zeroes* 2015-03-07 14:41:15 -06:00
Christopher Dunn
401e98269e old-style enum namespacing 2015-03-06 16:11:55 -06:00
Christopher Dunn
b2a7438d08 Merge pull request #205 from open-source-parsers/reject-dup-keys
[Shekhar (shakers007) wrote](https://sourceforge.net/p/jsoncpp/bugs/22/):

> As per RFC4627 (section 2.2), names within an object should be unique. When using JSONCPP's strict mode, parsing such an object should fail.
2015-03-06 12:58:55 -06:00
Christopher Dunn
62ad140d18 rejectDupKeys 2015-03-06 12:39:05 -06:00
Christopher Dunn
527332d5d5 add rejectDupKeys feature - not yet impld 2015-03-06 12:38:58 -06:00
Christopher Dunn
cada3b951f test for repeated key in strictMode
https://sourceforge.net/p/jsoncpp/bugs/22/
2015-03-06 12:38:00 -06:00
Christopher Dunn
ff61752444 change str_ for cross-compilation
https://sourceforge.net/p/jsoncpp/bugs/59/
2015-03-06 10:31:46 -06:00
Christopher Dunn
7f439f4276 clarify operator= 2015-03-06 09:22:57 -06:00
Christopher Dunn
3976f17ffd test assignment over-writes comments, but swapPayload() does not 2015-03-06 09:16:19 -06:00
Christopher Dunn
80ca11bb41 test commentBefore
for issue #203
2015-03-06 05:55:19 -06:00
Christopher Dunn
2fc08b4ebd clarify which versions work with old compilers 2015-03-05 21:45:42 -06:00
Christopher Dunn
239c733ab5 1.5.3 <- 1.5.2 2015-03-05 18:27:52 -06:00
Christopher Dunn
295e73ff3c generate both version.h and version from CMakelists.txt
This forces consistency, since they will be re-generated whenever
a git operation alters CMakelists.txt. They are still in the repo
because users might not actually run cmake.
2015-03-05 18:27:39 -06:00
Christopher Dunn
2a840c105c had trouble finding Python on Windows
With this change, `make jsoncpp_check` will still fail if Python
is missing, so our CI tests are unaffected.
2015-03-05 17:42:12 -06:00
Christopher Dunn
7ec98dc9fe Merge pull request #202 from open-source-parsers/get-with-zero
`Value::get(key, default)` with zero
2015-03-05 16:56:56 -06:00
Christopher Dunn
0fd2875a44 fix get() for embedded zeroes in key
This method had been overlooked.
2015-03-05 16:47:29 -06:00
Christopher Dunn
d31151d150 test get(key, default) 2015-03-05 16:44:50 -06:00
Christopher Dunn
f50145fbda Merge pull request #201 from open-source-parsers/vs
* Copy .dll for running unit-tests in VisualStudio.
* Stop using `do{}while(0)` idiom b/c VisualStudio warns.
* Fix a warning in a test.
2015-03-05 16:37:42 -06:00
Christopher Dunn
b3e6f3d70f drop do{}while(0) idiom
Rationale:
  * http://stackoverflow.com/questions/154136/do-while-and-if-else-statements-in-c-c-macros/154138#154138

But Visual Studio issues a warning: `warning C4127: conditional expression is constant`
  * http://stackoverflow.com/questions/1946445/c-c-how-to-use-the-do-while0-construct-without-compiler-warnings-like-c412
2015-03-05 15:26:29 -06:00
Christopher Dunn
13e063c336 copy .dll for unit-test
Fix 2nd problem in issue #200.
* http://stackoverflow.com/questions/10671916/how-to-copy-dll-files-into-the-same-folder-as-the-executable-using-cmake

Q: What about the Python tests?
A: They are not normally run in Visual Studio. If desired, one can set PATH.
2015-03-05 15:23:20 -06:00
Christopher Dunn
f57da48f48 maybe address warning
cmake -DCMAKE_BUILD_TYPE=debug -DJSONCPP_LIB_BUILD_STATIC=OFF
-DJSONCPP_LIB_BUILD_SHARED=ON -G "Visual Studio 10" ../..

`potentially uninitialized local variable 'dist' (line 2212 of
test_lib_json/main.cpp)`
2015-03-05 14:41:42 -06:00
Christopher Dunn
eaa3fd8eca maybe fix DLL symbols for VS 2015-03-05 10:11:43 -06:00
Christopher Dunn
ff63d034e5 1.5.2 <- 1.5.1
* Fixed compile error for VS2013.
* Added `operator[]` to Builders. Recalling previous minor versions for
  bug-fixes.
2015-03-05 09:18:44 -06:00
Christopher Dunn
2aa4557e79 Merge pull request #199 from open-source-parsers/set-builders
Setting builders
2015-03-05 09:18:06 -06:00
Christopher Dunn
37dde9d29d fix example in docs 2015-03-05 09:18:06 -06:00
Christopher Dunn
c312dd5ef7 Builder::operator[] plus tests 2015-03-05 09:18:01 -06:00
Christopher Dunn
42d7e59fe0 fix compiler-error and warnings for VS2013
fix issue #200
2015-03-05 09:15:10 -06:00
Christopher Dunn
a8104a8035 1.5.1 <- 1.5.0
Fixed Builders ::validate(). Added tests.
2015-03-04 21:18:05 -06:00
Christopher Dunn
7b22768c33 test Builders::validate() 2015-03-04 21:17:03 -06:00
Christopher Dunn
19c49a459d fix Builders::validate()
(cherry picked from commit 626cfcdbb8)
2015-03-04 21:14:53 -06:00
Christopher Dunn
99822b27cd clarify python version 2015-03-03 16:17:11 -06:00
Christopher Dunn
8a70297869 fix inline doxygen comments 2015-03-03 16:17:08 -06:00
Christopher Dunn
24f544996f no struct init
The C struct initializer is not standard C++.
GCC and Clang handle this (at least in some versions) but some
compilers might not.
2015-03-03 10:15:09 -06:00
Christopher Dunn
0c91927da2 assertions should be logic_error 2015-03-03 09:45:33 -06:00
Christopher Dunn
493f6dcebe keep StaticString (!allocated_) for copy ctor 2015-03-03 09:36:22 -06:00
Christopher Dunn
eaa06355e1 test Json::Value::null 2015-03-03 08:45:52 -06:00
Christopher Dunn
effd732aa1 null -> nullRef 2015-03-03 01:25:33 -06:00
Christopher Dunn
70cd04d49a 1.5.0 <- 1.4.4 2015-03-03 00:45:57 -06:00
Christopher Dunn
9e49e3d84a Merge pull request #197 from cdunn2001/allow-zero
allow zeroes in strings
2015-03-03 00:44:19 -06:00
Christopher Dunn
2d653bd15d fix security hole for string-key-lengths > 2^30 2015-03-03 00:14:54 -06:00
Christopher Dunn
585b267595 tests for zeroes
* ValueTest/zeroes
* ValueTest/zeroesInKeys
2015-03-03 00:14:54 -06:00
Christopher Dunn
c28610fb5d fix StaticString test
* support zeroes in string_
* support zeroes in writer; provide getString(char**, unsigned*)
* valueToQuotedStringN(), isCC0(), etc
* allow zeroes for cpptl ConstString
* allocated => non-static
2015-03-03 00:14:54 -06:00
Christopher Dunn
a53283568f cp duplicateStringValue() 2015-03-03 00:14:53 -06:00
Christopher Dunn
ef21fbc785 doc new behavior 2015-03-03 00:14:53 -06:00
Christopher Dunn
25342bac13 support UTF-8 for const methods 2015-03-03 00:14:53 -06:00
Christopher Dunn
2b9abc3ebf Merge pull request #195 from cdunn2001/len-in-czstring
Store length in CZString
2015-03-03 00:07:03 -06:00
Christopher Dunn
e6b46e4503 stop computing strlen() in CZString 2015-03-02 23:50:59 -06:00
Christopher Dunn
8a77037320 actually store length in CZString 2015-03-02 23:50:59 -06:00
Christopher Dunn
57ad051f67 allow length in CZString 2015-03-02 23:50:59 -06:00
Christopher Dunn
b383fdc61e use memcmp in CZString
This is a loss of efficiency, but it prepares for an increase when we
have stored lengths.
2015-03-02 23:50:59 -06:00
Mattes D
5d79275a5b Added MSVC+CMake generated files to .gitignore.
pull #193
2015-03-02 23:50:14 -06:00
Christopher Dunn
c1e834a110 Merge pull request #194 from cdunn2001/master
test StaticString
2015-03-02 17:14:36 -06:00
Christopher Dunn
0570f9eefb test StaticString 2015-03-02 17:06:09 -06:00
Christopher Dunn
b947b0b3df Merge pull request #192 from marklakata/size_t
changed length argument of duplicateStringValue from unsigned int to siz...
2015-03-02 17:03:24 -06:00
Mark Lakata
8051cf6ba7 changed length argument of duplicateStringValue from unsigned int to size_t, to avoid warnings in Visual Studio. 2015-03-02 11:57:17 -08:00
Christopher Dunn
c8bf184ea9 Merge pull request #189 from open-source-parsers/skip-py-tests
skip python jsontestrunner by default
2015-02-26 09:55:21 -06:00
Christopher Dunn
9998094eee skip python jsontestrunner by default
To run these tests, in cmake build-dir:
    make jsoncpp_check

TravisCI is now set to run these always.

For now, the test_json_lib unit-tests will still run.

issue #187
2015-02-26 09:41:45 -06:00
Christopher Dunn
4ac4cac2be Merge pull request #185 from open-source-parsers/drop-internal-map
`JSON_VALUE_USE_INTERNAL_MAP` and `JSON_USE_SIMPLE_INTERNAL_ALLOCATOR` did not actually compile with current compilers. In fact, as far as I can tell, they haven't worked since just after the 0.6.0-rc2 release. And according to blep, there are bugs, e.g. [this one](https://sourceforge.net/p/jsoncpp/bugs/27). It's probably better to hide the implementation of Value (maybe in the next major release) and to put experiments like this into completely separate classes.
2015-02-25 10:12:20 -06:00
Christopher Dunn
4788764844 drop JSON_VALUE_USE_INTERNAL_MAP, JSON_USE_SIMPLE_INTERNAL_ALLOCATOR
And remove some old headers.

These were not actually compiling anymore, and there were outstanding,
known bugs, e.g. https://sourceforge.net/p/jsoncpp/bugs/27
2015-02-25 10:04:13 -06:00
Christopher Dunn
6c898441e3 test-amalgamate
Not tested via Travis (since this is outside cmake), but at least
we can more easily validate changes to `amalgamate.py`.
2015-02-25 10:04:12 -06:00
Christopher Dunn
7d1f656859 Merge pull request #183 from open-source-parsers/allow-single-quote
Allow single quote
2015-02-24 18:24:42 -06:00
Christopher Dunn
0c66e698fb allowSingleQuotes
issue #182
2015-02-24 15:49:45 -06:00
Christopher Dunn
b9229b7400 failing test for allowSingleQuotes 2015-02-23 16:40:06 -06:00
Christopher Dunn
f9db82af17 1.4.4 < 1.4.3
* fixed Readers when `allowDropNullPlaceholders`
2015-02-19 15:37:57 -06:00
Christopher Dunn
ae71879549 Merge pull request #179 from cdunn2001/drop-null
Fix failures for `allowDroppedNullPlaceholders`.
2015-02-19 12:38:16 -06:00
Christopher Dunn
7b3683ccd1 apply fix to old Reader 2015-02-19 11:37:17 -06:00
Christopher Dunn
58499031a4 fix all cases from issue -- all pass! 2015-02-19 11:35:28 -06:00
Christopher Dunn
2c8c1ac0ec all 8 failing cases from issue 2015-02-19 11:35:20 -06:00
Christopher Dunn
c58e93b014 fix failing object case 2015-02-19 11:34:35 -06:00
Christopher Dunn
eed193e151 object cases; 1st passes, 2nd fails 2015-02-19 11:34:35 -06:00
Christopher Dunn
4382a7b585 cases 0,1,5,9,12 from issue -- passing 2015-02-19 11:32:42 -06:00
Christopher Dunn
30d923f155 1.4.3 <- 1.4.2 2015-02-18 09:20:28 -06:00
Christopher Dunn
2f4e40bc95 fix typo: issue #177 2015-02-18 09:17:06 -06:00
Christopher Dunn
505e086ebc Merge pull request #173 from cdunn2001/fix-include-amal
help use of amalgamated header

Reviewed by "Lightness Races in Orbit" at StackOverflow.
2015-02-15 13:11:38 -06:00
Christopher Dunn
c6582415d8 fix typos 2015-02-15 13:06:48 -06:00
Christopher Dunn
0ee7e2426f help use of amalgamated header
* Include from same-directory first, so `-I DIR` is rarely needed.
* Double-check that proper header has been included.

http://stackoverflow.com/questions/28510109/linking-problems-with-jsoncpp
2015-02-15 13:06:48 -06:00
Christopher Dunn
1522e4dfb1 Merge pull request #174 from kobigurk/kobigurk-patch-asserts
only throws exceptions JSON_USE_EXCEPTION
2015-02-15 12:05:28 -06:00
Kobi Gurkan
09b8670536 only throws exceptions JSON_USE_EXCEPTION
JSON_ASSERT now throws a runtime_error
2015-02-15 18:00:31 +02:00
Christopher Dunn
f164288646 help rebasing 2015-02-15 03:01:26 -06:00
Christopher Dunn
3bfd215938 1.4.2 < 1.4.1
* Bug-fix for ValueIterator::operator-() (issue #169)
2015-02-15 02:49:34 -06:00
Christopher Dunn
400b744195 Merge pull request #172 from cdunn2001/master
Fix bug in ValueIteratorBase::operator- 

Fixes #169.
2015-02-15 02:44:17 -06:00
Christopher Dunn
bd55164089 reverse sense for CPPTL too 2015-02-15 02:38:31 -06:00
Kevin Grant
4c5832a0be Fix bug in ValueIteratorBase::operator- 2015-02-15 02:38:31 -06:00
Christopher Dunn
8ba9875962 IteratorTest 2015-02-15 02:38:31 -06:00
Christopher Dunn
9c91b995dd rules for signing and doc-generation 2015-02-14 10:20:45 -06:00
Christopher Dunn
e7233bf056 1.4.1 <- 1.4.0 2015-02-13 10:00:38 -06:00
Christopher Dunn
9c90456890 Merge pull request #167 from cdunn2001/fail-if-extra
Add `failIfExtra` feature to `CharReaderBuilder`.
2015-02-13 09:55:11 -06:00
Christopher Dunn
f4be815c86 failIfExtra
1. failing regression tests, from #164 and #107
2. implemented; tests pass
3. allow trailing comments
2015-02-13 09:39:08 -06:00
Christopher Dunn
aa13a8ba40 comments/minor typos 2015-02-13 09:38:49 -06:00
Christopher Dunn
da0fcfbaa2 link web docs 2015-02-12 11:45:21 -06:00
Christopher Dunn
3ebba5cea8 stop calling validate() in newReader/Writer()
By not calling validate(), we can add
non-invasive features which will be simply ignored when user-code
is compiled against an old version. That way, we can often
avoid a minor version-bump.

The user can call validate() himself if he prefers that behavior.
2015-02-11 11:15:32 -06:00
Christopher Dunn
acbf4eb2ef Merge pull request #166 from cdunn2001/stackLimit
Fixes #88 and #56.
2015-02-11 10:35:16 -06:00
Christopher Dunn
56df206847 limit stackDepth for old (deprecated) Json::Reader too
This is an improper solution. If multiple Readers exist,
then the effect stackLimit is reduced because of side-effects.
But our options are limited. We need to address the security
hole without breaking binary-compatibility.

However, this is not likely to cause any practical problems because:

* Anyone using `operator>>(istream, Json::Value)` will be using the
new code already
* Multiple Readers are uncommon.
* The stackLimit is quite high.
* Deeply nested JSON probably would have hit the system limits anyway.
2015-02-11 10:20:53 -06:00
Christopher Dunn
4dca80da49 limit stackDepth 2015-02-11 10:20:47 -06:00
Christopher Dunn
249ad9f47f stackLimit 2015-02-11 10:01:58 -06:00
Christopher Dunn
99b8e856f6 stackLimit_ 2015-02-11 10:01:58 -06:00
Christopher Dunn
89b72e1653 test stackLimit 2015-02-11 10:01:58 -06:00
Christopher Dunn
2474989f24 Old -> Our 2015-02-11 09:48:24 -06:00
Christopher Dunn
315b8c9f2c 1st StreamWriterTest 2015-02-10 23:29:14 -06:00
Christopher Dunn
29501c4d9f clarify comments
And throw instead of return null for invalid settings.
2015-02-10 23:03:27 -06:00
Christopher Dunn
7796f20eab Merge pull request #165 from cdunn2001/master
Remove some experimental classes that are not needed for 1.4.0. This also helps 0.8.0 binary compatibility with 0.6.0-rc2.
2015-02-10 22:45:32 -06:00
Christopher Dunn
20d09676c2 drop experimental OldCompressingStreamWriterBuilder 2015-02-10 21:29:35 -06:00
Christopher Dunn
5a744708fc enableYAMLCompatibility and dropNullPlaceholders for StreamWriterBuilder 2015-02-10 21:28:13 -06:00
Christopher Dunn
07f0e9308d nullRef, since we had to add that kludge to 0.8.0 2015-02-10 21:28:13 -06:00
Christopher Dunn
052050df07 copy Features to OldFeatures 2015-02-10 17:01:08 -06:00
Christopher Dunn
435d2a2f8d passes 2015-02-10 17:01:08 -06:00
Christopher Dunn
6123bd1505 copy Reader impl to OldReader 2015-02-10 17:01:08 -06:00
Christopher Dunn
7477bcfa3a renames for OldReader 2015-02-10 17:01:08 -06:00
Christopher Dunn
5e3e68af2e OldReader copied from Reader 2015-02-10 17:01:08 -06:00
Christopher Dunn
04a607d95b Merge pull request #163 from cdunn2001/master
Reimplement the new Builders.

Issue #131.
2015-02-09 18:55:55 -06:00
Christopher Dunn
db75cdf21e mv CommentStyle to .cpp 2015-02-09 18:54:58 -06:00
Christopher Dunn
c41609b9f9 set output stream in write(), not in builder 2015-02-09 18:44:53 -06:00
Christopher Dunn
b56381a636 <stdexcept> 2015-02-09 18:29:11 -06:00
Christopher Dunn
f757c18ca0 add all features 2015-02-09 18:24:56 -06:00
Christopher Dunn
3cf9175bde remark defaults via doxygen snippet 2015-02-09 18:16:24 -06:00
Christopher Dunn
a9e1ab302d Builder::settings_
We use Json::Value to configure the builders so we can maintain
binary-compatibility easily.
2015-02-09 17:30:11 -06:00
Christopher Dunn
694dbcb328 update docs, writeString() 2015-02-09 15:25:57 -06:00
Christopher Dunn
732abb80ef Merge pull request #162 from cdunn2001/master
Deprecate the new Builders.
2015-02-09 11:55:54 -06:00
Christopher Dunn
f3b3358a0e deprecate current Builders 2015-02-09 11:51:06 -06:00
Christopher Dunn
1357cddf1e deprecate Builders
see issue #131
2015-02-09 11:46:27 -06:00
Christopher Dunn
8df98f6112 deprecate old Reader; separate Advanced Usage section 2015-02-09 11:15:39 -06:00
Christopher Dunn
16bdfd8af3 --in=doc/web_doxyfile.in 2015-02-09 11:15:11 -06:00
Christopher Dunn
ce799b3aa3 copy doxyfile.in 2015-02-09 10:36:55 -06:00
Christopher Dunn
3a65581b20 drop an old impl 2015-02-09 09:54:26 -06:00
Christopher Dunn
6451412c99 simplify basic docs 2015-02-09 09:44:26 -06:00
Christopher Dunn
66a8ba255f clarify Builders 2015-02-09 01:29:43 -06:00
Christopher Dunn
249fd18114 put version into docs 2015-02-09 00:50:27 -06:00
Christopher Dunn
a587d04f77 Merge pull request #161 from cdunn2001/master
CharReader/Builder

I guess we should but the patch-level version. We will set the version properly soon...
2015-02-08 13:25:08 -06:00
Christopher Dunn
2c1197c2c8 CharReader/Builder
* CharReaderBuilder is similar to StreamWriterBuilder.
* use rdbuf(), since getline(string) is not required to handle EOF as delimiter
2015-02-08 13:22:09 -06:00
Christopher Dunn
2a94618589 Merge pull request #160 from cdunn2001/master
rm unique_ptr<>/shared_ptr<>, for pre-C++11
2015-02-08 13:10:18 -06:00
Christopher Dunn
dee4602b8f rm unique_ptr<>/shared_ptr<>, for pre-C++11 2015-02-08 11:54:49 -06:00
Christopher Dunn
ea2d167a38 Merge pull request #158 from cdunn2001/travis-with-cmake-package
JSONCPP_WITH_CMAKE_PACKAGE in Travis

I guess we don't really need to shared and static separately either. Saves a little time, maybe?
2015-02-07 12:24:58 -06:00
Christopher Dunn
41edda5ebe JSONCPP_WITH_CMAKE_PACKAGE in Travis 2015-02-07 12:18:20 -06:00
Christopher Dunn
2941cb3fe2 Merge pull request #156 from cdunn2001/with-cmake-package
fix JSONCPP_WITH_CMAKE_PACKAGE #155
2015-02-07 11:44:24 -06:00
Christopher Dunn
636121485c fix JSONCPP_WITH_CMAKE_PACKAGE #155
mv JSONCPP_WITH_CMAKE_PACKAGE ahead of INSTALL def.
2015-02-07 11:39:16 -06:00
Christopher Dunn
fe855fb4dd drop nullptr
See issue #153.
2015-02-02 15:33:47 -06:00
Christopher Dunn
198cc350c5 drop scoped enum, for pre-C++11 compatibility 2015-01-29 13:49:21 -06:00
Peter Spiess-Knafl
5e8595c0e2 added cmake option to build static and shared libraries at once
See #147 and #149.
2015-01-27 18:22:43 -06:00
Christopher Dunn
38042b3892 docs 2015-01-26 11:38:38 -06:00
Christopher Dunn
3b5f2b85ca Merge pull request #145 from cdunn2001/simplify-builder
Simplify builder
2015-01-26 11:33:16 -06:00
Christopher Dunn
7eca3b4e88 gcc-4.6 (Travis CI) does not support 2015-01-26 11:17:42 -06:00
Christopher Dunn
999f5912f0 docs 2015-01-26 11:12:53 -06:00
Christopher Dunn
472d29f57b fix doc 2015-01-26 11:04:03 -06:00
Christopher Dunn
6065a1c142 make StreamWriterBuilder concrete 2015-01-26 11:01:15 -06:00
Christopher Dunn
28a20917b0 Move old FastWriter stuff out of new Builder 2015-01-26 10:47:42 -06:00
Christopher Dunn
177b7b8f22 OldCompressingStreamWriterBuilder 2015-01-26 10:44:20 -06:00
Christopher Dunn
9da9f84903 improve docs
including `writeString()`
2015-01-26 10:43:53 -06:00
Christopher Dunn
54b8e6939a Merge pull request #132 from cdunn2001/builder
StreamWriter::Builder

Deprecate old Writers, but include them in tests.

This should still be binary-compatible with 1.3.0.
2015-01-25 18:52:09 -06:00
Christopher Dunn
c7b39c2e25 deprecate old Writers
also, use withers instead of setters, and update docs
2015-01-25 18:45:59 -06:00
Christopher Dunn
d78caa3851 implement strange setting from FastWriter 2015-01-25 18:15:54 -06:00
Christopher Dunn
c6e0688e5a implement CommentStyle::None/indentation_=="" 2015-01-25 17:32:36 -06:00
Christopher Dunn
1e21e63853 default \t indentation, All comments 2015-01-25 16:01:59 -06:00
Christopher Dunn
dea6f8d9a6 incorporate 'proper newlines for comments' into new StreamWriter 2015-01-25 15:55:18 -06:00
Christopher Dunn
648843d148 clarify CommentStyle 2015-01-25 15:54:40 -06:00
Christopher Dunn
fe3979cd8a drop StreamWriterBuilderFactory, for now 2015-01-25 15:54:40 -06:00
Christopher Dunn
94665eab72 copy fixes from StyledStreamWriter 2015-01-25 15:54:40 -06:00
Christopher Dunn
9e4bcf354f test BuiltStyledStreamWriter too 2015-01-25 15:54:40 -06:00
Christopher Dunn
9243d602fe const stuff 2015-01-25 15:54:40 -06:00
Christopher Dunn
beb6f35c63 non-const write 2015-01-25 15:54:40 -06:00
Christopher Dunn
ceef7f5219 copied impl of StyledStreamWriter 2015-01-25 15:54:40 -06:00
Christopher Dunn
77ce057f14 fix comment 2015-01-25 15:54:40 -06:00
Christopher Dunn
d49ab5aee1 use new BuiltStyledStreamWriter in operator<<() 2015-01-25 15:54:40 -06:00
Christopher Dunn
4d649402b0 setIndentation() 2015-01-25 15:54:40 -06:00
Christopher Dunn
489707ff60 StreamWriter::Builder 2015-01-25 15:54:39 -06:00
Christopher Dunn
5fbfe3cdb9 StreamWriter 2015-01-25 15:54:39 -06:00
Christopher Dunn
948f29032e update docs 2015-01-25 15:54:07 -06:00
Christopher Dunn
964affd333 add back space before trailing comment 2015-01-25 15:49:02 -06:00
Christopher Dunn
c038e08efc Merge pull request #144 from cdunn2001/proper-comment-lfs
proper newlines for comments

This alters `StyledStreamWriter`, but not `StyledWriter`.
2015-01-25 15:10:38 -06:00
Christopher Dunn
74c2d82e19 proper newlines for comments
The logic is still messy, but it seems to work.
2015-01-25 15:05:09 -06:00
Christopher Dunn
30726082f3 Merge pull request #143 from cdunn2001/rm-trailing-newlines
rm trailing newlines for *all* comments
2015-01-25 14:35:24 -06:00
Christopher Dunn
1e3149ab75 rm trailing newlines for *all* comments
This will make it easier to fix newlines consistently.
2015-01-25 14:32:13 -06:00
Christopher Dunn
7312b1022d Merge pull request #141 from cdunn2001/set-comment
Fix a border case which causes Value::CommentInfo::setComment() to crash
2015-01-25 11:37:02 -06:00
datadiode
2f046b584d Fix a border case which causes Value::CommentInfo::setComment() to crash
re: pull #140
2015-01-25 11:19:51 -06:00
Christopher Dunn
dd91914b1b TravisCI gcc-4.6 does not yet support -Wpedantic 2015-01-25 10:34:49 -06:00
Christopher Dunn
2a46e295ec Merge pull request #139 from cdunn2001/some-python-changes
Some python changes.

* Better messaging.
* Make `doxybuild.py` work with python3.4
2015-01-24 16:24:12 -06:00
Christopher Dunn
f4bc0bf4ec README.md 2015-01-24 16:21:12 -06:00
Christopher Dunn
f357688893 make doxybuild.py work with python3.4 2015-01-24 16:21:12 -06:00
Florian Meier
bb0c80b3e5 Doxybuild: Error message if doxygen not found
This patch introduces a better error message.

See discussion at pull #129.
2015-01-24 16:21:12 -06:00
Christopher Dunn
ff5abe76a5 update doxbuild.py 2015-01-24 16:21:12 -06:00
Christopher Dunn
9cc0bb80b2 update TarFile usage 2015-01-24 16:21:12 -06:00
Christopher Dunn
494950a63d rm extra whitespace in python, per PEP8 2015-01-24 16:21:12 -06:00
Christopher Dunn
7d82b14726 fix issue #90
We are static-casting to U, so we really have no reason to use
references.

However, if this comes up again, try applying -ffloat-store to
the target executable, per
    https://github.com/open-source-parsers/jsoncpp/issues/90
2015-01-24 14:34:54 -06:00
Christopher Dunn
2bc6137ada fix gcc warnings 2015-01-24 13:42:37 -06:00
Christopher Dunn
201904bfbb Merge pull request #138 from cdunn2001/fix-103
Fix #103.
2015-01-23 14:51:31 -06:00
Christopher Dunn
216ecd3085 fix test_comment_00 for #103 2015-01-23 14:28:44 -06:00
Christopher Dunn
8d15e51228 add test_comment_00
one-element array with comment, for issue #103
2015-01-23 14:28:21 -06:00
Christopher Dunn
9fbd12b27c Merge pull request #137 from cdunn2001/avoid-extra-newline
Avoid extra newline
2015-01-23 14:24:52 -06:00
Christopher Dunn
f8ca6cbb25 1.4.0 <- 1.3.0
Minor version bump, but we will wait for a few more commits this time
before tagging the release.
2015-01-23 14:23:31 -06:00
Christopher Dunn
d383056fbb avoid extra newlines in StyledStreamWriter
Add indented_ as a bitfield. (Verified that sizeof(StyledStreamWriter)
remains 96 for binary compatibility. But the new symbol requires a minor
version-bump.)
2015-01-23 14:23:31 -06:00
Christopher Dunn
ddb4ff7dec Merge pull request #136 from cdunn2001/test-both-styled-writers
Test both styled writers

Not only does this now test StyledStreamWriter the same way as StyledWriter, but it also makes the former work more like the latter, indenting separate lines of a comment before a value. Might break some user tests (as `operator<<()` uses `StyledStreamWriter`) but basically a harmless improvement.

All tests pass.
2015-01-23 13:55:45 -06:00
Christopher Dunn
3efc587fba make StyledStreamWriter work more like StyledWriter
tests pass
2015-01-23 13:36:10 -06:00
Christopher Dunn
70704b9a70 test both StyledWriter and StyledStreamWriter 2015-01-23 13:36:10 -06:00
Christopher Dunn
ac6bbbc739 show cmd in runjsontests.py 2015-01-23 13:36:10 -06:00
Christopher Dunn
26c52861b9 pass --json-writer StyledWriter 2015-01-23 13:36:10 -06:00
Christopher Dunn
3682f60927 --json-writer arg 2015-01-23 13:36:10 -06:00
Christopher Dunn
58c31ac550 mv try-block 2015-01-23 12:35:12 -06:00
Christopher Dunn
08cfd02d8c fix minor bugs in test-runner 2015-01-23 12:35:12 -06:00
Christopher Dunn
79211e1aeb Options class for test 2015-01-23 12:35:12 -06:00
Christopher Dunn
632c9b5032 cleaner 2015-01-23 12:35:12 -06:00
Christopher Dunn
05810a7607 cleaner 2015-01-23 12:35:12 -06:00
Christopher Dunn
942e2c999a unindent test-code 2015-01-23 12:35:12 -06:00
Christopher Dunn
2160c9a042 switch from StyledWriter to StyledStream writer in tests 2015-01-23 09:02:44 -06:00
100 changed files with 9642 additions and 5856 deletions

View File

@@ -29,8 +29,8 @@ PenaltyExcessCharacter: 1000000
PenaltyReturnTypeOnItsOwnLine: 60
PointerBindsToType: true
SpacesBeforeTrailingComments: 1
Cpp11BracedListStyle: false
Standard: Cpp03
Cpp11BracedListStyle: true
Standard: Cpp11
IndentWidth: 2
TabWidth: 8
UseTab: Never

11
.gitattributes vendored Normal file
View File

@@ -0,0 +1,11 @@
* text=auto
*.h text
*.cpp text
*.json text
*.in text
*.sh eol=lf
*.bat eol=crlf
*.vcproj eol=crlf
*.vcxproj eol=crlf
*.sln eol=crlf
devtools/agent_vm* eol=crlf

26
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@@ -0,0 +1,26 @@
---
name: Bug report
about: Create a report to help us improve
title: ''
labels: ''
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1.
**Expected behavior**
A clear and concise description of what you expected to happen.
**Desktop (please complete the following information):**
- OS: [e.g. iOS]
- Meson version
- Ninja version
**Additional context**
Add any other context about the problem here.

View File

@@ -0,0 +1,20 @@
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: ''
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

44
.gitignore vendored
View File

@@ -1,4 +1,5 @@
/build/
/build-*/
*.pyc
*.swp
*.actual
@@ -6,8 +7,49 @@
*.process-output
*.rewrite
/bin/
/buildscons/
/libs/
/doc/doxyfile
/dist/
#/version
#/include/json/version.h
# MSVC project files:
*.sln
*.vcxproj
*.filters
*.user
*.sdf
*.opensdf
*.suo
# MSVC build files:
*.lib
*.obj
*.tlog/
*.pdb
# CMake-generated files:
CMakeFiles/
*.cmake
/pkg-config/jsoncpp.pc
jsoncpp_lib_static.dir/
# In case someone runs cmake in the root-dir:
/CMakeCache.txt
/Makefile
/include/Makefile
/src/Makefile
/src/jsontestrunner/Makefile
/src/jsontestrunner/jsontestrunner_exe
/src/lib_json/Makefile
/src/test_lib_json/Makefile
/src/test_lib_json/jsoncpp_test
*.a
# eclipse project files
.project
.cproject
/.settings/
# DS_Store
.DS_Store

View File

@@ -1,18 +1,65 @@
# Build matrix / environment variable are explained on:
# Build matrix / environment variables are explained on:
# http://about.travis-ci.org/docs/user/build-configuration/
# This file can be validated on:
# http://lint.travis-ci.org/
before_install: sudo apt-get install cmake
# This file can be validated on: http://www.yamllint.com/
# Or using the Ruby based travel command line tool:
# gem install travis --no-rdoc --no-ri
# travis lint .travis.yml
language: cpp
compiler:
- gcc
- clang
script: cmake -DJSONCPP_LIB_BUILD_SHARED=$SHARED_LIBRARY -DCMAKE_BUILD_TYPE=$BUILD_TYPE -DCMAKE_VERBOSE_MAKEFILE=$VERBOSE_MAKE . && make
env:
matrix:
- SHARED_LIBRARY=ON BUILD_TYPE=release VERBOSE_MAKE=false
- SHARED_LIBRARY=OFF BUILD_TYPE=release VERBOSE_MAKE=false
- SHARED_LIBRARY=OFF BUILD_TYPE=debug VERBOSE VERBOSE_MAKE=true
sudo: false
addons:
homebrew:
packages:
- meson
- ninja
update: false # do not update homebrew by default
apt:
sources:
- ubuntu-toolchain-r-test
- llvm-toolchain-xenial-8
packages:
- clang-8
- valgrind
matrix:
allow_failures:
- os: osx
include:
- name: Mac clang meson static release testing
os: osx
osx_image: xcode10.2
compiler: clang
env:
CXX="clang++"
CC="clang"
LIB_TYPE=static
BUILD_TYPE=release
script: ./.travis_scripts/meson_builder.sh
- name: Linux xenial clang meson static release testing
os: linux
dist: xenial
compiler: clang
env:
CXX="clang++"
CC="clang"
LIB_TYPE=static
BUILD_TYPE=release
# before_install and install steps only needed for linux meson builds
before_install:
- source ./.travis_scripts/travis.before_install.${TRAVIS_OS_NAME}.sh
install:
- source ./.travis_scripts/travis.install.${TRAVIS_OS_NAME}.sh
script: ./.travis_scripts/meson_builder.sh
- name: Linux xenial gcc cmake coverage
os: linux
dist: xenial
compiler: gcc
env:
CXX=g++
CC=gcc
DO_Coverage=ON
BUILD_TOOL="Unix Makefiles"
BUILD_TYPE=Debug
LIB_TYPE=shared
DESTDIR=/tmp/cmake_json_cpp
script: ./.travis_scripts/cmake_builder.sh
notifications:
email:
- aaronjjacobs@gmail.com
email: false

130
.travis_scripts/cmake_builder.sh Executable file
View File

@@ -0,0 +1,130 @@
#!/usr/bin/env sh
# This script can be used on the command line directly to configure several
# different build environments.
# This is called by `.travis.yml` via Travis CI.
# Travis supplies $TRAVIS_OS_NAME.
# http://docs.travis-ci.com/user/multi-os/
# Our .travis.yml also defines:
# - BUILD_TYPE=Release/Debug
# - LIB_TYPE=static/shared
#
# Optional environmental variables
# - DESTDIR <- used for setting the install prefix
# - BUILD_TOOL=["Unix Makefile"|"Ninja"]
# - BUILDNAME <- how to identify this build on the dashboard
# - DO_MemCheck <- if set, try to use valgrind
# - DO_Coverage <- if set, try to do dashboard coverage testing
#
env_set=1
if ${BUILD_TYPE+false}; then
echo "BUILD_TYPE not set in environment."
env_set=0
fi
if ${LIB_TYPE+false}; then
echo "LIB_TYPE not set in environment."
env_set=0
fi
if ${CXX+false}; then
echo "CXX not set in environment."
env_set=0
fi
if [ ${env_set} -eq 0 ]; then
echo "USAGE: CXX=$(which clang++) BUILD_TYPE=[Release|Debug] LIB_TYPE=[static|shared] $0"
echo ""
echo "Examples:"
echo " CXX=$(which clang++) BUILD_TYPE=Release LIB_TYPE=shared DESTDIR=/tmp/cmake_json_cpp $0"
echo " CXX=$(which clang++) BUILD_TYPE=Debug LIB_TYPE=shared DESTDIR=/tmp/cmake_json_cpp $0"
echo " CXX=$(which clang++) BUILD_TYPE=Release LIB_TYPE=static DESTDIR=/tmp/cmake_json_cpp $0"
echo " CXX=$(which clang++) BUILD_TYPE=Debug LIB_TYPE=static DESTDIR=/tmp/cmake_json_cpp $0"
echo " CXX=$(which g++) BUILD_TYPE=Release LIB_TYPE=shared DESTDIR=/tmp/cmake_json_cpp $0"
echo " CXX=$(which g++) BUILD_TYPE=Debug LIB_TYPE=shared DESTDIR=/tmp/cmake_json_cpp $0"
echo " CXX=$(which g++) BUILD_TYPE=Release LIB_TYPE=static DESTDIR=/tmp/cmake_json_cpp $0"
echo " CXX=$(which g++) BUILD_TYPE=Debug LIB_TYPE=static DESTDIR=/tmp/cmake_json_cpp $0"
exit -1
fi
if ${DESTDIR+false}; then
DESTDIR="/usr/local"
fi
# -e: fail on error
# -v: show commands
# -x: show expanded commands
set -vex
env | sort
which cmake
cmake --version
echo ${CXX}
${CXX} --version
_COMPILER_NAME=`basename ${CXX}`
if [ "${BUILD_TYPE}" == "shared" ]; then
_CMAKE_BUILD_SHARED_LIBS=ON
else
_CMAKE_BUILD_SHARED_LIBS=OFF
fi
CTEST_TESTING_OPTION="-D ExperimentalTest"
# - DO_MemCheck <- if set, try to use valgrind
if ! ${DO_MemCheck+false}; then
valgrind --version
CTEST_TESTING_OPTION="-D ExperimentalMemCheck"
else
# - DO_Coverage <- if set, try to do dashboard coverage testing
if ! ${DO_Coverage+false}; then
export CXXFLAGS="-fprofile-arcs -ftest-coverage"
export LDFLAGS="-fprofile-arcs -ftest-coverage"
CTEST_TESTING_OPTION="-D ExperimentalTest -D ExperimentalCoverage"
#gcov --version
fi
fi
# Ninja = Generates build.ninja files.
if ${BUILD_TOOL+false}; then
BUILD_TOOL="Ninja"
export _BUILD_EXE=ninja
which ninja
ninja --version
else
# Unix Makefiles = Generates standard UNIX makefiles.
export _BUILD_EXE=make
fi
_BUILD_DIR_NAME="build-cmake_${BUILD_TYPE}_${LIB_TYPE}_${_COMPILER_NAME}_${_BUILD_EXE}"
mkdir -p ${_BUILD_DIR_NAME}
cd "${_BUILD_DIR_NAME}"
if ${BUILDNAME+false}; then
_HOSTNAME=`hostname -s`
BUILDNAME="${_HOSTNAME}_${BUILD_TYPE}_${LIB_TYPE}_${_COMPILER_NAME}_${_BUILD_EXE}"
fi
cmake \
-G "${BUILD_TOOL}" \
-DBUILDNAME:STRING="${BUILDNAME}" \
-DCMAKE_CXX_COMPILER:PATH=${CXX} \
-DCMAKE_BUILD_TYPE:STRING=${BUILD_TYPE} \
-DBUILD_SHARED_LIBS:BOOL=${_CMAKE_BUILD_SHARED_LIBS} \
-DCMAKE_INSTALL_PREFIX:PATH=${DESTDIR} \
../
ctest -C ${BUILD_TYPE} -D ExperimentalStart -D ExperimentalConfigure -D ExperimentalBuild ${CTEST_TESTING_OPTION} -D ExperimentalSubmit
# Final step is to verify that installation succeeds
cmake --build . --config ${BUILD_TYPE} --target install
if [ "${DESTDIR}" != "/usr/local" ]; then
${_BUILD_EXE} install
fi
cd -
if ${CLEANUP+false}; then
echo "Skipping cleanup: build directory will persist."
else
rm -r "${_BUILD_DIR_NAME}"
fi

View File

@@ -0,0 +1,81 @@
#!/usr/bin/env sh
# This script can be used on the command line directly to configure several
# different build environments.
# This is called by `.travis.yml` via Travis CI.
# Travis supplies $TRAVIS_OS_NAME.
# http://docs.travis-ci.com/user/multi-os/
# Our .travis.yml also defines:
# - BUILD_TYPE=release/debug
# - LIB_TYPE=static/shared
env_set=1
if ${BUILD_TYPE+false}; then
echo "BUILD_TYPE not set in environment."
env_set=0
fi
if ${LIB_TYPE+false}; then
echo "LIB_TYPE not set in environment."
env_set=0
fi
if ${CXX+false}; then
echo "CXX not set in environment."
env_set=0
fi
if [ ${env_set} -eq 0 ]; then
echo "USAGE: CXX=$(which clang++) BUILD_TYPE=[release|debug] LIB_TYPE=[static|shared] $0"
echo ""
echo "Examples:"
echo " CXX=$(which clang++) BUILD_TYPE=release LIB_TYPE=shared DESTDIR=/tmp/meson_json_cpp $0"
echo " CXX=$(which clang++) BUILD_TYPE=debug LIB_TYPE=shared DESTDIR=/tmp/meson_json_cpp $0"
echo " CXX=$(which clang++) BUILD_TYPE=release LIB_TYPE=static DESTDIR=/tmp/meson_json_cpp $0"
echo " CXX=$(which clang++) BUILD_TYPE=debug LIB_TYPE=static DESTDIR=/tmp/meson_json_cpp $0"
echo " CXX=$(which g++) BUILD_TYPE=release LIB_TYPE=shared DESTDIR=/tmp/meson_json_cpp $0"
echo " CXX=$(which g++) BUILD_TYPE=debug LIB_TYPE=shared DESTDIR=/tmp/meson_json_cpp $0"
echo " CXX=$(which g++) BUILD_TYPE=release LIB_TYPE=static DESTDIR=/tmp/meson_json_cpp $0"
echo " CXX=$(which g++) BUILD_TYPE=debug LIB_TYPE=static DESTDIR=/tmp/meson_json_cpp $0"
exit -1
fi
if ${DESTDIR+false}; then
DESTDIR="/usr/local"
fi
# -e: fail on error
# -v: show commands
# -x: show expanded commands
set -vex
env | sort
which python3
which meson
which ninja
echo ${CXX}
${CXX} --version
python3 --version
meson --version
ninja --version
_COMPILER_NAME=`basename ${CXX}`
_BUILD_DIR_NAME="build-${BUILD_TYPE}_${LIB_TYPE}_${_COMPILER_NAME}"
meson --buildtype ${BUILD_TYPE} --default-library ${LIB_TYPE} . "${_BUILD_DIR_NAME}"
ninja -v -j 2 -C "${_BUILD_DIR_NAME}"
#ninja -v -j 2 -C "${_BUILD_DIR_NAME}" test
cd "${_BUILD_DIR_NAME}"
meson test --no-rebuild --print-errorlogs
if [ "${DESTDIR}" != "/usr/local" ]; then
ninja install
fi
cd -
if ${CLEANUP+false}; then
echo "Skipping cleanup: build directory will persist."
else
rm -r "${_BUILD_DIR_NAME}"
fi

View File

@@ -0,0 +1,8 @@
set -vex
# Preinstalled versions of python are dependent on which Ubuntu distribution
# you are running. The below version needs to be updated whenever we roll
# the Ubuntu version used in Travis.
# https://docs.travis-ci.com/user/languages/python/
pyenv global 3.7.1

View File

@@ -0,0 +1 @@
# NOTHING TO DO HERE

View File

@@ -0,0 +1,10 @@
set -vex
wget https://github.com/ninja-build/ninja/releases/download/v1.9.0/ninja-linux.zip
unzip -q ninja-linux.zip -d build
pip3 install meson
echo ${PATH}
ls /usr/local
ls /usr/local/bin
export PATH="${PWD}"/build:/usr/local/bin:/usr/bin:${PATH}

View File

@@ -0,0 +1 @@
# NOTHING TO DO HERE

112
AUTHORS
View File

@@ -1 +1,113 @@
Baptiste Lepilleur <blep@users.sourceforge.net>
Aaron Jacobs <aaronjjacobs@gmail.com>
Aaron Jacobs <jacobsa@google.com>
Adam Boseley <ABoseley@agjunction.com>
Adam Boseley <adam.boseley@gmail.com>
Aleksandr Derbenev <13alexac@gmail.com>
Alexander Gazarov <DrMetallius@users.noreply.github.com>
Alexander V. Brezgin <abrezgin@appliedtech.ru>
Alexandr Brezgin <albrezgin@mail.ru>
Alexey Kruchinin <alexey@mopals.com>
Anton Indrawan <anton.indrawan@gmail.com>
Baptiste Jonglez <git@bitsofnetworks.org>
Baptiste Lepilleur <baptiste.lepilleur@gmail.com>
Baruch Siach <baruch@tkos.co.il>
Ben Boeckel <mathstuf@gmail.com>
Benjamin Knecht <bknecht@logitech.com>
Bernd Kuhls <bernd.kuhls@t-online.de>
Billy Donahue <billydonahue@google.com>
Braden McDorman <bmcdorman@gmail.com>
Brandon Myers <bmyers1788@gmail.com>
Brendan Drew <brendan.drew@daqri.com>
chason <cxchao802@gmail.com>
Chris Gilling <cgilling@iparadigms.com>
Christopher Dawes <christopher.dawes.1981@googlemail.com>
Christopher Dunn <cdunn2001@gmail.com>
Chuck Atkins <chuck.atkins@kitware.com>
Cody P Schafer <dev@codyps.com>
Connor Manning <connor@hobu.co>
Cory Quammen <cory.quammen@kitware.com>
Cristóvão B da Cruz e Silva <CrisXed@gmail.com>
Daniel Krügler <daniel.kruegler@gmail.com>
Dani-Hub <daniel.kruegler@googlemail.com>
Dan Liu <gzliudan>
datadiode <datadiode@users.noreply.github.com>
datadiode <jochen.neubeck@vodafone.de>
David Seifert <soap@gentoo.org>
David West <david-west@idexx.com>
dawesc <chris.dawes@eftlab.co.uk>
Devin Jeanpierre <jeanpierreda@google.com>
Dmitry Marakasov <amdmi3@amdmi3.ru>
dominicpezzuto <dom@dompezzuto.com>
Don Milham <dmilham@gmail.com>
drgler <daniel.kruegler@gmail.com>
ds283 <D.Seery@sussex.ac.uk>
Egor Tensin <Egor.Tensin@gmail.com>
eightnoteight <mr.eightnoteight@gmail.com>
Evince <baneyue@gmail.com>
filipjs <filipjs@users.noreply.github.com>
findblar <ft@finbarr.ca>
Florian Meier <florian.meier@koalo.de>
Gaëtan Lehmann <gaetan.lehmann@gmail.com>
Gaurav <g.gupta@samsung.com>
Gergely Nagy <ngg@ngg.hu>
Gida Pataki <gida.pataki@prezi.com>
I3ck <buckmartin@buckmartin.de>
Iñaki Baz Castillo <ibc@aliax.net>
Jacco <jacco@geul.net>
Jean-Christophe Fillion-Robin <jchris.fillionr@kitware.com>
Jonas Platte <mail@jonasplatte.de>
Jordan Bayles <bayles.jordan@gmail.com>
Jörg Krause <joerg.krause@embedded.rocks>
Keith Lea <keith@whamcitylights.com>
Kevin Grant <kbradleygrant@gmail.com>
Kirill V. Lyadvinsky <jia3ep@gmail.com>
Kirill V. Lyadvinsky <mail@codeatcpp.com>
Kobi Gurkan <kobigurk@gmail.com>
Magnus Bjerke Vik <mbvett@gmail.com>
Malay Shah <malays@users.sourceforge.net>
Mara Kim <hacker.root@gmail.com>
Marek Kotewicz <marek.kotewicz@gmail.com>
Mark Lakata <mark@lakata.org>
Mark Zeren <mzeren@vmware.com>
Martin Buck <buckmartin@buckmartin.de>
Martyn Gigg <martyn.gigg@gmail.com>
Mattes D <github@xoft.cz>
Matthias Loy <matthias.loy@hbm.com>
Merlyn Morgan-Graham <kavika@gmail.com>
Michael Shields <mshields@google.com>
Michał Górny <mgorny@gentoo.org>
Mike Naberezny <mike@naberezny.com>
mloy <matthias.loy@googlemail.com>
Motti <lanzkron@gmail.com>
nnkur <nnkur@mail.ru>
Omkar Wagh <owagh@owaghlinux.ny.tower-research.com>
paulo <paulobrizolara@users.noreply.github.com>
pavel.pimenov <pavel.pimenov@gmail.com>
Paweł Bylica <chfast@gmail.com>
Péricles Lopes Machado <pericles.raskolnikoff@gmail.com>
Peter Spiess-Knafl <psk@autistici.org>
pffang <pffang@vip.qq.com>
Rémi Verschelde <remi@verschelde.fr>
renu555 <renu.tyagi@samsung.com>
Robert Dailey <rcdailey@gmail.com>
Sam Clegg <sbc@chromium.org>
selaselah <selah@outlook.com>
Sergiy80 <sil2004@gmail.com>
sergzub <sergzub@gmail.com>
Stefan Schweter <stefan@schweter.it>
Steffen Kieß <Steffen.Kiess@ipvs.uni-stuttgart.de>
Steven Hahn <hahnse@ornl.gov>
Stuart Eichert <stuart@fivemicro.com>
SuperManitu <supermanitu@gmail.com>
Techwolf <dring@g33kworld.net>
Tengiz Sharafiev <btolfa+github@gmail.com>
Tomasz Maciejewski <tmaciejewsk@gmail.com>
Vicente Olivert Riera <Vincent.Riera@imgtec.com>
xiaoyur347 <xiaoyur347@gmail.com>
ycqiu <429148848@qq.com>
yiqiju <fred_ju@selinc.com>
Yu Xiaolei <dreifachstein@gmail.com>
Google Inc.

View File

@@ -1,117 +1,186 @@
CMAKE_MINIMUM_REQUIRED(VERSION 2.8.5)
PROJECT(jsoncpp)
ENABLE_TESTING()
# vim: et ts=4 sts=4 sw=4 tw=0
OPTION(JSONCPP_WITH_TESTS "Compile and run JsonCpp test executables" ON)
OPTION(JSONCPP_WITH_POST_BUILD_UNITTEST "Automatically run unit-tests as a post build step" ON)
OPTION(JSONCPP_WITH_WARNING_AS_ERROR "Force compilation to fail if a warning occurs" OFF)
OPTION(JSONCPP_WITH_PKGCONFIG_SUPPORT "Generate and install .pc files" ON)
OPTION(JSONCPP_WITH_CMAKE_PACKAGE "Generate and install cmake package files" OFF)
# ==== Define cmake build policies that affect compilation and linkage default behaviors
#
# Set the JSONCPP_NEWEST_VALIDATED_POLICIES_VERSION string to the newest cmake version
# policies that provide successful builds. By setting JSONCPP_NEWEST_VALIDATED_POLICIES_VERSION
# to a value greater than the oldest policies, all policies between
# JSONCPP_OLDEST_VALIDATED_POLICIES_VERSION and CMAKE_VERSION (used for this build)
# are set to their NEW behaivor, thereby suppressing policy warnings related to policies
# between the JSONCPP_OLDEST_VALIDATED_POLICIES_VERSION and CMAKE_VERSION.
#
# CMake versions greater than the JSONCPP_NEWEST_VALIDATED_POLICIES_VERSION policies will
# continue to generate policy warnings "CMake Warning (dev)...Policy CMP0XXX is not set:"
#
set(JSONCPP_OLDEST_VALIDATED_POLICIES_VERSION "3.8.0")
set(JSONCPP_NEWEST_VALIDATED_POLICIES_VERSION "3.13.2")
cmake_minimum_required(VERSION ${JSONCPP_OLDEST_VALIDATED_POLICIES_VERSION})
if("${CMAKE_VERSION}" VERSION_LESS "${JSONCPP_NEWEST_VALIDATED_POLICIES_VERSION}")
#Set and use the newest available cmake policies that are validated to work
set(JSONCPP_CMAKE_POLICY_VERSION "${CMAKE_VERSION}")
else()
set(JSONCPP_CMAKE_POLICY_VERSION "${JSONCPP_NEWEST_VALIDATED_POLICIES_VERSION}")
endif()
cmake_policy(VERSION ${JSONCPP_CMAKE_POLICY_VERSION})
#
# Now enumerate specific policies newer than JSONCPP_NEWEST_VALIDATED_POLICIES_VERSION
# that may need to be individually set to NEW/OLD
#
foreach(pnew "") # Currently Empty
if(POLICY ${pnew})
cmake_policy(SET ${pnew} NEW)
endif()
endforeach()
foreach(pold "") # Currently Empty
if(POLICY ${pold})
cmake_policy(SET ${pold} OLD)
endif()
endforeach()
# Ensures that CMAKE_BUILD_TYPE is visible in cmake-gui on Unix
IF(NOT WIN32)
IF(NOT CMAKE_BUILD_TYPE)
SET(CMAKE_BUILD_TYPE Release CACHE STRING
"Choose the type of build, options are: None Debug Release RelWithDebInfo MinSizeRel Coverage."
FORCE)
ENDIF(NOT CMAKE_BUILD_TYPE)
ENDIF(NOT WIN32)
# ==== Define language standard configurations requiring at least c++11 standard
if(CMAKE_CXX_STANDARD EQUAL "98" )
message(FATAL_ERROR "CMAKE_CXX_STANDARD:STRING=98 is not supported.")
endif()
SET(LIB_SUFFIX "" CACHE STRING "Optional arch-dependent suffix for the library installation directory")
#####
## Set the default target properties
if(NOT CMAKE_CXX_STANDARD)
set(CMAKE_CXX_STANDARD 11) # Supported values are ``11``, ``14``, and ``17``.
endif()
if(NOT CMAKE_CXX_STANDARD_REQUIRED)
set(CMAKE_CXX_STANDARD_REQUIRED ON)
endif()
if(NOT CMAKE_CXX_EXTENSIONS)
set(CMAKE_CXX_EXTENSIONS OFF)
endif()
SET(RUNTIME_INSTALL_DIR bin
CACHE PATH "Install dir for executables and dlls")
SET(ARCHIVE_INSTALL_DIR lib${LIB_SUFFIX}
CACHE PATH "Install dir for static libraries")
SET(LIBRARY_INSTALL_DIR lib${LIB_SUFFIX}
CACHE PATH "Install dir for shared libraries")
SET(INCLUDE_INSTALL_DIR include
CACHE PATH "Install dir for headers")
SET(PACKAGE_INSTALL_DIR lib${LIB_SUFFIX}/cmake
CACHE PATH "Install dir for cmake package config files")
MARK_AS_ADVANCED( RUNTIME_INSTALL_DIR ARCHIVE_INSTALL_DIR INCLUDE_INSTALL_DIR PACKAGE_INSTALL_DIR )
# ====
# Set variable named ${VAR_NAME} to value ${VALUE}
FUNCTION(set_using_dynamic_name VAR_NAME VALUE)
SET( "${VAR_NAME}" "${VALUE}" PARENT_SCOPE)
ENDFUNCTION(set_using_dynamic_name)
# Ensures that CMAKE_BUILD_TYPE has a default value
if(NOT DEFINED CMAKE_BUILD_TYPE)
set(CMAKE_BUILD_TYPE Release CACHE STRING
"Choose the type of build, options are: None Debug Release RelWithDebInfo MinSizeRel Coverage.")
endif()
# Extract major, minor, patch from version text
# Parse a version string "X.Y.Z" and outputs
# version parts in ${OUPUT_PREFIX}_MAJOR, _MINOR, _PATCH.
# If parse succeeds then ${OUPUT_PREFIX}_FOUND is TRUE.
MACRO(jsoncpp_parse_version VERSION_TEXT OUPUT_PREFIX)
SET(VERSION_REGEX "[0-9]+\\.[0-9]+\\.[0-9]+(-[a-zA-Z0-9_]+)?")
IF( ${VERSION_TEXT} MATCHES ${VERSION_REGEX} )
STRING(REGEX MATCHALL "[0-9]+|-([A-Za-z0-9_]+)" VERSION_PARTS ${VERSION_TEXT})
LIST(GET VERSION_PARTS 0 ${OUPUT_PREFIX}_MAJOR)
LIST(GET VERSION_PARTS 1 ${OUPUT_PREFIX}_MINOR)
LIST(GET VERSION_PARTS 2 ${OUPUT_PREFIX}_PATCH)
set_using_dynamic_name( "${OUPUT_PREFIX}_FOUND" TRUE )
ELSE( ${VERSION_TEXT} MATCHES ${VERSION_REGEX} )
set_using_dynamic_name( "${OUPUT_PREFIX}_FOUND" FALSE )
ENDIF( ${VERSION_TEXT} MATCHES ${VERSION_REGEX} )
ENDMACRO(jsoncpp_parse_version)
project(JSONCPP
VERSION 1.9.0 # <major>[.<minor>[.<patch>[.<tweak>]]]
LANGUAGES CXX)
# Read out version from "version" file
FILE(STRINGS "version" JSONCPP_VERSION)
message(STATUS "JsonCpp Version: ${JSONCPP_VERSION_MAJOR}.${JSONCPP_VERSION_MINOR}.${JSONCPP_VERSION_PATCH}")
set( JSONCPP_SOVERSION 21 )
jsoncpp_parse_version( ${JSONCPP_VERSION} JSONCPP_VERSION )
IF(NOT JSONCPP_VERSION_FOUND)
MESSAGE(FATAL_ERROR "Failed to parse version string properly. Expect X.Y.Z")
ENDIF(NOT JSONCPP_VERSION_FOUND)
option(JSONCPP_WITH_TESTS "Compile and (for jsoncpp_check) run JsonCpp test executables" ON)
option(JSONCPP_WITH_POST_BUILD_UNITTEST "Automatically run unit-tests as a post build step" ON)
option(JSONCPP_WITH_WARNING_AS_ERROR "Force compilation to fail if a warning occurs" OFF)
option(JSONCPP_WITH_STRICT_ISO "Issue all the warnings demanded by strict ISO C and ISO C++" ON)
option(JSONCPP_WITH_PKGCONFIG_SUPPORT "Generate and install .pc files" ON)
option(JSONCPP_WITH_CMAKE_PACKAGE "Generate and install cmake package files" ON)
option(BUILD_SHARED_LIBS "Build jsoncpp_lib as a shared library." OFF)
# Enable runtime search path support for dynamic libraries on OSX
if(APPLE)
set(CMAKE_MACOSX_RPATH 1)
endif()
# Adhere to GNU filesystem layout conventions
include(GNUInstallDirs)
set(DEBUG_LIBNAME_SUFFIX "" CACHE STRING "Optional suffix to append to the library name for a debug build")
set(JSONCPP_USE_SECURE_MEMORY "0" CACHE STRING "-D...=1 to use memory-wiping allocator for STL" )
MESSAGE(STATUS "JsonCpp Version: ${JSONCPP_VERSION_MAJOR}.${JSONCPP_VERSION_MINOR}.${JSONCPP_VERSION_PATCH}")
# File version.h is only regenerated on CMake configure step
CONFIGURE_FILE( "${PROJECT_SOURCE_DIR}/src/lib_json/version.h.in"
"${PROJECT_SOURCE_DIR}/include/json/version.h" )
configure_file( "${PROJECT_SOURCE_DIR}/src/lib_json/version.h.in"
"${PROJECT_BINARY_DIR}/include/json/version.h"
NEWLINE_STYLE UNIX )
configure_file( "${PROJECT_SOURCE_DIR}/version.in"
"${PROJECT_BINARY_DIR}/version"
NEWLINE_STYLE UNIX )
macro(UseCompilationWarningAsError)
if ( MSVC )
if(MSVC)
# Only enabled in debug because some old versions of VS STL generate
# warnings when compiled in release configuration.
set(CMAKE_CXX_FLAGS_DEBUG "${CMAKE_CXX_FLAGS_DEBUG} /WX ")
endif( MSVC )
add_compile_options($<$<CONFIG:Debug>:/WX>)
elseif(CMAKE_CXX_COMPILER_ID STREQUAL "GNU")
add_compile_options(-Werror)
if(JSONCPP_WITH_STRICT_ISO)
add_compile_options(-pedantic-errors)
endif()
endif()
endmacro()
# Include our configuration header
INCLUDE_DIRECTORIES( ${jsoncpp_SOURCE_DIR}/include )
include_directories( ${jsoncpp_SOURCE_DIR}/include )
if ( MSVC )
if(MSVC)
# Only enabled in debug because some old versions of VS STL generate
# unreachable code warning when compiled in release configuration.
set(CMAKE_CXX_FLAGS_DEBUG "${CMAKE_CXX_FLAGS_DEBUG} /W4 ")
endif( MSVC )
if (CMAKE_CXX_COMPILER_ID MATCHES "Clang")
# using regular Clang or AppleClang
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall -std=c++11")
elseif ("${CMAKE_CXX_COMPILER_ID}" STREQUAL "GNU")
# using GCC
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall -std=c++0x")
add_compile_options($<$<CONFIG:Debug>:/W4>)
endif()
IF(JSONCPP_WITH_WARNING_AS_ERROR)
if(CMAKE_CXX_COMPILER_ID MATCHES "Clang")
# using regular Clang or AppleClang
add_compile_options(-Wall -Wconversion -Wshadow -Werror=conversion -Werror=sign-compare)
elseif(CMAKE_CXX_COMPILER_ID STREQUAL "GNU")
# using GCC
add_compile_options(-Wall -Wconversion -Wshadow -Wextra)
# not yet ready for -Wsign-conversion
if(JSONCPP_WITH_STRICT_ISO)
add_compile_options(-pedantic)
endif()
if(JSONCPP_WITH_WARNING_AS_ERROR)
add_compile_options(-Werror=conversion)
endif()
elseif(CMAKE_CXX_COMPILER_ID STREQUAL "Intel")
# using Intel compiler
add_compile_options(-Wall -Wconversion -Wshadow -Wextra -Werror=conversion)
if(JSONCPP_WITH_STRICT_ISO AND NOT JSONCPP_WITH_WARNING_AS_ERROR)
add_compile_options(-pedantic)
endif()
endif()
find_program(CCACHE_FOUND ccache)
if(CCACHE_FOUND)
set_property(GLOBAL PROPERTY RULE_LAUNCH_COMPILE ccache)
set_property(GLOBAL PROPERTY RULE_LAUNCH_LINK ccache)
endif(CCACHE_FOUND)
if(JSONCPP_WITH_WARNING_AS_ERROR)
UseCompilationWarningAsError()
ENDIF(JSONCPP_WITH_WARNING_AS_ERROR)
endif()
IF(JSONCPP_WITH_PKGCONFIG_SUPPORT)
CONFIGURE_FILE(
"pkg-config/jsoncpp.pc.in"
"pkg-config/jsoncpp.pc"
@ONLY)
INSTALL(FILES "${CMAKE_BINARY_DIR}/pkg-config/jsoncpp.pc"
DESTINATION "${CMAKE_INSTALL_PREFIX}/lib${LIB_SUFFIX}/pkgconfig")
ENDIF(JSONCPP_WITH_PKGCONFIG_SUPPORT)
if(JSONCPP_WITH_PKGCONFIG_SUPPORT)
configure_file(
"pkg-config/jsoncpp.pc.in"
"pkg-config/jsoncpp.pc"
@ONLY)
install(FILES "${CMAKE_CURRENT_BINARY_DIR}/pkg-config/jsoncpp.pc"
DESTINATION "${CMAKE_INSTALL_LIBDIR}/pkgconfig")
endif()
IF(JSONCPP_WITH_CMAKE_PACKAGE)
INSTALL(EXPORT jsoncpp
DESTINATION ${PACKAGE_INSTALL_DIR}/jsoncpp
if(JSONCPP_WITH_CMAKE_PACKAGE)
include (CMakePackageConfigHelpers)
install(EXPORT jsoncpp
DESTINATION ${CMAKE_INSTALL_LIBDIR}/cmake/jsoncpp
FILE jsoncppConfig.cmake)
ENDIF(JSONCPP_WITH_CMAKE_PACKAGE)
write_basic_package_version_file ("${CMAKE_CURRENT_BINARY_DIR}/jsoncppConfigVersion.cmake"
VERSION ${PROJECT_VERSION}
COMPATIBILITY SameMajorVersion)
install(FILES ${CMAKE_CURRENT_BINARY_DIR}/jsoncppConfigVersion.cmake
DESTINATION ${CMAKE_INSTALL_LIBDIR}/cmake/jsoncpp)
endif()
if(JSONCPP_WITH_TESTS)
enable_testing()
include(CTest)
endif()
# Build the different applications
ADD_SUBDIRECTORY( src )
add_subdirectory( src )
#install the includes
ADD_SUBDIRECTORY( include )
add_subdirectory( include )

145
CONTRIBUTING.md Normal file
View File

@@ -0,0 +1,145 @@
# Contributing to JsonCpp
## Building
Both CMake and Meson tools are capable of generating a variety of build environments for you preferred development environment.
Using cmake or meson you can generate an XCode, Visual Studio, Unix Makefile, Ninja, or other environment that fits your needs.
An example of a common Meson/Ninja environment is described next.
## Building and testing with Meson/Ninja
Thanks to David Seifert (@SoapGentoo), we (the maintainers) now use
[meson](http://mesonbuild.com/) and [ninja](https://ninja-build.org/) to build
for debugging, as well as for continuous integration (see
[`./travis_scripts/meson_builder.sh`](./travis_scripts/meson_builder.sh) ). Other systems may work, but minor
things like version strings might break.
First, install both meson (which requires Python3) and ninja.
If you wish to install to a directory other than /usr/local, set an environment variable called DESTDIR with the desired path:
DESTDIR=/path/to/install/dir
Then,
cd jsoncpp/
BUILD_TYPE=debug
#BUILD_TYPE=release
LIB_TYPE=shared
#LIB_TYPE=static
meson --buildtype ${BUILD_TYPE} --default-library ${LIB_TYPE} . build-${LIB_TYPE}
#ninja -v -C build-${LIB_TYPE} test # This stopped working on my Mac.
ninja -v -C build-${LIB_TYPE}
cd build-${LIB_TYPE}
meson test --no-rebuild --print-errorlogs
sudo ninja install
## Building and testing with other build systems
See https://github.com/open-source-parsers/jsoncpp/wiki/Building
## Running the tests manually
You need to run tests manually only if you are troubleshooting an issue.
In the instructions below, replace `path/to/jsontest` with the path of the
`jsontest` executable that was compiled on your platform.
cd test
# This will run the Reader/Writer tests
python runjsontests.py path/to/jsontest
# This will run the Reader/Writer tests, using JSONChecker test suite
# (http://www.json.org/JSON_checker/).
# Notes: not all tests pass: JsonCpp is too lenient (for example,
# it allows an integer to start with '0'). The goal is to improve
# strict mode parsing to get all tests to pass.
python runjsontests.py --with-json-checker path/to/jsontest
# This will run the unit tests (mostly Value)
python rununittests.py path/to/test_lib_json
# You can run the tests using valgrind:
python rununittests.py --valgrind path/to/test_lib_json
## Building the documentation
Run the Python script `doxybuild.py` from the top directory:
python doxybuild.py --doxygen=$(which doxygen) --open --with-dot
See `doxybuild.py --help` for options.
## Adding a reader/writer test
To add a test, you need to create two files in test/data:
* a `TESTNAME.json` file, that contains the input document in JSON format.
* a `TESTNAME.expected` file, that contains a flatened representation of the
input document.
The `TESTNAME.expected` file format is as follows:
* Each line represents a JSON element of the element tree represented by the
input document.
* Each line has two parts: the path to access the element separated from the
element value by `=`. Array and object values are always empty (i.e.
represented by either `[]` or `{}`).
* Element path `.` represents the root element, and is used to separate object
members. `[N]` is used to specify the value of an array element at index `N`.
See the examples `test_complex_01.json` and `test_complex_01.expected` to better understand element paths.
## Understanding reader/writer test output
When a test is run, output files are generated beside the input test files. Below is a short description of the content of each file:
* `test_complex_01.json`: input JSON document.
* `test_complex_01.expected`: flattened JSON element tree used to check if
parsing was corrected.
* `test_complex_01.actual`: flattened JSON element tree produced by `jsontest`
from reading `test_complex_01.json`.
* `test_complex_01.rewrite`: JSON document written by `jsontest` using the
`Json::Value` parsed from `test_complex_01.json` and serialized using
`Json::StyledWritter`.
* `test_complex_01.actual-rewrite`: flattened JSON element tree produced by
`jsontest` from reading `test_complex_01.rewrite`.
* `test_complex_01.process-output`: `jsontest` output, typically useful for
understanding parsing errors.
## Versioning rules
Consumers of this library require a strict approach to incrementing versioning of the JsonCpp library. Currently, we follow the below set of rules:
* Any new public symbols require a minor version bump.
* Any alteration or removal of public symbols requires a major version bump, including changing the size of a class. This is necessary for
consumers to do dependency injection properly.
## Preparing code for submission
Generally, JsonCpp's style guide has been pretty relaxed, with the following common themes:
* Variables and function names use lower camel case (E.g. parseValue or collectComments).
* Class use camel case (e.g. OurReader)
* Member variables have a trailing underscore
* Prefer `nullptr` over `NULL`.
* Passing by non-const reference is allowed.
* Single statement if blocks may omit brackets.
* Generally prefer less space over more space.
For an example:
```c++
bool Reader::decodeNumber(Token& token) {
Value decoded;
if (!decodeNumber(token, decoded))
return false;
currentValue().swapPayload(decoded);
currentValue().setOffsetStart(token.start_ - begin_);
currentValue().setOffsetLimit(token.end_ - begin_);
return true;
}
```
Before submitting your code, ensure that you meet the versioning requirements above, follow the style guide of the file you are modifying (or the above rules for new files), and run clang format. Meson exposes clang format with the following command:
```
ninja -v -C build-${LIB_TYPE}/ clang-format
```

15
CTestConfig.cmake Normal file
View File

@@ -0,0 +1,15 @@
## This file should be placed in the root directory of your project.
## Then modify the CMakeLists.txt file in the root directory of your
## project to incorporate the testing dashboard.
##
## # The following are required to submit to the CDash dashboard:
## ENABLE_TESTING()
## INCLUDE(CTest)
set(CTEST_PROJECT_NAME "jsoncpp")
set(CTEST_NIGHTLY_START_TIME "01:23:45 UTC")
set(CTEST_DROP_METHOD "https")
set(CTEST_DROP_SITE "my.cdash.org")
set(CTEST_DROP_LOCATION "/submit.php?project=jsoncpp")
set(CTEST_DROP_SITE_CDASH TRUE)

View File

@@ -2,13 +2,13 @@ The JsonCpp library's source code, including accompanying documentation,
tests and demonstration applications, are licensed under the following
conditions...
The author (Baptiste Lepilleur) explicitly disclaims copyright in all
Baptiste Lepilleur and The JsonCpp Authors explicitly disclaim copyright in all
jurisdictions which recognize such a disclaimer. In such jurisdictions,
this software is released into the Public Domain.
In jurisdictions which do not recognize Public Domain property (e.g. Germany as of
2010), this software is Copyright (c) 2007-2010 by Baptiste Lepilleur, and is
released under the terms of the MIT License (see below).
2010), this software is Copyright (c) 2007-2010 by Baptiste Lepilleur and
The JsonCpp Authors, and is released under the terms of the MIT License (see below).
In jurisdictions which recognize Public Domain property, the user of this
software may choose to accept it either as 1) Public Domain, 2) under the
@@ -23,7 +23,7 @@ described in clear, concise terms at:
The full text of the MIT License follows:
========================================================================
Copyright (c) 2007-2010 Baptiste Lepilleur
Copyright (c) 2007-2010 Baptiste Lepilleur and The JsonCpp Authors
Permission is hereby granted, free of charge, to any person
obtaining a copy of this software and associated documentation

175
NEWS.txt
View File

@@ -1,175 +0,0 @@
New in SVN
----------
* Updated the type system's behavior, in order to better support backwards
compatibility with code that was written before 64-bit integer support was
introduced. Here's how it works now:
* isInt, isInt64, isUInt, and isUInt64 return true if and only if the
value can be exactly represented as that type. In particular, a value
constructed with a double like 17.0 will now return true for all of
these methods.
* isDouble and isFloat now return true for all numeric values, since all
numeric values can be converted to a double or float without
truncation. Note however that the conversion may not be exact -- for
example, doubles cannot exactly represent all integers above 2^53 + 1.
* isBool, isNull, isString, isArray, and isObject now return true if and
only if the value is of that type.
* isConvertibleTo(fooValue) indicates that it is safe to call asFoo.
(For each type foo, isFoo always implies isConvertibleTo(fooValue).)
asFoo returns an approximate or exact representation as appropriate.
For example, a double value may be truncated when asInt is called.
* For backwards compatibility with old code, isConvertibleTo(intValue)
may return false even if type() == intValue. This is because the value
may have been constructed with a 64-bit integer larger than maxInt,
and calling asInt() would cause an exception. If you're writing new
code, use isInt64 to find out whether the value is exactly
representable using an Int64, or asDouble() combined with minInt64 and
maxInt64 to figure out whether it is approximately representable.
* Value
- Patch #10: BOOST_FOREACH compatibility. Made Json::iterator more
standard compliant, added missing iterator_category and value_type
typedefs (contribued by Robert A. Iannucci).
* Compilation
- New CMake based build system. Based in part on contribution from
Igor Okulist and Damien Buhl (Patch #14).
- New header json/version.h now contains version number macros
(JSONCPP_VERSION_MAJOR, JSONCPP_VERSION_MINOR, JSONCPP_VERSION_PATCH
and JSONCPP_VERSION_HEXA).
- Patch #11: added missing JSON_API on some classes causing link issues
when building as a dynamic library on Windows
(contributed by Francis Bolduc).
- Visual Studio DLL: suppressed warning "C4251: <data member>: <type>
needs to have dll-interface to be used by..." via pragma push/pop
in json-cpp headers.
- Added Travis CI intregration: https://travis-ci.org/blep/jsoncpp-mirror
* Bug fixes
- Patch #15: Copy constructor does not initialize allocated_ for stringValue
(contributed by rmongia).
- Patch #16: Missing field copy in Json::Value::iterator causing infinite
loop when using experimental internal map (#define JSON_VALUE_USE_INTERNAL_MAP)
(contributed by Ming-Lin Kao).
New in JsonCpp 0.6.0:
---------------------
* Compilation
- LD_LIBRARY_PATH and LIBRARY_PATH environment variables are now
propagated to the build environment as this is required for some
compiler installation.
- Added support for Microsoft Visual Studio 2008 (bug #2930462):
The platform "msvc90" has been added.
Notes: you need to setup the environment by running vcvars32.bat
(e.g. MSVC 2008 command prompt in start menu) before running scons.
- Added support for amalgamated source and header generation (a la sqlite).
Refer to README.txt section "Generating amalgamated source and header"
for detail.
* Value
- Removed experimental ValueAllocator, it caused static
initialization/destruction order issues (bug #2934500).
The DefaultValueAllocator has been inlined in code.
- Added support for 64 bits integer:
Types Json::Int64 and Json::UInt64 have been added. They are aliased
to 64 bits integers on system that support them (based on __int64 on
Microsoft Visual Studio platform, and long long on other platforms).
Types Json::LargestInt and Json::LargestUInt have been added. They are
aliased to the largest integer type supported:
either Json::Int/Json::UInt or Json::Int64/Json::UInt64 respectively.
Json::Value::asInt() and Json::Value::asUInt() still returns plain
"int" based types, but asserts if an attempt is made to retrieve
a 64 bits value that can not represented as the return type.
Json::Value::asInt64() and Json::Value::asUInt64() have been added
to obtain the 64 bits integer value.
Json::Value::asLargestInt() and Json::Value::asLargestUInt() returns
the integer as a LargestInt/LargestUInt respectively. Those functions
functions are typically used when implementing writer.
The reader attempts to read number as 64 bits integer, and fall back
to reading a double if the number is not in the range of 64 bits
integer.
Warning: Json::Value::asInt() and Json::Value::asUInt() now returns
long long. This changes break code that was passing the return value
to *printf() function.
Support for 64 bits integer can be disabled by defining the macro
JSON_NO_INT64 (uncomment it in json/config.h for example), though
it should have no impact on existing usage.
- The type Json::ArrayIndex is used for indexes of a JSON value array. It
is an unsigned int (typically 32 bits).
- Array index can be passed as int to operator[], allowing use of literal:
Json::Value array;
array.append( 1234 );
int value = array[0].asInt(); // did not compile previously
- Added float Json::Value::asFloat() to obtain a floating point value as a
float (avoid lost of precision warning caused by used of asDouble()
to initialize a float).
* Reader
- Renamed Reader::getFormatedErrorMessages() to getFormattedErrorMessages.
Bug #3023708 (Formatted has 2 't'). The old member function is deprecated
but still present for backward compatibility.
* Tests
- Added test to ensure that the escape sequence "\/" is corrected handled
by the parser.
* Bug fixes
- Bug #3139677: JSON [1 2 3] was incorrectly parsed as [1, 3]. Error is now
correctly detected.
- Bug #3139678: stack buffer overflow when parsing a double with a
length of 32 characters.
- Fixed Value::operator <= implementation (had the semantic of operator >=).
Found when adding unit tests for comparison operators.
- Value::compare() is now const and has an actual implementation with
unit tests.
- Bug #2407932: strpbrk() can fail for NULL pointer.
- Bug #3306345: Fixed minor typo in Path::resolve().
- Bug #3314841/#3306896: errors in amalgamate.py
- Fixed some Coverity warnings and line-endings.
* License
- See file LICENSE for details. Basically JsonCpp is now licensed under
MIT license, or public domain if desired and recognized in your jurisdiction.
Thanks to Stephan G. Beal [http://wanderinghorse.net/home/stephan/]) who
helped figuring out the solution to the public domain issue.

224
README.md
View File

@@ -1,5 +1,6 @@
Introduction
------------
# JsonCpp
[![badge](https://img.shields.io/badge/conan.io-jsoncpp%2F1.8.0-green.svg?logo=data:image/png;base64%2CiVBORw0KGgoAAAANSUhEUgAAAA4AAAAOCAMAAAAolt3jAAAA1VBMVEUAAABhlctjlstkl8tlmMtlmMxlmcxmmcxnmsxpnMxpnM1qnc1sn85voM91oM11oc1xotB2oc56pNF6pNJ2ptJ8ptJ8ptN9ptN8p9N5qNJ9p9N9p9R8qtOBqdSAqtOAqtR%2BrNSCrNJ/rdWDrNWCsNWCsNaJs9eLs9iRvNuVvdyVv9yXwd2Zwt6axN6dxt%2Bfx%2BChyeGiyuGjyuCjyuGly%2BGlzOKmzOGozuKoz%2BKqz%2BOq0OOv1OWw1OWw1eWx1eWy1uay1%2Baz1%2Baz1%2Bez2Oe02Oe12ee22ujUGwH3AAAAAXRSTlMAQObYZgAAAAFiS0dEAIgFHUgAAAAJcEhZcwAACxMAAAsTAQCanBgAAAAHdElNRQfgBQkREyOxFIh/AAAAiklEQVQI12NgAAMbOwY4sLZ2NtQ1coVKWNvoc/Eq8XDr2wB5Ig62ekza9vaOqpK2TpoMzOxaFtwqZua2Bm4makIM7OzMAjoaCqYuxooSUqJALjs7o4yVpbowvzSUy87KqSwmxQfnsrPISyFzWeWAXCkpMaBVIC4bmCsOdgiUKwh3JojLgAQ4ZCE0AMm2D29tZwe6AAAAAElFTkSuQmCC)](https://bintray.com/theirix/conan-repo/jsoncpp%3Atheirix)
[JSON][json-org] is a lightweight data-interchange format. It can represent
numbers, strings, ordered sequences of values, and collections of name/value
@@ -12,213 +13,34 @@ serialization and deserialization to and from strings. It can also preserve
existing comment in unserialization/serialization steps, making it a convenient
format to store user input files.
## Documentation
[JsonCpp documentation][JsonCpp-documentation] is generated using [Doxygen][].
[JsonCpp-documentation]: http://open-source-parsers.github.io/jsoncpp-docs/doxygen/index.html
[Doxygen]: http://www.doxygen.org
## A note on backward-compatibility
Very soon, we are switching to C++11 only. For older compilers, try the `pre-C++11` branch.
Using JsonCpp in your project
-----------------------------
The recommended approach to integrating JsonCpp in your project is to build
the amalgamated source (a single `.cpp` file) with your own build system. This
ensures consistency of compilation flags and ABI compatibility. See the section
"Generating amalgamated source and header" for instructions.
The `include/` should be added to your compiler include path. Jsoncpp headers
should be included as follow:
#include <json/json.h>
If JsonCpp was build as a dynamic library on Windows, then your project needs to
define the macro `JSON_DLL`.
* `1.y.z` is built with C++11.
* `0.y.z` can be used with older compilers.
* Major versions maintain binary-compatibility.
Building and testing with new CMake
-----------------------------------
## Using JsonCpp in your project
[CMake][] is a C++ Makefiles/Solution generator. It is usually available on most
Linux system as package. On Ubuntu:
### Amalgamated source
https://github.com/open-source-parsers/jsoncpp/wiki/Amalgamated
sudo apt-get install cmake
### The Meson Build System
If you are using the [Meson Build System](http://mesonbuild.com), then you can get a wrap file by downloading it from [Meson WrapDB](https://wrapdb.mesonbuild.com/jsoncpp), or simply use `meson wrap install jsoncpp`.
[CMake]: http://www.cmake.org
### Other ways
If you have trouble, see the Wiki, or post a question as an Issue.
Note that Python is also required to run the JSON reader/writer tests. If
missing, the build will skip running those tests.
When running CMake, a few parameters are required:
* a build directory where the makefiles/solution are generated. It is also used
to store objects, libraries and executables files.
* the generator to use: makefiles or Visual Studio solution? What version or
Visual Studio, 32 or 64 bits solution?
Steps for generating solution/makefiles using `cmake-gui`:
* Make "source code" point to the source directory.
* Make "where to build the binary" point to the directory to use for the build.
* Click on the "Grouped" check box.
* Review JsonCpp build options (tick `JSONCPP_LIB_BUILD_SHARED` to build as a
dynamic library).
* Click the configure button at the bottom, then the generate button.
* The generated solution/makefiles can be found in the binary directory.
Alternatively, from the command-line on Unix in the source directory:
mkdir -p build/debug
cd build/debug
cmake -DCMAKE_BUILD_TYPE=debug -DJSONCPP_LIB_BUILD_SHARED=OFF -G "Unix Makefiles" ../..
make
Running `cmake -`" will display the list of available generators (passed using
the `-G` option).
By default CMake hides compilation commands. This can be modified by specifying
`-DCMAKE_VERBOSE_MAKEFILE=true` when generating makefiles.
Building and testing with SCons
-------------------------------
**Note:** The SCons-based build system is deprecated. Please use CMake; see the
section above.
JsonCpp can use [Scons][] as a build system. Note that SCons requires Python to
be installed.
[SCons]: http://www.scons.org/
Invoke SCons as follows:
scons platform=$PLATFORM [TARGET]
where `$PLATFORM` may be one of:
* `suncc`: Sun C++ (Solaris)
* `vacpp`: Visual Age C++ (AIX)
* `mingw`
* `msvc6`: Microsoft Visual Studio 6 service pack 5-6
* `msvc70`: Microsoft Visual Studio 2002
* `msvc71`: Microsoft Visual Studio 2003
* `msvc80`: Microsoft Visual Studio 2005
* `msvc90`: Microsoft Visual Studio 2008
* `linux-gcc`: Gnu C++ (linux, also reported to work for Mac OS X)
If you are building with Microsoft Visual Studio 2008, you need to set up the
environment by running `vcvars32.bat` (e.g. MSVC 2008 command prompt) before
running SCons.
Running the tests manually
--------------------------
Note that test can be run using SCons using the `check` target:
scons platform=$PLATFORM check
You need to run tests manually only if you are troubleshooting an issue.
In the instructions below, replace `path/to/jsontest` with the path of the
`jsontest` executable that was compiled on your platform.
cd test
# This will run the Reader/Writer tests
python runjsontests.py path/to/jsontest
# This will run the Reader/Writer tests, using JSONChecker test suite
# (http://www.json.org/JSON_checker/).
# Notes: not all tests pass: JsonCpp is too lenient (for example,
# it allows an integer to start with '0'). The goal is to improve
# strict mode parsing to get all tests to pass.
python runjsontests.py --with-json-checker path/to/jsontest
# This will run the unit tests (mostly Value)
python rununittests.py path/to/test_lib_json
# You can run the tests using valgrind:
python rununittests.py --valgrind path/to/test_lib_json
Building the documentation
--------------------------
Run the Python script `doxybuild.py` from the top directory:
python doxybuild.py --doxygen=$(which doxygen) --open --with-dot
See `doxybuild.py --help` for options.
Generating amalgamated source and header
----------------------------------------
JsonCpp is provided with a script to generate a single header and a single
source file to ease inclusion into an existing project. The amalgamated source
can be generated at any time by running the following command from the
top-directory (this requires Python 2.6):
python amalgamate.py
It is possible to specify header name. See the `-h` option for detail.
By default, the following files are generated:
* `dist/jsoncpp.cpp`: source file that needs to be added to your project.
* `dist/json/json.h`: corresponding header file for use in your project. It is
equivalent to including `json/json.h` in non-amalgamated source. This header
only depends on standard headers.
* `dist/json/json-forwards.h`: header that provides forward declaration of all
JsonCpp types.
The amalgamated sources are generated by concatenating JsonCpp source in the
correct order and defining the macro `JSON_IS_AMALGAMATION` to prevent inclusion
of other headers.
Adding a reader/writer test
---------------------------
To add a test, you need to create two files in test/data:
* a `TESTNAME.json` file, that contains the input document in JSON format.
* a `TESTNAME.expected` file, that contains a flatened representation of the
input document.
The `TESTNAME.expected` file format is as follows:
* each line represents a JSON element of the element tree represented by the
input document.
* each line has two parts: the path to access the element separated from the
element value by `=`. Array and object values are always empty (i.e.
represented by either `[]` or `{}`).
* element path: `.` represents the root element, and is used to separate object
members. `[N]` is used to specify the value of an array element at index `N`.
See the examples `test_complex_01.json` and `test_complex_01.expected` to better
understand element paths.
Understanding reader/writer test output
---------------------------------------
When a test is run, output files are generated beside the input test files.
Below is a short description of the content of each file:
* `test_complex_01.json`: input JSON document.
* `test_complex_01.expected`: flattened JSON element tree used to check if
parsing was corrected.
* `test_complex_01.actual`: flattened JSON element tree produced by `jsontest`
from reading `test_complex_01.json`.
* `test_complex_01.rewrite`: JSON document written by `jsontest` using the
`Json::Value` parsed from `test_complex_01.json` and serialized using
`Json::StyledWritter`.
* `test_complex_01.actual-rewrite`: flattened JSON element tree produced by
`jsontest` from reading `test_complex_01.rewrite`.
* `test_complex_01.process-output`: `jsontest` output, typically useful for
understanding parsing errors.
License
-------
## License
See the `LICENSE` file for details. In summary, JsonCpp is licensed under the
MIT license, or public domain if desired and recognized in your jurisdiction.

View File

@@ -1,248 +0,0 @@
"""
Notes:
- shared library support is buggy: it assumes that a static and dynamic library can be build from the same object files. This is not true on many platforms. For this reason it is only enabled on linux-gcc at the current time.
To add a platform:
- add its name in options allowed_values below
- add tool initialization for this platform. Search for "if platform == 'suncc'" as an example.
"""
import os
import os.path
import sys
JSONCPP_VERSION = open(File('#version').abspath,'rt').read().strip()
DIST_DIR = '#dist'
options = Variables()
options.Add( EnumVariable('platform',
'Platform (compiler/stl) used to build the project',
'msvc71',
allowed_values='suncc vacpp mingw msvc6 msvc7 msvc71 msvc80 msvc90 linux-gcc'.split(),
ignorecase=2) )
try:
platform = ARGUMENTS['platform']
if platform == 'linux-gcc':
CXX = 'g++' # not quite right, but env is not yet available.
import commands
version = commands.getoutput('%s -dumpversion' %CXX)
platform = 'linux-gcc-%s' %version
print "Using platform '%s'" %platform
LD_LIBRARY_PATH = os.environ.get('LD_LIBRARY_PATH', '')
LD_LIBRARY_PATH = "%s:libs/%s" %(LD_LIBRARY_PATH, platform)
os.environ['LD_LIBRARY_PATH'] = LD_LIBRARY_PATH
print "LD_LIBRARY_PATH =", LD_LIBRARY_PATH
except KeyError:
print 'You must specify a "platform"'
sys.exit(2)
print "Building using PLATFORM =", platform
rootbuild_dir = Dir('#buildscons')
build_dir = os.path.join( '#buildscons', platform )
bin_dir = os.path.join( '#bin', platform )
lib_dir = os.path.join( '#libs', platform )
sconsign_dir_path = Dir(build_dir).abspath
sconsign_path = os.path.join( sconsign_dir_path, '.sconsign.dbm' )
# Ensure build directory exist (SConsignFile fail otherwise!)
if not os.path.exists( sconsign_dir_path ):
os.makedirs( sconsign_dir_path )
# Store all dependencies signature in a database
SConsignFile( sconsign_path )
def make_environ_vars():
"""Returns a dictionnary with environment variable to use when compiling."""
# PATH is required to find the compiler
# TEMP is required for at least mingw
# LD_LIBRARY_PATH & co is required on some system for the compiler
vars = {}
for name in ('PATH', 'TEMP', 'TMP', 'LD_LIBRARY_PATH', 'LIBRARY_PATH'):
if name in os.environ:
vars[name] = os.environ[name]
return vars
env = Environment( ENV = make_environ_vars(),
toolpath = ['scons-tools'],
tools=[] ) #, tools=['default'] )
if platform == 'suncc':
env.Tool( 'sunc++' )
env.Tool( 'sunlink' )
env.Tool( 'sunar' )
env.Append( CCFLAGS = ['-mt'] )
elif platform == 'vacpp':
env.Tool( 'default' )
env.Tool( 'aixcc' )
env['CXX'] = 'xlC_r' #scons does not pick-up the correct one !
# using xlC_r ensure multi-threading is enabled:
# http://publib.boulder.ibm.com/infocenter/pseries/index.jsp?topic=/com.ibm.vacpp7a.doc/compiler/ref/cuselect.htm
env.Append( CCFLAGS = '-qrtti=all',
LINKFLAGS='-bh:5' ) # -bh:5 remove duplicate symbol warning
elif platform == 'msvc6':
env['MSVS_VERSION']='6.0'
for tool in ['msvc', 'msvs', 'mslink', 'masm', 'mslib']:
env.Tool( tool )
env['CXXFLAGS']='-GR -GX /nologo /MT'
elif platform == 'msvc70':
env['MSVS_VERSION']='7.0'
for tool in ['msvc', 'msvs', 'mslink', 'masm', 'mslib']:
env.Tool( tool )
env['CXXFLAGS']='-GR -GX /nologo /MT'
elif platform == 'msvc71':
env['MSVS_VERSION']='7.1'
for tool in ['msvc', 'msvs', 'mslink', 'masm', 'mslib']:
env.Tool( tool )
env['CXXFLAGS']='-GR -GX /nologo /MT'
elif platform == 'msvc80':
env['MSVS_VERSION']='8.0'
for tool in ['msvc', 'msvs', 'mslink', 'masm', 'mslib']:
env.Tool( tool )
env['CXXFLAGS']='-GR -EHsc /nologo /MT'
elif platform == 'msvc90':
env['MSVS_VERSION']='9.0'
# Scons 1.2 fails to detect the correct location of the platform SDK.
# So we propagate those from the environment. This requires that the
# user run vcvars32.bat before compiling.
if 'INCLUDE' in os.environ:
env['ENV']['INCLUDE'] = os.environ['INCLUDE']
if 'LIB' in os.environ:
env['ENV']['LIB'] = os.environ['LIB']
for tool in ['msvc', 'msvs', 'mslink', 'masm', 'mslib']:
env.Tool( tool )
env['CXXFLAGS']='-GR -EHsc /nologo /MT'
elif platform == 'mingw':
env.Tool( 'mingw' )
env.Append( CPPDEFINES=[ "WIN32", "NDEBUG", "_MT" ] )
elif platform.startswith('linux-gcc'):
env.Tool( 'default' )
env.Append( LIBS = ['pthread'], CCFLAGS = os.environ.get("CXXFLAGS", "-Wall"), LINKFLAGS=os.environ.get("LDFLAGS", "") )
env['SHARED_LIB_ENABLED'] = True
else:
print "UNSUPPORTED PLATFORM."
env.Exit(1)
env.Tool('targz')
env.Tool('srcdist')
env.Tool('globtool')
env.Append( CPPPATH = ['#include'],
LIBPATH = lib_dir )
short_platform = platform
if short_platform.startswith('msvc'):
short_platform = short_platform[2:]
# Notes: on Windows you need to rebuild the source for each variant
# Build script does not support that yet so we only build static libraries.
# This also fails on AIX because both dynamic and static library ends with
# extension .a.
env['SHARED_LIB_ENABLED'] = env.get('SHARED_LIB_ENABLED', False)
env['LIB_PLATFORM'] = short_platform
env['LIB_LINK_TYPE'] = 'lib' # static
env['LIB_CRUNTIME'] = 'mt'
env['LIB_NAME_SUFFIX'] = '${LIB_PLATFORM}_${LIB_LINK_TYPE}${LIB_CRUNTIME}' # must match autolink naming convention
env['JSONCPP_VERSION'] = JSONCPP_VERSION
env['BUILD_DIR'] = env.Dir(build_dir)
env['ROOTBUILD_DIR'] = env.Dir(rootbuild_dir)
env['DIST_DIR'] = DIST_DIR
if 'TarGz' in env['BUILDERS']:
class SrcDistAdder:
def __init__( self, env ):
self.env = env
def __call__( self, *args, **kw ):
apply( self.env.SrcDist, (self.env['SRCDIST_TARGET'],) + args, kw )
env['SRCDIST_BUILDER'] = env.TarGz
else: # If tarfile module is missing
class SrcDistAdder:
def __init__( self, env ):
pass
def __call__( self, *args, **kw ):
pass
env['SRCDIST_ADD'] = SrcDistAdder( env )
env['SRCDIST_TARGET'] = os.path.join( DIST_DIR, 'jsoncpp-src-%s.tar.gz' % env['JSONCPP_VERSION'] )
env_testing = env.Clone( )
env_testing.Append( LIBS = ['json_${LIB_NAME_SUFFIX}'] )
def buildJSONExample( env, target_sources, target_name ):
env = env.Clone()
env.Append( CPPPATH = ['#'] )
exe = env.Program( target=target_name,
source=target_sources )
env['SRCDIST_ADD']( source=[target_sources] )
global bin_dir
return env.Install( bin_dir, exe )
def buildJSONTests( env, target_sources, target_name ):
jsontests_node = buildJSONExample( env, target_sources, target_name )
check_alias_target = env.Alias( 'check', jsontests_node, RunJSONTests( jsontests_node, jsontests_node ) )
env.AlwaysBuild( check_alias_target )
def buildUnitTests( env, target_sources, target_name ):
jsontests_node = buildJSONExample( env, target_sources, target_name )
check_alias_target = env.Alias( 'check', jsontests_node,
RunUnitTests( jsontests_node, jsontests_node ) )
env.AlwaysBuild( check_alias_target )
def buildLibrary( env, target_sources, target_name ):
static_lib = env.StaticLibrary( target=target_name + '_${LIB_NAME_SUFFIX}',
source=target_sources )
global lib_dir
env.Install( lib_dir, static_lib )
if env['SHARED_LIB_ENABLED']:
shared_lib = env.SharedLibrary( target=target_name + '_${LIB_NAME_SUFFIX}',
source=target_sources )
env.Install( lib_dir, shared_lib )
env['SRCDIST_ADD']( source=[target_sources] )
Export( 'env env_testing buildJSONExample buildLibrary buildJSONTests buildUnitTests' )
def buildProjectInDirectory( target_directory ):
global build_dir
target_build_dir = os.path.join( build_dir, target_directory )
target = os.path.join( target_directory, 'sconscript' )
SConscript( target, build_dir=target_build_dir, duplicate=0 )
env['SRCDIST_ADD']( source=[target] )
def runJSONTests_action( target, source = None, env = None ):
# Add test scripts to python path
jsontest_path = Dir( '#test' ).abspath
sys.path.insert( 0, jsontest_path )
data_path = os.path.join( jsontest_path, 'data' )
import runjsontests
return runjsontests.runAllTests( os.path.abspath(source[0].path), data_path )
def runJSONTests_string( target, source = None, env = None ):
return 'RunJSONTests("%s")' % source[0]
import SCons.Action
ActionFactory = SCons.Action.ActionFactory
RunJSONTests = ActionFactory(runJSONTests_action, runJSONTests_string )
def runUnitTests_action( target, source = None, env = None ):
# Add test scripts to python path
jsontest_path = Dir( '#test' ).abspath
sys.path.insert( 0, jsontest_path )
import rununittests
return rununittests.runAllTests( os.path.abspath(source[0].path) )
def runUnitTests_string( target, source = None, env = None ):
return 'RunUnitTests("%s")' % source[0]
RunUnitTests = ActionFactory(runUnitTests_action, runUnitTests_string )
env.Alias( 'check' )
srcdist_cmd = env['SRCDIST_ADD']( source = """
AUTHORS README.txt SConstruct
""".split() )
env.Alias( 'src-dist', srcdist_cmd )
buildProjectInDirectory( 'src/jsontestrunner' )
buildProjectInDirectory( 'src/lib_json' )
buildProjectInDirectory( 'src/test_lib_json' )
#print env.Dump()

View File

@@ -1,129 +1,134 @@
"""Amalgate json-cpp library sources into a single source and header file.
"""Amalgamate json-cpp library sources into a single source and header file.
Requires Python 2.6
Works with python2.6+ and python3.4+.
Example of invocation (must be invoked from json-cpp top directory):
python amalgate.py
python amalgamate.py
"""
import os
import os.path
import sys
class AmalgamationFile:
def __init__( self, top_dir ):
def __init__(self, top_dir):
self.top_dir = top_dir
self.blocks = []
def add_text( self, text ):
if not text.endswith( "\n" ):
def add_text(self, text):
if not text.endswith("\n"):
text += "\n"
self.blocks.append( text )
self.blocks.append(text)
def add_file( self, relative_input_path, wrap_in_comment=False ):
def add_marker( prefix ):
self.add_text( "" )
self.add_text( "// " + "/"*70 )
self.add_text( "// %s of content of file: %s" % (prefix, relative_input_path.replace("\\","/")) )
self.add_text( "// " + "/"*70 )
self.add_text( "" )
add_marker( "Beginning" )
f = open( os.path.join( self.top_dir, relative_input_path ), "rt" )
def add_file(self, relative_input_path, wrap_in_comment=False):
def add_marker(prefix):
self.add_text("")
self.add_text("// " + "/"*70)
self.add_text("// %s of content of file: %s" % (prefix, relative_input_path.replace("\\","/")))
self.add_text("// " + "/"*70)
self.add_text("")
add_marker("Beginning")
f = open(os.path.join(self.top_dir, relative_input_path), "rt")
content = f.read()
if wrap_in_comment:
content = "/*\n" + content + "\n*/"
self.add_text( content )
self.add_text(content)
f.close()
add_marker( "End" )
self.add_text( "\n\n\n\n" )
add_marker("End")
self.add_text("\n\n\n\n")
def get_value( self ):
return "".join( self.blocks ).replace("\r\n","\n")
def get_value(self):
return "".join(self.blocks).replace("\r\n","\n")
def write_to( self, output_path ):
output_dir = os.path.dirname( output_path )
if output_dir and not os.path.isdir( output_dir ):
os.makedirs( output_dir )
f = open( output_path, "wb" )
f.write( str.encode(self.get_value(), 'UTF-8') )
def write_to(self, output_path):
output_dir = os.path.dirname(output_path)
if output_dir and not os.path.isdir(output_dir):
os.makedirs(output_dir)
f = open(output_path, "wb")
f.write(str.encode(self.get_value(), 'UTF-8'))
f.close()
def amalgamate_source( source_top_dir=None,
def amalgamate_source(source_top_dir=None,
target_source_path=None,
header_include_path=None ):
"""Produces amalgated source.
header_include_path=None):
"""Produces amalgamated source.
Parameters:
source_top_dir: top-directory
target_source_path: output .cpp path
header_include_path: generated header path relative to target_source_path.
"""
print("Amalgating header...")
header = AmalgamationFile( source_top_dir )
header.add_text( "/// Json-cpp amalgated header (http://jsoncpp.sourceforge.net/)." )
header.add_text( "/// It is intented to be used with #include <%s>" % header_include_path )
header.add_file( "LICENSE", wrap_in_comment=True )
header.add_text( "#ifndef JSON_AMALGATED_H_INCLUDED" )
header.add_text( "# define JSON_AMALGATED_H_INCLUDED" )
header.add_text( "/// If defined, indicates that the source file is amalgated" )
header.add_text( "/// to prevent private header inclusion." )
header.add_text( "#define JSON_IS_AMALGAMATION" )
header.add_file( "include/json/version.h" )
header.add_file( "include/json/config.h" )
header.add_file( "include/json/forwards.h" )
header.add_file( "include/json/features.h" )
header.add_file( "include/json/value.h" )
header.add_file( "include/json/reader.h" )
header.add_file( "include/json/writer.h" )
header.add_file( "include/json/assertions.h" )
header.add_text( "#endif //ifndef JSON_AMALGATED_H_INCLUDED" )
print("Amalgamating header...")
header = AmalgamationFile(source_top_dir)
header.add_text("/// Json-cpp amalgamated header (http://jsoncpp.sourceforge.net/).")
header.add_text('/// It is intended to be used with #include "%s"' % header_include_path)
header.add_file("LICENSE", wrap_in_comment=True)
header.add_text("#ifndef JSON_AMALGAMATED_H_INCLUDED")
header.add_text("# define JSON_AMALGAMATED_H_INCLUDED")
header.add_text("/// If defined, indicates that the source file is amalgamated")
header.add_text("/// to prevent private header inclusion.")
header.add_text("#define JSON_IS_AMALGAMATION")
header.add_file("include/json/version.h")
header.add_file("include/json/allocator.h")
header.add_file("include/json/config.h")
header.add_file("include/json/forwards.h")
header.add_file("include/json/features.h")
header.add_file("include/json/value.h")
header.add_file("include/json/reader.h")
header.add_file("include/json/writer.h")
header.add_file("include/json/assertions.h")
header.add_text("#endif //ifndef JSON_AMALGAMATED_H_INCLUDED")
target_header_path = os.path.join( os.path.dirname(target_source_path), header_include_path )
print("Writing amalgated header to %r" % target_header_path)
header.write_to( target_header_path )
target_header_path = os.path.join(os.path.dirname(target_source_path), header_include_path)
print("Writing amalgamated header to %r" % target_header_path)
header.write_to(target_header_path)
base, ext = os.path.splitext( header_include_path )
base, ext = os.path.splitext(header_include_path)
forward_header_include_path = base + "-forwards" + ext
print("Amalgating forward header...")
header = AmalgamationFile( source_top_dir )
header.add_text( "/// Json-cpp amalgated forward header (http://jsoncpp.sourceforge.net/)." )
header.add_text( "/// It is intented to be used with #include <%s>" % forward_header_include_path )
header.add_text( "/// This header provides forward declaration for all JsonCpp types." )
header.add_file( "LICENSE", wrap_in_comment=True )
header.add_text( "#ifndef JSON_FORWARD_AMALGATED_H_INCLUDED" )
header.add_text( "# define JSON_FORWARD_AMALGATED_H_INCLUDED" )
header.add_text( "/// If defined, indicates that the source file is amalgated" )
header.add_text( "/// to prevent private header inclusion." )
header.add_text( "#define JSON_IS_AMALGAMATION" )
header.add_file( "include/json/config.h" )
header.add_file( "include/json/forwards.h" )
header.add_text( "#endif //ifndef JSON_FORWARD_AMALGATED_H_INCLUDED" )
print("Amalgamating forward header...")
header = AmalgamationFile(source_top_dir)
header.add_text("/// Json-cpp amalgamated forward header (http://jsoncpp.sourceforge.net/).")
header.add_text('/// It is intended to be used with #include "%s"' % forward_header_include_path)
header.add_text("/// This header provides forward declaration for all JsonCpp types.")
header.add_file("LICENSE", wrap_in_comment=True)
header.add_text("#ifndef JSON_FORWARD_AMALGAMATED_H_INCLUDED")
header.add_text("# define JSON_FORWARD_AMALGAMATED_H_INCLUDED")
header.add_text("/// If defined, indicates that the source file is amalgamated")
header.add_text("/// to prevent private header inclusion.")
header.add_text("#define JSON_IS_AMALGAMATION")
header.add_file("include/json/config.h")
header.add_file("include/json/forwards.h")
header.add_text("#endif //ifndef JSON_FORWARD_AMALGAMATED_H_INCLUDED")
target_forward_header_path = os.path.join( os.path.dirname(target_source_path),
forward_header_include_path )
print("Writing amalgated forward header to %r" % target_forward_header_path)
header.write_to( target_forward_header_path )
target_forward_header_path = os.path.join(os.path.dirname(target_source_path),
forward_header_include_path)
print("Writing amalgamated forward header to %r" % target_forward_header_path)
header.write_to(target_forward_header_path)
print("Amalgating source...")
source = AmalgamationFile( source_top_dir )
source.add_text( "/// Json-cpp amalgated source (http://jsoncpp.sourceforge.net/)." )
source.add_text( "/// It is intented to be used with #include <%s>" % header_include_path )
source.add_file( "LICENSE", wrap_in_comment=True )
source.add_text( "" )
source.add_text( "#include <%s>" % header_include_path )
source.add_text( "" )
print("Amalgamating source...")
source = AmalgamationFile(source_top_dir)
source.add_text("/// Json-cpp amalgamated source (http://jsoncpp.sourceforge.net/).")
source.add_text('/// It is intended to be used with #include "%s"' % header_include_path)
source.add_file("LICENSE", wrap_in_comment=True)
source.add_text("")
source.add_text('#include "%s"' % header_include_path)
source.add_text("""
#ifndef JSON_IS_AMALGAMATION
#error "Compile with -I PATH_TO_JSON_DIRECTORY"
#endif
""")
source.add_text("")
lib_json = "src/lib_json"
source.add_file( os.path.join(lib_json, "json_tool.h") )
source.add_file( os.path.join(lib_json, "json_reader.cpp") )
source.add_file( os.path.join(lib_json, "json_batchallocator.h") )
source.add_file( os.path.join(lib_json, "json_valueiterator.inl") )
source.add_file( os.path.join(lib_json, "json_value.cpp") )
source.add_file( os.path.join(lib_json, "json_writer.cpp") )
source.add_file(os.path.join(lib_json, "json_tool.h"))
source.add_file(os.path.join(lib_json, "json_reader.cpp"))
source.add_file(os.path.join(lib_json, "json_valueiterator.inl"))
source.add_file(os.path.join(lib_json, "json_value.cpp"))
source.add_file(os.path.join(lib_json, "json_writer.cpp"))
print("Writing amalgated source to %r" % target_source_path)
source.write_to( target_source_path )
print("Writing amalgamated source to %r" % target_source_path)
source.write_to(target_source_path)
def main():
usage = """%prog [options]
Generate a single amalgated source and header file from the sources.
Generate a single amalgamated source and header file from the sources.
"""
from optparse import OptionParser
parser = OptionParser(usage=usage)
@@ -131,20 +136,20 @@ Generate a single amalgated source and header file from the sources.
parser.add_option("-s", "--source", dest="target_source_path", action="store", default="dist/jsoncpp.cpp",
help="""Output .cpp source path. [Default: %default]""")
parser.add_option("-i", "--include", dest="header_include_path", action="store", default="json/json.h",
help="""Header include path. Used to include the header from the amalgated source file. [Default: %default]""")
help="""Header include path. Used to include the header from the amalgamated source file. [Default: %default]""")
parser.add_option("-t", "--top-dir", dest="top_dir", action="store", default=os.getcwd(),
help="""Source top-directory. [Default: %default]""")
parser.enable_interspersed_args()
options, args = parser.parse_args()
msg = amalgamate_source( source_top_dir=options.top_dir,
msg = amalgamate_source(source_top_dir=options.top_dir,
target_source_path=options.target_source_path,
header_include_path=options.header_include_path )
header_include_path=options.header_include_path)
if msg:
sys.stderr.write( msg + "\n" )
sys.exit( 1 )
sys.stderr.write(msg + "\n")
sys.exit(1)
else:
print("Source succesfully amalagated")
print("Source successfully amalgamated")
if __name__ == "__main__":
main()

32
appveyor.yml Normal file
View File

@@ -0,0 +1,32 @@
clone_folder: c:\projects\jsoncpp
environment:
matrix:
- APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015
CMAKE_GENERATOR: Visual Studio 14 2015
- APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015
CMAKE_GENERATOR: Visual Studio 14 2015 Win64
- APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2017
CMAKE_GENERATOR: Visual Studio 15 2017
- APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2017
CMAKE_GENERATOR: Visual Studio 15 2017 Win64
build_script:
- cmake --version
- cd c:\projects\jsoncpp
- cmake -G "%CMAKE_GENERATOR%" -DCMAKE_INSTALL_PREFIX:PATH=%CD:\=/%/install -DBUILD_SHARED_LIBS:BOOL=ON .
# Use ctest to make a dashboard build:
# - ctest -D Experimental(Start|Update|Configure|Build|Test|Coverage|MemCheck|Submit)
# NOTE: Testing on window is not yet finished:
# - ctest -C Release -D ExperimentalStart -D ExperimentalConfigure -D ExperimentalBuild -D ExperimentalTest -D ExperimentalSubmit
- ctest -C Release -D ExperimentalStart -D ExperimentalConfigure -D ExperimentalBuild -D ExperimentalSubmit
# Final step is to verify that installation succeeds
- cmake --build . --config Release --target install
deploy:
provider: GitHub
auth_token:
secure: K2Tp1q8pIZ7rs0Ot24ZMWuwr12Ev6Tc6QkhMjGQxoQG3ng1pXtgPasiJ45IDXGdg
on:
branch: master
appveyor_repo_tag: true

View File

@@ -1,14 +1,35 @@
all: build test-amalgamate
# This is only for jsoncpp developers/contributors.
# We use this to sign releases, generate documentation, etc.
VER?=$(shell cat version.txt)
default:
@echo "VER=${VER}"
sign: jsoncpp-${VER}.tar.gz
gpg --armor --detach-sign $<
gpg --verify $<.asc
# Then upload .asc to the release.
jsoncpp-%.tar.gz:
curl https://github.com/open-source-parsers/jsoncpp/archive/$*.tar.gz -o $@
dox:
python doxybuild.py --doxygen=$$(which doxygen) --in doc/web_doxyfile.in
rsync -va -c --delete dist/doxygen/jsoncpp-api-html-${VER}/ ../jsoncpp-docs/doxygen/
# Then 'git add -A' and 'git push' in jsoncpp-docs.
build:
mkdir -p build/debug
cd build/debug; cmake -DCMAKE_BUILD_TYPE=debug -DJSONCPP_LIB_BUILD_SHARED=ON -G "Unix Makefiles" ../..
cd build/debug; cmake -DCMAKE_BUILD_TYPE=debug -DBUILD_SHARED_LIBS=ON -G "Unix Makefiles" ../..
make -C build/debug
# Currently, this depends on include/json/version.h generated
# by cmake.
test-amalgamate: build
test-amalgamate:
python2.7 amalgamate.py
python3.4 amalgamate.py
cd dist; gcc -I. -c jsoncpp.cpp
valgrind:
valgrind --error-exitcode=42 --leak-check=full ./build/debug/src/test_lib_json/jsoncpp_test
clean:
\rm -rf *.gz *.asc dist/
.PHONY: build

View File

@@ -1 +1,6 @@
# module
# Copyright 2010 Baptiste Lepilleur and The JsonCpp Authors
# Distributed under MIT license, or public domain if desired and
# recognized in your jurisdiction.
# See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
# module

View File

@@ -1,33 +1,33 @@
{
"cmake_variants" : [
{"name": "generator",
"generators": [
{"generator": [
"Visual Studio 7 .NET 2003",
"Visual Studio 9 2008",
"Visual Studio 9 2008 Win64",
"Visual Studio 10",
"Visual Studio 10 Win64",
"Visual Studio 11",
"Visual Studio 11 Win64"
]
},
{"generator": ["MinGW Makefiles"],
"env_prepend": [{"path": "c:/wut/prg/MinGW/bin"}]
}
]
},
{"name": "shared_dll",
"variables": [
["JSONCPP_LIB_BUILD_SHARED=true"],
["JSONCPP_LIB_BUILD_SHARED=false"]
]
},
{"name": "build_type",
"build_types": [
"debug",
"release"
]
}
]
}
{
"cmake_variants" : [
{"name": "generator",
"generators": [
{"generator": [
"Visual Studio 7 .NET 2003",
"Visual Studio 9 2008",
"Visual Studio 9 2008 Win64",
"Visual Studio 10",
"Visual Studio 10 Win64",
"Visual Studio 11",
"Visual Studio 11 Win64"
]
},
{"generator": ["MinGW Makefiles"],
"env_prepend": [{"path": "c:/wut/prg/MinGW/bin"}]
}
]
},
{"name": "shared_dll",
"variables": [
["BUILD_SHARED_LIBS=true"],
["BUILD_SHARED_LIBS=false"]
]
},
{"name": "build_type",
"build_types": [
"debug",
"release"
]
}
]
}

View File

@@ -1,26 +1,26 @@
{
"cmake_variants" : [
{"name": "generator",
"generators": [
{"generator": [
"Visual Studio 6",
"Visual Studio 7",
"Visual Studio 8 2005"
]
}
]
},
{"name": "shared_dll",
"variables": [
["JSONCPP_LIB_BUILD_SHARED=true"],
["JSONCPP_LIB_BUILD_SHARED=false"]
]
},
{"name": "build_type",
"build_types": [
"debug",
"release"
]
}
]
}
{
"cmake_variants" : [
{"name": "generator",
"generators": [
{"generator": [
"Visual Studio 6",
"Visual Studio 7",
"Visual Studio 8 2005"
]
}
]
},
{"name": "shared_dll",
"variables": [
["BUILD_SHARED_LIBS=true"],
["BUILD_SHARED_LIBS=false"]
]
},
{"name": "build_type",
"build_types": [
"debug",
"release"
]
}
]
}

View File

@@ -1,6 +1,9 @@
#!/usr/bin/env python
# encoding: utf-8
# Baptiste Lepilleur, 2009
# Copyright 2009 Baptiste Lepilleur and The JsonCpp Authors
# Distributed under MIT license, or public domain if desired and
# recognized in your jurisdiction.
# See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
from __future__ import print_function
from dircache import listdir
@@ -54,9 +57,9 @@ LINKS = DIR_LINK | FILE_LINK
ALL_NO_LINK = DIR | FILE
ALL = DIR | FILE | LINKS
_ANT_RE = re.compile( r'(/\*\*/)|(\*\*/)|(/\*\*)|(\*)|(/)|([^\*/]*)' )
_ANT_RE = re.compile(r'(/\*\*/)|(\*\*/)|(/\*\*)|(\*)|(/)|([^\*/]*)')
def ant_pattern_to_re( ant_pattern ):
def ant_pattern_to_re(ant_pattern):
"""Generates a regular expression from the ant pattern.
Matching convention:
**/a: match 'a', 'dir/a', 'dir1/dir2/a'
@@ -65,30 +68,30 @@ def ant_pattern_to_re( ant_pattern ):
"""
rex = ['^']
next_pos = 0
sep_rex = r'(?:/|%s)' % re.escape( os.path.sep )
sep_rex = r'(?:/|%s)' % re.escape(os.path.sep)
## print 'Converting', ant_pattern
for match in _ANT_RE.finditer( ant_pattern ):
for match in _ANT_RE.finditer(ant_pattern):
## print 'Matched', match.group()
## print match.start(0), next_pos
if match.start(0) != next_pos:
raise ValueError( "Invalid ant pattern" )
raise ValueError("Invalid ant pattern")
if match.group(1): # /**/
rex.append( sep_rex + '(?:.*%s)?' % sep_rex )
rex.append(sep_rex + '(?:.*%s)?' % sep_rex)
elif match.group(2): # **/
rex.append( '(?:.*%s)?' % sep_rex )
rex.append('(?:.*%s)?' % sep_rex)
elif match.group(3): # /**
rex.append( sep_rex + '.*' )
rex.append(sep_rex + '.*')
elif match.group(4): # *
rex.append( '[^/%s]*' % re.escape(os.path.sep) )
rex.append('[^/%s]*' % re.escape(os.path.sep))
elif match.group(5): # /
rex.append( sep_rex )
rex.append(sep_rex)
else: # somepath
rex.append( re.escape(match.group(6)) )
rex.append(re.escape(match.group(6)))
next_pos = match.end()
rex.append('$')
return re.compile( ''.join( rex ) )
return re.compile(''.join(rex))
def _as_list( l ):
def _as_list(l):
if isinstance(l, basestring):
return l.split()
return l
@@ -105,37 +108,37 @@ def glob(dir_path,
dir_path = dir_path.replace('/',os.path.sep)
entry_type_filter = entry_type
def is_pruned_dir( dir_name ):
def is_pruned_dir(dir_name):
for pattern in prune_dirs:
if fnmatch.fnmatch( dir_name, pattern ):
if fnmatch.fnmatch(dir_name, pattern):
return True
return False
def apply_filter( full_path, filter_rexs ):
def apply_filter(full_path, filter_rexs):
"""Return True if at least one of the filter regular expression match full_path."""
for rex in filter_rexs:
if rex.match( full_path ):
if rex.match(full_path):
return True
return False
def glob_impl( root_dir_path ):
def glob_impl(root_dir_path):
child_dirs = [root_dir_path]
while child_dirs:
dir_path = child_dirs.pop()
for entry in listdir( dir_path ):
full_path = os.path.join( dir_path, entry )
for entry in listdir(dir_path):
full_path = os.path.join(dir_path, entry)
## print 'Testing:', full_path,
is_dir = os.path.isdir( full_path )
if is_dir and not is_pruned_dir( entry ): # explore child directory ?
is_dir = os.path.isdir(full_path)
if is_dir and not is_pruned_dir(entry): # explore child directory ?
## print '===> marked for recursion',
child_dirs.append( full_path )
included = apply_filter( full_path, include_filter )
rejected = apply_filter( full_path, exclude_filter )
child_dirs.append(full_path)
included = apply_filter(full_path, include_filter)
rejected = apply_filter(full_path, exclude_filter)
if not included or rejected: # do not include entry ?
## print '=> not included or rejected'
continue
link = os.path.islink( full_path )
is_file = os.path.isfile( full_path )
link = os.path.islink(full_path)
is_file = os.path.isfile(full_path)
if not is_file and not is_dir:
## print '=> unknown entry type'
continue
@@ -146,57 +149,57 @@ def glob(dir_path,
## print '=> type: %d' % entry_type,
if (entry_type & entry_type_filter) != 0:
## print ' => KEEP'
yield os.path.join( dir_path, entry )
yield os.path.join(dir_path, entry)
## else:
## print ' => TYPE REJECTED'
return list( glob_impl( dir_path ) )
return list(glob_impl(dir_path))
if __name__ == "__main__":
import unittest
class AntPatternToRETest(unittest.TestCase):
## def test_conversion( self ):
## self.assertEqual( '^somepath$', ant_pattern_to_re( 'somepath' ).pattern )
## def test_conversion(self):
## self.assertEqual('^somepath$', ant_pattern_to_re('somepath').pattern)
def test_matching( self ):
test_cases = [ ( 'path',
def test_matching(self):
test_cases = [ ('path',
['path'],
['somepath', 'pathsuffix', '/path', '/path'] ),
( '*.py',
['somepath', 'pathsuffix', '/path', '/path']),
('*.py',
['source.py', 'source.ext.py', '.py'],
['path/source.py', '/.py', 'dir.py/z', 'z.pyc', 'z.c'] ),
( '**/path',
['path/source.py', '/.py', 'dir.py/z', 'z.pyc', 'z.c']),
('**/path',
['path', '/path', '/a/path', 'c:/a/path', '/a/b/path', '//a/path', '/a/path/b/path'],
['path/', 'a/path/b', 'dir.py/z', 'somepath', 'pathsuffix', 'a/somepath'] ),
( 'path/**',
['path/', 'a/path/b', 'dir.py/z', 'somepath', 'pathsuffix', 'a/somepath']),
('path/**',
['path/a', 'path/path/a', 'path//'],
['path', 'somepath/a', 'a/path', 'a/path/a', 'pathsuffix/a'] ),
( '/**/path',
['path', 'somepath/a', 'a/path', 'a/path/a', 'pathsuffix/a']),
('/**/path',
['/path', '/a/path', '/a/b/path/path', '/path/path'],
['path', 'path/', 'a/path', '/pathsuffix', '/somepath'] ),
( 'a/b',
['path', 'path/', 'a/path', '/pathsuffix', '/somepath']),
('a/b',
['a/b'],
['somea/b', 'a/bsuffix', 'a/b/c'] ),
( '**/*.py',
['somea/b', 'a/bsuffix', 'a/b/c']),
('**/*.py',
['script.py', 'src/script.py', 'a/b/script.py', '/a/b/script.py'],
['script.pyc', 'script.pyo', 'a.py/b'] ),
( 'src/**/*.py',
['script.pyc', 'script.pyo', 'a.py/b']),
('src/**/*.py',
['src/a.py', 'src/dir/a.py'],
['a/src/a.py', '/src/a.py'] ),
['a/src/a.py', '/src/a.py']),
]
for ant_pattern, accepted_matches, rejected_matches in list(test_cases):
def local_path( paths ):
def local_path(paths):
return [ p.replace('/',os.path.sep) for p in paths ]
test_cases.append( (ant_pattern, local_path(accepted_matches), local_path( rejected_matches )) )
test_cases.append((ant_pattern, local_path(accepted_matches), local_path(rejected_matches)))
for ant_pattern, accepted_matches, rejected_matches in test_cases:
rex = ant_pattern_to_re( ant_pattern )
rex = ant_pattern_to_re(ant_pattern)
print('ant_pattern:', ant_pattern, ' => ', rex.pattern)
for accepted_match in accepted_matches:
print('Accepted?:', accepted_match)
self.assertTrue( rex.match( accepted_match ) is not None )
self.assertTrue(rex.match(accepted_match) is not None)
for rejected_match in rejected_matches:
print('Rejected?:', rejected_match)
self.assertTrue( rex.match( rejected_match ) is None )
self.assertTrue(rex.match(rejected_match) is None)
unittest.main()

View File

@@ -18,62 +18,62 @@ class BuildDesc:
self.build_type = build_type
self.generator = generator
def merged_with( self, build_desc ):
def merged_with(self, build_desc):
"""Returns a new BuildDesc by merging field content.
Prefer build_desc fields to self fields for single valued field.
"""
return BuildDesc( self.prepend_envs + build_desc.prepend_envs,
return BuildDesc(self.prepend_envs + build_desc.prepend_envs,
self.variables + build_desc.variables,
build_desc.build_type or self.build_type,
build_desc.generator or self.generator )
build_desc.generator or self.generator)
def env( self ):
def env(self):
environ = os.environ.copy()
for values_by_name in self.prepend_envs:
for var, value in list(values_by_name.items()):
var = var.upper()
if type(value) is unicode:
value = value.encode( sys.getdefaultencoding() )
value = value.encode(sys.getdefaultencoding())
if var in environ:
environ[var] = value + os.pathsep + environ[var]
else:
environ[var] = value
return environ
def cmake_args( self ):
def cmake_args(self):
args = ["-D%s" % var for var in self.variables]
# skip build type for Visual Studio solution as it cause warning
if self.build_type and 'Visual' not in self.generator:
args.append( "-DCMAKE_BUILD_TYPE=%s" % self.build_type )
args.append("-DCMAKE_BUILD_TYPE=%s" % self.build_type)
if self.generator:
args.extend( ['-G', self.generator] )
args.extend(['-G', self.generator])
return args
def __repr__( self ):
return "BuildDesc( %s, build_type=%s )" % (" ".join( self.cmake_args()), self.build_type)
def __repr__(self):
return "BuildDesc(%s, build_type=%s)" % (" ".join(self.cmake_args()), self.build_type)
class BuildData:
def __init__( self, desc, work_dir, source_dir ):
def __init__(self, desc, work_dir, source_dir):
self.desc = desc
self.work_dir = work_dir
self.source_dir = source_dir
self.cmake_log_path = os.path.join( work_dir, 'batchbuild_cmake.log' )
self.build_log_path = os.path.join( work_dir, 'batchbuild_build.log' )
self.cmake_log_path = os.path.join(work_dir, 'batchbuild_cmake.log')
self.build_log_path = os.path.join(work_dir, 'batchbuild_build.log')
self.cmake_succeeded = False
self.build_succeeded = False
def execute_build(self):
print('Build %s' % self.desc)
self._make_new_work_dir( )
self.cmake_succeeded = self._generate_makefiles( )
self._make_new_work_dir()
self.cmake_succeeded = self._generate_makefiles()
if self.cmake_succeeded:
self.build_succeeded = self._build_using_makefiles( )
self.build_succeeded = self._build_using_makefiles()
return self.build_succeeded
def _generate_makefiles(self):
print(' Generating makefiles: ', end=' ')
cmd = ['cmake'] + self.desc.cmake_args( ) + [os.path.abspath( self.source_dir )]
succeeded = self._execute_build_subprocess( cmd, self.desc.env(), self.cmake_log_path )
cmd = ['cmake'] + self.desc.cmake_args() + [os.path.abspath(self.source_dir)]
succeeded = self._execute_build_subprocess(cmd, self.desc.env(), self.cmake_log_path)
print('done' if succeeded else 'FAILED')
return succeeded
@@ -82,58 +82,58 @@ class BuildData:
cmd = ['cmake', '--build', self.work_dir]
if self.desc.build_type:
cmd += ['--config', self.desc.build_type]
succeeded = self._execute_build_subprocess( cmd, self.desc.env(), self.build_log_path )
succeeded = self._execute_build_subprocess(cmd, self.desc.env(), self.build_log_path)
print('done' if succeeded else 'FAILED')
return succeeded
def _execute_build_subprocess(self, cmd, env, log_path):
process = subprocess.Popen( cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, cwd=self.work_dir,
env=env )
stdout, _ = process.communicate( )
process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, cwd=self.work_dir,
env=env)
stdout, _ = process.communicate()
succeeded = (process.returncode == 0)
with open( log_path, 'wb' ) as flog:
log = ' '.join( cmd ) + '\n' + stdout + '\nExit code: %r\n' % process.returncode
flog.write( fix_eol( log ) )
with open(log_path, 'wb') as flog:
log = ' '.join(cmd) + '\n' + stdout + '\nExit code: %r\n' % process.returncode
flog.write(fix_eol(log))
return succeeded
def _make_new_work_dir(self):
if os.path.isdir( self.work_dir ):
if os.path.isdir(self.work_dir):
print(' Removing work directory', self.work_dir)
shutil.rmtree( self.work_dir, ignore_errors=True )
if not os.path.isdir( self.work_dir ):
os.makedirs( self.work_dir )
shutil.rmtree(self.work_dir, ignore_errors=True)
if not os.path.isdir(self.work_dir):
os.makedirs(self.work_dir)
def fix_eol( stdout ):
def fix_eol(stdout):
"""Fixes wrong EOL produced by cmake --build on Windows (\r\r\n instead of \r\n).
"""
return re.sub( '\r*\n', os.linesep, stdout )
return re.sub('\r*\n', os.linesep, stdout)
def load_build_variants_from_config( config_path ):
with open( config_path, 'rb' ) as fconfig:
data = json.load( fconfig )
def load_build_variants_from_config(config_path):
with open(config_path, 'rb') as fconfig:
data = json.load(fconfig)
variants = data[ 'cmake_variants' ]
build_descs_by_axis = collections.defaultdict( list )
build_descs_by_axis = collections.defaultdict(list)
for axis in variants:
axis_name = axis["name"]
build_descs = []
if "generators" in axis:
for generator_data in axis["generators"]:
for generator in generator_data["generator"]:
build_desc = BuildDesc( generator=generator,
prepend_envs=generator_data.get("env_prepend") )
build_descs.append( build_desc )
build_desc = BuildDesc(generator=generator,
prepend_envs=generator_data.get("env_prepend"))
build_descs.append(build_desc)
elif "variables" in axis:
for variables in axis["variables"]:
build_desc = BuildDesc( variables=variables )
build_descs.append( build_desc )
build_desc = BuildDesc(variables=variables)
build_descs.append(build_desc)
elif "build_types" in axis:
for build_type in axis["build_types"]:
build_desc = BuildDesc( build_type=build_type )
build_descs.append( build_desc )
build_descs_by_axis[axis_name].extend( build_descs )
build_desc = BuildDesc(build_type=build_type)
build_descs.append(build_desc)
build_descs_by_axis[axis_name].extend(build_descs)
return build_descs_by_axis
def generate_build_variants( build_descs_by_axis ):
def generate_build_variants(build_descs_by_axis):
"""Returns a list of BuildDesc generated for the partial BuildDesc for each axis."""
axis_names = list(build_descs_by_axis.keys())
build_descs = []
@@ -141,8 +141,8 @@ def generate_build_variants( build_descs_by_axis ):
if len(build_descs):
# for each existing build_desc and each axis build desc, create a new build_desc
new_build_descs = []
for prototype_build_desc, axis_build_desc in itertools.product( build_descs, axis_build_descs):
new_build_descs.append( prototype_build_desc.merged_with( axis_build_desc ) )
for prototype_build_desc, axis_build_desc in itertools.product(build_descs, axis_build_descs):
new_build_descs.append(prototype_build_desc.merged_with(axis_build_desc))
build_descs = new_build_descs
else:
build_descs = axis_build_descs
@@ -174,60 +174,57 @@ $tr_builds
</table>
</body></html>''')
def generate_html_report( html_report_path, builds ):
report_dir = os.path.dirname( html_report_path )
def generate_html_report(html_report_path, builds):
report_dir = os.path.dirname(html_report_path)
# Vertical axis: generator
# Horizontal: variables, then build_type
builds_by_generator = collections.defaultdict( list )
builds_by_generator = collections.defaultdict(list)
variables = set()
build_types_by_variable = collections.defaultdict( set )
build_types_by_variable = collections.defaultdict(set)
build_by_pos_key = {} # { (generator, var_key, build_type): build }
for build in builds:
builds_by_generator[build.desc.generator].append( build )
builds_by_generator[build.desc.generator].append(build)
var_key = tuple(sorted(build.desc.variables))
variables.add( var_key )
build_types_by_variable[var_key].add( build.desc.build_type )
variables.add(var_key)
build_types_by_variable[var_key].add(build.desc.build_type)
pos_key = (build.desc.generator, var_key, build.desc.build_type)
build_by_pos_key[pos_key] = build
variables = sorted( variables )
variables = sorted(variables)
th_vars = []
th_build_types = []
for variable in variables:
build_types = sorted( build_types_by_variable[variable] )
build_types = sorted(build_types_by_variable[variable])
nb_build_type = len(build_types_by_variable[variable])
th_vars.append( '<th colspan="%d">%s</th>' % (nb_build_type, cgi.escape( ' '.join( variable ) ) ) )
th_vars.append('<th colspan="%d">%s</th>' % (nb_build_type, cgi.escape(' '.join(variable))))
for build_type in build_types:
th_build_types.append( '<th>%s</th>' % cgi.escape(build_type) )
th_build_types.append('<th>%s</th>' % cgi.escape(build_type))
tr_builds = []
for generator in sorted( builds_by_generator ):
tds = [ '<td>%s</td>\n' % cgi.escape( generator ) ]
for generator in sorted(builds_by_generator):
tds = [ '<td>%s</td>\n' % cgi.escape(generator) ]
for variable in variables:
build_types = sorted( build_types_by_variable[variable] )
build_types = sorted(build_types_by_variable[variable])
for build_type in build_types:
pos_key = (generator, variable, build_type)
build = build_by_pos_key.get(pos_key)
if build:
cmake_status = 'ok' if build.cmake_succeeded else 'FAILED'
build_status = 'ok' if build.build_succeeded else 'FAILED'
cmake_log_url = os.path.relpath( build.cmake_log_path, report_dir )
build_log_url = os.path.relpath( build.build_log_path, report_dir )
td = '<td class="%s"><a href="%s" class="%s">CMake: %s</a>' % (
build_status.lower(), cmake_log_url, cmake_status.lower(), cmake_status)
cmake_log_url = os.path.relpath(build.cmake_log_path, report_dir)
build_log_url = os.path.relpath(build.build_log_path, report_dir)
td = '<td class="%s"><a href="%s" class="%s">CMake: %s</a>' % ( build_status.lower(), cmake_log_url, cmake_status.lower(), cmake_status)
if build.cmake_succeeded:
td += '<br><a href="%s" class="%s">Build: %s</a>' % (
build_log_url, build_status.lower(), build_status)
td += '<br><a href="%s" class="%s">Build: %s</a>' % ( build_log_url, build_status.lower(), build_status)
td += '</td>'
else:
td = '<td></td>'
tds.append( td )
tr_builds.append( '<tr>%s</tr>' % '\n'.join( tds ) )
html = HTML_TEMPLATE.substitute(
title='Batch build report',
tds.append(td)
tr_builds.append('<tr>%s</tr>' % '\n'.join(tds))
html = HTML_TEMPLATE.substitute( title='Batch build report',
th_vars=' '.join(th_vars),
th_build_types=' '.join( th_build_types),
tr_builds='\n'.join( tr_builds ) )
with open( html_report_path, 'wt' ) as fhtml:
fhtml.write( html )
th_build_types=' '.join(th_build_types),
tr_builds='\n'.join(tr_builds))
with open(html_report_path, 'wt') as fhtml:
fhtml.write(html)
print('HTML report generated in:', html_report_path)
def main():
@@ -246,33 +243,33 @@ python devtools\batchbuild.py e:\buildbots\jsoncpp\build . devtools\agent_vmw7.j
parser.enable_interspersed_args()
options, args = parser.parse_args()
if len(args) < 3:
parser.error( "Missing one of WORK_DIR SOURCE_DIR CONFIG_JSON_PATH." )
parser.error("Missing one of WORK_DIR SOURCE_DIR CONFIG_JSON_PATH.")
work_dir = args[0]
source_dir = args[1].rstrip('/\\')
config_paths = args[2:]
for config_path in config_paths:
if not os.path.isfile( config_path ):
parser.error( "Can not read: %r" % config_path )
if not os.path.isfile(config_path):
parser.error("Can not read: %r" % config_path)
# generate build variants
build_descs = []
for config_path in config_paths:
build_descs_by_axis = load_build_variants_from_config( config_path )
build_descs.extend( generate_build_variants( build_descs_by_axis ) )
build_descs_by_axis = load_build_variants_from_config(config_path)
build_descs.extend(generate_build_variants(build_descs_by_axis))
print('Build variants (%d):' % len(build_descs))
# assign build directory for each variant
if not os.path.isdir( work_dir ):
os.makedirs( work_dir )
if not os.path.isdir(work_dir):
os.makedirs(work_dir)
builds = []
with open( os.path.join( work_dir, 'matrix-dir-map.txt' ), 'wt' ) as fmatrixmap:
for index, build_desc in enumerate( build_descs ):
build_desc_work_dir = os.path.join( work_dir, '%03d' % (index+1) )
builds.append( BuildData( build_desc, build_desc_work_dir, source_dir ) )
fmatrixmap.write( '%s: %s\n' % (build_desc_work_dir, build_desc) )
with open(os.path.join(work_dir, 'matrix-dir-map.txt'), 'wt') as fmatrixmap:
for index, build_desc in enumerate(build_descs):
build_desc_work_dir = os.path.join(work_dir, '%03d' % (index+1))
builds.append(BuildData(build_desc, build_desc_work_dir, source_dir))
fmatrixmap.write('%s: %s\n' % (build_desc_work_dir, build_desc))
for build in builds:
build.execute_build()
html_report_path = os.path.join( work_dir, 'batchbuild-report.html' )
generate_html_report( html_report_path, builds )
html_report_path = os.path.join(work_dir, 'batchbuild-report.html')
generate_html_report(html_report_path, builds)
print('Done')

View File

@@ -1,10 +1,16 @@
# Copyright 2010 Baptiste Lepilleur and The JsonCpp Authors
# Distributed under MIT license, or public domain if desired and
# recognized in your jurisdiction.
# See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
from __future__ import print_function
import os.path
import sys
def fix_source_eol( path, is_dry_run = True, verbose = True, eol = '\n' ):
def fix_source_eol(path, is_dry_run = True, verbose = True, eol = '\n'):
"""Makes sure that all sources have the specified eol sequence (default: unix)."""
if not os.path.isfile( path ):
raise ValueError( 'Path "%s" is not a file' % path )
if not os.path.isfile(path):
raise ValueError('Path "%s" is not a file' % path)
try:
f = open(path, 'rb')
except IOError as msg:
@@ -29,27 +35,27 @@ def fix_source_eol( path, is_dry_run = True, verbose = True, eol = '\n' ):
##
##
##
##def _do_fix( is_dry_run = True ):
##def _do_fix(is_dry_run = True):
## from waftools import antglob
## python_sources = antglob.glob( '.',
## python_sources = antglob.glob('.',
## includes = '**/*.py **/wscript **/wscript_build',
## excludes = antglob.default_excludes + './waf.py',
## prune_dirs = antglob.prune_dirs + 'waf-* ./build' )
## prune_dirs = antglob.prune_dirs + 'waf-* ./build')
## for path in python_sources:
## _fix_python_source( path, is_dry_run )
## _fix_python_source(path, is_dry_run)
##
## cpp_sources = antglob.glob( '.',
## cpp_sources = antglob.glob('.',
## includes = '**/*.cpp **/*.h **/*.inl',
## prune_dirs = antglob.prune_dirs + 'waf-* ./build' )
## prune_dirs = antglob.prune_dirs + 'waf-* ./build')
## for path in cpp_sources:
## _fix_source_eol( path, is_dry_run )
## _fix_source_eol(path, is_dry_run)
##
##
##def dry_fix(context):
## _do_fix( is_dry_run = True )
## _do_fix(is_dry_run = True)
##
##def fix(context):
## _do_fix( is_dry_run = False )
## _do_fix(is_dry_run = False)
##
##def shutdown():
## pass

View File

@@ -6,14 +6,14 @@ from __future__ import print_function
# and ends with the first blank line.
LICENSE_BEGIN = "// Copyright "
BRIEF_LICENSE = LICENSE_BEGIN + """2007-2010 Baptiste Lepilleur
BRIEF_LICENSE = LICENSE_BEGIN + """2007-2010 Baptiste Lepilleur and The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
""".replace('\r\n','\n')
def update_license( path, dry_run, show_diff ):
def update_license(path, dry_run, show_diff):
"""Update the license statement in the specified file.
Parameters:
path: path of the C++ source file to update.
@@ -22,28 +22,28 @@ def update_license( path, dry_run, show_diff ):
show_diff: if True, print the path of the file that would be modified,
as well as the change made to the file.
"""
with open( path, 'rt' ) as fin:
with open(path, 'rt') as fin:
original_text = fin.read().replace('\r\n','\n')
newline = fin.newlines and fin.newlines[0] or '\n'
if not original_text.startswith( LICENSE_BEGIN ):
if not original_text.startswith(LICENSE_BEGIN):
# No existing license found => prepend it
new_text = BRIEF_LICENSE + original_text
else:
license_end_index = original_text.index( '\n\n' ) # search first blank line
license_end_index = original_text.index('\n\n') # search first blank line
new_text = BRIEF_LICENSE + original_text[license_end_index+2:]
if original_text != new_text:
if not dry_run:
with open( path, 'wb' ) as fout:
fout.write( new_text.replace('\n', newline ) )
with open(path, 'wb') as fout:
fout.write(new_text.replace('\n', newline))
print('Updated', path)
if show_diff:
import difflib
print('\n'.join( difflib.unified_diff( original_text.split('\n'),
new_text.split('\n') ) ))
print('\n'.join(difflib.unified_diff(original_text.split('\n'),
new_text.split('\n'))))
return True
return False
def update_license_in_source_directories( source_dirs, dry_run, show_diff ):
def update_license_in_source_directories(source_dirs, dry_run, show_diff):
"""Updates license text in C++ source files found in directory source_dirs.
Parameters:
source_dirs: list of directory to scan for C++ sources. Directories are
@@ -56,11 +56,11 @@ def update_license_in_source_directories( source_dirs, dry_run, show_diff ):
from devtools import antglob
prune_dirs = antglob.prune_dirs + 'scons-local* ./build* ./libs ./dist'
for source_dir in source_dirs:
cpp_sources = antglob.glob( source_dir,
cpp_sources = antglob.glob(source_dir,
includes = '''**/*.h **/*.cpp **/*.inl''',
prune_dirs = prune_dirs )
prune_dirs = prune_dirs)
for source in cpp_sources:
update_license( source, dry_run, show_diff )
update_license(source, dry_run, show_diff)
def main():
usage = """%prog DIR [DIR2...]
@@ -83,7 +83,7 @@ python devtools\licenseupdater.py include src
help="""On update, show change made to the file.""")
parser.enable_interspersed_args()
options, args = parser.parse_args()
update_license_in_source_directories( args, options.dry_run, options.show_diff )
update_license_in_source_directories(args, options.dry_run, options.show_diff)
print('Done')
if __name__ == '__main__':

View File

@@ -1,5 +1,10 @@
import os.path
import gzip
# Copyright 2010 Baptiste Lepilleur and The JsonCpp Authors
# Distributed under MIT license, or public domain if desired and
# recognized in your jurisdiction.
# See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
from contextlib import closing
import os
import tarfile
TARGZ_DEFAULT_COMPRESSION_LEVEL = 9
@@ -13,41 +18,35 @@ def make_tarball(tarball_path, sources, base_dir, prefix_dir=''):
prefix_dir: all files stored in the tarball be sub-directory of prefix_dir. Set to ''
to make them child of root.
"""
base_dir = os.path.normpath( os.path.abspath( base_dir ) )
def archive_name( path ):
base_dir = os.path.normpath(os.path.abspath(base_dir))
def archive_name(path):
"""Makes path relative to base_dir."""
path = os.path.normpath( os.path.abspath( path ) )
common_path = os.path.commonprefix( (base_dir, path) )
path = os.path.normpath(os.path.abspath(path))
common_path = os.path.commonprefix((base_dir, path))
archive_name = path[len(common_path):]
if os.path.isabs( archive_name ):
if os.path.isabs(archive_name):
archive_name = archive_name[1:]
return os.path.join( prefix_dir, archive_name )
return os.path.join(prefix_dir, archive_name)
def visit(tar, dirname, names):
for name in names:
path = os.path.join(dirname, name)
if os.path.isfile(path):
path_in_tar = archive_name(path)
tar.add(path, path_in_tar )
tar.add(path, path_in_tar)
compression = TARGZ_DEFAULT_COMPRESSION_LEVEL
tar = tarfile.TarFile.gzopen( tarball_path, 'w', compresslevel=compression )
try:
with closing(tarfile.TarFile.open(tarball_path, 'w:gz',
compresslevel=compression)) as tar:
for source in sources:
source_path = source
if os.path.isdir( source ):
os.path.walk(source_path, visit, tar)
if os.path.isdir(source):
for dirpath, dirnames, filenames in os.walk(source_path):
visit(tar, dirpath, filenames)
else:
path_in_tar = archive_name(source_path)
tar.add(source_path, path_in_tar ) # filename, arcname
finally:
tar.close()
tar.add(source_path, path_in_tar) # filename, arcname
def decompress( tarball_path, base_dir ):
def decompress(tarball_path, base_dir):
"""Decompress the gzipped tarball into directory base_dir.
"""
# !!! This class method is not documented in the online doc
# nor is bz2open!
tar = tarfile.TarFile.gzopen(tarball_path, mode='r')
try:
tar.extractall( base_dir )
finally:
tar.close()
with closing(tarfile.TarFile.open(tarball_path)) as tar:
tar.extractall(base_dir)

View File

@@ -271,7 +271,7 @@ OPTIMIZE_OUTPUT_VHDL = NO
# parses. With this tag you can assign which parser to use for a given
# extension. Doxygen has a built-in mapping, but you can override or extend it
# using this tag. The format is ext=language, where ext is a file extension, and
# language is one of the parsers supported by doxygen: IDL, Java, Javascript,
# language is one of the parsers supported by doxygen: IDL, Java, JavaScript,
# C#, C, C++, D, PHP, Objective-C, Python, Fortran, VHDL. For instance to make
# doxygen treat .inc files as Fortran files (default is PHP), and .f files as C
# (default is Fortran), use: inc=Fortran f=C.
@@ -819,7 +819,7 @@ EXCLUDE_SYMBOLS =
# that contain example code fragments that are included (see the \include
# command).
EXAMPLE_PATH =
EXAMPLE_PATH = ..
# If the value of the EXAMPLE_PATH tag contains directories, you can use the
# EXAMPLE_PATTERNS tag to specify one or more wildcard pattern (like *.cpp and
@@ -984,7 +984,8 @@ VERBATIM_HEADERS = YES
# classes, structs, unions or interfaces.
# The default value is: YES.
ALPHABETICAL_INDEX = NO
ALPHABETICAL_INDEX = YES
TOC_INCLUDE_HEADINGS = 2
# The COLS_IN_ALPHA_INDEX tag can be used to specify the number of columns in
# which the alphabetical index list will be split.
@@ -1071,7 +1072,7 @@ HTML_STYLESHEET =
# defined cascading style sheet that is included after the standard style sheets
# created by doxygen. Using this option one can overrule certain style aspects.
# This is preferred over using HTML_STYLESHEET since it does not replace the
# standard style sheet and is therefor more robust against future updates.
# standard style sheet and is therefore more robust against future updates.
# Doxygen will copy the style sheet file to the output directory. For an example
# see the documentation.
# This tag requires that the tag GENERATE_HTML is set to YES.
@@ -1407,7 +1408,7 @@ FORMULA_FONTSIZE = 10
FORMULA_TRANSPARENT = YES
# Enable the USE_MATHJAX option to render LaTeX formulas using MathJax (see
# http://www.mathjax.org) which uses client side Javascript for the rendering
# http://www.mathjax.org) which uses client side JavaScript for the rendering
# instead of using prerendered bitmaps. Use this if you do not have LaTeX
# installed or if you want to formulas look prettier in the HTML output. When
# enabled you may also need to install MathJax separately and configure the path
@@ -1477,7 +1478,7 @@ MATHJAX_CODEFILE =
SEARCHENGINE = NO
# When the SERVER_BASED_SEARCH tag is enabled the search engine will be
# implemented using a web server instead of a web client using Javascript. There
# implemented using a web server instead of a web client using JavaScript. There
# are two flavours of web server based searching depending on the
# EXTERNAL_SEARCH setting. When disabled, doxygen will generate a PHP script for
# searching and an index file used by the script. When EXTERNAL_SEARCH is
@@ -1943,11 +1944,10 @@ INCLUDE_FILE_PATTERNS = *.h
# recursively expanded use the := operator instead of the = operator.
# This tag requires that the tag ENABLE_PREPROCESSING is set to YES.
PREDEFINED = "_MSC_VER=1400" \
PREDEFINED = "_MSC_VER=1800" \
_CPPRTTI \
_WIN32 \
JSONCPP_DOC_EXCLUDE_IMPLEMENTATION \
JSON_VALUE_USE_INTERNAL_MAP
JSONCPP_DOC_EXCLUDE_IMPLEMENTATION
# If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this
# tag can be used to specify a list of macro names that should be expanded. The
@@ -1959,7 +1959,7 @@ PREDEFINED = "_MSC_VER=1400" \
EXPAND_AS_DEFINED =
# If the SKIP_FUNCTION_MACROS tag is set to YES then doxygen's preprocessor will
# remove all refrences to function-like macros that are alone on a line, have an
# remove all references to function-like macros that are alone on a line, have an
# all uppercase name, and do not end with a semicolon. Such function macros are
# typically used for boiler-plate code, and will confuse the parser if not
# removed.

View File

@@ -1,3 +1,21 @@
<hr>
</body>
<!-- HTML footer for doxygen 1.8.13-->
<!-- start footer part -->
<!--BEGIN GENERATE_TREEVIEW-->
<div id="nav-path" class="navpath"><!-- id is needed for treeview function! -->
<ul>
$navpath
<li class="footer">$generatedby
<a href="http://www.doxygen.org/index.html">
<img class="footer" src="$relpath^doxygen.png" alt="doxygen"/></a> $doxygenversion </li>
</ul>
</div>
<!--END GENERATE_TREEVIEW-->
<!--BEGIN !GENERATE_TREEVIEW-->
<hr class="footer"/><address class="footer"><small>
$generatedby &#160;<a href="http://www.doxygen.org/index.html">
<img class="footer" src="$relpath^doxygen.png" alt="doxygen"/>
</a> $doxygenversion
</small></address>
<!--END !GENERATE_TREEVIEW-->
</body>
</html>

View File

@@ -1,24 +1,64 @@
<html>
<!-- HTML header for doxygen 1.8.13-->
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<title>
JsonCpp - JSON data format manipulation library
</title>
<link href="doxygen.css" rel="stylesheet" type="text/css">
<link href="tabs.css" rel="stylesheet" type="text/css">
<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/>
<meta http-equiv="X-UA-Compatible" content="IE=9"/>
<meta name="generator" content="Doxygen $doxygenversion"/>
<meta name="viewport" content="width=device-width, initial-scale=1"/>
<link href="$relpath^tabs.css" rel="stylesheet" type="text/css"/>
<script type="text/javascript" src="$relpath^jquery.js"></script>
<script type="text/javascript" src="$relpath^dynsections.js"></script>
$treeview
$search
$mathjax
<link href="$relpath^$stylesheet" rel="stylesheet" type="text/css" />
$extrastylesheet
</head>
<body>
<div id="top"><!-- do not remove this div, it is closed by doxygen! -->
<body bgcolor="#ffffff">
<!--BEGIN TITLEAREA-->
<div id="titlearea">
<table cellspacing="0" cellpadding="0">
<tbody>
<tr style="height: 56px;">
<!--BEGIN PROJECT_LOGO-->
<td id="projectlogo"><img alt="Logo" src="$relpath^$projectlogo"/></td>
<!--END PROJECT_LOGO-->
<!--BEGIN DISABLE_INDEX-->
<!--BEGIN SEARCHENGINE-->
<td>$searchbox</td>
<!--END SEARCHENGINE-->
<!--END DISABLE_INDEX-->
</tr>
</tbody>
</table>
</div>
<!--END TITLEAREA-->
<body bgcolor="#ffffff">
<table width="100%">
<tr>
<td width="40%" align="left" valign="center">
<td width="30%" align="left" valign="center">
<a href="https://github.com/open-source-parsers/jsoncpp">
JsonCpp project page
</a>
</td>
<td width="40%" align="right" valign="center">
<a href="https://github.com/open-source-parsers/jsoncpp">JsonCpp home page</a>
<td width="20%" align="center" valign="center">
<a href="hierarchy.html">
Classes
</a>
</td>
<td width="20%" align="center" valign="center">
<a href="namespace_json.html">
Namespace
</a>
</td>
<td width="30%" align="right" valign="center">
<a href="http://open-source-parsers.github.io/jsoncpp-docs/doxygen/">JsonCpp home page</a>
</td>
</tr>
</table>
<hr>
<!-- end header part -->

View File

@@ -4,11 +4,21 @@
<a HREF="http://www.json.org/">JSON (JavaScript Object Notation)</a>
is a lightweight data-interchange format.
It can represent integer, real number, string, an ordered sequence of value, and
a collection of name/value pairs.
Here is an example of JSON data:
\verbatim
{
"encoding" : "UTF-8",
"plug-ins" : [
"python",
"c++",
"ruby"
],
"indent" : { "length" : 3, "use_space": true }
}
\endverbatim
<b>JsonCpp</b> supports comments as <i>meta-data</i>:
\code
// Configuration options
{
// Default encoding for text
@@ -17,22 +27,22 @@ Here is an example of JSON data:
// Plug-ins loaded at start-up
"plug-ins" : [
"python",
"c++",
"c++", // trailing comment
"ruby"
],
// Tab indent size
"indent" : { "length" : 3, "use_space": true }
// (multi-line comment)
"indent" : { /*embedded comment*/ "length" : 3, "use_space": true }
}
\endverbatim
<code>jsoncpp</code> supports comments as <i>meta-data</i>.
\endcode
\section _features Features
- read and write JSON document
- attach C++ style comments to element during parsing
- rewrite JSON document preserving original comments
Notes: Comments used to be supported in JSON but where removed for
Notes: Comments used to be supported in JSON but were removed for
portability (C like comments are not supported in Python). Since
comments are useful in configuration/input file, this feature was
preserved.
@@ -40,47 +50,77 @@ preserved.
\section _example Code example
\code
Json::Value root; // will contains the root value after parsing.
Json::Reader reader;
bool parsingSuccessful = reader.parse( config_doc, root );
if ( !parsingSuccessful )
{
// report to the user the failure and their locations in the document.
std::cout << "Failed to parse configuration\n"
<< reader.getFormattedErrorMessages();
return;
}
Json::Value root; // 'root' will contain the root value after parsing.
std::cin >> root;
// Get the value of the member of root named 'encoding', return 'UTF-8' if there is no
// such member.
std::string encoding = root.get("encoding", "UTF-8" ).asString();
// Get the value of the member of root named 'encoding', return a 'null' value if
// there is no such member.
const Json::Value plugins = root["plug-ins"];
for ( int index = 0; index < plugins.size(); ++index ) // Iterates over the sequence elements.
loadPlugIn( plugins[index].asString() );
setIndentLength( root["indent"].get("length", 3).asInt() );
setIndentUseSpace( root["indent"].get("use_space", true).asBool() );
// ...
// At application shutdown to make the new configuration document:
// Since Json::Value has implicit constructor for all value types, it is not
// necessary to explicitly construct the Json::Value object:
root["encoding"] = getCurrentEncoding();
root["indent"]["length"] = getCurrentIndentLength();
root["indent"]["use_space"] = getCurrentIndentUseSpace();
Json::StyledWriter writer;
// Make a new JSON document for the configuration. Preserve original comments.
std::string outputConfig = writer.write( root );
// You can also use streams. This will put the contents of any JSON
// stream at a particular sub-value, if you'd like.
// You can also read into a particular sub-value.
std::cin >> root["subtree"];
// And you can write to a stream, using the StyledWriter automatically.
// Get the value of the member of root named 'encoding',
// and return 'UTF-8' if there is no such member.
std::string encoding = root.get("encoding", "UTF-8" ).asString();
// Get the value of the member of root named 'plug-ins'; return a 'null' value if
// there is no such member.
const Json::Value plugins = root["plug-ins"];
// Iterate over the sequence elements.
for ( int index = 0; index < plugins.size(); ++index )
loadPlugIn( plugins[index].asString() );
// Try other datatypes. Some are auto-convertible to others.
foo::setIndentLength( root["indent"].get("length", 3).asInt() );
foo::setIndentUseSpace( root["indent"].get("use_space", true).asBool() );
// Since Json::Value has an implicit constructor for all value types, it is not
// necessary to explicitly construct the Json::Value object.
root["encoding"] = foo::getCurrentEncoding();
root["indent"]["length"] = foo::getCurrentIndentLength();
root["indent"]["use_space"] = foo::getCurrentIndentUseSpace();
// If you like the defaults, you can insert directly into a stream.
std::cout << root;
// Of course, you can write to `std::ostringstream` if you prefer.
// If desired, remember to add a linefeed and flush.
std::cout << std::endl;
\endcode
\section _advanced Advanced usage
Configure *builders* to create *readers* and *writers*. For
configuration, we use our own `Json::Value` (rather than
standard setters/getters) so that we can add
features without losing binary-compatibility.
\code
// For convenience, use `writeString()` with a specialized builder.
Json::StreamWriterBuilder wbuilder;
wbuilder["indentation"] = "\t";
std::string document = Json::writeString(wbuilder, root);
// Here, using a specialized Builder, we discard comments and
// record errors as we parse.
Json::CharReaderBuilder rbuilder;
rbuilder["collectComments"] = false;
std::string errs;
bool ok = Json::parseFromStream(rbuilder, std::cin, &root, &errs);
\endcode
Yes, compile-time configuration-checking would be helpful,
but `Json::Value` lets you
write and read the builder configuration, which is better! In other words,
you can configure your JSON parser using JSON.
CharReaders and StreamWriters are not thread-safe, but they are re-usable.
\code
Json::CharReaderBuilder rbuilder;
cfg >> rbuilder.settings_;
std::unique_ptr<Json::CharReader> const reader(rbuilder.newCharReader());
reader->parse(start, stop, &value1, &errs);
// ...
reader->parse(start, stop, &value2, &errs);
// etc.
\endcode
\section _pbuild Build instructions
@@ -116,4 +156,9 @@ Basically JsonCpp is licensed under MIT license, or public domain if desired
and recognized in your jurisdiction.
\author Baptiste Lepilleur <blep@users.sourceforge.net> (originator)
\author Christopher Dunn <cdunn2001@gmail.com> (primary maintainer)
\version \include version
We make strong guarantees about binary-compatibility, consistent with
<a href="http://apr.apache.org/versioning.html">the Apache versioning scheme</a>.
\sa version.h
*/

2290
doc/web_doxyfile.in Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,22 +1,37 @@
"""Script to generate doxygen documentation.
"""
from __future__ import print_function
from __future__ import unicode_literals
from devtools import tarball
from contextlib import contextmanager
import subprocess
import traceback
import re
import os
import os.path
import sys
import shutil
@contextmanager
def cd(newdir):
"""
http://stackoverflow.com/questions/431684/how-do-i-cd-in-python
"""
prevdir = os.getcwd()
os.chdir(newdir)
try:
yield
finally:
os.chdir(prevdir)
def find_program(*filenames):
"""find a program in folders path_lst, and sets env[var]
@param filenames: a list of possible names of the program to search for
@return: the full path of the filename if found, or '' if filename could not be found
"""
paths = os.environ.get('PATH', '').split(os.pathsep)
suffixes = ('win32' in sys.platform ) and '.exe .com .bat .cmd' or ''
suffixes = ('win32' in sys.platform) and '.exe .com .bat .cmd' or ''
for filename in filenames:
for name in [filename+ext for ext in suffixes.split()]:
for name in [filename+ext for ext in suffixes.split(' ')]:
for directory in paths:
full_path = os.path.join(directory, name)
if os.path.isfile(full_path):
@@ -28,53 +43,56 @@ def do_subst_in_file(targetfile, sourcefile, dict):
For example, if dict is {'%VERSION%': '1.2345', '%BASE%': 'MyProg'},
then all instances of %VERSION% in the file will be replaced with 1.2345 etc.
"""
try:
f = open(sourcefile, 'rb')
with open(sourcefile, 'r') as f:
contents = f.read()
f.close()
except:
print("Can't read source file %s"%sourcefile)
raise
for (k,v) in list(dict.items()):
v = v.replace('\\','\\\\')
contents = re.sub(k, v, contents)
try:
f = open(targetfile, 'wb')
with open(targetfile, 'w') as f:
f.write(contents)
f.close()
def getstatusoutput(cmd):
"""cmd is a list.
"""
try:
process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
output, _ = process.communicate()
status = process.returncode
except:
print("Can't write target file %s"%targetfile)
raise
status = -1
output = traceback.format_exc()
return status, output
def run_cmd(cmd, silent=False):
"""Raise exception on failure.
"""
info = 'Running: %r in %r' %(' '.join(cmd), os.getcwd())
print(info)
sys.stdout.flush()
if silent:
status, output = getstatusoutput(cmd)
else:
status, output = subprocess.call(cmd), ''
if status:
msg = 'Error while %s ...\n\terror=%d, output="""%s"""' %(info, status, output)
raise Exception(msg)
def assert_is_exe(path):
if not path:
raise Exception('path is empty.')
if not os.path.isfile(path):
raise Exception('%r is not a file.' %path)
if not os.access(path, os.X_OK):
raise Exception('%r is not executable by this user.' %path)
def run_doxygen(doxygen_path, config_file, working_dir, is_silent):
config_file = os.path.abspath( config_file )
doxygen_path = doxygen_path
old_cwd = os.getcwd()
try:
os.chdir( working_dir )
assert_is_exe(doxygen_path)
config_file = os.path.abspath(config_file)
with cd(working_dir):
cmd = [doxygen_path, config_file]
print('Running:', ' '.join( cmd ))
try:
import subprocess
except:
if os.system( ' '.join( cmd ) ) != 0:
print('Documentation generation failed')
return False
else:
if is_silent:
process = subprocess.Popen( cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT )
else:
process = subprocess.Popen( cmd )
stdout, _ = process.communicate()
if process.returncode:
print('Documentation generation failed:')
print(stdout)
return False
return True
finally:
os.chdir( old_cwd )
run_cmd(cmd, is_silent)
def build_doc( options, make_release=False ):
def build_doc(options, make_release=False):
if make_release:
options.make_tarball = True
options.with_dot = True
@@ -83,62 +101,62 @@ def build_doc( options, make_release=False ):
options.open = False
options.silent = True
version = open('version','rt').read().strip()
version = open('version', 'rt').read().strip()
output_dir = 'dist/doxygen' # relative to doc/doxyfile location.
if not os.path.isdir( output_dir ):
os.makedirs( output_dir )
top_dir = os.path.abspath( '.' )
if not os.path.isdir(output_dir):
os.makedirs(output_dir)
top_dir = os.path.abspath('.')
html_output_dirname = 'jsoncpp-api-html-' + version
tarball_path = os.path.join( 'dist', html_output_dirname + '.tar.gz' )
warning_log_path = os.path.join( output_dir, '../jsoncpp-doxygen-warning.log' )
html_output_path = os.path.join( output_dir, html_output_dirname )
def yesno( bool ):
tarball_path = os.path.join('dist', html_output_dirname + '.tar.gz')
warning_log_path = os.path.join(output_dir, '../jsoncpp-doxygen-warning.log')
html_output_path = os.path.join(output_dir, html_output_dirname)
def yesno(bool):
return bool and 'YES' or 'NO'
subst_keys = {
'%JSONCPP_VERSION%': version,
'%DOC_TOPDIR%': '',
'%TOPDIR%': top_dir,
'%HTML_OUTPUT%': os.path.join( '..', output_dir, html_output_dirname ),
'%HTML_OUTPUT%': os.path.join('..', output_dir, html_output_dirname),
'%HAVE_DOT%': yesno(options.with_dot),
'%DOT_PATH%': os.path.split(options.dot_path)[0],
'%HTML_HELP%': yesno(options.with_html_help),
'%UML_LOOK%': yesno(options.with_uml_look),
'%WARNING_LOG_PATH%': os.path.join( '..', warning_log_path )
'%WARNING_LOG_PATH%': os.path.join('..', warning_log_path)
}
if os.path.isdir( output_dir ):
if os.path.isdir(output_dir):
print('Deleting directory:', output_dir)
shutil.rmtree( output_dir )
if not os.path.isdir( output_dir ):
os.makedirs( output_dir )
shutil.rmtree(output_dir)
if not os.path.isdir(output_dir):
os.makedirs(output_dir)
do_subst_in_file( 'doc/doxyfile', 'doc/doxyfile.in', subst_keys )
ok = run_doxygen( options.doxygen_path, 'doc/doxyfile', 'doc', is_silent=options.silent )
do_subst_in_file('doc/doxyfile', options.doxyfile_input_path, subst_keys)
run_doxygen(options.doxygen_path, 'doc/doxyfile', 'doc', is_silent=options.silent)
if not options.silent:
print(open(warning_log_path, 'rb').read())
print(open(warning_log_path, 'r').read())
index_path = os.path.abspath(os.path.join('doc', subst_keys['%HTML_OUTPUT%'], 'index.html'))
print('Generated documentation can be found in:')
print(index_path)
if options.open:
import webbrowser
webbrowser.open( 'file://' + index_path )
webbrowser.open('file://' + index_path)
if options.make_tarball:
print('Generating doc tarball to', tarball_path)
tarball_sources = [
output_dir,
'README.txt',
'README.md',
'LICENSE',
'NEWS.txt',
'version'
]
tarball_basedir = os.path.join( output_dir, html_output_dirname )
tarball.make_tarball( tarball_path, tarball_sources, tarball_basedir, html_output_dirname )
tarball_basedir = os.path.join(output_dir, html_output_dirname)
tarball.make_tarball(tarball_path, tarball_sources, tarball_basedir, html_output_dirname)
return tarball_path, html_output_dirname
def main():
usage = """%prog
Generates doxygen documentation in build/doxygen.
Optionaly makes a tarball of the documentation to dist/.
Optionally makes a tarball of the documentation to dist/.
Must be started in the project top directory.
"""
@@ -151,6 +169,8 @@ def main():
help="""Path to GraphViz dot tool. Must be full qualified path. [Default: %default]""")
parser.add_option('--doxygen', dest="doxygen_path", action='store', default=find_program('doxygen'),
help="""Path to Doxygen tool. [Default: %default]""")
parser.add_option('--in', dest="doxyfile_input_path", action='store', default='doc/doxyfile.in',
help="""Path to doxygen inputs. [Default: %default]""")
parser.add_option('--with-html-help', dest="with_html_help", action='store_true', default=False,
help="""Enable generation of Microsoft HTML HELP""")
parser.add_option('--no-uml-look', dest="with_uml_look", action='store_false', default=True,
@@ -163,7 +183,7 @@ def main():
help="""Hides doxygen output""")
parser.enable_interspersed_args()
options, args = parser.parse_args()
build_doc( options )
build_doc(options)
if __name__ == '__main__':
main()

View File

@@ -1,2 +1,6 @@
FILE(GLOB INCLUDE_FILES "json/*.h")
INSTALL(FILES ${INCLUDE_FILES} DESTINATION ${INCLUDE_INSTALL_DIR}/json)
file(GLOB INCLUDE_FILES "json/*.h")
install(FILES
${INCLUDE_FILES}
${PROJECT_BINARY_DIR}/include/json/version.h
DESTINATION ${CMAKE_INSTALL_INCLUDEDIR}/json)

89
include/json/allocator.h Normal file
View File

@@ -0,0 +1,89 @@
// Copyright 2007-2010 Baptiste Lepilleur and The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
#ifndef CPPTL_JSON_ALLOCATOR_H_INCLUDED
#define CPPTL_JSON_ALLOCATOR_H_INCLUDED
#include <cstring>
#include <memory>
#pragma pack(push, 8)
namespace Json {
template <typename T> class SecureAllocator {
public:
// Type definitions
using value_type = T;
using pointer = T*;
using const_pointer = const T*;
using reference = T&;
using const_reference = const T&;
using size_type = std::size_t;
using difference_type = std::ptrdiff_t;
/**
* Allocate memory for N items using the standard allocator.
*/
pointer allocate(size_type n) {
// allocate using "global operator new"
return static_cast<pointer>(::operator new(n * sizeof(T)));
}
/**
* Release memory which was allocated for N items at pointer P.
*
* The memory block is filled with zeroes before being released.
* The pointer argument is tagged as "volatile" to prevent the
* compiler optimizing out this critical step.
*/
void deallocate(volatile pointer p, size_type n) {
std::memset(p, 0, n * sizeof(T));
// free using "global operator delete"
::operator delete(p);
}
/**
* Construct an item in-place at pointer P.
*/
template <typename... Args> void construct(pointer p, Args&&... args) {
// construct using "placement new" and "perfect forwarding"
::new (static_cast<void*>(p)) T(std::forward<Args>(args)...);
}
size_type max_size() const { return size_t(-1) / sizeof(T); }
pointer address(reference x) const { return std::addressof(x); }
const_pointer address(const_reference x) const { return std::addressof(x); }
/**
* Destroy an item in-place at pointer P.
*/
void destroy(pointer p) {
// destroy using "explicit destructor"
p->~T();
}
// Boilerplate
SecureAllocator() {}
template <typename U> SecureAllocator(const SecureAllocator<U>&) {}
template <typename U> struct rebind { using other = SecureAllocator<U>; };
};
template <typename T, typename U>
bool operator==(const SecureAllocator<T>&, const SecureAllocator<U>&) {
return true;
}
template <typename T, typename U>
bool operator!=(const SecureAllocator<T>&, const SecureAllocator<U>&) {
return false;
}
} // namespace Json
#pragma pack(pop)
#endif // CPPTL_JSON_ALLOCATOR_H_INCLUDED

View File

@@ -1,4 +1,4 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Copyright 2007-2010 Baptiste Lepilleur and The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
@@ -6,31 +6,48 @@
#ifndef CPPTL_JSON_ASSERTIONS_H_INCLUDED
#define CPPTL_JSON_ASSERTIONS_H_INCLUDED
#include <stdlib.h>
#include <cstdlib>
#include <sstream>
#if !defined(JSON_IS_AMALGAMATION)
#include "config.h"
#endif // if !defined(JSON_IS_AMALGAMATION)
/** It should not be possible for a maliciously designed file to
* cause an abort() or seg-fault, so these macros are used only
* for pre-condition violations and internal logic errors.
*/
#if JSON_USE_EXCEPTION
#include <stdexcept>
#define JSON_ASSERT(condition) \
assert(condition); // @todo <= change this into an exception throw
#define JSON_FAIL_MESSAGE(message) do{std::ostringstream oss; oss << message; throw std::runtime_error(oss.str());}while(0)
//#define JSON_FAIL_MESSAGE(message) throw std::runtime_error(message)
#else // JSON_USE_EXCEPTION
#define JSON_ASSERT(condition) assert(condition);
// The call to assert() will show the failure message in debug builds. In
// release bugs we abort, for a core-dump or debugger.
// @todo <= add detail about condition in exception
#define JSON_ASSERT(condition) \
{ \
if (!(condition)) { \
Json::throwLogicError("assert json failed"); \
} \
}
#define JSON_FAIL_MESSAGE(message) \
{ \
std::ostringstream oss; oss << message; \
assert(false && oss.str().c_str()); \
OStringStream oss; \
oss << message; \
Json::throwLogicError(oss.str()); \
abort(); \
}
#else // JSON_USE_EXCEPTION
#define JSON_ASSERT(condition) assert(condition)
// The call to assert() will show the failure message in debug builds. In
// release builds we abort, for a core-dump or debugger.
#define JSON_FAIL_MESSAGE(message) \
{ \
OStringStream oss; \
oss << message; \
assert(false && oss.str().c_str()); \
abort(); \
}
#endif

View File

@@ -1,4 +1,4 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Copyright 2007-2010 Baptiste Lepilleur and The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE

View File

@@ -1,10 +1,18 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Copyright 2007-2010 Baptiste Lepilleur and The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
#ifndef JSON_CONFIG_H_INCLUDED
#define JSON_CONFIG_H_INCLUDED
#include <cstddef>
#include <cstdint>
#include <istream>
#include <memory>
#include <ostream>
#include <sstream>
#include <string>
#include <type_traits>
/// If defined, indicates that json library is embedded in CppTL library.
//# define JSON_IN_CPPTL 1
@@ -15,17 +23,6 @@
/// std::map
/// as Value container.
//# define JSON_USE_CPPTL_SMALLMAP 1
/// If defined, indicates that Json specific container should be used
/// (hash table & simple deque container with customizable allocator).
/// THIS FEATURE IS STILL EXPERIMENTAL! There is know bugs: See #3177332
//# define JSON_VALUE_USE_INTERNAL_MAP 1
/// Force usage of standard new/malloc based allocator instead of memory pool
/// based allocator.
/// The memory pools allocator used optimization (initializing Value and
/// ValueInternalLink
/// as if it was a POD) that may cause some validation tool to report errors.
/// Only has effects if JSON_VALUE_USE_INTERNAL_MAP is defined.
//# define JSON_USE_SIMPLE_INTERNAL_ALLOCATOR 1
// If non-zero, the library uses exceptions to report bad input instead of C
// assertion macros. The default is to use exceptions.
@@ -33,9 +30,9 @@
#define JSON_USE_EXCEPTION 1
#endif
/// If defined, indicates that the source file is amalgated
/// If defined, indicates that the source file is amalgamated
/// to prevent private header inclusion.
/// Remarks: it is automatically defined in the generated amalgated header.
/// Remarks: it is automatically defined in the generated amalgamated header.
// #define JSON_IS_AMALGAMATION
#ifdef JSON_IN_CPPTL
@@ -48,12 +45,14 @@
#ifdef JSON_IN_CPPTL
#define JSON_API CPPTL_API
#elif defined(JSON_DLL_BUILD)
#if defined(_MSC_VER)
#if defined(_MSC_VER) || defined(__MINGW32__)
#define JSON_API __declspec(dllexport)
#define JSONCPP_DISABLE_DLL_INTERFACE_WARNING
#elif defined(__GNUC__) || defined(__clang__)
#define JSON_API __attribute__((visibility("default")))
#endif // if defined(_MSC_VER)
#elif defined(JSON_DLL)
#if defined(_MSC_VER)
#if defined(_MSC_VER) || defined(__MINGW32__)
#define JSON_API __declspec(dllimport)
#define JSONCPP_DISABLE_DLL_INTERFACE_WARNING
#endif // if defined(_MSC_VER)
@@ -62,31 +61,74 @@
#define JSON_API
#endif
#if defined(_MSC_VER) && _MSC_VER < 1800
#error \
"ERROR: Visual Studio 12 (2013) with _MSC_VER=1800 is the oldest supported compiler with sufficient C++11 capabilities"
#endif
#if defined(_MSC_VER) && _MSC_VER < 1900
// As recommended at
// https://stackoverflow.com/questions/2915672/snprintf-and-visual-studio-2010
extern JSON_API int
msvc_pre1900_c99_snprintf(char* outBuf, size_t size, const char* format, ...);
#define jsoncpp_snprintf msvc_pre1900_c99_snprintf
#else
#define jsoncpp_snprintf std::snprintf
#endif
// If JSON_NO_INT64 is defined, then Json only support C++ "int" type for
// integer
// Storages, and 64 bits integer support is disabled.
// #define JSON_NO_INT64 1
#if defined(_MSC_VER) && _MSC_VER <= 1200 // MSVC 6
// Microsoft Visual Studio 6 only support conversion from __int64 to double
// (no conversion from unsigned __int64).
#define JSON_USE_INT64_DOUBLE_CONVERSION 1
// Disable warning 4786 for VS6 caused by STL (identifier was truncated to '255'
// characters in the debug information)
// All projects I've ever seen with VS6 were using this globally (not bothering
// with pragma push/pop).
#pragma warning(disable : 4786)
#endif // if defined(_MSC_VER) && _MSC_VER < 1200 // MSVC 6
// JSONCPP_OVERRIDE is maintained for backwards compatibility of external tools.
// C++11 should be used directly in JSONCPP.
#define JSONCPP_OVERRIDE override
#if defined(_MSC_VER) && _MSC_VER >= 1500 // MSVC 2008
/// Indicates that the following function is deprecated.
#define JSONCPP_DEPRECATED(message) __declspec(deprecated(message))
#if __cplusplus >= 201103L
#define JSONCPP_NOEXCEPT noexcept
#define JSONCPP_OP_EXPLICIT explicit
#elif defined(_MSC_VER) && _MSC_VER < 1900
#define JSONCPP_NOEXCEPT throw()
#define JSONCPP_OP_EXPLICIT explicit
#elif defined(_MSC_VER) && _MSC_VER >= 1900
#define JSONCPP_NOEXCEPT noexcept
#define JSONCPP_OP_EXPLICIT explicit
#else
#define JSONCPP_NOEXCEPT throw()
#define JSONCPP_OP_EXPLICIT
#endif
#ifdef __clang__
#if __has_extension(attribute_deprecated_with_message)
#define JSONCPP_DEPRECATED(message) __attribute__((deprecated(message)))
#endif
#elif defined __GNUC__ // not clang (gcc comes later since clang emulates gcc)
#if (__GNUC__ > 4 || (__GNUC__ == 4 && __GNUC_MINOR__ >= 5))
#define JSONCPP_DEPRECATED(message) __attribute__((deprecated(message)))
#elif (__GNUC__ > 3 || (__GNUC__ == 3 && __GNUC_MINOR__ >= 1))
#define JSONCPP_DEPRECATED(message) __attribute__((__deprecated__))
#endif // GNUC version
#elif defined(_MSC_VER) // MSVC (after clang because clang on Windows emulates
// MSVC)
#define JSONCPP_DEPRECATED(message) __declspec(deprecated(message))
#endif // __clang__ || __GNUC__ || _MSC_VER
#if !defined(JSONCPP_DEPRECATED)
#define JSONCPP_DEPRECATED(message)
#endif // if !defined(JSONCPP_DEPRECATED)
#if __GNUC__ >= 6
#define JSON_USE_INT64_DOUBLE_CONVERSION 1
#endif
#if !defined(JSON_IS_AMALGAMATION)
#include "allocator.h"
#include "version.h"
#endif // if !defined(JSON_IS_AMALGAMATION)
namespace Json {
typedef int Int;
typedef unsigned int UInt;
@@ -100,13 +142,34 @@ typedef unsigned int LargestUInt;
typedef __int64 Int64;
typedef unsigned __int64 UInt64;
#else // if defined(_MSC_VER) // Other platforms, use long long
typedef long long int Int64;
typedef unsigned long long int UInt64;
#endif // if defined(_MSC_VER)
typedef int64_t Int64;
typedef uint64_t UInt64;
#endif // if defined(_MSC_VER)
typedef Int64 LargestInt;
typedef UInt64 LargestUInt;
#define JSON_HAS_INT64
#endif // if defined(JSON_NO_INT64)
} // end namespace Json
template <typename T>
using Allocator = typename std::conditional<JSONCPP_USING_SECURE_MEMORY,
SecureAllocator<T>,
std::allocator<T>>::type;
using String = std::basic_string<char, std::char_traits<char>, Allocator<char>>;
using IStringStream = std::basic_istringstream<String::value_type,
String::traits_type,
String::allocator_type>;
using OStringStream = std::basic_ostringstream<String::value_type,
String::traits_type,
String::allocator_type>;
using IStream = std::istream;
using OStream = std::ostream;
} // namespace Json
// Legacy names (formerly macros).
using JSONCPP_STRING = Json::String;
using JSONCPP_ISTRINGSTREAM = Json::IStringStream;
using JSONCPP_OSTRINGSTREAM = Json::OStringStream;
using JSONCPP_ISTREAM = Json::IStream;
using JSONCPP_OSTREAM = Json::OStream;
#endif // JSON_CONFIG_H_INCLUDED

View File

@@ -1,4 +1,4 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Copyright 2007-2010 Baptiste Lepilleur and The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
@@ -10,6 +10,8 @@
#include "forwards.h"
#endif // if !defined(JSON_IS_AMALGAMATION)
#pragma pack(push, 8)
namespace Json {
/** \brief Configuration passed to reader and writer.
@@ -39,19 +41,21 @@ public:
Features();
/// \c true if comments are allowed. Default: \c true.
bool allowComments_;
bool allowComments_{true};
/// \c true if root must be either an array or an object value. Default: \c
/// false.
bool strictRoot_;
bool strictRoot_{false};
/// \c true if dropped null placeholders are allowed. Default: \c false.
bool allowDroppedNullPlaceholders_;
bool allowDroppedNullPlaceholders_{false};
/// \c true if numeric object key are allowed. Default: \c false.
bool allowNumericKeys_;
bool allowNumericKeys_{false};
};
} // namespace Json
#pragma pack(pop)
#endif // CPPTL_JSON_FEATURES_H_INCLUDED

View File

@@ -1,4 +1,4 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Copyright 2007-2010 Baptiste Lepilleur and The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
@@ -13,11 +13,17 @@
namespace Json {
// writer.h
class StreamWriter;
class StreamWriterBuilder;
class Writer;
class FastWriter;
class StyledWriter;
class StyledStreamWriter;
// reader.h
class Reader;
class CharReader;
class CharReaderBuilder;
// features.h
class Features;
@@ -31,12 +37,6 @@ class Value;
class ValueIteratorBase;
class ValueIterator;
class ValueConstIterator;
#ifdef JSON_VALUE_USE_INTERNAL_MAP
class ValueMapAllocator;
class ValueInternalLink;
class ValueInternalArray;
class ValueInternalMap;
#endif // #ifdef JSON_VALUE_USE_INTERNAL_MAP
} // namespace Json

View File

@@ -1,4 +1,4 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Copyright 2007-2010 Baptiste Lepilleur and The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
@@ -7,9 +7,9 @@
#define JSON_JSON_H_INCLUDED
#include "autolink.h"
#include "value.h"
#include "reader.h"
#include "writer.h"
#include "features.h"
#include "reader.h"
#include "value.h"
#include "writer.h"
#endif // JSON_JSON_H_INCLUDED

View File

@@ -1,4 +1,4 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Copyright 2007-2010 Baptiste Lepilleur and The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
@@ -12,6 +12,7 @@
#endif // if !defined(JSON_IS_AMALGAMATION)
#include <deque>
#include <iosfwd>
#include <istream>
#include <stack>
#include <string>
@@ -22,11 +23,14 @@
#pragma warning(disable : 4251)
#endif // if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING)
#pragma pack(push, 8)
namespace Json {
/** \brief Unserialize a <a HREF="http://www.json.org">JSON</a> document into a
*Value.
*
* \deprecated Use CharReader and CharReaderBuilder.
*/
class JSON_API Reader {
public:
@@ -40,19 +44,21 @@ public:
*
*/
struct StructuredError {
size_t offset_start;
size_t offset_limit;
std::string message;
ptrdiff_t offset_start;
ptrdiff_t offset_limit;
String message;
};
/** \brief Constructs a Reader allowing all features
* for parsing.
*/
JSONCPP_DEPRECATED("Use CharReader and CharReaderBuilder instead")
Reader();
/** \brief Constructs a Reader allowing the specified feature set
* for parsing.
*/
JSONCPP_DEPRECATED("Use CharReader and CharReaderBuilder instead")
Reader(const Features& features);
/** \brief Read a Value from a <a HREF="http://www.json.org">JSON</a>
@@ -78,7 +84,7 @@ public:
document to read.
* \param endDoc Pointer on the end of the UTF-8 encoded string of the
document to read.
\ Must be >= beginDoc.
* Must be >= beginDoc.
* \param root [out] Contains the root value of the document if it was
* successfully parsed.
* \param collectComments \c true to collect comment and allow writing them
@@ -97,7 +103,7 @@ public:
/// \brief Parse from input stream.
/// \see Json::operator>>(std::istream&, Json::Value&).
bool parse(std::istream& is, Value& root, bool collectComments = true);
bool parse(IStream& is, Value& root, bool collectComments = true);
/** \brief Returns a user friendly string that list errors in the parsed
* document.
@@ -108,8 +114,8 @@ public:
* during parsing.
* \deprecated Use getFormattedErrorMessages() instead (typo fix).
*/
JSONCPP_DEPRECATED("Use getFormattedErrorMessages instead")
std::string getFormatedErrorMessages() const;
JSONCPP_DEPRECATED("Use getFormattedErrorMessages() instead.")
String getFormatedErrorMessages() const;
/** \brief Returns a user friendly string that list errors in the parsed
* document.
@@ -119,7 +125,7 @@ public:
* occurred
* during parsing.
*/
std::string getFormattedErrorMessages() const;
String getFormattedErrorMessages() const;
/** \brief Returns a vector of structured erros encounted while parsing.
* \return A (possibly empty) vector of StructuredError objects. Currently
@@ -136,7 +142,7 @@ public:
* \return \c true if the error was successfully added, \c false if the
* Value offset exceeds the document size.
*/
bool pushError(const Value& value, const std::string& message);
bool pushError(const Value& value, const String& message);
/** \brief Add a semantic error message with extra context.
* \param value JSON Value location associated with the error
@@ -145,7 +151,7 @@ public:
* \return \c true if the error was successfully added, \c false if either
* Value offset exceeds the document size.
*/
bool pushError(const Value& value, const std::string& message, const Value& extra);
bool pushError(const Value& value, const String& message, const Value& extra);
/** \brief Return whether there are any errors.
* \return \c true if there are no errors to report \c false if
@@ -181,7 +187,7 @@ private:
class ErrorInfo {
public:
Token token_;
std::string message_;
String message_;
Location extra_;
};
@@ -201,7 +207,7 @@ private:
bool decodeNumber(Token& token);
bool decodeNumber(Token& token, Value& decoded);
bool decodeString(Token& token);
bool decodeString(Token& token, std::string& decoded);
bool decodeString(Token& token, String& decoded);
bool decodeDouble(Token& token);
bool decodeDouble(Token& token, Value& decoded);
bool decodeUnicodeCodePoint(Token& token,
@@ -212,9 +218,9 @@ private:
Location& current,
Location end,
unsigned int& unicode);
bool addError(const std::string& message, Token& token, Location extra = 0);
bool addError(const String& message, Token& token, Location extra = nullptr);
bool recoverFromError(TokenType skipUntilToken);
bool addErrorAndRecover(const std::string& message,
bool addErrorAndRecover(const String& message,
Token& token,
TokenType skipUntilToken);
void skipUntilSpace();
@@ -222,24 +228,158 @@ private:
Char getNextChar();
void
getLocationLineAndColumn(Location location, int& line, int& column) const;
std::string getLocationLineAndColumn(Location location) const;
String getLocationLineAndColumn(Location location) const;
void addComment(Location begin, Location end, CommentPlacement placement);
void skipCommentTokens(Token& token);
static bool containsNewLine(Location begin, Location end);
static String normalizeEOL(Location begin, Location end);
typedef std::stack<Value*> Nodes;
Nodes nodes_;
Errors errors_;
std::string document_;
Location begin_;
Location end_;
Location current_;
Location lastValueEnd_;
Value* lastValue_;
std::string commentsBefore_;
String document_;
Location begin_{};
Location end_{};
Location current_{};
Location lastValueEnd_{};
Value* lastValue_{};
String commentsBefore_;
Features features_;
bool collectComments_;
bool collectComments_{};
}; // Reader
/** Interface for reading JSON from a char array.
*/
class JSON_API CharReader {
public:
virtual ~CharReader() = default;
/** \brief Read a Value from a <a HREF="http://www.json.org">JSON</a>
document.
* The document must be a UTF-8 encoded string containing the document to
read.
*
* \param beginDoc Pointer on the beginning of the UTF-8 encoded string of the
document to read.
* \param endDoc Pointer on the end of the UTF-8 encoded string of the
document to read.
* Must be >= beginDoc.
* \param root [out] Contains the root value of the document if it was
* successfully parsed.
* \param errs [out] Formatted error messages (if not NULL)
* a user friendly string that lists errors in the parsed
* document.
* \return \c true if the document was successfully parsed, \c false if an
error occurred.
*/
virtual bool parse(char const* beginDoc,
char const* endDoc,
Value* root,
String* errs) = 0;
class JSON_API Factory {
public:
virtual ~Factory() = default;
/** \brief Allocate a CharReader via operator new().
* \throw std::exception if something goes wrong (e.g. invalid settings)
*/
virtual CharReader* newCharReader() const = 0;
}; // Factory
}; // CharReader
/** \brief Build a CharReader implementation.
Usage:
\code
using namespace Json;
CharReaderBuilder builder;
builder["collectComments"] = false;
Value value;
String errs;
bool ok = parseFromStream(builder, std::cin, &value, &errs);
\endcode
*/
class JSON_API CharReaderBuilder : public CharReader::Factory {
public:
// Note: We use a Json::Value so that we can add data-members to this class
// without a major version bump.
/** Configuration of this builder.
These are case-sensitive.
Available settings (case-sensitive):
- `"collectComments": false or true`
- true to collect comment and allow writing them
back during serialization, false to discard comments.
This parameter is ignored if allowComments is false.
- `"allowComments": false or true`
- true if comments are allowed.
- `"strictRoot": false or true`
- true if root must be either an array or an object value
- `"allowDroppedNullPlaceholders": false or true`
- true if dropped null placeholders are allowed. (See
StreamWriterBuilder.)
- `"allowNumericKeys": false or true`
- true if numeric object keys are allowed.
- `"allowSingleQuotes": false or true`
- true if '' are allowed for strings (both keys and values)
- `"stackLimit": integer`
- Exceeding stackLimit (recursive depth of `readValue()`) will
cause an exception.
- This is a security issue (seg-faults caused by deeply nested JSON),
so the default is low.
- `"failIfExtra": false or true`
- If true, `parse()` returns false when extra non-whitespace trails
the JSON value in the input string.
- `"rejectDupKeys": false or true`
- If true, `parse()` returns false when a key is duplicated within an
object.
- `"allowSpecialFloats": false or true`
- If true, special float values (NaNs and infinities) are allowed
and their values are lossfree restorable.
You can examine 'settings_` yourself
to see the defaults. You can also write and read them just like any
JSON Value.
\sa setDefaults()
*/
Json::Value settings_;
CharReaderBuilder();
~CharReaderBuilder() override;
CharReader* newCharReader() const override;
/** \return true if 'settings' are legal and consistent;
* otherwise, indicate bad settings via 'invalid'.
*/
bool validate(Json::Value* invalid) const;
/** A simple way to update a specific setting.
*/
Value& operator[](const String& key);
/** Called by ctor, but you can use this to reset settings_.
* \pre 'settings' != NULL (but Json::null is fine)
* \remark Defaults:
* \snippet src/lib_json/json_reader.cpp CharReaderBuilderDefaults
*/
static void setDefaults(Json::Value* settings);
/** Same as old Features::strictMode().
* \pre 'settings' != NULL (but Json::null is fine)
* \remark Defaults:
* \snippet src/lib_json/json_reader.cpp CharReaderBuilderStrictMode
*/
static void strictMode(Json::Value* settings);
};
/** Consume entire stream and use its begin/end.
* Someday we might have a real StreamReader, but for now this
* is convenient.
*/
bool JSON_API parseFromStream(CharReader::Factory const&,
IStream&,
Value* root,
std::string* errs);
/** \brief Read from 'sin' into 'root'.
Always keep comments from the input JSON.
@@ -264,10 +404,12 @@ private:
\throw std::exception on parse error.
\see Json::operator<<()
*/
JSON_API std::istream& operator>>(std::istream&, Value&);
JSON_API IStream& operator>>(IStream&, Value&);
} // namespace Json
#pragma pack(pop)
#if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING)
#pragma warning(pop)
#endif // if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING)

File diff suppressed because it is too large Load Diff

View File

@@ -1,14 +0,0 @@
// DO NOT EDIT. This file is generated by CMake from "version"
// and "version.h.in" files.
// Run CMake configure step to update it.
#ifndef JSON_VERSION_H_INCLUDED
# define JSON_VERSION_H_INCLUDED
# define JSONCPP_VERSION_STRING "1.3.0"
# define JSONCPP_VERSION_MAJOR 1
# define JSONCPP_VERSION_MINOR 3
# define JSONCPP_VERSION_PATCH 0
# define JSONCPP_VERSION_QUALIFIER
# define JSONCPP_VERSION_HEXA ((JSONCPP_VERSION_MAJOR << 24) | (JSONCPP_VERSION_MINOR << 16) | (JSONCPP_VERSION_PATCH << 8))
#endif // JSON_VERSION_H_INCLUDED

View File

@@ -1,4 +1,4 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Copyright 2007-2010 Baptiste Lepilleur and The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
@@ -9,27 +9,147 @@
#if !defined(JSON_IS_AMALGAMATION)
#include "value.h"
#endif // if !defined(JSON_IS_AMALGAMATION)
#include <vector>
#include <ostream>
#include <string>
#include <vector>
// Disable warning C4251: <data member>: <type> needs to have dll-interface to
// be used by...
#if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING)
#if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING) && defined(_MSC_VER)
#pragma warning(push)
#pragma warning(disable : 4251)
#endif // if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING)
#pragma pack(push, 8)
namespace Json {
class Value;
/** \brief Abstract class for writers.
/**
Usage:
\code
using namespace Json;
void writeToStdout(StreamWriter::Factory const& factory, Value const& value) {
std::unique_ptr<StreamWriter> const writer(
factory.newStreamWriter());
writer->write(value, &std::cout);
std::cout << std::endl; // add lf and flush
}
\endcode
*/
class JSON_API StreamWriter {
protected:
OStream* sout_; // not owned; will not delete
public:
StreamWriter();
virtual ~StreamWriter();
/** Write Value into document as configured in sub-class.
Do not take ownership of sout, but maintain a reference during function.
\pre sout != NULL
\return zero on success (For now, we always return zero, so check the
stream instead.) \throw std::exception possibly, depending on configuration
*/
virtual int write(Value const& root, OStream* sout) = 0;
/** \brief A simple abstract factory.
*/
class JSON_API Factory {
public:
virtual ~Factory();
/** \brief Allocate a CharReader via operator new().
* \throw std::exception if something goes wrong (e.g. invalid settings)
*/
virtual StreamWriter* newStreamWriter() const = 0;
}; // Factory
}; // StreamWriter
/** \brief Write into stringstream, then return string, for convenience.
* A StreamWriter will be created from the factory, used, and then deleted.
*/
class JSON_API Writer {
String JSON_API writeString(StreamWriter::Factory const& factory,
Value const& root);
/** \brief Build a StreamWriter implementation.
Usage:
\code
using namespace Json;
Value value = ...;
StreamWriterBuilder builder;
builder["commentStyle"] = "None";
builder["indentation"] = " "; // or whatever you like
std::unique_ptr<Json::StreamWriter> writer(
builder.newStreamWriter());
writer->write(value, &std::cout);
std::cout << std::endl; // add lf and flush
\endcode
*/
class JSON_API StreamWriterBuilder : public StreamWriter::Factory {
public:
// Note: We use a Json::Value so that we can add data-members to this class
// without a major version bump.
/** Configuration of this builder.
Available settings (case-sensitive):
- "commentStyle": "None" or "All"
- "indentation": "<anything>".
- Setting this to an empty string also omits newline characters.
- "enableYAMLCompatibility": false or true
- slightly change the whitespace around colons
- "dropNullPlaceholders": false or true
- Drop the "null" string from the writer's output for nullValues.
Strictly speaking, this is not valid JSON. But when the output is being
fed to a browser's JavaScript, it makes for smaller output and the
browser can handle the output just fine.
- "useSpecialFloats": false or true
- If true, outputs non-finite floating point values in the following way:
NaN values as "NaN", positive infinity as "Infinity", and negative
infinity as "-Infinity".
- "precision": int
- Number of precision digits for formatting of real values.
- "precisionType": "significant"(default) or "decimal"
- Type of precision for formatting of real values.
You can examine 'settings_` yourself
to see the defaults. You can also write and read them just like any
JSON Value.
\sa setDefaults()
*/
Json::Value settings_;
StreamWriterBuilder();
~StreamWriterBuilder() override;
/**
* \throw std::exception if something goes wrong (e.g. invalid settings)
*/
StreamWriter* newStreamWriter() const override;
/** \return true if 'settings' are legal and consistent;
* otherwise, indicate bad settings via 'invalid'.
*/
bool validate(Json::Value* invalid) const;
/** A simple way to update a specific setting.
*/
Value& operator[](const String& key);
/** Called by ctor, but you can use this to reset settings_.
* \pre 'settings' != NULL (but Json::null is fine)
* \remark Defaults:
* \snippet src/lib_json/json_writer.cpp StreamWriterBuilderDefaults
*/
static void setDefaults(Json::Value* settings);
};
/** \brief Abstract class for writers.
* \deprecated Use StreamWriter. (And really, this is an implementation detail.)
*/
class JSONCPP_DEPRECATED("Use StreamWriter instead") JSON_API Writer {
public:
virtual ~Writer();
virtual std::string write(const Value& root) = 0;
virtual String write(const Value& root) = 0;
};
/** \brief Outputs a Value in <a HREF="http://www.json.org">JSON</a> format
@@ -37,19 +157,25 @@ public:
*
* The JSON document is written in a single line. It is not intended for 'human'
*consumption,
* but may be usefull to support feature such as RPC where bandwith is limited.
* but may be useful to support feature such as RPC where bandwidth is limited.
* \sa Reader, Value
* \deprecated Use StreamWriterBuilder.
*/
class JSON_API FastWriter : public Writer {
#if defined(_MSC_VER)
#pragma warning(push)
#pragma warning(disable : 4996) // Deriving from deprecated class
#endif
class JSONCPP_DEPRECATED("Use StreamWriterBuilder instead") JSON_API FastWriter
: public Writer {
public:
FastWriter();
virtual ~FastWriter() {}
~FastWriter() override = default;
void enableYAMLCompatibility();
/** \brief Drop the "null" string from the writer's output for nullValues.
* Strictly speaking, this is not valid JSON. But when the output is being
* fed to a browser's Javascript, it makes for smaller output and the
* fed to a browser's JavaScript, it makes for smaller output and the
* browser can handle the output just fine.
*/
void dropNullPlaceholders();
@@ -57,16 +183,19 @@ public:
void omitEndingLineFeed();
public: // overridden from Writer
virtual std::string write(const Value& root);
String write(const Value& root) override;
private:
void writeValue(const Value& value);
std::string document_;
bool yamlCompatiblityEnabled_;
bool dropNullPlaceholders_;
bool omitEndingLineFeed_;
String document_;
bool yamlCompatibilityEnabled_{false};
bool dropNullPlaceholders_{false};
bool omitEndingLineFeed_{false};
};
#if defined(_MSC_VER)
#pragma warning(pop)
#endif
/** \brief Writes a Value in <a HREF="http://www.json.org">JSON</a> format in a
*human friendly way.
@@ -90,42 +219,51 @@ private:
*#CommentPlacement.
*
* \sa Reader, Value, Value::setComment()
* \deprecated Use StreamWriterBuilder.
*/
class JSON_API StyledWriter : public Writer {
#if defined(_MSC_VER)
#pragma warning(push)
#pragma warning(disable : 4996) // Deriving from deprecated class
#endif
class JSONCPP_DEPRECATED("Use StreamWriterBuilder instead") JSON_API
StyledWriter : public Writer {
public:
StyledWriter();
virtual ~StyledWriter() {}
~StyledWriter() override = default;
public: // overridden from Writer
/** \brief Serialize a Value in <a HREF="http://www.json.org">JSON</a> format.
* \param root Value to serialize.
* \return String containing the JSON document that represents the root value.
*/
virtual std::string write(const Value& root);
String write(const Value& root) override;
private:
void writeValue(const Value& value);
void writeArrayValue(const Value& value);
bool isMultineArray(const Value& value);
void pushValue(const std::string& value);
bool isMultilineArray(const Value& value);
void pushValue(const String& value);
void writeIndent();
void writeWithIndent(const std::string& value);
void writeWithIndent(const String& value);
void indent();
void unindent();
void writeCommentBeforeValue(const Value& root);
void writeCommentAfterValueOnSameLine(const Value& root);
bool hasCommentForValue(const Value& value);
static std::string normalizeEOL(const std::string& text);
static bool hasCommentForValue(const Value& value);
static String normalizeEOL(const String& text);
typedef std::vector<std::string> ChildValues;
typedef std::vector<String> ChildValues;
ChildValues childValues_;
std::string document_;
std::string indentString_;
int rightMargin_;
int indentSize_;
bool addChildValues_;
String document_;
String indentString_;
unsigned int rightMargin_{74};
unsigned int indentSize_{3};
bool addChildValues_{false};
};
#if defined(_MSC_VER)
#pragma warning(pop)
#endif
/** \brief Writes a Value in <a HREF="http://www.json.org">JSON</a> format in a
human friendly way,
@@ -149,13 +287,21 @@ private:
* If the Value have comments then they are outputed according to their
#CommentPlacement.
*
* \param indentation Each level will be indented by this amount extra.
* \sa Reader, Value, Value::setComment()
* \deprecated Use StreamWriterBuilder.
*/
class JSON_API StyledStreamWriter {
#if defined(_MSC_VER)
#pragma warning(push)
#pragma warning(disable : 4996) // Deriving from deprecated class
#endif
class JSONCPP_DEPRECATED("Use StreamWriterBuilder instead") JSON_API
StyledStreamWriter {
public:
StyledStreamWriter(std::string indentation = "\t");
~StyledStreamWriter() {}
/**
* \param indentation Each level will be indented by this amount extra.
*/
StyledStreamWriter(String indentation = "\t");
~StyledStreamWriter() = default;
public:
/** \brief Serialize a Value in <a HREF="http://www.json.org">JSON</a> format.
@@ -164,48 +310,57 @@ public:
* \note There is no point in deriving from Writer, since write() should not
* return a value.
*/
void write(std::ostream& out, const Value& root);
void write(OStream& out, const Value& root);
private:
void writeValue(const Value& value);
void writeArrayValue(const Value& value);
bool isMultineArray(const Value& value);
void pushValue(const std::string& value);
bool isMultilineArray(const Value& value);
void pushValue(const String& value);
void writeIndent();
void writeWithIndent(const std::string& value);
void writeWithIndent(const String& value);
void indent();
void unindent();
void writeCommentBeforeValue(const Value& root);
void writeCommentAfterValueOnSameLine(const Value& root);
bool hasCommentForValue(const Value& value);
static std::string normalizeEOL(const std::string& text);
static bool hasCommentForValue(const Value& value);
static String normalizeEOL(const String& text);
typedef std::vector<std::string> ChildValues;
typedef std::vector<String> ChildValues;
ChildValues childValues_;
std::ostream* document_;
std::string indentString_;
int rightMargin_;
std::string indentation_;
bool addChildValues_;
OStream* document_;
String indentString_;
unsigned int rightMargin_{74};
String indentation_;
bool addChildValues_ : 1;
bool indented_ : 1;
};
#if defined(_MSC_VER)
#pragma warning(pop)
#endif
#if defined(JSON_HAS_INT64)
std::string JSON_API valueToString(Int value);
std::string JSON_API valueToString(UInt value);
String JSON_API valueToString(Int value);
String JSON_API valueToString(UInt value);
#endif // if defined(JSON_HAS_INT64)
std::string JSON_API valueToString(LargestInt value);
std::string JSON_API valueToString(LargestUInt value);
std::string JSON_API valueToString(double value);
std::string JSON_API valueToString(bool value);
std::string JSON_API valueToQuotedString(const char* value);
String JSON_API valueToString(LargestInt value);
String JSON_API valueToString(LargestUInt value);
String JSON_API
valueToString(double value,
unsigned int precision = Value::defaultRealPrecision,
PrecisionType precisionType = PrecisionType::significantDigits);
String JSON_API valueToString(bool value);
String JSON_API valueToQuotedString(const char* value);
/// \brief Output using the StyledStreamWriter.
/// \see Json::operator>>()
JSON_API std::ostream& operator<<(std::ostream&, const Value& root);
JSON_API OStream& operator<<(OStream&, const Value& root);
} // namespace Json
#pragma pack(pop)
#if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING)
#pragma warning(pop)
#endif // if defined(JSONCPP_DISABLE_DLL_INTERFACE_WARNING)

View File

@@ -1,42 +0,0 @@

Microsoft Visual Studio Solution File, Format Version 11.00
# Visual Studio 2010
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "lib_json", "lib_json.vcxproj", "{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}"
EndProject
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "jsontest", "jsontest.vcxproj", "{25AF2DD2-D396-4668-B188-488C33B8E620}"
EndProject
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "test_lib_json", "test_lib_json.vcxproj", "{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Win32 = Debug|Win32
Debug|x64 = Debug|x64
Release|Win32 = Release|Win32
Release|x64 = Release|x64
EndGlobalSection
GlobalSection(ProjectConfigurationPlatforms) = postSolution
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Debug|Win32.ActiveCfg = Debug|Win32
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Debug|Win32.Build.0 = Debug|Win32
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Debug|x64.ActiveCfg = Debug|x64
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Debug|x64.Build.0 = Debug|x64
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Release|Win32.ActiveCfg = Release|Win32
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Release|Win32.Build.0 = Release|Win32
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Release|x64.ActiveCfg = Release|x64
{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}.Release|x64.Build.0 = Release|x64
{25AF2DD2-D396-4668-B188-488C33B8E620}.Debug|Win32.ActiveCfg = Debug|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.Debug|Win32.Build.0 = Debug|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.Debug|x64.ActiveCfg = Debug|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.Release|Win32.ActiveCfg = Release|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.Release|Win32.Build.0 = Release|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.Release|x64.ActiveCfg = Release|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Debug|Win32.ActiveCfg = Debug|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Debug|Win32.Build.0 = Debug|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Debug|x64.ActiveCfg = Debug|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Release|Win32.ActiveCfg = Release|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Release|Win32.Build.0 = Release|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Release|x64.ActiveCfg = Release|Win32
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
EndGlobalSection
EndGlobal

View File

@@ -1,96 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup Label="ProjectConfigurations">
<ProjectConfiguration Include="Debug|Win32">
<Configuration>Debug</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|Win32">
<Configuration>Release</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
</ItemGroup>
<PropertyGroup Label="Globals">
<ProjectGuid>{25AF2DD2-D396-4668-B188-488C33B8E620}</ProjectGuid>
<Keyword>Win32Proj</Keyword>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.props" />
<ImportGroup Label="ExtensionSettings">
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<PropertyGroup Label="UserMacros" />
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<OutDir Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">../../build/vs71/debug/jsontest\</OutDir>
<IntDir Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">../../build/vs71/debug/jsontest\</IntDir>
<LinkIncremental Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">true</LinkIncremental>
<OutDir Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">../../build/vs71/release/jsontest\</OutDir>
<IntDir Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">../../build/vs71/release/jsontest\</IntDir>
<LinkIncremental Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">false</LinkIncremental>
</PropertyGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<ClCompile>
<Optimization>Disabled</Optimization>
<AdditionalIncludeDirectories>../../include;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<PreprocessorDefinitions>WIN32;_DEBUG;_CONSOLE;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<MinimalRebuild>true</MinimalRebuild>
<BasicRuntimeChecks>EnableFastChecks</BasicRuntimeChecks>
<RuntimeLibrary>MultiThreadedDebug</RuntimeLibrary>
<PrecompiledHeader>
</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
<DebugInformationFormat>EditAndContinue</DebugInformationFormat>
</ClCompile>
<Link>
<OutputFile>$(OutDir)jsontest.exe</OutputFile>
<GenerateDebugInformation>true</GenerateDebugInformation>
<ProgramDatabaseFile>$(OutDir)jsontest.pdb</ProgramDatabaseFile>
<SubSystem>Console</SubSystem>
<TargetMachine>MachineX86</TargetMachine>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<ClCompile>
<AdditionalIncludeDirectories>../../include;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<PreprocessorDefinitions>WIN32;NDEBUG;_CONSOLE;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<RuntimeLibrary>MultiThreaded</RuntimeLibrary>
<PrecompiledHeader>
</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
<DebugInformationFormat>ProgramDatabase</DebugInformationFormat>
</ClCompile>
<Link>
<OutputFile>$(OutDir)jsontest.exe</OutputFile>
<GenerateDebugInformation>true</GenerateDebugInformation>
<SubSystem>Console</SubSystem>
<OptimizeReferences>true</OptimizeReferences>
<EnableCOMDATFolding>true</EnableCOMDATFolding>
<TargetMachine>MachineX86</TargetMachine>
</Link>
</ItemDefinitionGroup>
<ItemGroup>
<ClCompile Include="..\..\src\jsontestrunner\main.cpp" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="lib_json.vcxproj">
<Project>{1e6c2c1c-6453-4129-ae3f-0ee8e6599c89}</Project>
</ProjectReference>
</ItemGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.targets" />
<ImportGroup Label="ExtensionTargets">
</ImportGroup>
</Project>

View File

@@ -1,13 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup>
<Filter Include="Source Files">
<UniqueIdentifier>{903591b3-ade3-4ce4-b1f9-1e175e62b014}</UniqueIdentifier>
</Filter>
</ItemGroup>
<ItemGroup>
<ClCompile Include="..\..\src\jsontestrunner\main.cpp">
<Filter>Source Files</Filter>
</ClCompile>
</ItemGroup>
</Project>

View File

@@ -1,143 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup Label="ProjectConfigurations">
<ProjectConfiguration Include="Debug|Win32">
<Configuration>Debug</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Debug|x64">
<Configuration>Debug</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|Win32">
<Configuration>Release</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|x64">
<Configuration>Release</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
</ItemGroup>
<ItemGroup>
<ClCompile Include="..\..\src\lib_json\json_reader.cpp" />
<ClCompile Include="..\..\src\lib_json\json_value.cpp" />
<ClCompile Include="..\..\src\lib_json\json_writer.cpp" />
</ItemGroup>
<ItemGroup>
<ClInclude Include="..\..\include\json\reader.h" />
<ClInclude Include="..\..\include\json\value.h" />
<ClInclude Include="..\..\include\json\writer.h" />
</ItemGroup>
<PropertyGroup Label="Globals">
<ProjectGuid>{1E6C2C1C-6453-4129-AE3F-0EE8E6599C89}</ProjectGuid>
<Keyword>Win32Proj</Keyword>
<RootNamespace>jsoncpp</RootNamespace>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="Configuration">
<ConfigurationType>StaticLibrary</ConfigurationType>
<UseDebugLibraries>true</UseDebugLibraries>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="Configuration">
<ConfigurationType>StaticLibrary</ConfigurationType>
<UseDebugLibraries>true</UseDebugLibraries>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="Configuration">
<ConfigurationType>StaticLibrary</ConfigurationType>
<UseDebugLibraries>false</UseDebugLibraries>
<WholeProgramOptimization>true</WholeProgramOptimization>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="Configuration">
<ConfigurationType>StaticLibrary</ConfigurationType>
<UseDebugLibraries>false</UseDebugLibraries>
<WholeProgramOptimization>true</WholeProgramOptimization>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.props" />
<ImportGroup Label="ExtensionSettings">
</ImportGroup>
<ImportGroup Label="PropertySheets" Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Label="PropertySheets" Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<PropertyGroup Label="UserMacros" />
<PropertyGroup />
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<ClCompile>
<PrecompiledHeader>NotUsing</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
<Optimization>Disabled</Optimization>
<PreprocessorDefinitions>WIN32;_DEBUG;_LIB;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>../../include</AdditionalIncludeDirectories>
<RuntimeLibrary>MultiThreadedDebug</RuntimeLibrary>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">
<ClCompile>
<PrecompiledHeader>NotUsing</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
<Optimization>Disabled</Optimization>
<PreprocessorDefinitions>WIN32;_DEBUG;_LIB;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>../../include</AdditionalIncludeDirectories>
<RuntimeLibrary>MultiThreadedDebug</RuntimeLibrary>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<ClCompile>
<WarningLevel>Level3</WarningLevel>
<PrecompiledHeader>NotUsing</PrecompiledHeader>
<Optimization>MaxSpeed</Optimization>
<FunctionLevelLinking>true</FunctionLevelLinking>
<IntrinsicFunctions>true</IntrinsicFunctions>
<PreprocessorDefinitions>WIN32;NDEBUG;_LIB;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>../../include</AdditionalIncludeDirectories>
<RuntimeLibrary>MultiThreaded</RuntimeLibrary>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
<EnableCOMDATFolding>true</EnableCOMDATFolding>
<OptimizeReferences>true</OptimizeReferences>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'">
<ClCompile>
<WarningLevel>Level3</WarningLevel>
<PrecompiledHeader>NotUsing</PrecompiledHeader>
<Optimization>MaxSpeed</Optimization>
<FunctionLevelLinking>true</FunctionLevelLinking>
<IntrinsicFunctions>true</IntrinsicFunctions>
<PreprocessorDefinitions>WIN32;NDEBUG;_LIB;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>../../include</AdditionalIncludeDirectories>
<RuntimeLibrary>MultiThreaded</RuntimeLibrary>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
<EnableCOMDATFolding>true</EnableCOMDATFolding>
<OptimizeReferences>true</OptimizeReferences>
</Link>
</ItemDefinitionGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.targets" />
<ImportGroup Label="ExtensionTargets">
</ImportGroup>
</Project>

View File

@@ -1,33 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup>
<Filter Include="Header Files">
<UniqueIdentifier>{c110bc57-c46e-476c-97ea-84d8014f431c}</UniqueIdentifier>
</Filter>
<Filter Include="Source Files">
<UniqueIdentifier>{ed718592-5acf-47b5-8f2b-b8224590da6a}</UniqueIdentifier>
</Filter>
</ItemGroup>
<ItemGroup>
<ClCompile Include="..\..\src\lib_json\json_reader.cpp">
<Filter>Source Files</Filter>
</ClCompile>
<ClCompile Include="..\..\src\lib_json\json_value.cpp">
<Filter>Source Files</Filter>
</ClCompile>
<ClCompile Include="..\..\src\lib_json\json_writer.cpp">
<Filter>Source Files</Filter>
</ClCompile>
</ItemGroup>
<ItemGroup>
<ClInclude Include="..\..\include\json\reader.h">
<Filter>Header Files</Filter>
</ClInclude>
<ClInclude Include="..\..\include\json\value.h">
<Filter>Header Files</Filter>
</ClInclude>
<ClInclude Include="..\..\include\json\writer.h">
<Filter>Header Files</Filter>
</ClInclude>
</ItemGroup>
</Project>

View File

@@ -1,109 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup Label="ProjectConfigurations">
<ProjectConfiguration Include="Debug|Win32">
<Configuration>Debug</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|Win32">
<Configuration>Release</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
</ItemGroup>
<PropertyGroup Label="Globals">
<ProjectGuid>{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}</ProjectGuid>
<RootNamespace>test_lib_json</RootNamespace>
<Keyword>Win32Proj</Keyword>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.props" />
<ImportGroup Label="ExtensionSettings">
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<PropertyGroup Label="UserMacros" />
<PropertyGroup>
<_ProjectFileVersion>10.0.40219.1</_ProjectFileVersion>
<OutDir Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">../../build/vs71/debug/test_lib_json\</OutDir>
<IntDir Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">../../build/vs71/debug/test_lib_json\</IntDir>
<LinkIncremental Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">true</LinkIncremental>
<OutDir Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">../../build/vs71/release/test_lib_json\</OutDir>
<IntDir Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">../../build/vs71/release/test_lib_json\</IntDir>
<LinkIncremental Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">false</LinkIncremental>
</PropertyGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<ClCompile>
<Optimization>Disabled</Optimization>
<AdditionalIncludeDirectories>../../include;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<PreprocessorDefinitions>WIN32;_DEBUG;_CONSOLE;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<MinimalRebuild>true</MinimalRebuild>
<BasicRuntimeChecks>EnableFastChecks</BasicRuntimeChecks>
<RuntimeLibrary>MultiThreadedDebug</RuntimeLibrary>
<PrecompiledHeader>
</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
<DebugInformationFormat>EditAndContinue</DebugInformationFormat>
</ClCompile>
<Link>
<OutputFile>$(OutDir)test_lib_json.exe</OutputFile>
<GenerateDebugInformation>true</GenerateDebugInformation>
<ProgramDatabaseFile>$(OutDir)test_lib_json.pdb</ProgramDatabaseFile>
<SubSystem>Console</SubSystem>
<TargetMachine>MachineX86</TargetMachine>
</Link>
<PostBuildEvent>
<Message>Running all unit tests</Message>
<Command>$(TargetPath)</Command>
</PostBuildEvent>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<ClCompile>
<AdditionalIncludeDirectories>../../include;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<PreprocessorDefinitions>WIN32;NDEBUG;_CONSOLE;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<RuntimeLibrary>MultiThreaded</RuntimeLibrary>
<PrecompiledHeader>
</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
<DebugInformationFormat>ProgramDatabase</DebugInformationFormat>
</ClCompile>
<Link>
<OutputFile>$(OutDir)test_lib_json.exe</OutputFile>
<GenerateDebugInformation>true</GenerateDebugInformation>
<SubSystem>Console</SubSystem>
<OptimizeReferences>true</OptimizeReferences>
<EnableCOMDATFolding>true</EnableCOMDATFolding>
<TargetMachine>MachineX86</TargetMachine>
</Link>
<PostBuildEvent>
<Message>Running all unit tests</Message>
<Command>$(TargetPath)</Command>
</PostBuildEvent>
</ItemDefinitionGroup>
<ItemGroup>
<ClCompile Include="..\..\src\test_lib_json\jsontest.cpp" />
<ClCompile Include="..\..\src\test_lib_json\main.cpp" />
</ItemGroup>
<ItemGroup>
<ClInclude Include="..\..\src\test_lib_json\jsontest.h" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="lib_json.vcxproj">
<Project>{1e6c2c1c-6453-4129-ae3f-0ee8e6599c89}</Project>
</ProjectReference>
</ItemGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.targets" />
<ImportGroup Label="ExtensionTargets">
</ImportGroup>
</Project>

View File

@@ -1,24 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup>
<ClCompile Include="..\..\src\test_lib_json\jsontest.cpp">
<Filter>Source Filter</Filter>
</ClCompile>
<ClCompile Include="..\..\src\test_lib_json\main.cpp">
<Filter>Source Filter</Filter>
</ClCompile>
</ItemGroup>
<ItemGroup>
<Filter Include="Source Filter">
<UniqueIdentifier>{bf40cbfc-8e98-40b4-b9f3-7e8d579cbae2}</UniqueIdentifier>
</Filter>
<Filter Include="Header Files">
<UniqueIdentifier>{5fd39074-89e6-4939-aa3f-694fefd296b1}</UniqueIdentifier>
</Filter>
</ItemGroup>
<ItemGroup>
<ClInclude Include="..\..\src\test_lib_json\jsontest.h">
<Filter>Header Files</Filter>
</ClInclude>
</ItemGroup>
</Project>

View File

@@ -1,46 +0,0 @@
Microsoft Visual Studio Solution File, Format Version 8.00
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "lib_json", "lib_json.vcproj", "{B84F7231-16CE-41D8-8C08-7B523FF4225B}"
ProjectSection(ProjectDependencies) = postProject
EndProjectSection
EndProject
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "jsontest", "jsontest.vcproj", "{25AF2DD2-D396-4668-B188-488C33B8E620}"
ProjectSection(ProjectDependencies) = postProject
{B84F7231-16CE-41D8-8C08-7B523FF4225B} = {B84F7231-16CE-41D8-8C08-7B523FF4225B}
EndProjectSection
EndProject
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "test_lib_json", "test_lib_json.vcproj", "{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}"
ProjectSection(ProjectDependencies) = postProject
{B84F7231-16CE-41D8-8C08-7B523FF4225B} = {B84F7231-16CE-41D8-8C08-7B523FF4225B}
EndProjectSection
EndProject
Global
GlobalSection(SolutionConfiguration) = preSolution
Debug = Debug
dummy = dummy
Release = Release
EndGlobalSection
GlobalSection(ProjectConfiguration) = postSolution
{B84F7231-16CE-41D8-8C08-7B523FF4225B}.Debug.ActiveCfg = Debug|Win32
{B84F7231-16CE-41D8-8C08-7B523FF4225B}.Debug.Build.0 = Debug|Win32
{B84F7231-16CE-41D8-8C08-7B523FF4225B}.dummy.ActiveCfg = dummy|Win32
{B84F7231-16CE-41D8-8C08-7B523FF4225B}.dummy.Build.0 = dummy|Win32
{B84F7231-16CE-41D8-8C08-7B523FF4225B}.Release.ActiveCfg = Release|Win32
{B84F7231-16CE-41D8-8C08-7B523FF4225B}.Release.Build.0 = Release|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.Debug.ActiveCfg = Debug|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.Debug.Build.0 = Debug|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.dummy.ActiveCfg = Debug|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.dummy.Build.0 = Debug|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.Release.ActiveCfg = Release|Win32
{25AF2DD2-D396-4668-B188-488C33B8E620}.Release.Build.0 = Release|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Debug.ActiveCfg = Debug|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Debug.Build.0 = Debug|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.dummy.ActiveCfg = Debug|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.dummy.Build.0 = Debug|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Release.ActiveCfg = Release|Win32
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Release.Build.0 = Release|Win32
EndGlobalSection
GlobalSection(ExtensibilityGlobals) = postSolution
EndGlobalSection
GlobalSection(ExtensibilityAddIns) = postSolution
EndGlobalSection
EndGlobal

View File

@@ -1,119 +0,0 @@
<?xml version="1.0" encoding="Windows-1252"?>
<VisualStudioProject
ProjectType="Visual C++"
Version="7.10"
Name="jsontest"
ProjectGUID="{25AF2DD2-D396-4668-B188-488C33B8E620}"
Keyword="Win32Proj">
<Platforms>
<Platform
Name="Win32"/>
</Platforms>
<Configurations>
<Configuration
Name="Debug|Win32"
OutputDirectory="../../build/vs71/debug/jsontest"
IntermediateDirectory="../../build/vs71/debug/jsontest"
ConfigurationType="1"
CharacterSet="2">
<Tool
Name="VCCLCompilerTool"
Optimization="0"
AdditionalIncludeDirectories="../../include"
PreprocessorDefinitions="WIN32;_DEBUG;_CONSOLE"
MinimalRebuild="TRUE"
BasicRuntimeChecks="3"
RuntimeLibrary="1"
UsePrecompiledHeader="0"
WarningLevel="3"
Detect64BitPortabilityProblems="TRUE"
DebugInformationFormat="4"/>
<Tool
Name="VCCustomBuildTool"/>
<Tool
Name="VCLinkerTool"
OutputFile="$(OutDir)/jsontest.exe"
LinkIncremental="2"
GenerateDebugInformation="TRUE"
ProgramDatabaseFile="$(OutDir)/jsontest.pdb"
SubSystem="1"
TargetMachine="1"/>
<Tool
Name="VCMIDLTool"/>
<Tool
Name="VCPostBuildEventTool"/>
<Tool
Name="VCPreBuildEventTool"/>
<Tool
Name="VCPreLinkEventTool"/>
<Tool
Name="VCResourceCompilerTool"/>
<Tool
Name="VCWebServiceProxyGeneratorTool"/>
<Tool
Name="VCXMLDataGeneratorTool"/>
<Tool
Name="VCWebDeploymentTool"/>
<Tool
Name="VCManagedWrapperGeneratorTool"/>
<Tool
Name="VCAuxiliaryManagedWrapperGeneratorTool"/>
</Configuration>
<Configuration
Name="Release|Win32"
OutputDirectory="../../build/vs71/release/jsontest"
IntermediateDirectory="../../build/vs71/release/jsontest"
ConfigurationType="1"
CharacterSet="2">
<Tool
Name="VCCLCompilerTool"
AdditionalIncludeDirectories="../../include"
PreprocessorDefinitions="WIN32;NDEBUG;_CONSOLE"
RuntimeLibrary="0"
UsePrecompiledHeader="0"
WarningLevel="3"
Detect64BitPortabilityProblems="TRUE"
DebugInformationFormat="3"/>
<Tool
Name="VCCustomBuildTool"/>
<Tool
Name="VCLinkerTool"
OutputFile="$(OutDir)/jsontest.exe"
LinkIncremental="1"
GenerateDebugInformation="TRUE"
SubSystem="1"
OptimizeReferences="2"
EnableCOMDATFolding="2"
TargetMachine="1"/>
<Tool
Name="VCMIDLTool"/>
<Tool
Name="VCPostBuildEventTool"/>
<Tool
Name="VCPreBuildEventTool"/>
<Tool
Name="VCPreLinkEventTool"/>
<Tool
Name="VCResourceCompilerTool"/>
<Tool
Name="VCWebServiceProxyGeneratorTool"/>
<Tool
Name="VCXMLDataGeneratorTool"/>
<Tool
Name="VCWebDeploymentTool"/>
<Tool
Name="VCManagedWrapperGeneratorTool"/>
<Tool
Name="VCAuxiliaryManagedWrapperGeneratorTool"/>
</Configuration>
</Configurations>
<References>
</References>
<Files>
<File
RelativePath="..\..\src\jsontestrunner\main.cpp">
</File>
</Files>
<Globals>
</Globals>
</VisualStudioProject>

View File

@@ -1,214 +0,0 @@
<?xml version="1.0" encoding="Windows-1252"?>
<VisualStudioProject
ProjectType="Visual C++"
Version="7.10"
Name="lib_json"
ProjectGUID="{B84F7231-16CE-41D8-8C08-7B523FF4225B}"
Keyword="Win32Proj">
<Platforms>
<Platform
Name="Win32"/>
</Platforms>
<Configurations>
<Configuration
Name="Debug|Win32"
OutputDirectory="../../build/vs71/debug/lib_json"
IntermediateDirectory="../../build/vs71/debug/lib_json"
ConfigurationType="4"
CharacterSet="2">
<Tool
Name="VCCLCompilerTool"
Optimization="0"
AdditionalIncludeDirectories="../../include"
PreprocessorDefinitions="WIN32;_DEBUG;_LIB"
StringPooling="TRUE"
MinimalRebuild="TRUE"
BasicRuntimeChecks="3"
RuntimeLibrary="1"
EnableFunctionLevelLinking="TRUE"
DisableLanguageExtensions="TRUE"
ForceConformanceInForLoopScope="FALSE"
RuntimeTypeInfo="TRUE"
UsePrecompiledHeader="0"
WarningLevel="3"
Detect64BitPortabilityProblems="TRUE"
DebugInformationFormat="4"/>
<Tool
Name="VCCustomBuildTool"/>
<Tool
Name="VCLibrarianTool"
OutputFile="$(OutDir)/json_vc71_libmtd.lib"/>
<Tool
Name="VCMIDLTool"/>
<Tool
Name="VCPostBuildEventTool"/>
<Tool
Name="VCPreBuildEventTool"/>
<Tool
Name="VCPreLinkEventTool"/>
<Tool
Name="VCResourceCompilerTool"/>
<Tool
Name="VCWebServiceProxyGeneratorTool"/>
<Tool
Name="VCXMLDataGeneratorTool"/>
<Tool
Name="VCManagedWrapperGeneratorTool"/>
<Tool
Name="VCAuxiliaryManagedWrapperGeneratorTool"/>
</Configuration>
<Configuration
Name="Release|Win32"
OutputDirectory="../../build/vs71/release/lib_json"
IntermediateDirectory="../../build/vs71/release/lib_json"
ConfigurationType="4"
CharacterSet="2"
WholeProgramOptimization="TRUE">
<Tool
Name="VCCLCompilerTool"
GlobalOptimizations="TRUE"
EnableIntrinsicFunctions="TRUE"
AdditionalIncludeDirectories="../../include"
PreprocessorDefinitions="WIN32;NDEBUG;_LIB"
StringPooling="TRUE"
RuntimeLibrary="0"
EnableFunctionLevelLinking="TRUE"
DisableLanguageExtensions="TRUE"
ForceConformanceInForLoopScope="FALSE"
RuntimeTypeInfo="TRUE"
UsePrecompiledHeader="0"
AssemblerOutput="4"
WarningLevel="3"
Detect64BitPortabilityProblems="TRUE"
DebugInformationFormat="3"/>
<Tool
Name="VCCustomBuildTool"/>
<Tool
Name="VCLibrarianTool"
OutputFile="$(OutDir)/json_vc71_libmt.lib"/>
<Tool
Name="VCMIDLTool"/>
<Tool
Name="VCPostBuildEventTool"/>
<Tool
Name="VCPreBuildEventTool"/>
<Tool
Name="VCPreLinkEventTool"/>
<Tool
Name="VCResourceCompilerTool"/>
<Tool
Name="VCWebServiceProxyGeneratorTool"/>
<Tool
Name="VCXMLDataGeneratorTool"/>
<Tool
Name="VCManagedWrapperGeneratorTool"/>
<Tool
Name="VCAuxiliaryManagedWrapperGeneratorTool"/>
</Configuration>
<Configuration
Name="dummy|Win32"
OutputDirectory="$(ConfigurationName)"
IntermediateDirectory="$(ConfigurationName)"
ConfigurationType="2"
CharacterSet="2"
WholeProgramOptimization="TRUE">
<Tool
Name="VCCLCompilerTool"
GlobalOptimizations="TRUE"
EnableIntrinsicFunctions="TRUE"
AdditionalIncludeDirectories="../../include"
PreprocessorDefinitions="WIN32;NDEBUG;_LIB"
StringPooling="TRUE"
RuntimeLibrary="4"
EnableFunctionLevelLinking="TRUE"
DisableLanguageExtensions="TRUE"
ForceConformanceInForLoopScope="FALSE"
RuntimeTypeInfo="TRUE"
UsePrecompiledHeader="0"
AssemblerOutput="4"
WarningLevel="3"
Detect64BitPortabilityProblems="TRUE"
DebugInformationFormat="3"/>
<Tool
Name="VCCustomBuildTool"/>
<Tool
Name="VCLinkerTool"
GenerateDebugInformation="TRUE"
SubSystem="2"
OptimizeReferences="2"
EnableCOMDATFolding="2"
TargetMachine="1"/>
<Tool
Name="VCMIDLTool"/>
<Tool
Name="VCPostBuildEventTool"/>
<Tool
Name="VCPreBuildEventTool"/>
<Tool
Name="VCPreLinkEventTool"/>
<Tool
Name="VCResourceCompilerTool"/>
<Tool
Name="VCWebServiceProxyGeneratorTool"/>
<Tool
Name="VCXMLDataGeneratorTool"/>
<Tool
Name="VCWebDeploymentTool"/>
<Tool
Name="VCManagedWrapperGeneratorTool"/>
<Tool
Name="VCAuxiliaryManagedWrapperGeneratorTool"/>
</Configuration>
</Configurations>
<References>
</References>
<Files>
<File
RelativePath="..\..\include\json\autolink.h">
</File>
<File
RelativePath="..\..\include\json\config.h">
</File>
<File
RelativePath="..\..\include\json\features.h">
</File>
<File
RelativePath="..\..\include\json\forwards.h">
</File>
<File
RelativePath="..\..\include\json\json.h">
</File>
<File
RelativePath="..\..\src\lib_json\json_batchallocator.h">
</File>
<File
RelativePath="..\..\src\lib_json\json_internalarray.inl">
</File>
<File
RelativePath="..\..\src\lib_json\json_internalmap.inl">
</File>
<File
RelativePath="..\..\src\lib_json\json_reader.cpp">
</File>
<File
RelativePath="..\..\src\lib_json\json_value.cpp">
</File>
<File
RelativePath="..\..\src\lib_json\json_valueiterator.inl">
</File>
<File
RelativePath="..\..\src\lib_json\json_writer.cpp">
</File>
<File
RelativePath="..\..\include\json\reader.h">
</File>
<File
RelativePath="..\..\include\json\value.h">
</File>
<File
RelativePath="..\..\include\json\writer.h">
</File>
</Files>
<Globals>
</Globals>
</VisualStudioProject>

View File

@@ -1,130 +0,0 @@
<?xml version="1.0" encoding="Windows-1252"?>
<VisualStudioProject
ProjectType="Visual C++"
Version="7.10"
Name="test_lib_json"
ProjectGUID="{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}"
RootNamespace="test_lib_json"
Keyword="Win32Proj">
<Platforms>
<Platform
Name="Win32"/>
</Platforms>
<Configurations>
<Configuration
Name="Debug|Win32"
OutputDirectory="../../build/vs71/debug/test_lib_json"
IntermediateDirectory="../../build/vs71/debug/test_lib_json"
ConfigurationType="1"
CharacterSet="2">
<Tool
Name="VCCLCompilerTool"
Optimization="0"
AdditionalIncludeDirectories="../../include"
PreprocessorDefinitions="WIN32;_DEBUG;_CONSOLE"
MinimalRebuild="TRUE"
BasicRuntimeChecks="3"
RuntimeLibrary="1"
UsePrecompiledHeader="0"
WarningLevel="3"
Detect64BitPortabilityProblems="TRUE"
DebugInformationFormat="4"/>
<Tool
Name="VCCustomBuildTool"/>
<Tool
Name="VCLinkerTool"
OutputFile="$(OutDir)/test_lib_json.exe"
LinkIncremental="2"
GenerateDebugInformation="TRUE"
ProgramDatabaseFile="$(OutDir)/test_lib_json.pdb"
SubSystem="1"
TargetMachine="1"/>
<Tool
Name="VCMIDLTool"/>
<Tool
Name="VCPostBuildEventTool"
Description="Running all unit tests"
CommandLine="$(TargetPath)"/>
<Tool
Name="VCPreBuildEventTool"/>
<Tool
Name="VCPreLinkEventTool"/>
<Tool
Name="VCResourceCompilerTool"/>
<Tool
Name="VCWebServiceProxyGeneratorTool"/>
<Tool
Name="VCXMLDataGeneratorTool"/>
<Tool
Name="VCWebDeploymentTool"/>
<Tool
Name="VCManagedWrapperGeneratorTool"/>
<Tool
Name="VCAuxiliaryManagedWrapperGeneratorTool"/>
</Configuration>
<Configuration
Name="Release|Win32"
OutputDirectory="../../build/vs71/release/test_lib_json"
IntermediateDirectory="../../build/vs71/release/test_lib_json"
ConfigurationType="1"
CharacterSet="2">
<Tool
Name="VCCLCompilerTool"
AdditionalIncludeDirectories="../../include"
PreprocessorDefinitions="WIN32;NDEBUG;_CONSOLE"
RuntimeLibrary="0"
UsePrecompiledHeader="0"
WarningLevel="3"
Detect64BitPortabilityProblems="TRUE"
DebugInformationFormat="3"/>
<Tool
Name="VCCustomBuildTool"/>
<Tool
Name="VCLinkerTool"
OutputFile="$(OutDir)/test_lib_json.exe"
LinkIncremental="1"
GenerateDebugInformation="TRUE"
SubSystem="1"
OptimizeReferences="2"
EnableCOMDATFolding="2"
TargetMachine="1"/>
<Tool
Name="VCMIDLTool"/>
<Tool
Name="VCPostBuildEventTool"
Description="Running all unit tests"
CommandLine="$(TargetPath)"/>
<Tool
Name="VCPreBuildEventTool"/>
<Tool
Name="VCPreLinkEventTool"/>
<Tool
Name="VCResourceCompilerTool"/>
<Tool
Name="VCWebServiceProxyGeneratorTool"/>
<Tool
Name="VCXMLDataGeneratorTool"/>
<Tool
Name="VCWebDeploymentTool"/>
<Tool
Name="VCManagedWrapperGeneratorTool"/>
<Tool
Name="VCAuxiliaryManagedWrapperGeneratorTool"/>
</Configuration>
</Configurations>
<References>
</References>
<Files>
<File
RelativePath="..\..\src\test_lib_json\jsontest.cpp">
</File>
<File
RelativePath="..\..\src\test_lib_json\jsontest.h">
</File>
<File
RelativePath="..\..\src\test_lib_json\main.cpp">
</File>
</Files>
<Globals>
</Globals>
</VisualStudioProject>

View File

@@ -1,3 +1,8 @@
# Copyright 2010 Baptiste Lepilleur and The JsonCpp Authors
# Distributed under MIT license, or public domain if desired and
# recognized in your jurisdiction.
# See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
"""Tag the sandbox for release, make source and doc tarballs.
Requires Python 2.6
@@ -14,6 +19,7 @@ python makerelease.py 0.5.0 0.6.0-dev
Note: This was for Subversion. Now that we are in GitHub, we do not
need to build versioned tarballs anymore, so makerelease.py is defunct.
"""
from __future__ import print_function
import os.path
import subprocess
@@ -34,57 +40,57 @@ SVN_TAG_ROOT = SVN_ROOT + 'tags/jsoncpp'
SCONS_LOCAL_URL = 'http://sourceforge.net/projects/scons/files/scons-local/1.2.0/scons-local-1.2.0.tar.gz/download'
SOURCEFORGE_PROJECT = 'jsoncpp'
def set_version( version ):
def set_version(version):
with open('version','wb') as f:
f.write( version.strip() )
f.write(version.strip())
def rmdir_if_exist( dir_path ):
if os.path.isdir( dir_path ):
shutil.rmtree( dir_path )
def rmdir_if_exist(dir_path):
if os.path.isdir(dir_path):
shutil.rmtree(dir_path)
class SVNError(Exception):
pass
def svn_command( command, *args ):
def svn_command(command, *args):
cmd = ['svn', '--non-interactive', command] + list(args)
print('Running:', ' '.join( cmd ))
process = subprocess.Popen( cmd,
print('Running:', ' '.join(cmd))
process = subprocess.Popen(cmd,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT )
stderr=subprocess.STDOUT)
stdout = process.communicate()[0]
if process.returncode:
error = SVNError( 'SVN command failed:\n' + stdout )
error = SVNError('SVN command failed:\n' + stdout)
error.returncode = process.returncode
raise error
return stdout
def check_no_pending_commit():
"""Checks that there is no pending commit in the sandbox."""
stdout = svn_command( 'status', '--xml' )
etree = ElementTree.fromstring( stdout )
stdout = svn_command('status', '--xml')
etree = ElementTree.fromstring(stdout)
msg = []
for entry in etree.getiterator( 'entry' ):
for entry in etree.getiterator('entry'):
path = entry.get('path')
status = entry.find('wc-status').get('item')
if status != 'unversioned' and path != 'version':
msg.append( 'File "%s" has pending change (status="%s")' % (path, status) )
msg.append('File "%s" has pending change (status="%s")' % (path, status))
if msg:
msg.insert(0, 'Pending change to commit found in sandbox. Commit them first!' )
return '\n'.join( msg )
msg.insert(0, 'Pending change to commit found in sandbox. Commit them first!')
return '\n'.join(msg)
def svn_join_url( base_url, suffix ):
def svn_join_url(base_url, suffix):
if not base_url.endswith('/'):
base_url += '/'
if suffix.startswith('/'):
suffix = suffix[1:]
return base_url + suffix
def svn_check_if_tag_exist( tag_url ):
def svn_check_if_tag_exist(tag_url):
"""Checks if a tag exist.
Returns: True if the tag exist, False otherwise.
"""
try:
list_stdout = svn_command( 'list', tag_url )
list_stdout = svn_command('list', tag_url)
except SVNError as e:
if e.returncode != 1 or not str(e).find('tag_url'):
raise e
@@ -92,82 +98,82 @@ def svn_check_if_tag_exist( tag_url ):
return False
return True
def svn_commit( message ):
def svn_commit(message):
"""Commit the sandbox, providing the specified comment.
"""
svn_command( 'ci', '-m', message )
svn_command('ci', '-m', message)
def svn_tag_sandbox( tag_url, message ):
def svn_tag_sandbox(tag_url, message):
"""Makes a tag based on the sandbox revisions.
"""
svn_command( 'copy', '-m', message, '.', tag_url )
svn_command('copy', '-m', message, '.', tag_url)
def svn_remove_tag( tag_url, message ):
def svn_remove_tag(tag_url, message):
"""Removes an existing tag.
"""
svn_command( 'delete', '-m', message, tag_url )
svn_command('delete', '-m', message, tag_url)
def svn_export( tag_url, export_dir ):
def svn_export(tag_url, export_dir):
"""Exports the tag_url revision to export_dir.
Target directory, including its parent is created if it does not exist.
If the directory export_dir exist, it is deleted before export proceed.
"""
rmdir_if_exist( export_dir )
svn_command( 'export', tag_url, export_dir )
rmdir_if_exist(export_dir)
svn_command('export', tag_url, export_dir)
def fix_sources_eol( dist_dir ):
def fix_sources_eol(dist_dir):
"""Set file EOL for tarball distribution.
"""
print('Preparing exported source file EOL for distribution...')
prune_dirs = antglob.prune_dirs + 'scons-local* ./build* ./libs ./dist'
win_sources = antglob.glob( dist_dir,
win_sources = antglob.glob(dist_dir,
includes = '**/*.sln **/*.vcproj',
prune_dirs = prune_dirs )
unix_sources = antglob.glob( dist_dir,
prune_dirs = prune_dirs)
unix_sources = antglob.glob(dist_dir,
includes = '''**/*.h **/*.cpp **/*.inl **/*.txt **/*.dox **/*.py **/*.html **/*.in
sconscript *.json *.expected AUTHORS LICENSE''',
excludes = antglob.default_excludes + 'scons.py sconsign.py scons-*',
prune_dirs = prune_dirs )
prune_dirs = prune_dirs)
for path in win_sources:
fixeol.fix_source_eol( path, is_dry_run = False, verbose = True, eol = '\r\n' )
fixeol.fix_source_eol(path, is_dry_run = False, verbose = True, eol = '\r\n')
for path in unix_sources:
fixeol.fix_source_eol( path, is_dry_run = False, verbose = True, eol = '\n' )
fixeol.fix_source_eol(path, is_dry_run = False, verbose = True, eol = '\n')
def download( url, target_path ):
def download(url, target_path):
"""Download file represented by url to target_path.
"""
f = urllib2.urlopen( url )
f = urllib2.urlopen(url)
try:
data = f.read()
finally:
f.close()
fout = open( target_path, 'wb' )
fout = open(target_path, 'wb')
try:
fout.write( data )
fout.write(data)
finally:
fout.close()
def check_compile( distcheck_top_dir, platform ):
def check_compile(distcheck_top_dir, platform):
cmd = [sys.executable, 'scons.py', 'platform=%s' % platform, 'check']
print('Running:', ' '.join( cmd ))
log_path = os.path.join( distcheck_top_dir, 'build-%s.log' % platform )
flog = open( log_path, 'wb' )
print('Running:', ' '.join(cmd))
log_path = os.path.join(distcheck_top_dir, 'build-%s.log' % platform)
flog = open(log_path, 'wb')
try:
process = subprocess.Popen( cmd,
process = subprocess.Popen(cmd,
stdout=flog,
stderr=subprocess.STDOUT,
cwd=distcheck_top_dir )
cwd=distcheck_top_dir)
stdout = process.communicate()[0]
status = (process.returncode == 0)
finally:
flog.close()
return (status, log_path)
def write_tempfile( content, **kwargs ):
fd, path = tempfile.mkstemp( **kwargs )
f = os.fdopen( fd, 'wt' )
def write_tempfile(content, **kwargs):
fd, path = tempfile.mkstemp(**kwargs)
f = os.fdopen(fd, 'wt')
try:
f.write( content )
f.write(content)
finally:
f.close()
return path
@@ -175,34 +181,34 @@ def write_tempfile( content, **kwargs ):
class SFTPError(Exception):
pass
def run_sftp_batch( userhost, sftp, batch, retry=0 ):
path = write_tempfile( batch, suffix='.sftp', text=True )
def run_sftp_batch(userhost, sftp, batch, retry=0):
path = write_tempfile(batch, suffix='.sftp', text=True)
# psftp -agent -C blep,jsoncpp@web.sourceforge.net -batch -b batch.sftp -bc
cmd = [sftp, '-agent', '-C', '-batch', '-b', path, '-bc', userhost]
error = None
for retry_index in range(0, max(1,retry)):
heading = retry_index == 0 and 'Running:' or 'Retrying:'
print(heading, ' '.join( cmd ))
process = subprocess.Popen( cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT )
print(heading, ' '.join(cmd))
process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
stdout = process.communicate()[0]
if process.returncode != 0:
error = SFTPError( 'SFTP batch failed:\n' + stdout )
error = SFTPError('SFTP batch failed:\n' + stdout)
else:
break
if error:
raise error
return stdout
def sourceforge_web_synchro( sourceforge_project, doc_dir,
user=None, sftp='sftp' ):
def sourceforge_web_synchro(sourceforge_project, doc_dir,
user=None, sftp='sftp'):
"""Notes: does not synchronize sub-directory of doc-dir.
"""
userhost = '%s,%s@web.sourceforge.net' % (user, sourceforge_project)
stdout = run_sftp_batch( userhost, sftp, """
stdout = run_sftp_batch(userhost, sftp, """
cd htdocs
dir
exit
""" )
""")
existing_paths = set()
collect = 0
for line in stdout.split('\n'):
@@ -216,15 +222,15 @@ exit
elif collect == 2:
path = line.strip().split()[-1:]
if path and path[0] not in ('.', '..'):
existing_paths.add( path[0] )
upload_paths = set( [os.path.basename(p) for p in antglob.glob( doc_dir )] )
existing_paths.add(path[0])
upload_paths = set([os.path.basename(p) for p in antglob.glob(doc_dir)])
paths_to_remove = existing_paths - upload_paths
if paths_to_remove:
print('Removing the following file from web:')
print('\n'.join( paths_to_remove ))
stdout = run_sftp_batch( userhost, sftp, """cd htdocs
print('\n'.join(paths_to_remove))
stdout = run_sftp_batch(userhost, sftp, """cd htdocs
rm %s
exit""" % ' '.join(paths_to_remove) )
exit""" % ' '.join(paths_to_remove))
print('Uploading %d files:' % len(upload_paths))
batch_size = 10
upload_paths = list(upload_paths)
@@ -235,17 +241,17 @@ exit""" % ' '.join(paths_to_remove) )
remaining_files = len(upload_paths) - index
remaining_sec = file_per_sec * remaining_files
print('%d/%d, ETA=%.1fs' % (index+1, len(upload_paths), remaining_sec))
run_sftp_batch( userhost, sftp, """cd htdocs
run_sftp_batch(userhost, sftp, """cd htdocs
lcd %s
mput %s
exit""" % (doc_dir, ' '.join(paths) ), retry=3 )
exit""" % (doc_dir, ' '.join(paths)), retry=3)
def sourceforge_release_tarball( sourceforge_project, paths, user=None, sftp='sftp' ):
def sourceforge_release_tarball(sourceforge_project, paths, user=None, sftp='sftp'):
userhost = '%s,%s@frs.sourceforge.net' % (user, sourceforge_project)
run_sftp_batch( userhost, sftp, """
run_sftp_batch(userhost, sftp, """
mput %s
exit
""" % (' '.join(paths),) )
""" % (' '.join(paths),))
def main():
@@ -259,7 +265,7 @@ Performs an svn export of tag release version, and build a source tarball.
Must be started in the project top directory.
Warning: --force should only be used when developping/testing the release script.
Warning: --force should only be used when developing/testing the release script.
"""
from optparse import OptionParser
parser = OptionParser(usage=usage)
@@ -286,12 +292,12 @@ Warning: --force should only be used when developping/testing the release script
options, args = parser.parse_args()
if len(args) != 2:
parser.error( 'release_version missing on command-line.' )
parser.error('release_version missing on command-line.')
release_version = args[0]
next_version = args[1]
if not options.platforms and not options.no_test:
parser.error( 'You must specify either --platform or --no-test option.' )
parser.error('You must specify either --platform or --no-test option.')
if options.ignore_pending_commit:
msg = ''
@@ -299,86 +305,86 @@ Warning: --force should only be used when developping/testing the release script
msg = check_no_pending_commit()
if not msg:
print('Setting version to', release_version)
set_version( release_version )
svn_commit( 'Release ' + release_version )
tag_url = svn_join_url( SVN_TAG_ROOT, release_version )
if svn_check_if_tag_exist( tag_url ):
set_version(release_version)
svn_commit('Release ' + release_version)
tag_url = svn_join_url(SVN_TAG_ROOT, release_version)
if svn_check_if_tag_exist(tag_url):
if options.retag_release:
svn_remove_tag( tag_url, 'Overwriting previous tag' )
svn_remove_tag(tag_url, 'Overwriting previous tag')
else:
print('Aborting, tag %s already exist. Use --retag to overwrite it!' % tag_url)
sys.exit( 1 )
svn_tag_sandbox( tag_url, 'Release ' + release_version )
sys.exit(1)
svn_tag_sandbox(tag_url, 'Release ' + release_version)
print('Generated doxygen document...')
## doc_dirname = r'jsoncpp-api-html-0.5.0'
## doc_tarball_path = r'e:\prg\vc\Lib\jsoncpp-trunk\dist\jsoncpp-api-html-0.5.0.tar.gz'
doc_tarball_path, doc_dirname = doxybuild.build_doc( options, make_release=True )
doc_tarball_path, doc_dirname = doxybuild.build_doc(options, make_release=True)
doc_distcheck_dir = 'dist/doccheck'
tarball.decompress( doc_tarball_path, doc_distcheck_dir )
doc_distcheck_top_dir = os.path.join( doc_distcheck_dir, doc_dirname )
tarball.decompress(doc_tarball_path, doc_distcheck_dir)
doc_distcheck_top_dir = os.path.join(doc_distcheck_dir, doc_dirname)
export_dir = 'dist/export'
svn_export( tag_url, export_dir )
fix_sources_eol( export_dir )
svn_export(tag_url, export_dir)
fix_sources_eol(export_dir)
source_dir = 'jsoncpp-src-' + release_version
source_tarball_path = 'dist/%s.tar.gz' % source_dir
print('Generating source tarball to', source_tarball_path)
tarball.make_tarball( source_tarball_path, [export_dir], export_dir, prefix_dir=source_dir )
tarball.make_tarball(source_tarball_path, [export_dir], export_dir, prefix_dir=source_dir)
amalgamation_tarball_path = 'dist/%s-amalgamation.tar.gz' % source_dir
print('Generating amalgamation source tarball to', amalgamation_tarball_path)
amalgamation_dir = 'dist/amalgamation'
amalgamate.amalgamate_source( export_dir, '%s/jsoncpp.cpp' % amalgamation_dir, 'json/json.h' )
amalgamate.amalgamate_source(export_dir, '%s/jsoncpp.cpp' % amalgamation_dir, 'json/json.h')
amalgamation_source_dir = 'jsoncpp-src-amalgamation' + release_version
tarball.make_tarball( amalgamation_tarball_path, [amalgamation_dir],
amalgamation_dir, prefix_dir=amalgamation_source_dir )
tarball.make_tarball(amalgamation_tarball_path, [amalgamation_dir],
amalgamation_dir, prefix_dir=amalgamation_source_dir)
# Decompress source tarball, download and install scons-local
distcheck_dir = 'dist/distcheck'
distcheck_top_dir = distcheck_dir + '/' + source_dir
print('Decompressing source tarball to', distcheck_dir)
rmdir_if_exist( distcheck_dir )
tarball.decompress( source_tarball_path, distcheck_dir )
rmdir_if_exist(distcheck_dir)
tarball.decompress(source_tarball_path, distcheck_dir)
scons_local_path = 'dist/scons-local.tar.gz'
print('Downloading scons-local to', scons_local_path)
download( SCONS_LOCAL_URL, scons_local_path )
download(SCONS_LOCAL_URL, scons_local_path)
print('Decompressing scons-local to', distcheck_top_dir)
tarball.decompress( scons_local_path, distcheck_top_dir )
tarball.decompress(scons_local_path, distcheck_top_dir)
# Run compilation
print('Compiling decompressed tarball')
all_build_status = True
for platform in options.platforms.split(','):
print('Testing platform:', platform)
build_status, log_path = check_compile( distcheck_top_dir, platform )
build_status, log_path = check_compile(distcheck_top_dir, platform)
print('see build log:', log_path)
print(build_status and '=> ok' or '=> FAILED')
all_build_status = all_build_status and build_status
if not build_status:
print('Testing failed on at least one platform, aborting...')
svn_remove_tag( tag_url, 'Removing tag due to failed testing' )
svn_remove_tag(tag_url, 'Removing tag due to failed testing')
sys.exit(1)
if options.user:
if not options.no_web:
print('Uploading documentation using user', options.user)
sourceforge_web_synchro( SOURCEFORGE_PROJECT, doc_distcheck_top_dir, user=options.user, sftp=options.sftp )
sourceforge_web_synchro(SOURCEFORGE_PROJECT, doc_distcheck_top_dir, user=options.user, sftp=options.sftp)
print('Completed documentation upload')
print('Uploading source and documentation tarballs for release using user', options.user)
sourceforge_release_tarball( SOURCEFORGE_PROJECT,
sourceforge_release_tarball(SOURCEFORGE_PROJECT,
[source_tarball_path, doc_tarball_path],
user=options.user, sftp=options.sftp )
user=options.user, sftp=options.sftp)
print('Source and doc release tarballs uploaded')
else:
print('No upload user specified. Web site and download tarbal were not uploaded.')
print('No upload user specified. Web site and download tarball were not uploaded.')
print('Tarball can be found at:', doc_tarball_path)
# Set next version number and commit
set_version( next_version )
svn_commit( 'Released ' + release_version )
set_version(next_version)
svn_commit('Released ' + release_version)
else:
sys.stderr.write( msg + '\n' )
sys.stderr.write(msg + '\n')
if __name__ == '__main__':
main()

116
meson.build Normal file
View File

@@ -0,0 +1,116 @@
project(
'jsoncpp',
'cpp',
version : '1.9.0',
default_options : [
'buildtype=release',
'cpp_std=c++11',
'warning_level=1'],
license : 'Public Domain',
meson_version : '>= 0.50.0')
jsoncpp_ver_arr = meson.project_version().split('.')
jsoncpp_major_version = jsoncpp_ver_arr[0]
jsoncpp_minor_version = jsoncpp_ver_arr[1]
jsoncpp_patch_version = jsoncpp_ver_arr[2]
jsoncpp_cdata = configuration_data()
jsoncpp_cdata.set('JSONCPP_VERSION', meson.project_version())
jsoncpp_cdata.set('JSONCPP_VERSION_MAJOR', jsoncpp_major_version)
jsoncpp_cdata.set('JSONCPP_VERSION_MINOR', jsoncpp_minor_version)
jsoncpp_cdata.set('JSONCPP_VERSION_PATCH', jsoncpp_patch_version)
jsoncpp_cdata.set('JSONCPP_USE_SECURE_MEMORY',0)
jsoncpp_gen_sources = configure_file(
input : 'src/lib_json/version.h.in',
output : 'version.h',
configuration : jsoncpp_cdata,
install : true,
install_dir : join_paths(get_option('prefix'), get_option('includedir'), 'json')
)
jsoncpp_headers = [
'include/json/allocator.h',
'include/json/assertions.h',
'include/json/autolink.h',
'include/json/config.h',
'include/json/features.h',
'include/json/forwards.h',
'include/json/json.h',
'include/json/reader.h',
'include/json/value.h',
'include/json/writer.h']
jsoncpp_include_directories = include_directories('include')
install_headers(
jsoncpp_headers,
subdir : 'json')
if get_option('default_library') == 'shared' and meson.get_compiler('cpp').get_id() == 'msvc'
dll_export_flag = '-DJSON_DLL_BUILD'
dll_import_flag = '-DJSON_DLL'
else
dll_export_flag = []
dll_import_flag = []
endif
jsoncpp_lib = library(
'jsoncpp',
[ jsoncpp_gen_sources,
jsoncpp_headers,
'src/lib_json/json_tool.h',
'src/lib_json/json_reader.cpp',
'src/lib_json/json_value.cpp',
'src/lib_json/json_writer.cpp'],
soversion : 21,
install : true,
include_directories : jsoncpp_include_directories,
cpp_args: dll_export_flag)
import('pkgconfig').generate(
libraries : jsoncpp_lib,
version : meson.project_version(),
name : 'jsoncpp',
filebase : 'jsoncpp',
description : 'A C++ library for interacting with JSON')
# for libraries bundling jsoncpp
jsoncpp_dep = declare_dependency(
include_directories : jsoncpp_include_directories,
link_with : jsoncpp_lib,
version : meson.project_version(),
sources : jsoncpp_gen_sources)
# tests
python = import('python3').find_python()
jsoncpp_test = executable(
'jsoncpp_test',
[ 'src/test_lib_json/jsontest.cpp',
'src/test_lib_json/jsontest.h',
'src/test_lib_json/main.cpp',
'src/test_lib_json/fuzz.cpp'],
include_directories : jsoncpp_include_directories,
link_with : jsoncpp_lib,
install : false,
cpp_args: dll_import_flag)
test(
'unittest_jsoncpp_test',
jsoncpp_test)
jsontestrunner = executable(
'jsontestrunner',
'src/jsontestrunner/main.cpp',
include_directories : jsoncpp_include_directories,
link_with : jsoncpp_lib,
install : false,
cpp_args: dll_import_flag)
test(
'unittest_jsontestrunner',
python,
args : [
'-B',
join_paths(meson.current_source_dir(), 'test/runjsontests.py'),
jsontestrunner,
join_paths(meson.current_source_dir(), 'test/data')]
)

View File

@@ -1,7 +1,5 @@
prefix=@CMAKE_INSTALL_PREFIX@
exec_prefix=${prefix}
libdir=${exec_prefix}/@LIBRARY_INSTALL_DIR@
includedir=${prefix}/@INCLUDE_INSTALL_DIR@
libdir=@CMAKE_INSTALL_FULL_LIBDIR@
includedir=@CMAKE_INSTALL_FULL_INCLUDEDIR@
Name: jsoncpp
Description: A C++ library for interacting with JSON

View File

@@ -1,53 +0,0 @@
import fnmatch
import os
def generate( env ):
def Glob( env, includes = None, excludes = None, dir = '.' ):
"""Adds Glob( includes = Split( '*' ), excludes = None, dir = '.')
helper function to environment.
Glob both the file-system files.
includes: list of file name pattern included in the return list when matched.
excludes: list of file name pattern exluced from the return list.
Example:
sources = env.Glob( ("*.cpp", '*.h'), "~*.cpp", "#src" )
"""
def filterFilename(path):
abs_path = os.path.join( dir, path )
if not os.path.isfile(abs_path):
return 0
fn = os.path.basename(path)
match = 0
for include in includes:
if fnmatch.fnmatchcase( fn, include ):
match = 1
break
if match == 1 and not excludes is None:
for exclude in excludes:
if fnmatch.fnmatchcase( fn, exclude ):
match = 0
break
return match
if includes is None:
includes = ('*',)
elif type(includes) in ( type(''), type(u'') ):
includes = (includes,)
if type(excludes) in ( type(''), type(u'') ):
excludes = (excludes,)
dir = env.Dir(dir).abspath
paths = os.listdir( dir )
def makeAbsFileNode( path ):
return env.File( os.path.join( dir, path ) )
nodes = filter( filterFilename, paths )
return map( makeAbsFileNode, nodes )
from SCons.Script import Environment
Environment.Glob = Glob
def exists(env):
"""
Tool always exists.
"""
return True

View File

@@ -1,179 +0,0 @@
import os
import os.path
from fnmatch import fnmatch
import targz
##def DoxyfileParse(file_contents):
## """
## Parse a Doxygen source file and return a dictionary of all the values.
## Values will be strings and lists of strings.
## """
## data = {}
##
## import shlex
## lex = shlex.shlex(instream = file_contents, posix = True)
## lex.wordchars += "*+./-:"
## lex.whitespace = lex.whitespace.replace("\n", "")
## lex.escape = ""
##
## lineno = lex.lineno
## last_backslash_lineno = lineno
## token = lex.get_token()
## key = token # the first token should be a key
## last_token = ""
## key_token = False
## next_key = False
## new_data = True
##
## def append_data(data, key, new_data, token):
## if new_data or len(data[key]) == 0:
## data[key].append(token)
## else:
## data[key][-1] += token
##
## while token:
## if token in ['\n']:
## if last_token not in ['\\']:
## key_token = True
## elif token in ['\\']:
## pass
## elif key_token:
## key = token
## key_token = False
## else:
## if token == "+=":
## if not data.has_key(key):
## data[key] = list()
## elif token == "=":
## data[key] = list()
## else:
## append_data( data, key, new_data, token )
## new_data = True
##
## last_token = token
## token = lex.get_token()
##
## if last_token == '\\' and token != '\n':
## new_data = False
## append_data( data, key, new_data, '\\' )
##
## # compress lists of len 1 into single strings
## for (k, v) in data.items():
## if len(v) == 0:
## data.pop(k)
##
## # items in the following list will be kept as lists and not converted to strings
## if k in ["INPUT", "FILE_PATTERNS", "EXCLUDE_PATTERNS"]:
## continue
##
## if len(v) == 1:
## data[k] = v[0]
##
## return data
##
##def DoxySourceScan(node, env, path):
## """
## Doxygen Doxyfile source scanner. This should scan the Doxygen file and add
## any files used to generate docs to the list of source files.
## """
## default_file_patterns = [
## '*.c', '*.cc', '*.cxx', '*.cpp', '*.c++', '*.java', '*.ii', '*.ixx',
## '*.ipp', '*.i++', '*.inl', '*.h', '*.hh ', '*.hxx', '*.hpp', '*.h++',
## '*.idl', '*.odl', '*.cs', '*.php', '*.php3', '*.inc', '*.m', '*.mm',
## '*.py',
## ]
##
## default_exclude_patterns = [
## '*~',
## ]
##
## sources = []
##
## data = DoxyfileParse(node.get_contents())
##
## if data.get("RECURSIVE", "NO") == "YES":
## recursive = True
## else:
## recursive = False
##
## file_patterns = data.get("FILE_PATTERNS", default_file_patterns)
## exclude_patterns = data.get("EXCLUDE_PATTERNS", default_exclude_patterns)
##
## for node in data.get("INPUT", []):
## if os.path.isfile(node):
## sources.add(node)
## elif os.path.isdir(node):
## if recursive:
## for root, dirs, files in os.walk(node):
## for f in files:
## filename = os.path.join(root, f)
##
## pattern_check = reduce(lambda x, y: x or bool(fnmatch(filename, y)), file_patterns, False)
## exclude_check = reduce(lambda x, y: x and fnmatch(filename, y), exclude_patterns, True)
##
## if pattern_check and not exclude_check:
## sources.append(filename)
## else:
## for pattern in file_patterns:
## sources.extend(glob.glob("/".join([node, pattern])))
## sources = map( lambda path: env.File(path), sources )
## return sources
##
##
##def DoxySourceScanCheck(node, env):
## """Check if we should scan this file"""
## return os.path.isfile(node.path)
def srcDistEmitter(source, target, env):
## """Doxygen Doxyfile emitter"""
## # possible output formats and their default values and output locations
## output_formats = {
## "HTML": ("YES", "html"),
## "LATEX": ("YES", "latex"),
## "RTF": ("NO", "rtf"),
## "MAN": ("YES", "man"),
## "XML": ("NO", "xml"),
## }
##
## data = DoxyfileParse(source[0].get_contents())
##
## targets = []
## out_dir = data.get("OUTPUT_DIRECTORY", ".")
##
## # add our output locations
## for (k, v) in output_formats.items():
## if data.get("GENERATE_" + k, v[0]) == "YES":
## targets.append(env.Dir( os.path.join(out_dir, data.get(k + "_OUTPUT", v[1]))) )
##
## # don't clobber targets
## for node in targets:
## env.Precious(node)
##
## # set up cleaning stuff
## for node in targets:
## env.Clean(node, node)
##
## return (targets, source)
return (target,source)
def generate(env):
"""
Add builders and construction variables for the
SrcDist tool.
"""
## doxyfile_scanner = env.Scanner(
## DoxySourceScan,
## "DoxySourceScan",
## scan_check = DoxySourceScanCheck,
## )
if targz.exists(env):
srcdist_builder = targz.makeBuilder( srcDistEmitter )
env['BUILDERS']['SrcDist'] = srcdist_builder
def exists(env):
"""
Make sure srcdist exists.
"""
return targz.exists(env)

View File

@@ -1,80 +0,0 @@
import re
from SCons.Script import * # the usual scons stuff you get in a SConscript
import collections
def generate(env):
"""
Add builders and construction variables for the
SubstInFile tool.
Adds SubstInFile builder, which substitutes the keys->values of SUBST_DICT
from the source to the target.
The values of SUBST_DICT first have any construction variables expanded
(its keys are not expanded).
If a value of SUBST_DICT is a python callable function, it is called and
the result is expanded as the value.
If there's more than one source and more than one target, each target gets
substituted from the corresponding source.
"""
def do_subst_in_file(targetfile, sourcefile, dict):
"""Replace all instances of the keys of dict with their values.
For example, if dict is {'%VERSION%': '1.2345', '%BASE%': 'MyProg'},
then all instances of %VERSION% in the file will be replaced with 1.2345 etc.
"""
try:
f = open(sourcefile, 'rb')
contents = f.read()
f.close()
except:
raise SCons.Errors.UserError("Can't read source file %s"%sourcefile)
for (k,v) in list(dict.items()):
contents = re.sub(k, v, contents)
try:
f = open(targetfile, 'wb')
f.write(contents)
f.close()
except:
raise SCons.Errors.UserError("Can't write target file %s"%targetfile)
return 0 # success
def subst_in_file(target, source, env):
if 'SUBST_DICT' not in env:
raise SCons.Errors.UserError("SubstInFile requires SUBST_DICT to be set.")
d = dict(env['SUBST_DICT']) # copy it
for (k,v) in list(d.items()):
if isinstance(v, collections.Callable):
d[k] = env.subst(v()).replace('\\','\\\\')
elif SCons.Util.is_String(v):
d[k] = env.subst(v).replace('\\','\\\\')
else:
raise SCons.Errors.UserError("SubstInFile: key %s: %s must be a string or callable"%(k, repr(v)))
for (t,s) in zip(target, source):
return do_subst_in_file(str(t), str(s), d)
def subst_in_file_string(target, source, env):
"""This is what gets printed on the console."""
return '\n'.join(['Substituting vars from %s into %s'%(str(s), str(t))
for (t,s) in zip(target, source)])
def subst_emitter(target, source, env):
"""Add dependency from substituted SUBST_DICT to target.
Returns original target, source tuple unchanged.
"""
d = env['SUBST_DICT'].copy() # copy it
for (k,v) in list(d.items()):
if isinstance(v, collections.Callable):
d[k] = env.subst(v())
elif SCons.Util.is_String(v):
d[k]=env.subst(v)
Depends(target, SCons.Node.Python.Value(d))
return target, source
## env.Append(TOOLS = 'substinfile') # this should be automaticaly done by Scons ?!?
subst_action = SCons.Action.Action( subst_in_file, subst_in_file_string )
env['BUILDERS']['SubstInFile'] = Builder(action=subst_action, emitter=subst_emitter)
def exists(env):
"""
Make sure tool exists.
"""
return True

View File

@@ -1,82 +0,0 @@
"""tarball
Tool-specific initialization for tarball.
"""
## Commands to tackle a command based implementation:
##to unpack on the fly...
##gunzip < FILE.tar.gz | tar xvf -
##to pack on the fly...
##tar cvf - FILE-LIST | gzip -c > FILE.tar.gz
import os.path
import SCons.Builder
import SCons.Node.FS
import SCons.Util
try:
import gzip
import tarfile
internal_targz = 1
except ImportError:
internal_targz = 0
TARGZ_DEFAULT_COMPRESSION_LEVEL = 9
if internal_targz:
def targz(target, source, env):
def archive_name( path ):
path = os.path.normpath( os.path.abspath( path ) )
common_path = os.path.commonprefix( (base_dir, path) )
archive_name = path[len(common_path):]
return archive_name
def visit(tar, dirname, names):
for name in names:
path = os.path.join(dirname, name)
if os.path.isfile(path):
tar.add(path, archive_name(path) )
compression = env.get('TARGZ_COMPRESSION_LEVEL',TARGZ_DEFAULT_COMPRESSION_LEVEL)
base_dir = os.path.normpath( env.get('TARGZ_BASEDIR', env.Dir('.')).abspath )
target_path = str(target[0])
fileobj = gzip.GzipFile( target_path, 'wb', compression )
tar = tarfile.TarFile(os.path.splitext(target_path)[0], 'w', fileobj)
for source in source:
source_path = str(source)
if source.isdir():
os.path.walk(source_path, visit, tar)
else:
tar.add(source_path, archive_name(source_path) ) # filename, arcname
tar.close()
targzAction = SCons.Action.Action(targz, varlist=['TARGZ_COMPRESSION_LEVEL','TARGZ_BASEDIR'])
def makeBuilder( emitter = None ):
return SCons.Builder.Builder(action = SCons.Action.Action('$TARGZ_COM', '$TARGZ_COMSTR'),
source_factory = SCons.Node.FS.Entry,
source_scanner = SCons.Defaults.DirScanner,
suffix = '$TARGZ_SUFFIX',
multi = 1)
TarGzBuilder = makeBuilder()
def generate(env):
"""Add Builders and construction variables for zip to an Environment.
The following environnement variables may be set:
TARGZ_COMPRESSION_LEVEL: integer, [0-9]. 0: no compression, 9: best compression (same as gzip compression level).
TARGZ_BASEDIR: base-directory used to determine archive name (this allow archive name to be relative
to something other than top-dir).
"""
env['BUILDERS']['TarGz'] = TarGzBuilder
env['TARGZ_COM'] = targzAction
env['TARGZ_COMPRESSION_LEVEL'] = TARGZ_DEFAULT_COMPRESSION_LEVEL # range 0-9
env['TARGZ_SUFFIX'] = '.tar.gz'
env['TARGZ_BASEDIR'] = env.Dir('.') # Sources archive name are made relative to that directory.
else:
def generate(env):
pass
def exists(env):
return internal_targz

View File

@@ -1,5 +1,5 @@
ADD_SUBDIRECTORY(lib_json)
IF(JSONCPP_WITH_TESTS)
ADD_SUBDIRECTORY(jsontestrunner)
ADD_SUBDIRECTORY(test_lib_json)
ENDIF(JSONCPP_WITH_TESTS)
add_subdirectory(lib_json)
if(JSONCPP_WITH_TESTS)
add_subdirectory(jsontestrunner)
add_subdirectory(test_lib_json)
endif()

View File

@@ -1,22 +1,36 @@
FIND_PACKAGE(PythonInterp 2.6 REQUIRED)
find_package(PythonInterp 2.6)
IF(JSONCPP_LIB_BUILD_SHARED)
ADD_DEFINITIONS( -DJSON_DLL )
ENDIF(JSONCPP_LIB_BUILD_SHARED)
ADD_EXECUTABLE(jsontestrunner_exe
add_executable(jsontestrunner_exe
main.cpp
)
TARGET_LINK_LIBRARIES(jsontestrunner_exe jsoncpp_lib)
SET_TARGET_PROPERTIES(jsontestrunner_exe PROPERTIES OUTPUT_NAME jsontestrunner_exe)
IF(PYTHONINTERP_FOUND)
if(BUILD_SHARED_LIBS)
add_compile_definitions( JSON_DLL )
endif()
target_link_libraries(jsontestrunner_exe jsoncpp_lib)
set_target_properties(jsontestrunner_exe PROPERTIES OUTPUT_NAME jsontestrunner_exe)
if(PYTHONINTERP_FOUND)
# Run end to end parser/writer tests
SET(TEST_DIR ${CMAKE_CURRENT_SOURCE_DIR}/../../test)
SET(RUNJSONTESTS_PATH ${TEST_DIR}/runjsontests.py)
ADD_CUSTOM_TARGET(jsoncpp_readerwriter_tests ALL
set(TEST_DIR ${CMAKE_CURRENT_SOURCE_DIR}/../../test)
set(RUNJSONTESTS_PATH ${TEST_DIR}/runjsontests.py)
# Run unit tests in post-build
# (default cmake workflow hides away the test result into a file, resulting in poor dev workflow?!?)
add_custom_target(jsoncpp_readerwriter_tests
"${PYTHON_EXECUTABLE}" -B "${RUNJSONTESTS_PATH}" $<TARGET_FILE:jsontestrunner_exe> "${TEST_DIR}/data"
DEPENDS jsontestrunner_exe jsoncpp_test
)
ADD_CUSTOM_TARGET(jsoncpp_check DEPENDS jsoncpp_readerwriter_tests)
ENDIF(PYTHONINTERP_FOUND)
add_custom_target(jsoncpp_check DEPENDS jsoncpp_readerwriter_tests)
## Create tests for dashboard submission, allows easy review of CI results https://my.cdash.org/index.php?project=jsoncpp
add_test(NAME jsoncpp_readerwriter
COMMAND "${PYTHON_EXECUTABLE}" -B "${RUNJSONTESTS_PATH}" $<TARGET_FILE:jsontestrunner_exe> "${TEST_DIR}/data"
WORKING_DIRECTORY "${TEST_DIR}/data"
)
add_test(NAME jsoncpp_readerwriter_json_checker
COMMAND "${PYTHON_EXECUTABLE}" -B "${RUNJSONTESTS_PATH}" --with-json-checker $<TARGET_FILE:jsontestrunner_exe> "${TEST_DIR}/data"
WORKING_DIRECTORY "${TEST_DIR}/data"
)
endif()

View File

@@ -1,39 +1,47 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Copyright 2007-2010 Baptiste Lepilleur and The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
#if defined(__GNUC__)
#pragma GCC diagnostic push
#pragma GCC diagnostic ignored "-Wdeprecated-declarations"
#elif defined(_MSC_VER)
#pragma warning(disable : 4996)
#endif
/* This executable is used for testing parser/writer using real JSON files.
*/
#include <json/json.h>
#include <algorithm> // sort
#include <stdio.h>
#include <cstdio>
#include <json/json.h>
#include <sstream>
#if defined(_MSC_VER) && _MSC_VER >= 1310
#pragma warning(disable : 4996) // disable fopen deprecation warning
#endif
struct Options {
Json::String path;
Json::Features features;
bool parseOnly;
using writeFuncType = Json::String (*)(Json::Value const&);
writeFuncType write;
};
static std::string normalizeFloatingPointStr(double value) {
static Json::String normalizeFloatingPointStr(double value) {
char buffer[32];
#if defined(_MSC_VER) && defined(__STDC_SECURE_LIB__)
sprintf_s(buffer, sizeof(buffer), "%.16g", value);
#else
snprintf(buffer, sizeof(buffer), "%.16g", value);
#endif
jsoncpp_snprintf(buffer, sizeof(buffer), "%.16g", value);
buffer[sizeof(buffer) - 1] = 0;
std::string s(buffer);
std::string::size_type index = s.find_last_of("eE");
if (index != std::string::npos) {
std::string::size_type hasSign =
Json::String s(buffer);
Json::String::size_type index = s.find_last_of("eE");
if (index != Json::String::npos) {
Json::String::size_type hasSign =
(s[index + 1] == '+' || s[index + 1] == '-') ? 1 : 0;
std::string::size_type exponentStartIndex = index + 1 + hasSign;
std::string normalized = s.substr(0, exponentStartIndex);
std::string::size_type indexDigit =
Json::String::size_type exponentStartIndex = index + 1 + hasSign;
Json::String normalized = s.substr(0, exponentStartIndex);
Json::String::size_type indexDigit =
s.find_first_not_of('0', exponentStartIndex);
std::string exponent = "0";
if (indexDigit !=
std::string::npos) // There is an exponent different from 0
Json::String exponent = "0";
if (indexDigit != Json::String::npos) // There is an exponent different
// from 0
{
exponent = s.substr(indexDigit);
}
@@ -42,17 +50,18 @@ static std::string normalizeFloatingPointStr(double value) {
return s;
}
static std::string readInputTestFile(const char* path) {
static Json::String readInputTestFile(const char* path) {
FILE* file = fopen(path, "rb");
if (!file)
return std::string("");
return "";
fseek(file, 0, SEEK_END);
long size = ftell(file);
long const size = ftell(file);
size_t const usize = static_cast<unsigned long>(size);
fseek(file, 0, SEEK_SET);
std::string text;
char* buffer = new char[size + 1];
buffer[size] = 0;
if (fread(buffer, 1, size, file) == (unsigned long)size)
Json::String text;
if (fread(buffer, 1, usize, file) == usize)
text = buffer;
fclose(file);
delete[] buffer;
@@ -60,7 +69,7 @@ static std::string readInputTestFile(const char* path) {
}
static void
printValueTree(FILE* fout, Json::Value& value, const std::string& path = ".") {
printValueTree(FILE* fout, Json::Value& value, const Json::String& path = ".") {
if (value.hasComment(Json::commentBefore)) {
fprintf(fout, "%s\n", value.getComment(Json::commentBefore).c_str());
}
@@ -69,21 +78,15 @@ printValueTree(FILE* fout, Json::Value& value, const std::string& path = ".") {
fprintf(fout, "%s=null\n", path.c_str());
break;
case Json::intValue:
fprintf(fout,
"%s=%s\n",
path.c_str(),
fprintf(fout, "%s=%s\n", path.c_str(),
Json::valueToString(value.asLargestInt()).c_str());
break;
case Json::uintValue:
fprintf(fout,
"%s=%s\n",
path.c_str(),
fprintf(fout, "%s=%s\n", path.c_str(),
Json::valueToString(value.asLargestUInt()).c_str());
break;
case Json::realValue:
fprintf(fout,
"%s=%s\n",
path.c_str(),
fprintf(fout, "%s=%s\n", path.c_str(),
normalizeFloatingPointStr(value.asDouble()).c_str());
break;
case Json::stringValue:
@@ -94,14 +97,10 @@ printValueTree(FILE* fout, Json::Value& value, const std::string& path = ".") {
break;
case Json::arrayValue: {
fprintf(fout, "%s=[]\n", path.c_str());
int size = value.size();
for (int index = 0; index < size; ++index) {
Json::ArrayIndex size = value.size();
for (Json::ArrayIndex index = 0; index < size; ++index) {
static char buffer[16];
#if defined(_MSC_VER) && defined(__STDC_SECURE_LIB__)
sprintf_s(buffer, sizeof(buffer), "[%d]", index);
#else
snprintf(buffer, sizeof(buffer), "[%d]", index);
#endif
jsoncpp_snprintf(buffer, sizeof(buffer), "[%u]", index);
printValueTree(fout, value[index], path + buffer);
}
} break;
@@ -109,11 +108,8 @@ printValueTree(FILE* fout, Json::Value& value, const std::string& path = ".") {
fprintf(fout, "%s={}\n", path.c_str());
Json::Value::Members members(value.getMemberNames());
std::sort(members.begin(), members.end());
std::string suffix = *(path.end() - 1) == '.' ? "" : ".";
for (Json::Value::Members::iterator it = members.begin();
it != members.end();
++it) {
const std::string& name = *it;
Json::String suffix = *(path.end() - 1) == '.' ? "" : ".";
for (auto name : members) {
printValueTree(fout, value[name], path + suffix + name);
}
} break;
@@ -126,57 +122,72 @@ printValueTree(FILE* fout, Json::Value& value, const std::string& path = ".") {
}
}
static int parseAndSaveValueTree(const std::string& input,
const std::string& actual,
const std::string& kind,
Json::Value& root,
static int parseAndSaveValueTree(const Json::String& input,
const Json::String& actual,
const Json::String& kind,
const Json::Features& features,
bool parseOnly) {
bool parseOnly,
Json::Value* root) {
Json::Reader reader(features);
bool parsingSuccessful = reader.parse(input, root);
bool parsingSuccessful =
reader.parse(input.data(), input.data() + input.size(), *root);
if (!parsingSuccessful) {
printf("Failed to parse %s file: \n%s\n",
kind.c_str(),
printf("Failed to parse %s file: \n%s\n", kind.c_str(),
reader.getFormattedErrorMessages().c_str());
return 1;
}
if (!parseOnly) {
FILE* factual = fopen(actual.c_str(), "wt");
if (!factual) {
printf("Failed to create %s actual file.\n", kind.c_str());
return 2;
}
printValueTree(factual, root);
printValueTree(factual, *root);
fclose(factual);
}
return 0;
}
static int rewriteValueTree(const std::string& rewritePath,
const Json::Value& root,
std::string& rewrite) {
// Json::FastWriter writer;
// writer.enableYAMLCompatibility();
// static Json::String useFastWriter(Json::Value const& root) {
// Json::FastWriter writer;
// writer.enableYAMLCompatibility();
// return writer.write(root);
// }
static Json::String useStyledWriter(Json::Value const& root) {
Json::StyledWriter writer;
rewrite = writer.write(root);
return writer.write(root);
}
static Json::String useStyledStreamWriter(Json::Value const& root) {
Json::StyledStreamWriter writer;
Json::OStringStream sout;
writer.write(sout, root);
return sout.str();
}
static Json::String useBuiltStyledStreamWriter(Json::Value const& root) {
Json::StreamWriterBuilder builder;
return Json::writeString(builder, root);
}
static int rewriteValueTree(const Json::String& rewritePath,
const Json::Value& root,
Options::writeFuncType write,
Json::String* rewrite) {
*rewrite = write(root);
FILE* fout = fopen(rewritePath.c_str(), "wt");
if (!fout) {
printf("Failed to create rewrite file: %s\n", rewritePath.c_str());
return 2;
}
fprintf(fout, "%s\n", rewrite.c_str());
fprintf(fout, "%s\n", rewrite->c_str());
fclose(fout);
return 0;
}
static std::string removeSuffix(const std::string& path,
const std::string& extension) {
static Json::String removeSuffix(const Json::String& path,
const Json::String& extension) {
if (extension.length() >= path.length())
return std::string("");
std::string suffix = path.substr(path.length() - extension.length());
return Json::String("");
Json::String suffix = path.substr(path.length() - extension.length());
if (suffix != extension)
return std::string("");
return Json::String("");
return path.substr(0, path.length() - extension.length());
}
@@ -194,84 +205,96 @@ static int printUsage(const char* argv[]) {
return 3;
}
int parseCommandLine(int argc,
const char* argv[],
Json::Features& features,
std::string& path,
bool& parseOnly) {
parseOnly = false;
static int parseCommandLine(int argc, const char* argv[], Options* opts) {
opts->parseOnly = false;
opts->write = &useStyledWriter;
if (argc < 2) {
return printUsage(argv);
}
int index = 1;
if (std::string(argv[1]) == "--json-checker") {
features = Json::Features::strictMode();
parseOnly = true;
if (Json::String(argv[index]) == "--json-checker") {
opts->features = Json::Features::strictMode();
opts->parseOnly = true;
++index;
}
if (std::string(argv[1]) == "--json-config") {
if (Json::String(argv[index]) == "--json-config") {
printConfig();
return 3;
}
if (Json::String(argv[index]) == "--json-writer") {
++index;
Json::String const writerName(argv[index++]);
if (writerName == "StyledWriter") {
opts->write = &useStyledWriter;
} else if (writerName == "StyledStreamWriter") {
opts->write = &useStyledStreamWriter;
} else if (writerName == "BuiltStyledStreamWriter") {
opts->write = &useBuiltStyledStreamWriter;
} else {
printf("Unknown '--json-writer %s'\n", writerName.c_str());
return 4;
}
}
if (index == argc || index + 1 < argc) {
return printUsage(argv);
}
path = argv[index];
opts->path = argv[index];
return 0;
}
static int runTest(Options const& opts) {
int exitCode = 0;
int main(int argc, const char* argv[]) {
std::string path;
Json::Features features;
bool parseOnly;
int exitCode = parseCommandLine(argc, argv, features, path, parseOnly);
if (exitCode != 0) {
Json::String input = readInputTestFile(opts.path.c_str());
if (input.empty()) {
printf("Failed to read input or empty input: %s\n", opts.path.c_str());
return 3;
}
Json::String basePath = removeSuffix(opts.path, ".json");
if (!opts.parseOnly && basePath.empty()) {
printf("Bad input path. Path does not end with '.expected':\n%s\n",
opts.path.c_str());
return 3;
}
Json::String const actualPath = basePath + ".actual";
Json::String const rewritePath = basePath + ".rewrite";
Json::String const rewriteActualPath = basePath + ".actual-rewrite";
Json::Value root;
exitCode = parseAndSaveValueTree(input, actualPath, "input", opts.features,
opts.parseOnly, &root);
if (exitCode || opts.parseOnly) {
return exitCode;
}
try {
std::string input = readInputTestFile(path.c_str());
if (input.empty()) {
printf("Failed to read input or empty input: %s\n", path.c_str());
return 3;
}
std::string basePath = removeSuffix(argv[1], ".json");
if (!parseOnly && basePath.empty()) {
printf("Bad input path. Path does not end with '.expected':\n%s\n",
path.c_str());
return 3;
}
std::string actualPath = basePath + ".actual";
std::string rewritePath = basePath + ".rewrite";
std::string rewriteActualPath = basePath + ".actual-rewrite";
Json::Value root;
exitCode = parseAndSaveValueTree(
input, actualPath, "input", root, features, parseOnly);
if (exitCode == 0 && !parseOnly) {
std::string rewrite;
exitCode = rewriteValueTree(rewritePath, root, rewrite);
if (exitCode == 0) {
Json::Value rewriteRoot;
exitCode = parseAndSaveValueTree(rewrite,
rewriteActualPath,
"rewrite",
rewriteRoot,
features,
parseOnly);
}
}
Json::String rewrite;
exitCode = rewriteValueTree(rewritePath, root, opts.write, &rewrite);
if (exitCode) {
return exitCode;
}
catch (const std::exception& e) {
printf("Unhandled exception:\n%s\n", e.what());
exitCode = 1;
Json::Value rewriteRoot;
exitCode = parseAndSaveValueTree(rewrite, rewriteActualPath, "rewrite",
opts.features, opts.parseOnly, &rewriteRoot);
if (exitCode) {
return exitCode;
}
return exitCode;
return 0;
}
int main(int argc, const char* argv[]) {
Options opts;
try {
int exitCode = parseCommandLine(argc, argv, &opts);
if (exitCode != 0) {
printf("Failed to parse command-line.");
return exitCode;
}
return runTest(opts);
} catch (const std::exception& e) {
printf("Unhandled exception:\n%s\n", e.what());
return 1;
}
}
#if defined(__GNUC__)
#pragma GCC diagnostic pop
#endif

View File

@@ -1,9 +0,0 @@
Import( 'env_testing buildJSONTests' )
buildJSONTests( env_testing, Split( """
main.cpp
""" ),
'jsontestrunner' )
# For 'check' to work, 'libs' must be built first.
env_testing.Depends('jsontestrunner', '#libs')

View File

@@ -1,29 +1,45 @@
OPTION(JSONCPP_LIB_BUILD_SHARED "Build jsoncpp_lib as a shared library." OFF)
IF(BUILD_SHARED_LIBS)
SET(JSONCPP_LIB_BUILD_SHARED ON)
ENDIF(BUILD_SHARED_LIBS)
IF(JSONCPP_LIB_BUILD_SHARED)
SET(JSONCPP_LIB_TYPE SHARED)
ADD_DEFINITIONS( -DJSON_DLL_BUILD )
ELSE(JSONCPP_LIB_BUILD_SHARED)
SET(JSONCPP_LIB_TYPE STATIC)
ENDIF(JSONCPP_LIB_BUILD_SHARED)
if( CMAKE_COMPILER_IS_GNUCXX )
#Get compiler version.
execute_process( COMMAND ${CMAKE_CXX_COMPILER} -dumpversion
OUTPUT_VARIABLE GNUCXX_VERSION )
#-Werror=* was introduced -after- GCC 4.1.2
if( GNUCXX_VERSION VERSION_GREATER 4.1.2 )
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Werror=strict-aliasing")
endif()
#Get compiler version.
execute_process( COMMAND ${CMAKE_CXX_COMPILER} -dumpversion
OUTPUT_VARIABLE GNUCXX_VERSION )
#-Werror=* was introduced -after- GCC 4.1.2
if( GNUCXX_VERSION VERSION_GREATER 4.1.2 )
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Werror=strict-aliasing")
endif()
endif( CMAKE_COMPILER_IS_GNUCXX )
SET( JSONCPP_INCLUDE_DIR ../../include )
include(CheckIncludeFileCXX)
include(CheckTypeSize)
include(CheckStructHasMember)
include(CheckCXXSymbolExists)
SET( PUBLIC_HEADERS
check_include_file_cxx(clocale HAVE_CLOCALE)
check_cxx_symbol_exists(localeconv clocale HAVE_LOCALECONV)
if(CMAKE_VERSION VERSION_LESS 3.0.0)
# The "LANGUAGE CXX" parameter is not supported in CMake versions below 3,
# so the C compiler and header has to be used.
check_include_file(locale.h HAVE_LOCALE_H)
set(CMAKE_EXTRA_INCLUDE_FILES locale.h)
check_type_size("struct lconv" LCONV_SIZE)
unset(CMAKE_EXTRA_INCLUDE_FILES)
check_struct_has_member("struct lconv" decimal_point locale.h HAVE_DECIMAL_POINT)
else()
set(CMAKE_EXTRA_INCLUDE_FILES clocale)
check_type_size(lconv LCONV_SIZE LANGUAGE CXX)
unset(CMAKE_EXTRA_INCLUDE_FILES)
check_struct_has_member(lconv decimal_point clocale HAVE_DECIMAL_POINT LANGUAGE CXX)
endif()
if(NOT (HAVE_CLOCALE AND HAVE_LCONV_SIZE AND HAVE_DECIMAL_POINT AND HAVE_LOCALECONV))
message(WARNING "Locale functionality is not supported")
add_compile_definitions(JSONCPP_NO_LOCALE_SUPPORT)
endif()
set( JSONCPP_INCLUDE_DIR ../../include )
set( PUBLIC_HEADERS
${JSONCPP_INCLUDE_DIR}/json/config.h
${JSONCPP_INCLUDE_DIR}/json/forwards.h
${JSONCPP_INCLUDE_DIR}/json/features.h
@@ -31,40 +47,100 @@ SET( PUBLIC_HEADERS
${JSONCPP_INCLUDE_DIR}/json/reader.h
${JSONCPP_INCLUDE_DIR}/json/writer.h
${JSONCPP_INCLUDE_DIR}/json/assertions.h
${JSONCPP_INCLUDE_DIR}/json/version.h
${PROJECT_BINARY_DIR}/include/json/version.h
)
SOURCE_GROUP( "Public API" FILES ${PUBLIC_HEADERS} )
source_group( "Public API" FILES ${PUBLIC_HEADERS} )
ADD_LIBRARY( jsoncpp_lib ${JSONCPP_LIB_TYPE}
${PUBLIC_HEADERS}
json_tool.h
json_reader.cpp
json_batchallocator.h
json_valueiterator.inl
json_value.cpp
json_writer.cpp
version.h.in
)
SET_TARGET_PROPERTIES( jsoncpp_lib PROPERTIES OUTPUT_NAME jsoncpp )
SET_TARGET_PROPERTIES( jsoncpp_lib PROPERTIES VERSION ${JSONCPP_VERSION} SOVERSION ${JSONCPP_VERSION_MAJOR} )
IF(NOT CMAKE_VERSION VERSION_LESS 2.8.11)
TARGET_INCLUDE_DIRECTORIES( jsoncpp_lib PUBLIC
$<INSTALL_INTERFACE:${INCLUDE_INSTALL_DIR}>
$<BUILD_INTERFACE:${CMAKE_CURRENT_LIST_DIR}/${JSONCPP_INCLUDE_DIR}>
)
ENDIF(NOT CMAKE_VERSION VERSION_LESS 2.8.11)
set(jsoncpp_sources
json_tool.h
json_reader.cpp
json_valueiterator.inl
json_value.cpp
json_writer.cpp
version.h.in)
# Install instructions for this target
IF(JSONCPP_WITH_CMAKE_PACKAGE)
SET(INSTALL_EXPORT EXPORT jsoncpp)
ELSE(JSONCPP_WITH_CMAKE_PACKAGE)
SET(INSTALL_EXPORT)
ENDIF(JSONCPP_WITH_CMAKE_PACKAGE)
if(JSONCPP_WITH_CMAKE_PACKAGE)
set(INSTALL_EXPORT EXPORT jsoncpp)
else(JSONCPP_WITH_CMAKE_PACKAGE)
set(INSTALL_EXPORT)
endif()
INSTALL( TARGETS jsoncpp_lib ${INSTALL_EXPORT}
RUNTIME DESTINATION ${RUNTIME_INSTALL_DIR}
LIBRARY DESTINATION ${LIBRARY_INSTALL_DIR}
ARCHIVE DESTINATION ${ARCHIVE_INSTALL_DIR}
if(BUILD_SHARED_LIBS)
add_compile_definitions( JSON_DLL_BUILD )
endif()
add_library(jsoncpp_lib ${PUBLIC_HEADERS} ${jsoncpp_sources})
set_target_properties( jsoncpp_lib PROPERTIES VERSION ${JSONCPP_VERSION} SOVERSION ${JSONCPP_SOVERSION})
set_target_properties( jsoncpp_lib PROPERTIES OUTPUT_NAME jsoncpp
DEBUG_OUTPUT_NAME jsoncpp${DEBUG_LIBNAME_SUFFIX} )
set_target_properties( jsoncpp_lib PROPERTIES POSITION_INDEPENDENT_CODE ON)
# Set library's runtime search path on OSX
if(APPLE)
set_target_properties( jsoncpp_lib PROPERTIES INSTALL_RPATH "@loader_path/." )
endif()
# Specify compiler features required when compiling a given target.
# See https://cmake.org/cmake/help/v3.1/prop_gbl/CMAKE_CXX_KNOWN_FEATURES.html#prop_gbl:CMAKE_CXX_KNOWN_FEATURES
# for complete list of features available
target_compile_features(jsoncpp_lib PUBLIC
cxx_std_11 # Compiler mode is aware of C++ 11.
#MSVC 1900 cxx_alignas # Alignment control alignas, as defined in N2341.
#MSVC 1900 cxx_alignof # Alignment control alignof, as defined in N2341.
#MSVC 1900 cxx_attributes # Generic attributes, as defined in N2761.
cxx_auto_type # Automatic type deduction, as defined in N1984.
#MSVC 1900 cxx_constexpr # Constant expressions, as defined in N2235.
cxx_decltype # Decltype, as defined in N2343.
cxx_default_function_template_args # Default template arguments for function templates, as defined in DR226
cxx_defaulted_functions # Defaulted functions, as defined in N2346.
#MSVC 1900 cxx_defaulted_move_initializers # Defaulted move initializers, as defined in N3053.
cxx_delegating_constructors # Delegating constructors, as defined in N1986.
#MSVC 1900 cxx_deleted_functions # Deleted functions, as defined in N2346.
cxx_enum_forward_declarations # Enum forward declarations, as defined in N2764.
cxx_explicit_conversions # Explicit conversion operators, as defined in N2437.
cxx_extended_friend_declarations # Extended friend declarations, as defined in N1791.
cxx_extern_templates # Extern templates, as defined in N1987.
cxx_final # Override control final keyword, as defined in N2928, N3206 and N3272.
#MSVC 1900 cxx_func_identifier # Predefined __func__ identifier, as defined in N2340.
#MSVC 1900 cxx_generalized_initializers # Initializer lists, as defined in N2672.
#MSVC 1900 cxx_inheriting_constructors # Inheriting constructors, as defined in N2540.
#MSVC 1900 cxx_inline_namespaces # Inline namespaces, as defined in N2535.
cxx_lambdas # Lambda functions, as defined in N2927.
#MSVC 1900 cxx_local_type_template_args # Local and unnamed types as template arguments, as defined in N2657.
cxx_long_long_type # long long type, as defined in N1811.
#MSVC 1900 cxx_noexcept # Exception specifications, as defined in N3050.
#MSVC 1900 cxx_nonstatic_member_init # Non-static data member initialization, as defined in N2756.
cxx_nullptr # Null pointer, as defined in N2431.
cxx_override # Override control override keyword, as defined in N2928, N3206 and N3272.
cxx_range_for # Range-based for, as defined in N2930.
cxx_raw_string_literals # Raw string literals, as defined in N2442.
#MSVC 1900 cxx_reference_qualified_functions # Reference qualified functions, as defined in N2439.
cxx_right_angle_brackets # Right angle bracket parsing, as defined in N1757.
cxx_rvalue_references # R-value references, as defined in N2118.
#MSVC 1900 cxx_sizeof_member # Size of non-static data members, as defined in N2253.
cxx_static_assert # Static assert, as defined in N1720.
cxx_strong_enums # Strongly typed enums, as defined in N2347.
#MSVC 1900 cxx_thread_local # Thread-local variables, as defined in N2659.
cxx_trailing_return_types # Automatic function return type, as defined in N2541.
#MSVC 1900 cxx_unicode_literals # Unicode string literals, as defined in N2442.
cxx_uniform_initialization # Uniform initialization, as defined in N2640.
#MSVC 1900 cxx_unrestricted_unions # Unrestricted unions, as defined in N2544.
#MSVC 1900 cxx_user_literals # User-defined literals, as defined in N2765.
cxx_variadic_macros # Variadic macros, as defined in N1653.
cxx_variadic_templates # Variadic templates, as defined in N2242.
)
install( TARGETS jsoncpp_lib ${INSTALL_EXPORT}
RUNTIME DESTINATION ${CMAKE_INSTALL_BINDIR}
LIBRARY DESTINATION ${CMAKE_INSTALL_LIBDIR}
ARCHIVE DESTINATION ${CMAKE_INSTALL_LIBDIR})
if(NOT CMAKE_VERSION VERSION_LESS 2.8.11)
target_include_directories( jsoncpp_lib PUBLIC
$<INSTALL_INTERFACE:${CMAKE_INSTALL_INCLUDEDIR}>
$<BUILD_INTERFACE:${CMAKE_CURRENT_LIST_DIR}/${JSONCPP_INCLUDE_DIR}>
$<BUILD_INTERFACE:${PROJECT_BINARY_DIR}/include/json>)
endif()

View File

@@ -1,121 +0,0 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
#ifndef JSONCPP_BATCHALLOCATOR_H_INCLUDED
#define JSONCPP_BATCHALLOCATOR_H_INCLUDED
#include <stdlib.h>
#include <assert.h>
#ifndef JSONCPP_DOC_EXCLUDE_IMPLEMENTATION
namespace Json {
/* Fast memory allocator.
*
* This memory allocator allocates memory for a batch of object (specified by
* the page size, the number of object in each page).
*
* It does not allow the destruction of a single object. All the allocated
* objects can be destroyed at once. The memory can be either released or reused
* for future allocation.
*
* The in-place new operator must be used to construct the object using the
* pointer returned by allocate.
*/
template <typename AllocatedType, const unsigned int objectPerAllocation>
class BatchAllocator {
public:
BatchAllocator(unsigned int objectsPerPage = 255)
: freeHead_(0), objectsPerPage_(objectsPerPage) {
// printf( "Size: %d => %s\n", sizeof(AllocatedType),
// typeid(AllocatedType).name() );
assert(sizeof(AllocatedType) * objectPerAllocation >=
sizeof(AllocatedType*)); // We must be able to store a slist in the
// object free space.
assert(objectsPerPage >= 16);
batches_ = allocateBatch(0); // allocated a dummy page
currentBatch_ = batches_;
}
~BatchAllocator() {
for (BatchInfo* batch = batches_; batch;) {
BatchInfo* nextBatch = batch->next_;
free(batch);
batch = nextBatch;
}
}
/// allocate space for an array of objectPerAllocation object.
/// @warning it is the responsability of the caller to call objects
/// constructors.
AllocatedType* allocate() {
if (freeHead_) // returns node from free list.
{
AllocatedType* object = freeHead_;
freeHead_ = *(AllocatedType**)object;
return object;
}
if (currentBatch_->used_ == currentBatch_->end_) {
currentBatch_ = currentBatch_->next_;
while (currentBatch_ && currentBatch_->used_ == currentBatch_->end_)
currentBatch_ = currentBatch_->next_;
if (!currentBatch_) // no free batch found, allocate a new one
{
currentBatch_ = allocateBatch(objectsPerPage_);
currentBatch_->next_ = batches_; // insert at the head of the list
batches_ = currentBatch_;
}
}
AllocatedType* allocated = currentBatch_->used_;
currentBatch_->used_ += objectPerAllocation;
return allocated;
}
/// Release the object.
/// @warning it is the responsability of the caller to actually destruct the
/// object.
void release(AllocatedType* object) {
assert(object != 0);
*(AllocatedType**)object = freeHead_;
freeHead_ = object;
}
private:
struct BatchInfo {
BatchInfo* next_;
AllocatedType* used_;
AllocatedType* end_;
AllocatedType buffer_[objectPerAllocation];
};
// disabled copy constructor and assignement operator.
BatchAllocator(const BatchAllocator&);
void operator=(const BatchAllocator&);
static BatchInfo* allocateBatch(unsigned int objectsPerPage) {
const unsigned int mallocSize =
sizeof(BatchInfo) - sizeof(AllocatedType) * objectPerAllocation +
sizeof(AllocatedType) * objectPerAllocation * objectsPerPage;
BatchInfo* batch = static_cast<BatchInfo*>(malloc(mallocSize));
batch->next_ = 0;
batch->used_ = batch->buffer_;
batch->end_ = batch->buffer_ + objectsPerPage;
return batch;
}
BatchInfo* batches_;
BatchInfo* currentBatch_;
/// Head of a single linked list within the allocated space of freeed object
AllocatedType* freeHead_;
unsigned int objectsPerPage_;
};
} // namespace Json
#endif // ifndef JSONCPP_DOC_INCLUDE_IMPLEMENTATION
#endif // JSONCPP_BATCHALLOCATOR_H_INCLUDED

View File

@@ -1,360 +0,0 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
// included by json_value.cpp
namespace Json {
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
// class ValueInternalArray
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
ValueArrayAllocator::~ValueArrayAllocator() {}
// //////////////////////////////////////////////////////////////////
// class DefaultValueArrayAllocator
// //////////////////////////////////////////////////////////////////
#ifdef JSON_USE_SIMPLE_INTERNAL_ALLOCATOR
class DefaultValueArrayAllocator : public ValueArrayAllocator {
public: // overridden from ValueArrayAllocator
virtual ~DefaultValueArrayAllocator() {}
virtual ValueInternalArray* newArray() { return new ValueInternalArray(); }
virtual ValueInternalArray* newArrayCopy(const ValueInternalArray& other) {
return new ValueInternalArray(other);
}
virtual void destructArray(ValueInternalArray* array) { delete array; }
virtual void
reallocateArrayPageIndex(Value**& indexes,
ValueInternalArray::PageIndex& indexCount,
ValueInternalArray::PageIndex minNewIndexCount) {
ValueInternalArray::PageIndex newIndexCount = (indexCount * 3) / 2 + 1;
if (minNewIndexCount > newIndexCount)
newIndexCount = minNewIndexCount;
void* newIndexes = realloc(indexes, sizeof(Value*) * newIndexCount);
JSON_ASSERT_MESSAGE(newIndexes, "Couldn't realloc.");
indexCount = newIndexCount;
indexes = static_cast<Value**>(newIndexes);
}
virtual void releaseArrayPageIndex(Value** indexes,
ValueInternalArray::PageIndex indexCount) {
if (indexes)
free(indexes);
}
virtual Value* allocateArrayPage() {
return static_cast<Value*>(
malloc(sizeof(Value) * ValueInternalArray::itemsPerPage));
}
virtual void releaseArrayPage(Value* value) {
if (value)
free(value);
}
};
#else // #ifdef JSON_USE_SIMPLE_INTERNAL_ALLOCATOR
/// @todo make this thread-safe (lock when accessign batch allocator)
class DefaultValueArrayAllocator : public ValueArrayAllocator {
public: // overridden from ValueArrayAllocator
virtual ~DefaultValueArrayAllocator() {}
virtual ValueInternalArray* newArray() {
ValueInternalArray* array = arraysAllocator_.allocate();
new (array) ValueInternalArray(); // placement new
return array;
}
virtual ValueInternalArray* newArrayCopy(const ValueInternalArray& other) {
ValueInternalArray* array = arraysAllocator_.allocate();
new (array) ValueInternalArray(other); // placement new
return array;
}
virtual void destructArray(ValueInternalArray* array) {
if (array) {
array->~ValueInternalArray();
arraysAllocator_.release(array);
}
}
virtual void
reallocateArrayPageIndex(Value**& indexes,
ValueInternalArray::PageIndex& indexCount,
ValueInternalArray::PageIndex minNewIndexCount) {
ValueInternalArray::PageIndex newIndexCount = (indexCount * 3) / 2 + 1;
if (minNewIndexCount > newIndexCount)
newIndexCount = minNewIndexCount;
void* newIndexes = realloc(indexes, sizeof(Value*) * newIndexCount);
JSON_ASSERT_MESSAGE(newIndexes, "Couldn't realloc.");
indexCount = newIndexCount;
indexes = static_cast<Value**>(newIndexes);
}
virtual void releaseArrayPageIndex(Value** indexes,
ValueInternalArray::PageIndex indexCount) {
if (indexes)
free(indexes);
}
virtual Value* allocateArrayPage() {
return static_cast<Value*>(pagesAllocator_.allocate());
}
virtual void releaseArrayPage(Value* value) {
if (value)
pagesAllocator_.release(value);
}
private:
BatchAllocator<ValueInternalArray, 1> arraysAllocator_;
BatchAllocator<Value, ValueInternalArray::itemsPerPage> pagesAllocator_;
};
#endif // #ifdef JSON_USE_SIMPLE_INTERNAL_ALLOCATOR
static ValueArrayAllocator*& arrayAllocator() {
static DefaultValueArrayAllocator defaultAllocator;
static ValueArrayAllocator* arrayAllocator = &defaultAllocator;
return arrayAllocator;
}
static struct DummyArrayAllocatorInitializer {
DummyArrayAllocatorInitializer() {
arrayAllocator(); // ensure arrayAllocator() statics are initialized before
// main().
}
} dummyArrayAllocatorInitializer;
// //////////////////////////////////////////////////////////////////
// class ValueInternalArray
// //////////////////////////////////////////////////////////////////
bool ValueInternalArray::equals(const IteratorState& x,
const IteratorState& other) {
return x.array_ == other.array_ &&
x.currentItemIndex_ == other.currentItemIndex_ &&
x.currentPageIndex_ == other.currentPageIndex_;
}
void ValueInternalArray::increment(IteratorState& it) {
JSON_ASSERT_MESSAGE(
it.array_ && (it.currentPageIndex_ - it.array_->pages_) * itemsPerPage +
it.currentItemIndex_ !=
it.array_->size_,
"ValueInternalArray::increment(): moving iterator beyond end");
++(it.currentItemIndex_);
if (it.currentItemIndex_ == itemsPerPage) {
it.currentItemIndex_ = 0;
++(it.currentPageIndex_);
}
}
void ValueInternalArray::decrement(IteratorState& it) {
JSON_ASSERT_MESSAGE(
it.array_ && it.currentPageIndex_ == it.array_->pages_ &&
it.currentItemIndex_ == 0,
"ValueInternalArray::decrement(): moving iterator beyond end");
if (it.currentItemIndex_ == 0) {
it.currentItemIndex_ = itemsPerPage - 1;
--(it.currentPageIndex_);
} else {
--(it.currentItemIndex_);
}
}
Value& ValueInternalArray::unsafeDereference(const IteratorState& it) {
return (*(it.currentPageIndex_))[it.currentItemIndex_];
}
Value& ValueInternalArray::dereference(const IteratorState& it) {
JSON_ASSERT_MESSAGE(
it.array_ && (it.currentPageIndex_ - it.array_->pages_) * itemsPerPage +
it.currentItemIndex_ <
it.array_->size_,
"ValueInternalArray::dereference(): dereferencing invalid iterator");
return unsafeDereference(it);
}
void ValueInternalArray::makeBeginIterator(IteratorState& it) const {
it.array_ = const_cast<ValueInternalArray*>(this);
it.currentItemIndex_ = 0;
it.currentPageIndex_ = pages_;
}
void ValueInternalArray::makeIterator(IteratorState& it,
ArrayIndex index) const {
it.array_ = const_cast<ValueInternalArray*>(this);
it.currentItemIndex_ = index % itemsPerPage;
it.currentPageIndex_ = pages_ + index / itemsPerPage;
}
void ValueInternalArray::makeEndIterator(IteratorState& it) const {
makeIterator(it, size_);
}
ValueInternalArray::ValueInternalArray() : pages_(0), size_(0), pageCount_(0) {}
ValueInternalArray::ValueInternalArray(const ValueInternalArray& other)
: pages_(0), size_(other.size_), pageCount_(0) {
PageIndex minNewPages = other.size_ / itemsPerPage;
arrayAllocator()->reallocateArrayPageIndex(pages_, pageCount_, minNewPages);
JSON_ASSERT_MESSAGE(pageCount_ >= minNewPages,
"ValueInternalArray::reserve(): bad reallocation");
IteratorState itOther;
other.makeBeginIterator(itOther);
Value* value;
for (ArrayIndex index = 0; index < size_; ++index, increment(itOther)) {
if (index % itemsPerPage == 0) {
PageIndex pageIndex = index / itemsPerPage;
value = arrayAllocator()->allocateArrayPage();
pages_[pageIndex] = value;
}
new (value) Value(dereference(itOther));
}
}
ValueInternalArray& ValueInternalArray::operator=(ValueInternalArray other) {
swap(other);
return *this;
}
ValueInternalArray::~ValueInternalArray() {
// destroy all constructed items
IteratorState it;
IteratorState itEnd;
makeBeginIterator(it);
makeEndIterator(itEnd);
for (; !equals(it, itEnd); increment(it)) {
Value* value = &dereference(it);
value->~Value();
}
// release all pages
PageIndex lastPageIndex = size_ / itemsPerPage;
for (PageIndex pageIndex = 0; pageIndex < lastPageIndex; ++pageIndex)
arrayAllocator()->releaseArrayPage(pages_[pageIndex]);
// release pages index
arrayAllocator()->releaseArrayPageIndex(pages_, pageCount_);
}
void ValueInternalArray::swap(ValueInternalArray& other) {
Value** tempPages = pages_;
pages_ = other.pages_;
other.pages_ = tempPages;
ArrayIndex tempSize = size_;
size_ = other.size_;
other.size_ = tempSize;
PageIndex tempPageCount = pageCount_;
pageCount_ = other.pageCount_;
other.pageCount_ = tempPageCount;
}
void ValueInternalArray::clear() {
ValueInternalArray dummy;
swap(dummy);
}
void ValueInternalArray::resize(ArrayIndex newSize) {
if (newSize == 0)
clear();
else if (newSize < size_) {
IteratorState it;
IteratorState itEnd;
makeIterator(it, newSize);
makeIterator(itEnd, size_);
for (; !equals(it, itEnd); increment(it)) {
Value* value = &dereference(it);
value->~Value();
}
PageIndex pageIndex = (newSize + itemsPerPage - 1) / itemsPerPage;
PageIndex lastPageIndex = size_ / itemsPerPage;
for (; pageIndex < lastPageIndex; ++pageIndex)
arrayAllocator()->releaseArrayPage(pages_[pageIndex]);
size_ = newSize;
} else if (newSize > size_)
resolveReference(newSize);
}
void ValueInternalArray::makeIndexValid(ArrayIndex index) {
// Need to enlarge page index ?
if (index >= pageCount_ * itemsPerPage) {
PageIndex minNewPages = (index + 1) / itemsPerPage;
arrayAllocator()->reallocateArrayPageIndex(pages_, pageCount_, minNewPages);
JSON_ASSERT_MESSAGE(pageCount_ >= minNewPages,
"ValueInternalArray::reserve(): bad reallocation");
}
// Need to allocate new pages ?
ArrayIndex nextPageIndex = (size_ % itemsPerPage) != 0
? size_ - (size_ % itemsPerPage) + itemsPerPage
: size_;
if (nextPageIndex <= index) {
PageIndex pageIndex = nextPageIndex / itemsPerPage;
PageIndex pageToAllocate = (index - nextPageIndex) / itemsPerPage + 1;
for (; pageToAllocate-- > 0; ++pageIndex)
pages_[pageIndex] = arrayAllocator()->allocateArrayPage();
}
// Initialize all new entries
IteratorState it;
IteratorState itEnd;
makeIterator(it, size_);
size_ = index + 1;
makeIterator(itEnd, size_);
for (; !equals(it, itEnd); increment(it)) {
Value* value = &dereference(it);
new (value) Value(); // Construct a default value using placement new
}
}
Value& ValueInternalArray::resolveReference(ArrayIndex index) {
if (index >= size_)
makeIndexValid(index);
return pages_[index / itemsPerPage][index % itemsPerPage];
}
Value* ValueInternalArray::find(ArrayIndex index) const {
if (index >= size_)
return 0;
return &(pages_[index / itemsPerPage][index % itemsPerPage]);
}
ValueInternalArray::ArrayIndex ValueInternalArray::size() const {
return size_;
}
int ValueInternalArray::distance(const IteratorState& x,
const IteratorState& y) {
return indexOf(y) - indexOf(x);
}
ValueInternalArray::ArrayIndex
ValueInternalArray::indexOf(const IteratorState& iterator) {
if (!iterator.array_)
return ArrayIndex(-1);
return ArrayIndex((iterator.currentPageIndex_ - iterator.array_->pages_) *
itemsPerPage +
iterator.currentItemIndex_);
}
int ValueInternalArray::compare(const ValueInternalArray& other) const {
int sizeDiff(size_ - other.size_);
if (sizeDiff != 0)
return sizeDiff;
for (ArrayIndex index = 0; index < size_; ++index) {
int diff = pages_[index / itemsPerPage][index % itemsPerPage].compare(
other.pages_[index / itemsPerPage][index % itemsPerPage]);
if (diff != 0)
return diff;
}
return 0;
}
} // namespace Json

View File

@@ -1,473 +0,0 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
// included by json_value.cpp
namespace Json {
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
// class ValueInternalMap
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
/** \internal MUST be safely initialized using memset( this, 0,
* sizeof(ValueInternalLink) );
* This optimization is used by the fast allocator.
*/
ValueInternalLink::ValueInternalLink() : previous_(0), next_(0) {}
ValueInternalLink::~ValueInternalLink() {
for (int index = 0; index < itemPerLink; ++index) {
if (!items_[index].isItemAvailable()) {
if (!items_[index].isMemberNameStatic())
free(keys_[index]);
} else
break;
}
}
ValueMapAllocator::~ValueMapAllocator() {}
#ifdef JSON_USE_SIMPLE_INTERNAL_ALLOCATOR
class DefaultValueMapAllocator : public ValueMapAllocator {
public: // overridden from ValueMapAllocator
virtual ValueInternalMap* newMap() { return new ValueInternalMap(); }
virtual ValueInternalMap* newMapCopy(const ValueInternalMap& other) {
return new ValueInternalMap(other);
}
virtual void destructMap(ValueInternalMap* map) { delete map; }
virtual ValueInternalLink* allocateMapBuckets(unsigned int size) {
return new ValueInternalLink[size];
}
virtual void releaseMapBuckets(ValueInternalLink* links) { delete[] links; }
virtual ValueInternalLink* allocateMapLink() {
return new ValueInternalLink();
}
virtual void releaseMapLink(ValueInternalLink* link) { delete link; }
};
#else
/// @todo make this thread-safe (lock when accessign batch allocator)
class DefaultValueMapAllocator : public ValueMapAllocator {
public: // overridden from ValueMapAllocator
virtual ValueInternalMap* newMap() {
ValueInternalMap* map = mapsAllocator_.allocate();
new (map) ValueInternalMap(); // placement new
return map;
}
virtual ValueInternalMap* newMapCopy(const ValueInternalMap& other) {
ValueInternalMap* map = mapsAllocator_.allocate();
new (map) ValueInternalMap(other); // placement new
return map;
}
virtual void destructMap(ValueInternalMap* map) {
if (map) {
map->~ValueInternalMap();
mapsAllocator_.release(map);
}
}
virtual ValueInternalLink* allocateMapBuckets(unsigned int size) {
return new ValueInternalLink[size];
}
virtual void releaseMapBuckets(ValueInternalLink* links) { delete[] links; }
virtual ValueInternalLink* allocateMapLink() {
ValueInternalLink* link = linksAllocator_.allocate();
memset(link, 0, sizeof(ValueInternalLink));
return link;
}
virtual void releaseMapLink(ValueInternalLink* link) {
link->~ValueInternalLink();
linksAllocator_.release(link);
}
private:
BatchAllocator<ValueInternalMap, 1> mapsAllocator_;
BatchAllocator<ValueInternalLink, 1> linksAllocator_;
};
#endif
static ValueMapAllocator*& mapAllocator() {
static DefaultValueMapAllocator defaultAllocator;
static ValueMapAllocator* mapAllocator = &defaultAllocator;
return mapAllocator;
}
static struct DummyMapAllocatorInitializer {
DummyMapAllocatorInitializer() {
mapAllocator(); // ensure mapAllocator() statics are initialized before
// main().
}
} dummyMapAllocatorInitializer;
// h(K) = value * K >> w ; with w = 32 & K prime w.r.t. 2^32.
/*
use linked list hash map.
buckets array is a container.
linked list element contains 6 key/values. (memory = (16+4) * 6 + 4 = 124)
value have extra state: valid, available, deleted
*/
ValueInternalMap::ValueInternalMap()
: buckets_(0), tailLink_(0), bucketsSize_(0), itemCount_(0) {}
ValueInternalMap::ValueInternalMap(const ValueInternalMap& other)
: buckets_(0), tailLink_(0), bucketsSize_(0), itemCount_(0) {
reserve(other.itemCount_);
IteratorState it;
IteratorState itEnd;
other.makeBeginIterator(it);
other.makeEndIterator(itEnd);
for (; !equals(it, itEnd); increment(it)) {
bool isStatic;
const char* memberName = key(it, isStatic);
const Value& aValue = value(it);
resolveReference(memberName, isStatic) = aValue;
}
}
ValueInternalMap& ValueInternalMap::operator=(ValueInternalMap other) {
swap(other);
return *this;
}
ValueInternalMap::~ValueInternalMap() {
if (buckets_) {
for (BucketIndex bucketIndex = 0; bucketIndex < bucketsSize_;
++bucketIndex) {
ValueInternalLink* link = buckets_[bucketIndex].next_;
while (link) {
ValueInternalLink* linkToRelease = link;
link = link->next_;
mapAllocator()->releaseMapLink(linkToRelease);
}
}
mapAllocator()->releaseMapBuckets(buckets_);
}
}
void ValueInternalMap::swap(ValueInternalMap& other) {
ValueInternalLink* tempBuckets = buckets_;
buckets_ = other.buckets_;
other.buckets_ = tempBuckets;
ValueInternalLink* tempTailLink = tailLink_;
tailLink_ = other.tailLink_;
other.tailLink_ = tempTailLink;
BucketIndex tempBucketsSize = bucketsSize_;
bucketsSize_ = other.bucketsSize_;
other.bucketsSize_ = tempBucketsSize;
BucketIndex tempItemCount = itemCount_;
itemCount_ = other.itemCount_;
other.itemCount_ = tempItemCount;
}
void ValueInternalMap::clear() {
ValueInternalMap dummy;
swap(dummy);
}
ValueInternalMap::BucketIndex ValueInternalMap::size() const {
return itemCount_;
}
bool ValueInternalMap::reserveDelta(BucketIndex growth) {
return reserve(itemCount_ + growth);
}
bool ValueInternalMap::reserve(BucketIndex newItemCount) {
if (!buckets_ && newItemCount > 0) {
buckets_ = mapAllocator()->allocateMapBuckets(1);
bucketsSize_ = 1;
tailLink_ = &buckets_[0];
}
// BucketIndex idealBucketCount = (newItemCount +
// ValueInternalLink::itemPerLink) / ValueInternalLink::itemPerLink;
return true;
}
const Value* ValueInternalMap::find(const char* key) const {
if (!bucketsSize_)
return 0;
HashKey hashedKey = hash(key);
BucketIndex bucketIndex = hashedKey % bucketsSize_;
for (const ValueInternalLink* current = &buckets_[bucketIndex]; current != 0;
current = current->next_) {
for (BucketIndex index = 0; index < ValueInternalLink::itemPerLink;
++index) {
if (current->items_[index].isItemAvailable())
return 0;
if (strcmp(key, current->keys_[index]) == 0)
return &current->items_[index];
}
}
return 0;
}
Value* ValueInternalMap::find(const char* key) {
const ValueInternalMap* constThis = this;
return const_cast<Value*>(constThis->find(key));
}
Value& ValueInternalMap::resolveReference(const char* key, bool isStatic) {
HashKey hashedKey = hash(key);
if (bucketsSize_) {
BucketIndex bucketIndex = hashedKey % bucketsSize_;
ValueInternalLink** previous = 0;
BucketIndex index;
for (ValueInternalLink* current = &buckets_[bucketIndex]; current != 0;
previous = &current->next_, current = current->next_) {
for (index = 0; index < ValueInternalLink::itemPerLink; ++index) {
if (current->items_[index].isItemAvailable())
return setNewItem(key, isStatic, current, index);
if (strcmp(key, current->keys_[index]) == 0)
return current->items_[index];
}
}
}
reserveDelta(1);
return unsafeAdd(key, isStatic, hashedKey);
}
void ValueInternalMap::remove(const char* key) {
HashKey hashedKey = hash(key);
if (!bucketsSize_)
return;
BucketIndex bucketIndex = hashedKey % bucketsSize_;
for (ValueInternalLink* link = &buckets_[bucketIndex]; link != 0;
link = link->next_) {
BucketIndex index;
for (index = 0; index < ValueInternalLink::itemPerLink; ++index) {
if (link->items_[index].isItemAvailable())
return;
if (strcmp(key, link->keys_[index]) == 0) {
doActualRemove(link, index, bucketIndex);
return;
}
}
}
}
void ValueInternalMap::doActualRemove(ValueInternalLink* link,
BucketIndex index,
BucketIndex bucketIndex) {
// find last item of the bucket and swap it with the 'removed' one.
// set removed items flags to 'available'.
// if last page only contains 'available' items, then desallocate it (it's
// empty)
ValueInternalLink*& lastLink = getLastLinkInBucket(index);
BucketIndex lastItemIndex = 1; // a link can never be empty, so start at 1
for (; lastItemIndex < ValueInternalLink::itemPerLink;
++lastItemIndex) // may be optimized with dicotomic search
{
if (lastLink->items_[lastItemIndex].isItemAvailable())
break;
}
BucketIndex lastUsedIndex = lastItemIndex - 1;
Value* valueToDelete = &link->items_[index];
Value* valueToPreserve = &lastLink->items_[lastUsedIndex];
if (valueToDelete != valueToPreserve)
valueToDelete->swap(*valueToPreserve);
if (lastUsedIndex == 0) // page is now empty
{ // remove it from bucket linked list and delete it.
ValueInternalLink* linkPreviousToLast = lastLink->previous_;
if (linkPreviousToLast != 0) // can not deleted bucket link.
{
mapAllocator()->releaseMapLink(lastLink);
linkPreviousToLast->next_ = 0;
lastLink = linkPreviousToLast;
}
} else {
Value dummy;
valueToPreserve->swap(dummy); // restore deleted to default Value.
valueToPreserve->setItemUsed(false);
}
--itemCount_;
}
ValueInternalLink*&
ValueInternalMap::getLastLinkInBucket(BucketIndex bucketIndex) {
if (bucketIndex == bucketsSize_ - 1)
return tailLink_;
ValueInternalLink*& previous = buckets_[bucketIndex + 1].previous_;
if (!previous)
previous = &buckets_[bucketIndex];
return previous;
}
Value& ValueInternalMap::setNewItem(const char* key,
bool isStatic,
ValueInternalLink* link,
BucketIndex index) {
char* duplicatedKey = makeMemberName(key);
++itemCount_;
link->keys_[index] = duplicatedKey;
link->items_[index].setItemUsed();
link->items_[index].setMemberNameIsStatic(isStatic);
return link->items_[index]; // items already default constructed.
}
Value&
ValueInternalMap::unsafeAdd(const char* key, bool isStatic, HashKey hashedKey) {
JSON_ASSERT_MESSAGE(bucketsSize_ > 0,
"ValueInternalMap::unsafeAdd(): internal logic error.");
BucketIndex bucketIndex = hashedKey % bucketsSize_;
ValueInternalLink*& previousLink = getLastLinkInBucket(bucketIndex);
ValueInternalLink* link = previousLink;
BucketIndex index;
for (index = 0; index < ValueInternalLink::itemPerLink; ++index) {
if (link->items_[index].isItemAvailable())
break;
}
if (index == ValueInternalLink::itemPerLink) // need to add a new page
{
ValueInternalLink* newLink = mapAllocator()->allocateMapLink();
index = 0;
link->next_ = newLink;
previousLink = newLink;
link = newLink;
}
return setNewItem(key, isStatic, link, index);
}
ValueInternalMap::HashKey ValueInternalMap::hash(const char* key) const {
HashKey hash = 0;
while (*key)
hash += *key++ * 37;
return hash;
}
int ValueInternalMap::compare(const ValueInternalMap& other) const {
int sizeDiff(itemCount_ - other.itemCount_);
if (sizeDiff != 0)
return sizeDiff;
// Strict order guaranty is required. Compare all keys FIRST, then compare
// values.
IteratorState it;
IteratorState itEnd;
makeBeginIterator(it);
makeEndIterator(itEnd);
for (; !equals(it, itEnd); increment(it)) {
if (!other.find(key(it)))
return 1;
}
// All keys are equals, let's compare values
makeBeginIterator(it);
for (; !equals(it, itEnd); increment(it)) {
const Value* otherValue = other.find(key(it));
int valueDiff = value(it).compare(*otherValue);
if (valueDiff != 0)
return valueDiff;
}
return 0;
}
void ValueInternalMap::makeBeginIterator(IteratorState& it) const {
it.map_ = const_cast<ValueInternalMap*>(this);
it.bucketIndex_ = 0;
it.itemIndex_ = 0;
it.link_ = buckets_;
}
void ValueInternalMap::makeEndIterator(IteratorState& it) const {
it.map_ = const_cast<ValueInternalMap*>(this);
it.bucketIndex_ = bucketsSize_;
it.itemIndex_ = 0;
it.link_ = 0;
}
bool ValueInternalMap::equals(const IteratorState& x,
const IteratorState& other) {
return x.map_ == other.map_ && x.bucketIndex_ == other.bucketIndex_ &&
x.link_ == other.link_ && x.itemIndex_ == other.itemIndex_;
}
void ValueInternalMap::incrementBucket(IteratorState& iterator) {
++iterator.bucketIndex_;
JSON_ASSERT_MESSAGE(
iterator.bucketIndex_ <= iterator.map_->bucketsSize_,
"ValueInternalMap::increment(): attempting to iterate beyond end.");
if (iterator.bucketIndex_ == iterator.map_->bucketsSize_)
iterator.link_ = 0;
else
iterator.link_ = &(iterator.map_->buckets_[iterator.bucketIndex_]);
iterator.itemIndex_ = 0;
}
void ValueInternalMap::increment(IteratorState& iterator) {
JSON_ASSERT_MESSAGE(iterator.map_,
"Attempting to iterator using invalid iterator.");
++iterator.itemIndex_;
if (iterator.itemIndex_ == ValueInternalLink::itemPerLink) {
JSON_ASSERT_MESSAGE(
iterator.link_ != 0,
"ValueInternalMap::increment(): attempting to iterate beyond end.");
iterator.link_ = iterator.link_->next_;
if (iterator.link_ == 0)
incrementBucket(iterator);
} else if (iterator.link_->items_[iterator.itemIndex_].isItemAvailable()) {
incrementBucket(iterator);
}
}
void ValueInternalMap::decrement(IteratorState& iterator) {
if (iterator.itemIndex_ == 0) {
JSON_ASSERT_MESSAGE(iterator.map_,
"Attempting to iterate using invalid iterator.");
if (iterator.link_ == &iterator.map_->buckets_[iterator.bucketIndex_]) {
JSON_ASSERT_MESSAGE(iterator.bucketIndex_ > 0,
"Attempting to iterate beyond beginning.");
--(iterator.bucketIndex_);
}
iterator.link_ = iterator.link_->previous_;
iterator.itemIndex_ = ValueInternalLink::itemPerLink - 1;
}
}
const char* ValueInternalMap::key(const IteratorState& iterator) {
JSON_ASSERT_MESSAGE(iterator.link_,
"Attempting to iterate using invalid iterator.");
return iterator.link_->keys_[iterator.itemIndex_];
}
const char* ValueInternalMap::key(const IteratorState& iterator,
bool& isStatic) {
JSON_ASSERT_MESSAGE(iterator.link_,
"Attempting to iterate using invalid iterator.");
isStatic = iterator.link_->items_[iterator.itemIndex_].isMemberNameStatic();
return iterator.link_->keys_[iterator.itemIndex_];
}
Value& ValueInternalMap::value(const IteratorState& iterator) {
JSON_ASSERT_MESSAGE(iterator.link_,
"Attempting to iterate using invalid iterator.");
return iterator.link_->items_[iterator.itemIndex_];
}
int ValueInternalMap::distance(const IteratorState& x, const IteratorState& y) {
int offset = 0;
IteratorState it = x;
while (!equals(it, y))
increment(it);
return offset;
}
} // namespace Json

File diff suppressed because it is too large Load Diff

View File

@@ -1,4 +1,4 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Copyright 2007-2010 Baptiste Lepilleur and The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
@@ -6,6 +6,19 @@
#ifndef LIB_JSONCPP_JSON_TOOL_H_INCLUDED
#define LIB_JSONCPP_JSON_TOOL_H_INCLUDED
#if !defined(JSON_IS_AMALGAMATION)
#include <json/config.h>
#endif
// Also support old flag NO_LOCALE_SUPPORT
#ifdef NO_LOCALE_SUPPORT
#define JSONCPP_NO_LOCALE_SUPPORT
#endif
#ifndef JSONCPP_NO_LOCALE_SUPPORT
#include <clocale>
#endif
/* This header provides common string manipulation support, such as UTF-8,
* portable conversion from/to string...
*
@@ -13,10 +26,18 @@
*/
namespace Json {
static inline char getDecimalPoint() {
#ifdef JSONCPP_NO_LOCALE_SUPPORT
return '\0';
#else
struct lconv* lc = localeconv();
return lc ? *(lc->decimal_point) : '\0';
#endif
}
/// Converts a unicode code-point to UTF-8.
static inline std::string codePointToUTF8(unsigned int cp) {
std::string result;
static inline String codePointToUTF8(unsigned int cp) {
String result;
// based on description from http://en.wikipedia.org/wiki/UTF-8
@@ -30,8 +51,8 @@ static inline std::string codePointToUTF8(unsigned int cp) {
} else if (cp <= 0xFFFF) {
result.resize(3);
result[2] = static_cast<char>(0x80 | (0x3f & cp));
result[1] = 0x80 | static_cast<char>((0x3f & (cp >> 6)));
result[0] = 0xE0 | static_cast<char>((0xf & (cp >> 12)));
result[1] = static_cast<char>(0x80 | (0x3f & (cp >> 6)));
result[0] = static_cast<char>(0xE0 | (0xf & (cp >> 12)));
} else if (cp <= 0x10FFFF) {
result.resize(4);
result[3] = static_cast<char>(0x80 | (0x3f & cp));
@@ -43,9 +64,6 @@ static inline std::string codePointToUTF8(unsigned int cp) {
return result;
}
/// Returns true if ch is a control character (in range [0,32[).
static inline bool isControlCharacter(char ch) { return ch > 0 && ch <= 0x1F; }
enum {
/// Constant that specify the size of the buffer that must be passed to
/// uintToString.
@@ -56,14 +74,14 @@ enum {
typedef char UIntToStringBuffer[uintToStringBufferSize];
/** Converts an unsigned integer to string.
* @param value Unsigned interger to convert to string
* @param value Unsigned integer to convert to string
* @param current Input/Output string buffer.
* Must have at least uintToStringBufferSize chars free.
*/
static inline void uintToString(LargestUInt value, char*& current) {
*--current = 0;
do {
*--current = char(value % 10) + '0';
*--current = static_cast<char>(value % 10U + static_cast<unsigned>('0'));
value /= 10;
} while (value != 0);
}
@@ -73,15 +91,44 @@ static inline void uintToString(LargestUInt value, char*& current) {
* We had a sophisticated way, but it did not work in WinCE.
* @see https://github.com/open-source-parsers/jsoncpp/pull/9
*/
static inline void fixNumericLocale(char* begin, char* end) {
while (begin < end) {
template <typename Iter> Iter fixNumericLocale(Iter begin, Iter end) {
for (; begin != end; ++begin) {
if (*begin == ',') {
*begin = '.';
}
++begin;
}
return begin;
}
template <typename Iter> void fixNumericLocaleInput(Iter begin, Iter end) {
char decimalPoint = getDecimalPoint();
if (decimalPoint == '\0' || decimalPoint == '.') {
return;
}
for (; begin != end; ++begin) {
if (*begin == '.') {
*begin = decimalPoint;
}
}
}
} // namespace Json {
/**
* Return iterator that would be the new end of the range [begin,end), if we
* were to delete zeros in the end of string, but not the last zero before '.'.
*/
template <typename Iter> Iter fixZerosInTheEnd(Iter begin, Iter end) {
for (; begin != end; --end) {
if (*(end - 1) != '0') {
return end;
}
// Don't delete the last zero before the decimal point.
if (begin != (end - 1) && *(end - 2) == '.') {
return end;
}
}
return end;
}
} // namespace Json
#endif // LIB_JSONCPP_JSON_TOOL_H_INCLUDED

File diff suppressed because it is too large Load Diff

View File

@@ -1,4 +1,4 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Copyright 2007-2010 Baptiste Lepilleur and The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
@@ -15,69 +15,22 @@ namespace Json {
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
ValueIteratorBase::ValueIteratorBase()
#ifndef JSON_VALUE_USE_INTERNAL_MAP
: current_(), isNull_(true) {
}
#else
: isArray_(true), isNull_(true) {
iterator_.array_ = ValueInternalArray::IteratorState();
}
#endif
ValueIteratorBase::ValueIteratorBase() : current_() {}
#ifndef JSON_VALUE_USE_INTERNAL_MAP
ValueIteratorBase::ValueIteratorBase(
const Value::ObjectValues::iterator& current)
: current_(current), isNull_(false) {}
#else
ValueIteratorBase::ValueIteratorBase(
const ValueInternalArray::IteratorState& state)
: isArray_(true) {
iterator_.array_ = state;
}
ValueIteratorBase::ValueIteratorBase(
const ValueInternalMap::IteratorState& state)
: isArray_(false) {
iterator_.map_ = state;
}
#endif
Value& ValueIteratorBase::deref() const { return current_->second; }
Value& ValueIteratorBase::deref() const {
#ifndef JSON_VALUE_USE_INTERNAL_MAP
return current_->second;
#else
if (isArray_)
return ValueInternalArray::dereference(iterator_.array_);
return ValueInternalMap::value(iterator_.map_);
#endif
}
void ValueIteratorBase::increment() { ++current_; }
void ValueIteratorBase::increment() {
#ifndef JSON_VALUE_USE_INTERNAL_MAP
++current_;
#else
if (isArray_)
ValueInternalArray::increment(iterator_.array_);
ValueInternalMap::increment(iterator_.map_);
#endif
}
void ValueIteratorBase::decrement() {
#ifndef JSON_VALUE_USE_INTERNAL_MAP
--current_;
#else
if (isArray_)
ValueInternalArray::decrement(iterator_.array_);
ValueInternalMap::decrement(iterator_.map_);
#endif
}
void ValueIteratorBase::decrement() { --current_; }
ValueIteratorBase::difference_type
ValueIteratorBase::computeDistance(const SelfType& other) const {
#ifndef JSON_VALUE_USE_INTERNAL_MAP
#ifdef JSON_USE_CPPTL_SMALLMAP
return current_ - other.current_;
return other.current_ - current_;
#else
// Iterator for null value are initialized using the default
// constructor, which initialize current_ to the default
@@ -100,80 +53,59 @@ ValueIteratorBase::computeDistance(const SelfType& other) const {
}
return myDistance;
#endif
#else
if (isArray_)
return ValueInternalArray::distance(iterator_.array_,
other.iterator_.array_);
return ValueInternalMap::distance(iterator_.map_, other.iterator_.map_);
#endif
}
bool ValueIteratorBase::isEqual(const SelfType& other) const {
#ifndef JSON_VALUE_USE_INTERNAL_MAP
if (isNull_) {
return other.isNull_;
}
return current_ == other.current_;
#else
if (isArray_)
return ValueInternalArray::equals(iterator_.array_, other.iterator_.array_);
return ValueInternalMap::equals(iterator_.map_, other.iterator_.map_);
#endif
}
void ValueIteratorBase::copy(const SelfType& other) {
#ifndef JSON_VALUE_USE_INTERNAL_MAP
current_ = other.current_;
isNull_ = other.isNull_;
#else
if (isArray_)
iterator_.array_ = other.iterator_.array_;
iterator_.map_ = other.iterator_.map_;
#endif
}
Value ValueIteratorBase::key() const {
#ifndef JSON_VALUE_USE_INTERNAL_MAP
const Value::CZString czstring = (*current_).first;
if (czstring.c_str()) {
if (czstring.data()) {
if (czstring.isStaticString())
return Value(StaticString(czstring.c_str()));
return Value(czstring.c_str());
return Value(StaticString(czstring.data()));
return Value(czstring.data(), czstring.data() + czstring.length());
}
return Value(czstring.index());
#else
if (isArray_)
return Value(ValueInternalArray::indexOf(iterator_.array_));
bool isStatic;
const char* memberName = ValueInternalMap::key(iterator_.map_, isStatic);
if (isStatic)
return Value(StaticString(memberName));
return Value(memberName);
#endif
}
UInt ValueIteratorBase::index() const {
#ifndef JSON_VALUE_USE_INTERNAL_MAP
const Value::CZString czstring = (*current_).first;
if (!czstring.c_str())
if (!czstring.data())
return czstring.index();
return Value::UInt(-1);
#else
if (isArray_)
return Value::UInt(ValueInternalArray::indexOf(iterator_.array_));
return Value::UInt(-1);
#endif
}
const char* ValueIteratorBase::memberName() const {
#ifndef JSON_VALUE_USE_INTERNAL_MAP
const char* name = (*current_).first.c_str();
return name ? name : "";
#else
if (!isArray_)
return ValueInternalMap::key(iterator_.map_);
return "";
#endif
String ValueIteratorBase::name() const {
char const* keey;
char const* end;
keey = memberName(&end);
if (!keey)
return String();
return String(keey, end);
}
char const* ValueIteratorBase::memberName() const {
const char* cname = (*current_).first.data();
return cname ? cname : "";
}
char const* ValueIteratorBase::memberName(char const** end) const {
const char* cname = (*current_).first.data();
if (!cname) {
*end = nullptr;
return nullptr;
}
*end = cname + (*current_).first.length();
return cname;
}
// //////////////////////////////////////////////////////////////////
@@ -184,21 +116,14 @@ const char* ValueIteratorBase::memberName() const {
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
ValueConstIterator::ValueConstIterator() {}
ValueConstIterator::ValueConstIterator() = default;
#ifndef JSON_VALUE_USE_INTERNAL_MAP
ValueConstIterator::ValueConstIterator(
const Value::ObjectValues::iterator& current)
: ValueIteratorBase(current) {}
#else
ValueConstIterator::ValueConstIterator(
const ValueInternalArray::IteratorState& state)
: ValueIteratorBase(state) {}
ValueConstIterator::ValueConstIterator(
const ValueInternalMap::IteratorState& state)
: ValueIteratorBase(state) {}
#endif
ValueConstIterator::ValueConstIterator(ValueIterator const& other)
: ValueIteratorBase(other) {}
ValueConstIterator& ValueConstIterator::
operator=(const ValueIteratorBase& other) {
@@ -214,24 +139,17 @@ operator=(const ValueIteratorBase& other) {
// //////////////////////////////////////////////////////////////////
// //////////////////////////////////////////////////////////////////
ValueIterator::ValueIterator() {}
ValueIterator::ValueIterator() = default;
#ifndef JSON_VALUE_USE_INTERNAL_MAP
ValueIterator::ValueIterator(const Value::ObjectValues::iterator& current)
: ValueIteratorBase(current) {}
#else
ValueIterator::ValueIterator(const ValueInternalArray::IteratorState& state)
: ValueIteratorBase(state) {}
ValueIterator::ValueIterator(const ValueInternalMap::IteratorState& state)
: ValueIteratorBase(state) {}
#endif
ValueIterator::ValueIterator(const ValueConstIterator& other)
: ValueIteratorBase(other) {}
: ValueIteratorBase(other) {
throwRuntimeError("ConstIterator to Iterator should never be allowed.");
}
ValueIterator::ValueIterator(const ValueIterator& other)
: ValueIteratorBase(other) {}
ValueIterator::ValueIterator(const ValueIterator& other) = default;
ValueIterator& ValueIterator::operator=(const SelfType& other) {
copy(other);

File diff suppressed because it is too large Load Diff

View File

@@ -1,8 +0,0 @@
Import( 'env buildLibrary' )
buildLibrary( env, Split( """
json_reader.cpp
json_value.cpp
json_writer.cpp
""" ),
'json' )

View File

@@ -1,14 +1,22 @@
// DO NOT EDIT. This file is generated by CMake from "version"
// and "version.h.in" files.
// Run CMake configure step to update it.
// DO NOT EDIT. This file (and "version") is a template used by the build system
// (either CMake or Meson) to generate a "version.h" header file.
#ifndef JSON_VERSION_H_INCLUDED
# define JSON_VERSION_H_INCLUDED
#define JSON_VERSION_H_INCLUDED
# define JSONCPP_VERSION_STRING "@JSONCPP_VERSION@"
# define JSONCPP_VERSION_MAJOR @JSONCPP_VERSION_MAJOR@
# define JSONCPP_VERSION_MINOR @JSONCPP_VERSION_MINOR@
# define JSONCPP_VERSION_PATCH @JSONCPP_VERSION_PATCH@
# define JSONCPP_VERSION_QUALIFIER
# define JSONCPP_VERSION_HEXA ((JSONCPP_VERSION_MAJOR << 24) | (JSONCPP_VERSION_MINOR << 16) | (JSONCPP_VERSION_PATCH << 8))
#define JSONCPP_VERSION_STRING "@JSONCPP_VERSION@"
#define JSONCPP_VERSION_MAJOR @JSONCPP_VERSION_MAJOR@
#define JSONCPP_VERSION_MINOR @JSONCPP_VERSION_MINOR@
#define JSONCPP_VERSION_PATCH @JSONCPP_VERSION_PATCH@
#define JSONCPP_VERSION_QUALIFIER
#define JSONCPP_VERSION_HEXA ((JSONCPP_VERSION_MAJOR << 24) \
| (JSONCPP_VERSION_MINOR << 16) \
| (JSONCPP_VERSION_PATCH << 8))
#ifdef JSONCPP_USING_SECURE_MEMORY
#undef JSONCPP_USING_SECURE_MEMORY
#endif
#define JSONCPP_USING_SECURE_MEMORY @JSONCPP_USE_SECURE_MEMORY@
// If non-zero, the library zeroes any memory that it has allocated before
// it frees its memory.
#endif // JSON_VERSION_H_INCLUDED

View File

@@ -1,22 +1,42 @@
# vim: et ts=4 sts=4 sw=4 tw=0
IF(JSONCPP_LIB_BUILD_SHARED)
ADD_DEFINITIONS( -DJSON_DLL )
ENDIF(JSONCPP_LIB_BUILD_SHARED)
ADD_EXECUTABLE( jsoncpp_test
add_executable( jsoncpp_test
jsontest.cpp
jsontest.h
fuzz.cpp
fuzz.h
main.cpp
)
TARGET_LINK_LIBRARIES(jsoncpp_test jsoncpp_lib)
if(BUILD_SHARED_LIBS)
add_compile_definitions( JSON_DLL )
endif()
target_link_libraries(jsoncpp_test jsoncpp_lib)
# another way to solve issue #90
#set_target_properties(jsoncpp_test PROPERTIES COMPILE_FLAGS -ffloat-store)
# Run unit tests in post-build
# (default cmake workflow hides away the test result into a file, resulting in poor dev workflow?!?)
IF(JSONCPP_WITH_POST_BUILD_UNITTEST)
ADD_CUSTOM_COMMAND( TARGET jsoncpp_test
POST_BUILD
COMMAND $<TARGET_FILE:jsoncpp_test>)
ENDIF(JSONCPP_WITH_POST_BUILD_UNITTEST)
if(JSONCPP_WITH_POST_BUILD_UNITTEST)
if(BUILD_SHARED_LIBS)
# First, copy the shared lib, for Microsoft.
# Then, run the test executable.
add_custom_command( TARGET jsoncpp_test
POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy_if_different $<TARGET_FILE:jsoncpp_lib> $<TARGET_FILE_DIR:jsoncpp_test>
COMMAND ${CMAKE_CROSSCOMPILING_EMULATOR} $<TARGET_FILE:jsoncpp_test>)
else(BUILD_SHARED_LIBS)
# Just run the test executable.
add_custom_command( TARGET jsoncpp_test
POST_BUILD
COMMAND ${CMAKE_CROSSCOMPILING_EMULATOR} $<TARGET_FILE:jsoncpp_test>)
endif()
## Create tests for dashboard submission, allows easy review of CI results https://my.cdash.org/index.php?project=jsoncpp
add_test(NAME jsoncpp_test
COMMAND ${CMAKE_CROSSCOMPILING_EMULATOR} $<TARGET_FILE:jsoncpp_test>
)
endif()
SET_TARGET_PROPERTIES(jsoncpp_test PROPERTIES OUTPUT_NAME jsoncpp_test)
set_target_properties(jsoncpp_test PROPERTIES OUTPUT_NAME jsoncpp_test)

View File

@@ -0,0 +1,49 @@
// Copyright 2007-2019 The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
#include "fuzz.h"
#include <cstdint>
#include <json/config.h>
#include <json/json.h>
#include <memory>
#include <stdint.h>
#include <string>
namespace Json {
class Exception;
}
extern "C" int LLVMFuzzerTestOneInput(const uint8_t* data, size_t size) {
Json::CharReaderBuilder builder;
if (size < sizeof(uint32_t)) {
return 0;
}
uint32_t hash_settings = *(const uint32_t*)data;
data += sizeof(uint32_t);
builder.settings_["failIfExtra"] = hash_settings & (1 << 0);
builder.settings_["allowComments_"] = hash_settings & (1 << 1);
builder.settings_["strictRoot_"] = hash_settings & (1 << 2);
builder.settings_["allowDroppedNullPlaceholders_"] = hash_settings & (1 << 3);
builder.settings_["allowNumericKeys_"] = hash_settings & (1 << 4);
builder.settings_["allowSingleQuotes_"] = hash_settings & (1 << 5);
builder.settings_["failIfExtra_"] = hash_settings & (1 << 6);
builder.settings_["rejectDupKeys_"] = hash_settings & (1 << 7);
builder.settings_["allowSpecialFloats_"] = hash_settings & (1 << 8);
std::unique_ptr<Json::CharReader> reader(builder.newCharReader());
Json::Value root;
const char* data_str = reinterpret_cast<const char*>(data);
try {
reader->parse(data_str, data_str + size, &root, nullptr);
} catch (Json::Exception const&) {
}
// Whether it succeeded or not doesn't matter.
return 0;
}

14
src/test_lib_json/fuzz.h Normal file
View File

@@ -0,0 +1,14 @@
// Copyright 2007-2010 The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
#ifndef FUZZ_H_INCLUDED
#define FUZZ_H_INCLUDED
#include <cstddef>
#include <stdint.h>
extern "C" int LLVMFuzzerTestOneInput(const uint8_t* data, size_t size);
#endif // ifndef FUZZ_H_INCLUDED

View File

@@ -1,11 +1,11 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Copyright 2007-2010 Baptiste Lepilleur and The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
#define _CRT_SECURE_NO_WARNINGS 1 // Prevents deprecation warning with MSVC
#include "jsontest.h"
#include <stdio.h>
#include <cstdio>
#include <string>
#if defined(_MSC_VER)
@@ -73,15 +73,14 @@ namespace JsonTest {
// class TestResult
// //////////////////////////////////////////////////////////////////
TestResult::TestResult()
: predicateId_(1), lastUsedPredicateId_(0), messageTarget_(0) {
TestResult::TestResult() {
// The root predicate has id 0
rootPredicateNode_.id_ = 0;
rootPredicateNode_.next_ = 0;
rootPredicateNode_.next_ = nullptr;
predicateStackTail_ = &rootPredicateNode_;
}
void TestResult::setTestName(const std::string& name) { name_ = name; }
void TestResult::setTestName(const Json::String& name) { name_ = name; }
TestResult&
TestResult::addFailure(const char* file, unsigned int line, const char* expr) {
@@ -89,12 +88,12 @@ TestResult::addFailure(const char* file, unsigned int line, const char* expr) {
/// added.
unsigned int nestingLevel = 0;
PredicateContext* lastNode = rootPredicateNode_.next_;
for (; lastNode != 0; lastNode = lastNode->next_) {
for (; lastNode != nullptr; lastNode = lastNode->next_) {
if (lastNode->id_ > lastUsedPredicateId_) // new PredicateContext
{
lastUsedPredicateId_ = lastNode->id_;
addFailureInfo(
lastNode->file_, lastNode->line_, lastNode->expr_, nestingLevel);
addFailureInfo(lastNode->file_, lastNode->line_, lastNode->expr_,
nestingLevel);
// Link the PredicateContext to the failure for message target when
// popping the PredicateContext.
lastNode->failure_ = &(failures_.back());
@@ -124,32 +123,22 @@ void TestResult::addFailureInfo(const char* file,
TestResult& TestResult::popPredicateContext() {
PredicateContext* lastNode = &rootPredicateNode_;
while (lastNode->next_ != 0 && lastNode->next_->next_ != 0) {
while (lastNode->next_ != nullptr && lastNode->next_->next_ != nullptr) {
lastNode = lastNode->next_;
}
// Set message target to popped failure
PredicateContext* tail = lastNode->next_;
if (tail != 0 && tail->failure_ != 0) {
if (tail != nullptr && tail->failure_ != nullptr) {
messageTarget_ = tail->failure_;
}
// Remove tail from list
predicateStackTail_ = lastNode;
lastNode->next_ = 0;
lastNode->next_ = nullptr;
return *this;
}
bool TestResult::failed() const { return !failures_.empty(); }
unsigned int TestResult::getAssertionNestingLevel() const {
unsigned int level = 0;
const PredicateContext* lastNode = &rootPredicateNode_;
while (lastNode->next_ != 0) {
lastNode = lastNode->next_;
++level;
}
return level;
}
void TestResult::printFailure(bool printTestName) const {
if (failures_.empty()) {
return;
@@ -160,12 +149,10 @@ void TestResult::printFailure(bool printTestName) const {
}
// Print in reverse to display the callstack in the right order
Failures::const_iterator itEnd = failures_.end();
for (Failures::const_iterator it = failures_.begin(); it != itEnd; ++it) {
const Failure& failure = *it;
std::string indent(failure.nestingLevel_ * 2, ' ');
for (const auto& failure : failures_) {
Json::String indent(failure.nestingLevel_ * 2, ' ');
if (failure.file_) {
printf("%s%s(%d): ", indent.c_str(), failure.file_, failure.line_);
printf("%s%s(%u): ", indent.c_str(), failure.file_, failure.line_);
}
if (!failure.expr_.empty()) {
printf("%s\n", failure.expr_.c_str());
@@ -173,19 +160,19 @@ void TestResult::printFailure(bool printTestName) const {
printf("\n");
}
if (!failure.message_.empty()) {
std::string reindented = indentText(failure.message_, indent + " ");
Json::String reindented = indentText(failure.message_, indent + " ");
printf("%s\n", reindented.c_str());
}
}
}
std::string TestResult::indentText(const std::string& text,
const std::string& indent) {
std::string reindented;
std::string::size_type lastIndex = 0;
Json::String TestResult::indentText(const Json::String& text,
const Json::String& indent) {
Json::String reindented;
Json::String::size_type lastIndex = 0;
while (lastIndex < text.size()) {
std::string::size_type nextIndex = text.find('\n', lastIndex);
if (nextIndex == std::string::npos) {
Json::String::size_type nextIndex = text.find('\n', lastIndex);
if (nextIndex == Json::String::npos) {
nextIndex = text.size() - 1;
}
reindented += indent;
@@ -195,8 +182,8 @@ std::string TestResult::indentText(const std::string& text,
return reindented;
}
TestResult& TestResult::addToLastFailure(const std::string& message) {
if (messageTarget_ != 0) {
TestResult& TestResult::addToLastFailure(const Json::String& message) {
if (messageTarget_ != nullptr) {
messageTarget_->message_ += message;
}
return *this;
@@ -217,9 +204,9 @@ TestResult& TestResult::operator<<(bool value) {
// class TestCase
// //////////////////////////////////////////////////////////////////
TestCase::TestCase() : result_(0) {}
TestCase::TestCase() = default;
TestCase::~TestCase() {}
TestCase::~TestCase() = default;
void TestCase::run(TestResult& result) {
result_ = &result;
@@ -229,25 +216,23 @@ void TestCase::run(TestResult& result) {
// class Runner
// //////////////////////////////////////////////////////////////////
Runner::Runner() {}
Runner::Runner() = default;
Runner& Runner::add(TestCaseFactory factory) {
tests_.push_back(factory);
return *this;
}
unsigned int Runner::testCount() const {
return static_cast<unsigned int>(tests_.size());
}
size_t Runner::testCount() const { return tests_.size(); }
std::string Runner::testNameAt(unsigned int index) const {
Json::String Runner::testNameAt(size_t index) const {
TestCase* test = tests_[index]();
std::string name = test->testName();
Json::String name = test->testName();
delete test;
return name;
}
void Runner::runTestAt(unsigned int index, TestResult& result) const {
void Runner::runTestAt(size_t index, TestResult& result) const {
TestCase* test = tests_[index]();
result.setTestName(test->testName());
printf("Testing %s: ", test->testName());
@@ -257,8 +242,7 @@ void Runner::runTestAt(unsigned int index, TestResult& result) const {
#endif // if JSON_USE_EXCEPTION
test->run(result);
#if JSON_USE_EXCEPTION
}
catch (const std::exception& e) {
} catch (const std::exception& e) {
result.addFailure(__FILE__, __LINE__, "Unexpected exception caught:")
<< e.what();
}
@@ -270,9 +254,9 @@ void Runner::runTestAt(unsigned int index, TestResult& result) const {
}
bool Runner::runAllTest(bool printSummary) const {
unsigned int count = testCount();
size_t const count = testCount();
std::deque<TestResult> failures;
for (unsigned int index = 0; index < count; ++index) {
for (size_t index = 0; index < count; ++index) {
TestResult result;
runTestAt(index, result);
if (result.failed()) {
@@ -282,31 +266,27 @@ bool Runner::runAllTest(bool printSummary) const {
if (failures.empty()) {
if (printSummary) {
printf("All %d tests passed\n", count);
printf("All %zu tests passed\n", count);
}
return true;
} else {
for (unsigned int index = 0; index < failures.size(); ++index) {
TestResult& result = failures[index];
for (auto& result : failures) {
result.printFailure(count > 1);
}
if (printSummary) {
unsigned int failedCount = static_cast<unsigned int>(failures.size());
unsigned int passedCount = count - failedCount;
printf("%d/%d tests passed (%d failure(s))\n",
passedCount,
count,
size_t const failedCount = failures.size();
size_t const passedCount = count - failedCount;
printf("%zu/%zu tests passed (%zu failure(s))\n", passedCount, count,
failedCount);
}
return false;
}
}
bool Runner::testIndex(const std::string& testName,
unsigned int& indexOut) const {
unsigned int count = testCount();
for (unsigned int index = 0; index < count; ++index) {
bool Runner::testIndex(const Json::String& testName, size_t& indexOut) const {
const size_t count = testCount();
for (size_t index = 0; index < count; ++index) {
if (testNameAt(index) == testName) {
indexOut = index;
return true;
@@ -316,17 +296,17 @@ bool Runner::testIndex(const std::string& testName,
}
void Runner::listTests() const {
unsigned int count = testCount();
for (unsigned int index = 0; index < count; ++index) {
const size_t count = testCount();
for (size_t index = 0; index < count; ++index) {
printf("%s\n", testNameAt(index).c_str());
}
}
int Runner::runCommandLine(int argc, const char* argv[]) const {
typedef std::deque<std::string> TestNames;
// typedef std::deque<String> TestNames;
Runner subrunner;
for (int index = 1; index < argc; ++index) {
std::string opt = argv[index];
Json::String opt = argv[index];
if (opt == "--list-tests") {
listTests();
return 0;
@@ -335,7 +315,7 @@ int Runner::runCommandLine(int argc, const char* argv[]) const {
} else if (opt == "--test") {
++index;
if (index < argc) {
unsigned int testNameIndex;
size_t testNameIndex;
if (testIndex(argv[index], testNameIndex)) {
subrunner.add(tests_[testNameIndex]);
} else {
@@ -398,8 +378,8 @@ void Runner::preventDialogOnCrash() {
_CrtSetReportHook(&msvcrtSilentReportHook);
#endif // if defined(_MSC_VER)
// @todo investiguate this handler (for buffer overflow)
// _set_security_error_handler
// @todo investigate this handler (for buffer overflow)
// _set_security_error_handler
#if defined(_WIN32)
// Prevents the system from popping a dialog for debugging if the
@@ -426,9 +406,21 @@ void Runner::printUsage(const char* appName) {
// Assertion functions
// //////////////////////////////////////////////////////////////////
Json::String ToJsonString(const char* toConvert) {
return Json::String(toConvert);
}
Json::String ToJsonString(Json::String in) { return in; }
#if JSONCPP_USING_SECURE_MEMORY
Json::String ToJsonString(std::string in) {
return Json::String(in.data(), in.data() + in.length());
}
#endif
TestResult& checkStringEqual(TestResult& result,
const std::string& expected,
const std::string& actual,
const Json::String& expected,
const Json::String& actual,
const char* file,
unsigned int line,
const char* expr) {

View File

@@ -1,4 +1,4 @@
// Copyright 2007-2010 Baptiste Lepilleur
// Copyright 2007-2010 Baptiste Lepilleur and The JsonCpp Authors
// Distributed under MIT license, or public domain if desired and
// recognized in your jurisdiction.
// See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
@@ -6,11 +6,11 @@
#ifndef JSONTEST_H_INCLUDED
#define JSONTEST_H_INCLUDED
#include <cstdio>
#include <deque>
#include <json/config.h>
#include <json/value.h>
#include <json/writer.h>
#include <stdio.h>
#include <deque>
#include <sstream>
#include <string>
@@ -32,8 +32,8 @@ class Failure {
public:
const char* file_;
unsigned int line_;
std::string expr_;
std::string message_;
Json::String expr_;
Json::String message_;
unsigned int nestingLevel_;
};
@@ -60,16 +60,16 @@ public:
/// Not encapsulated to prevent step into when debugging failed assertions
/// Incremented by one on assertion predicate entry, decreased by one
/// by addPredicateContext().
PredicateContext::Id predicateId_;
PredicateContext::Id predicateId_{1};
/// \internal Implementation detail for predicate macros
PredicateContext* predicateStackTail_;
void setTestName(const std::string& name);
void setTestName(const Json::String& name);
/// Adds an assertion failure.
TestResult&
addFailure(const char* file, unsigned int line, const char* expr = 0);
addFailure(const char* file, unsigned int line, const char* expr = nullptr);
/// Removes the last PredicateContext added to the predicate stack
/// chained list.
@@ -82,7 +82,7 @@ public:
// Generic operator that will work with anything ostream can deal with.
template <typename T> TestResult& operator<<(const T& value) {
std::ostringstream oss;
Json::OStringStream oss;
oss.precision(16);
oss.setf(std::ios_base::floatfield);
oss << value;
@@ -96,23 +96,22 @@ public:
TestResult& operator<<(Json::UInt64 value);
private:
TestResult& addToLastFailure(const std::string& message);
unsigned int getAssertionNestingLevel() const;
TestResult& addToLastFailure(const Json::String& message);
/// Adds a failure or a predicate context
void addFailureInfo(const char* file,
unsigned int line,
const char* expr,
unsigned int nestingLevel);
static std::string indentText(const std::string& text,
const std::string& indent);
static Json::String indentText(const Json::String& text,
const Json::String& indent);
typedef std::deque<Failure> Failures;
Failures failures_;
std::string name_;
Json::String name_;
PredicateContext rootPredicateNode_;
PredicateContext::Id lastUsedPredicateId_;
PredicateContext::Id lastUsedPredicateId_{0};
/// Failure which is the target of the messages added using operator <<
Failure* messageTarget_;
Failure* messageTarget_{nullptr};
};
class TestCase {
@@ -126,7 +125,7 @@ public:
virtual const char* testName() const = 0;
protected:
TestResult* result_;
TestResult* result_{nullptr};
private:
virtual void runTestCase() = 0;
@@ -152,23 +151,23 @@ public:
bool runAllTest(bool printSummary) const;
/// Returns the number of test case in the suite
unsigned int testCount() const;
size_t testCount() const;
/// Returns the name of the test case at the specified index
std::string testNameAt(unsigned int index) const;
Json::String testNameAt(size_t index) const;
/// Runs the test case at the specified index using the specified TestResult
void runTestAt(unsigned int index, TestResult& result) const;
void runTestAt(size_t index, TestResult& result) const;
static void printUsage(const char* appName);
private: // prevents copy construction and assignment
Runner(const Runner& other);
Runner& operator=(const Runner& other);
Runner(const Runner& other) = delete;
Runner& operator=(const Runner& other) = delete;
private:
void listTests() const;
bool testIndex(const std::string& testName, unsigned int& index) const;
bool testIndex(const Json::String& testName, size_t& indexOut) const;
static void preventDialogOnCrash();
private:
@@ -178,8 +177,8 @@ private:
template <typename T, typename U>
TestResult& checkEqual(TestResult& result,
const T& expected,
const U& actual,
T expected,
U actual,
const char* file,
unsigned int line,
const char* expr) {
@@ -191,9 +190,15 @@ TestResult& checkEqual(TestResult& result,
return result;
}
Json::String ToJsonString(const char* toConvert);
Json::String ToJsonString(Json::String in);
#if JSONCPP_USING_SECURE_MEMORY
Json::String ToJsonString(std::string in);
#endif
TestResult& checkStringEqual(TestResult& result,
const std::string& expected,
const std::string& actual,
const Json::String& expected,
const Json::String& actual,
const char* file,
unsigned int line,
const char* expr);
@@ -206,7 +211,7 @@ TestResult& checkStringEqual(TestResult& result,
#define JSONTEST_ASSERT(expr) \
if (expr) { \
} else \
result_->addFailure(__FILE__, __LINE__, #expr)
result_->addFailure(__FILE__, __LINE__, #expr)
/// \brief Asserts that the given predicate is true.
/// The predicate may do other assertions and be a member function of the
@@ -214,8 +219,7 @@ TestResult& checkStringEqual(TestResult& result,
#define JSONTEST_ASSERT_PRED(expr) \
{ \
JsonTest::PredicateContext _minitest_Context = { \
result_->predicateId_, __FILE__, __LINE__, #expr \
}; \
result_->predicateId_, __FILE__, __LINE__, #expr, NULL, NULL}; \
result_->predicateStackTail_->next_ = &_minitest_Context; \
result_->predicateId_ += 1; \
result_->predicateStackTail_ = &_minitest_Context; \
@@ -225,21 +229,14 @@ TestResult& checkStringEqual(TestResult& result,
/// \brief Asserts that two values are equals.
#define JSONTEST_ASSERT_EQUAL(expected, actual) \
JsonTest::checkEqual(*result_, \
expected, \
actual, \
__FILE__, \
__LINE__, \
JsonTest::checkEqual(*result_, expected, actual, __FILE__, __LINE__, \
#expected " == " #actual)
/// \brief Asserts that two values are equals.
#define JSONTEST_ASSERT_STRING_EQUAL(expected, actual) \
JsonTest::checkStringEqual(*result_, \
std::string(expected), \
std::string(actual), \
__FILE__, \
__LINE__, \
#expected " == " #actual)
JsonTest::checkStringEqual(*result_, JsonTest::ToJsonString(expected), \
JsonTest::ToJsonString(actual), __FILE__, \
__LINE__, #expected " == " #actual)
/// \brief Asserts that a given expression throws an exception
#define JSONTEST_ASSERT_THROWS(expr) \
@@ -247,13 +244,12 @@ TestResult& checkStringEqual(TestResult& result,
bool _threw = false; \
try { \
expr; \
} \
catch (...) { \
} catch (...) { \
_threw = true; \
} \
if (!_threw) \
result_->addFailure( \
__FILE__, __LINE__, "expected exception thrown: " #expr); \
result_->addFailure(__FILE__, __LINE__, \
"expected exception thrown: " #expr); \
}
/// \brief Begin a fixture test case.
@@ -264,9 +260,9 @@ TestResult& checkStringEqual(TestResult& result,
return new Test##FixtureType##name(); \
} \
\
public: /* overidden from TestCase */ \
virtual const char* testName() const { return #FixtureType "/" #name; } \
virtual void runTestCase(); \
public: /* overridden from TestCase */ \
const char* testName() const override { return #FixtureType "/" #name; } \
void runTestCase() override; \
}; \
\
void Test##FixtureType##name::runTestCase()

File diff suppressed because it is too large Load Diff

View File

@@ -1,10 +0,0 @@
Import( 'env_testing buildUnitTests' )
buildUnitTests( env_testing, Split( """
main.cpp
jsontest.cpp
""" ),
'test_lib_json' )
# For 'check' to work, 'libs' must be built first.
env_testing.Depends('test_lib_json', '#libs')

View File

@@ -1,10 +1,16 @@
# removes all files created during testing
# Copyright 2007 Baptiste Lepilleur and The JsonCpp Authors
# Distributed under MIT license, or public domain if desired and
# recognized in your jurisdiction.
# See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
"""Removes all files created during testing."""
import glob
import os
paths = []
for pattern in [ '*.actual', '*.actual-rewrite', '*.rewrite', '*.process-output' ]:
paths += glob.glob( 'data/' + pattern )
paths += glob.glob('data/' + pattern)
for path in paths:
os.unlink( path )
os.unlink(path)

View File

@@ -0,0 +1 @@
[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]

View File

@@ -0,0 +1,4 @@
// Comment for array
.=[]
// Comment within array
.[0]="one-element"

View File

@@ -0,0 +1,5 @@
// Comment for array
[
// Comment within array
"one-element"
]

View File

@@ -1,10 +1,15 @@
# Copyright 2007 Baptiste Lepilleur and The JsonCpp Authors
# Distributed under MIT license, or public domain if desired and
# recognized in your jurisdiction.
# See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
from __future__ import print_function
import glob
import os.path
for path in glob.glob( '*.json' ):
for path in glob.glob('*.json'):
text = file(path,'rt').read()
target = os.path.splitext(path)[0] + '.expected'
if os.path.exists( target ):
if os.path.exists(target):
print('skipping:', target)
else:
print('creating:', target)

View File

@@ -1,4 +1,11 @@
# Simple implementation of a json test runner to run the test against json-py.
# Copyright 2007 Baptiste Lepilleur and The JsonCpp Authors
# Distributed under MIT license, or public domain if desired and
# recognized in your jurisdiction.
# See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
"""Simple implementation of a json test runner to run the test against
json-py."""
from __future__ import print_function
import sys
import os.path
@@ -15,50 +22,50 @@ actual_path = base_path + '.actual'
rewrite_path = base_path + '.rewrite'
rewrite_actual_path = base_path + '.actual-rewrite'
def valueTreeToString( fout, value, path = '.' ):
def valueTreeToString(fout, value, path = '.'):
ty = type(value)
if ty is types.DictType:
fout.write( '%s={}\n' % path )
fout.write('%s={}\n' % path)
suffix = path[-1] != '.' and '.' or ''
names = value.keys()
names.sort()
for name in names:
valueTreeToString( fout, value[name], path + suffix + name )
valueTreeToString(fout, value[name], path + suffix + name)
elif ty is types.ListType:
fout.write( '%s=[]\n' % path )
for index, childValue in zip( xrange(0,len(value)), value ):
valueTreeToString( fout, childValue, path + '[%d]' % index )
fout.write('%s=[]\n' % path)
for index, childValue in zip(xrange(0,len(value)), value):
valueTreeToString(fout, childValue, path + '[%d]' % index)
elif ty is types.StringType:
fout.write( '%s="%s"\n' % (path,value) )
fout.write('%s="%s"\n' % (path,value))
elif ty is types.IntType:
fout.write( '%s=%d\n' % (path,value) )
fout.write('%s=%d\n' % (path,value))
elif ty is types.FloatType:
fout.write( '%s=%.16g\n' % (path,value) )
fout.write('%s=%.16g\n' % (path,value))
elif value is True:
fout.write( '%s=true\n' % path )
fout.write('%s=true\n' % path)
elif value is False:
fout.write( '%s=false\n' % path )
fout.write('%s=false\n' % path)
elif value is None:
fout.write( '%s=null\n' % path )
fout.write('%s=null\n' % path)
else:
assert False and "Unexpected value type"
def parseAndSaveValueTree( input, actual_path ):
root = json.loads( input )
fout = file( actual_path, 'wt' )
valueTreeToString( fout, root )
def parseAndSaveValueTree(input, actual_path):
root = json.loads(input)
fout = file(actual_path, 'wt')
valueTreeToString(fout, root)
fout.close()
return root
def rewriteValueTree( value, rewrite_path ):
rewrite = json.dumps( value )
def rewriteValueTree(value, rewrite_path):
rewrite = json.dumps(value)
#rewrite = rewrite[1:-1] # Somehow the string is quoted ! jsonpy bug ?
file( rewrite_path, 'wt').write( rewrite + '\n' )
file(rewrite_path, 'wt').write(rewrite + '\n')
return rewrite
input = file( input_path, 'rt' ).read()
root = parseAndSaveValueTree( input, actual_path )
rewrite = rewriteValueTree( json.write( root ), rewrite_path )
rewrite_root = parseAndSaveValueTree( rewrite, rewrite_actual_path )
input = file(input_path, 'rt').read()
root = parseAndSaveValueTree(input, actual_path)
rewrite = rewriteValueTree(json.write(root), rewrite_path)
rewrite_root = parseAndSaveValueTree(rewrite, rewrite_actual_path)
sys.exit( 0 )
sys.exit(0)

View File

@@ -1,3 +1,8 @@
# Copyright 2007 Baptiste Lepilleur and The JsonCpp Authors
# Distributed under MIT license, or public domain if desired and
# recognized in your jurisdiction.
# See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
from __future__ import print_function
from __future__ import unicode_literals
from io import open
@@ -14,6 +19,7 @@ def getStatusOutput(cmd):
Return int, unicode (for both Python 2 and 3).
Note: os.popen().close() would return None for 0.
"""
print(cmd, file=sys.stderr)
pipe = os.popen(cmd)
process_output = pipe.read()
try:
@@ -25,11 +31,11 @@ def getStatusOutput(cmd):
pass # python3
status = pipe.close()
return status, process_output
def compareOutputs( expected, actual, message ):
def compareOutputs(expected, actual, message):
expected = expected.strip().replace('\r','').split('\n')
actual = actual.strip().replace('\r','').split('\n')
diff_line = 0
max_line_to_compare = min( len(expected), len(actual) )
max_line_to_compare = min(len(expected), len(actual))
for index in range(0,max_line_to_compare):
if expected[index].strip() != actual[index].strip():
diff_line = index + 1
@@ -38,7 +44,7 @@ def compareOutputs( expected, actual, message ):
diff_line = max_line_to_compare+1
if diff_line == 0:
return None
def safeGetLine( lines, index ):
def safeGetLine(lines, index):
index += -1
if index >= len(lines):
return ''
@@ -48,64 +54,101 @@ def compareOutputs( expected, actual, message ):
Actual: '%s'
""" % (message, diff_line,
safeGetLine(expected,diff_line),
safeGetLine(actual,diff_line) )
def safeReadFile( path ):
safeGetLine(actual,diff_line))
def safeReadFile(path):
try:
return open( path, 'rt', encoding = 'utf-8' ).read()
return open(path, 'rt', encoding = 'utf-8').read()
except IOError as e:
return '<File "%s" is missing: %s>' % (path,e)
def runAllTests( jsontest_executable_path, input_dir = None,
use_valgrind=False, with_json_checker=False ):
def runAllTests(jsontest_executable_path, input_dir = None,
use_valgrind=False, with_json_checker=False,
writerClass='StyledWriter'):
if not input_dir:
input_dir = os.path.join( os.getcwd(), 'data' )
tests = glob( os.path.join( input_dir, '*.json' ) )
input_dir = os.path.join(os.getcwd(), 'data')
tests = glob(os.path.join(input_dir, '*.json'))
if with_json_checker:
test_jsonchecker = glob( os.path.join( input_dir, '../jsonchecker', '*.json' ) )
all_test_jsonchecker = glob(os.path.join(input_dir, '../jsonchecker', '*.json'))
# These tests fail with strict json support, but pass with jsoncpp extra lieniency
"""
Failure details:
* Test ../jsonchecker/fail25.json
Parsing should have failed:
[" tab character in string "]
* Test ../jsonchecker/fail13.json
Parsing should have failed:
{"Numbers cannot have leading zeroes": 013}
* Test ../jsonchecker/fail18.json
Parsing should have failed:
[[[[[[[[[[[[[[[[[[[["Too deep"]]]]]]]]]]]]]]]]]]]]
* Test ../jsonchecker/fail8.json
Parsing should have failed:
["Extra close"]]
* Test ../jsonchecker/fail7.json
Parsing should have failed:
["Comma after the close"],
* Test ../jsonchecker/fail10.json
Parsing should have failed:
{"Extra value after close": true} "misplaced quoted value"
* Test ../jsonchecker/fail27.json
Parsing should have failed:
["line
break"]
"""
known_differences_withjsonchecker = [ "fail25.json", "fail13.json", "fail18.json", "fail8.json",
"fail7.json", "fail10.json", "fail27.json" ]
test_jsonchecker = [ test for test in all_test_jsonchecker if os.path.basename(test) not in known_differences_withjsonchecker ]
else:
test_jsonchecker = []
failed_tests = []
valgrind_path = use_valgrind and VALGRIND_CMD or ''
for input_path in tests + test_jsonchecker:
expect_failure = os.path.basename( input_path ).startswith( 'fail' )
expect_failure = os.path.basename(input_path).startswith('fail')
is_json_checker_test = (input_path in test_jsonchecker) or expect_failure
print('TESTING:', input_path, end=' ')
options = is_json_checker_test and '--json-checker' or ''
cmd = '%s%s %s "%s"' % (
valgrind_path, jsontest_executable_path, options,
options += ' --json-writer %s'%writerClass
cmd = '%s%s %s "%s"' % ( valgrind_path, jsontest_executable_path, options,
input_path)
status, process_output = getStatusOutput(cmd)
if is_json_checker_test:
if expect_failure:
if not status:
print('FAILED')
failed_tests.append( (input_path, 'Parsing should have failed:\n%s' %
safeReadFile(input_path)) )
failed_tests.append((input_path, 'Parsing should have failed:\n%s' %
safeReadFile(input_path)))
else:
print('OK')
else:
if status:
print('FAILED')
failed_tests.append( (input_path, 'Parsing failed:\n' + process_output) )
failed_tests.append((input_path, 'Parsing failed:\n' + process_output))
else:
print('OK')
else:
base_path = os.path.splitext(input_path)[0]
actual_output = safeReadFile( base_path + '.actual' )
actual_rewrite_output = safeReadFile( base_path + '.actual-rewrite' )
open(base_path + '.process-output', 'wt', encoding = 'utf-8').write( process_output )
actual_output = safeReadFile(base_path + '.actual')
actual_rewrite_output = safeReadFile(base_path + '.actual-rewrite')
open(base_path + '.process-output', 'wt', encoding = 'utf-8').write(process_output)
if status:
print('parsing failed')
failed_tests.append( (input_path, 'Parsing failed:\n' + process_output) )
failed_tests.append((input_path, 'Parsing failed:\n' + process_output))
else:
expected_output_path = os.path.splitext(input_path)[0] + '.expected'
expected_output = open( expected_output_path, 'rt', encoding = 'utf-8' ).read()
detail = ( compareOutputs( expected_output, actual_output, 'input' )
or compareOutputs( expected_output, actual_rewrite_output, 'rewrite' ) )
expected_output = open(expected_output_path, 'rt', encoding = 'utf-8').read()
detail = (compareOutputs(expected_output, actual_output, 'input')
or compareOutputs(expected_output, actual_rewrite_output, 'rewrite'))
if detail:
print('FAILED')
failed_tests.append( (input_path, detail) )
failed_tests.append((input_path, detail))
else:
print('OK')
@@ -117,7 +160,7 @@ def runAllTests( jsontest_executable_path, input_dir = None,
print(failed_test[1])
print()
print('Test results: %d passed, %d failed.' % (len(tests)-len(failed_tests),
len(failed_tests) ))
len(failed_tests)))
return 1
else:
print('All %d tests passed.' % len(tests))
@@ -125,7 +168,7 @@ def runAllTests( jsontest_executable_path, input_dir = None,
def main():
from optparse import OptionParser
parser = OptionParser( usage="%prog [options] <path to jsontestrunner.exe> [test case directory]" )
parser = OptionParser(usage="%prog [options] <path to jsontestrunner.exe> [test case directory]")
parser.add_option("--valgrind",
action="store_true", dest="valgrind", default=False,
help="run all the tests using valgrind to detect memory leaks")
@@ -136,17 +179,32 @@ def main():
options, args = parser.parse_args()
if len(args) < 1 or len(args) > 2:
parser.error( 'Must provides at least path to jsontestrunner executable.' )
sys.exit( 1 )
parser.error('Must provides at least path to jsontestrunner executable.')
sys.exit(1)
jsontest_executable_path = os.path.normpath( os.path.abspath( args[0] ) )
jsontest_executable_path = os.path.normpath(os.path.abspath(args[0]))
if len(args) > 1:
input_path = os.path.normpath( os.path.abspath( args[1] ) )
input_path = os.path.normpath(os.path.abspath(args[1]))
else:
input_path = None
status = runAllTests( jsontest_executable_path, input_path,
use_valgrind=options.valgrind, with_json_checker=options.with_json_checker )
sys.exit( status )
status = runAllTests(jsontest_executable_path, input_path,
use_valgrind=options.valgrind,
with_json_checker=options.with_json_checker,
writerClass='StyledWriter')
if status:
sys.exit(status)
status = runAllTests(jsontest_executable_path, input_path,
use_valgrind=options.valgrind,
with_json_checker=options.with_json_checker,
writerClass='StyledStreamWriter')
if status:
sys.exit(status)
status = runAllTests(jsontest_executable_path, input_path,
use_valgrind=options.valgrind,
with_json_checker=options.with_json_checker,
writerClass='BuiltStyledStreamWriter')
if status:
sys.exit(status)
if __name__ == '__main__':
main()

View File

@@ -1,3 +1,8 @@
# Copyright 2009 Baptiste Lepilleur and The JsonCpp Authors
# Distributed under MIT license, or public domain if desired and
# recognized in your jurisdiction.
# See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
from __future__ import print_function
from __future__ import unicode_literals
from io import open
@@ -11,18 +16,18 @@ import optparse
VALGRIND_CMD = 'valgrind --tool=memcheck --leak-check=yes --undef-value-errors=yes'
class TestProxy(object):
def __init__( self, test_exe_path, use_valgrind=False ):
self.test_exe_path = os.path.normpath( os.path.abspath( test_exe_path ) )
def __init__(self, test_exe_path, use_valgrind=False):
self.test_exe_path = os.path.normpath(os.path.abspath(test_exe_path))
self.use_valgrind = use_valgrind
def run( self, options ):
def run(self, options):
if self.use_valgrind:
cmd = VALGRIND_CMD.split()
else:
cmd = []
cmd.extend( [self.test_exe_path, '--test-auto'] + options )
cmd.extend([self.test_exe_path, '--test-auto'] + options)
try:
process = subprocess.Popen( cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT )
process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
except:
print(cmd)
raise
@@ -31,9 +36,9 @@ class TestProxy(object):
return False, stdout
return True, stdout
def runAllTests( exe_path, use_valgrind=False ):
test_proxy = TestProxy( exe_path, use_valgrind=use_valgrind )
status, test_names = test_proxy.run( ['--list-tests'] )
def runAllTests(exe_path, use_valgrind=False):
test_proxy = TestProxy(exe_path, use_valgrind=use_valgrind)
status, test_names = test_proxy.run(['--list-tests'])
if not status:
print("Failed to obtain unit tests list:\n" + test_names, file=sys.stderr)
return 1
@@ -41,11 +46,11 @@ def runAllTests( exe_path, use_valgrind=False ):
failures = []
for name in test_names:
print('TESTING %s:' % name, end=' ')
succeed, result = test_proxy.run( ['--test', name] )
succeed, result = test_proxy.run(['--test', name])
if succeed:
print('OK')
else:
failures.append( (name, result) )
failures.append((name, result))
print('FAILED')
failed_count = len(failures)
pass_count = len(test_names) - failed_count
@@ -53,8 +58,7 @@ def runAllTests( exe_path, use_valgrind=False ):
print()
for name, result in failures:
print(result)
print('%d/%d tests passed (%d failure(s))' % (
pass_count, len(test_names), failed_count))
print('%d/%d tests passed (%d failure(s))' % ( pass_count, len(test_names), failed_count))
return 1
else:
print('All %d tests passed' % len(test_names))
@@ -62,7 +66,7 @@ def runAllTests( exe_path, use_valgrind=False ):
def main():
from optparse import OptionParser
parser = OptionParser( usage="%prog [options] <path to test_lib_json.exe>" )
parser = OptionParser(usage="%prog [options] <path to test_lib_json.exe>")
parser.add_option("--valgrind",
action="store_true", dest="valgrind", default=False,
help="run all the tests using valgrind to detect memory leaks")
@@ -70,11 +74,11 @@ def main():
options, args = parser.parse_args()
if len(args) != 1:
parser.error( 'Must provides at least path to test_lib_json executable.' )
sys.exit( 1 )
parser.error('Must provides at least path to test_lib_json executable.')
sys.exit(1)
exit_code = runAllTests( args[0], use_valgrind=options.valgrind )
sys.exit( exit_code )
exit_code = runAllTests(args[0], use_valgrind=options.valgrind)
sys.exit(exit_code)
if __name__ == '__main__':
main()

View File

@@ -1 +0,0 @@
1.3.0

1
version.in Normal file
View File

@@ -0,0 +1 @@
@JSONCPP_VERSION@

1
version.txt Normal file
View File

@@ -0,0 +1 @@
1.8.4