Compare commits

..

63 Commits
1.9.1 ... 1.9.2

Author SHA1 Message Date
Billy Donahue
d2e6a971f4 some test coverage for Value::iterator (#1093) 2019-11-12 02:16:54 -05:00
Billy Donahue
53c8e2cb3b Issue 1066 (#1080)
Implemented `as<T>()` and `is<T>()` with accompanying tests
2019-11-11 22:43:52 -05:00
Jordan Bayles
645cd0412c Number fixes (#1053)
* cleaning up the logic for parsing numbers

* Add Testcases for new Reader in jsontestrunner
2019-11-09 11:49:16 +08:00
dota17
ff58fdcc75 Update coverage badge (#1088)
update coverage badge
change int to Json::ArrayIndex in for-loop
2019-11-07 15:25:06 +08:00
chenguoping
82b736734d add testcase for json_value.cpp to improve coverage [90+%] 2019-11-06 21:38:47 -08:00
Hans Johnson
a86e129983 ENH: Move to requiring python 3
Resolves #1081

See for more details:
https://devguide.python.org/devcycle/#end-of-life-branches
https://pythonclock.org/
2019-11-06 21:06:57 -08:00
Hans Johnson
5fb17a66b8 COMP: Remove shadow variable warning
jsoncpp/src/test_lib_json/main.cpp:2261:30: warning: declaration shadows a local variable [-Wshadow]
    Json::StyledStreamWriter writer;
                             ^
jsoncpp/src/test_lib_json/main.cpp:2237:28: note: previous declaration is here
  Json::StyledStreamWriter writer;
                           ^
2019-11-06 21:06:57 -08:00
Hans Johnson
7429bb2bfa COMP: Fix type mismatch ambiguity
jsoncpp/src/test_lib_json/main.cpp:354:31: error:
  implicit conversion changes signedness: 'int' to
  'std::__1::vector<Json::Value *, std::__1::allocator<Json::Value *> >::size_type'
  aka 'unsigned long') [-Werror,-Wsign-conversion]
    JSONTEST_ASSERT_EQUAL(vec[i], &array[i]);
                          ~~~ ^
2019-11-06 21:06:57 -08:00
Christopher Dunn
2eb20a938c Remove deprecated makerelease.py
re: #1081
2019-11-04 01:41:35 -08:00
Billy Donahue
638ad269e7 Explicitly specify hexfloat in TestResult operator<< (#1078) 2019-11-04 01:37:02 -08:00
Christopher Dunn
ec9302c4ed Avoid deprecated Meson feature
* https://mesonbuild.com/Python-3-module.html

> This module is deprecated and replaced by the python module.
2019-11-04 01:29:02 -08:00
Christopher Dunn
fb9aaf8112 Update meson/python
```
DEPRECATION: Project targetting '>= 0.41.1' but tried to use feature deprecated since '0.48.0': python3 module
Build targets in project: 3
WARNING: Deprecated features used:
 * 0.48.0: {'python3 module'}
```
2019-11-04 01:29:02 -08:00
dota17
2703c306a3 add coverage badge in readme (#1072) 2019-10-26 17:03:05 +08:00
dota17
c634b98e7d modify README.md: add some badges (#1070) 2019-10-25 17:12:11 +08:00
dota17
6c9408d128 remove pushError in CharReader (#1055) 2019-10-23 15:31:25 -07:00
dota17
54bd178bd8 update testcases to improve coverage (#1061) 2019-10-23 15:30:34 -07:00
Jacob Bundgaard
41ffff01d3 Fix link to Amalgamated wiki article (#1064) 2019-10-18 11:06:56 -07:00
Hans Johnson
b082693b9e COMP: Improve const correctness for ValueIterators (#1056)
The protected deref method had inconsistent interface
of being a const function that returned a non-const
reference.  Resolves #914.
2019-10-17 10:52:13 -07:00
nicolaswilson
a955529e47 Added emitUTF8 setting. (#1045)
* Added emitUTF8 setting to emit UTF8 format JSON.

* Added a test for emitUTF8, with it in default, on and off states.

* Review comments addressed.

* Merged master into my branch & resolved conflicts.

* Fix clang-format errors.

* Fix clang-format errors.

* Fixed clang-format errors.

* Fixed clang-format errors.
2019-10-17 10:47:51 -07:00
dota17
f59ac2a1d7 add coveralls to test coverage (#1060) 2019-10-17 10:46:41 -07:00
Jordan Bayles
a07b37e4ec Improve performance for comment parsing (#1052)
* Improve performance for comment parsing

* Fix weird main.cpp issue

* Readd newline

* remove carriage return feed char

* Remove unnecessary checks
2019-10-17 10:43:25 -07:00
Hans Johnson
aebc7faa4f BUG: New CMake features used that break backward compatibility
We desire for jsoncpp to compile and be readily available
with older  versions of cmake.  The use of newer cmake
commands requires conditional statements so that older
strategies can be used with older versions of cmake.

Resolves: #1018
2019-10-17 11:11:34 -05:00
dota17
bdacfd7bc0 add a new method to insert a new value in an array at specific index. (#949)
* add a new method to insert a new value in an array at specific index.

* update: index > length, return false;

* fix clang-format
2019-10-16 14:33:53 +08:00
dota17
c5f66ab816 Test Framework Modify : Remove JSONTEST_REGISTER_FIXTURE (#1050)
* add JSONTEST_FIXTURE_V2 to automatically register

* fix clang-format

* revert singleton
2019-10-15 15:48:50 -07:00
Rosen Penev
bcad4e4de2 clang-tidy cleanups 2 (#1048)
* [clang-tidy] Add explicit to single argument constructor

Found with hicpp-explicit-conversions

Signed-off-by: Rosen Penev <rosenp@gmail.com>

* [clang-tidy] Fix mismatching declaration

Found with readability-inconsistent-declaration-parameter-name

Signed-off-by: Rosen Penev <rosenp@gmail.com>

* [clang-tidy] Replace {} with = default

Found with modernize-use-equals-default

Signed-off-by: Rosen Penev <rosenp@gmail.com>

* [clang-tidy] Remove redundant .c_Str

Found with readability-redundant-string-cstr

Signed-off-by: Rosen Penev <rosenp@gmail.com>

* [clang-tidy] Simplify boolean expressions

Found with readability-simplify-boolean-expr

Signed-off-by: Rosen Penev <rosenp@gmail.com>

* [clang-tidy] Use std::move

Found with modernize-pass-by-value

Signed-off-by: Rosen Penev <rosenp@gmail.com>

* [clang-tidy] Uppercase literal suffixes

Found with hicpp-uppercase-literal-suffix

Signed-off-by: Rosen Penev <rosenp@gmail.com>
2019-10-15 15:27:23 +08:00
dota17
7329223f58 fix clang-format (#1049)
fix clang-format for #1039
2019-10-14 09:42:47 +08:00
Jordan Bayles
2e33c218cb Fix fuzzer off by one error (#1047)
* Fix fuzzer off by one error

Currently the fuzzer has an off by one error, as it passing a bad length
to the CharReader::parse method, resulting in a heap buffer overflow.

* Rebase master, rerun clang format
2019-10-11 15:08:42 -07:00
dota17
ddc0748c4f add testcases for writerTest [improve coverage] (#1039)
* add testcases for writerTest

* update StyledWriterTest, StyledStreamWriterTest and StreamWriterTest

* run clang-format

* add FastWriter Test

* Improve Coverage to 90+%
2019-10-11 14:39:09 -07:00
es1x
2ee3b1dbb1 Re-add JSONCPP_NORETURN (#1041)
Fixes Visual Studio 2013 compatibility.
2019-10-11 11:33:36 -07:00
Jordan Bayles
f34bf24bbd Issue #958: Travis CI should enforce clang-format standards (#1026)
* Issue #958: Travis CI should enfore clang-format standards

This patch adds clang format support to the travis bots.

* Update path

* Roll back to version 8 since 9 is in test

* Cleanup clang

* Revert "Delete JSONCPP_DEPRECATED, use [[deprecated]] instead. (#978)" (#1029)

This reverts commit b27c83f691.
2019-10-11 11:19:00 -07:00
Jacob Bundgaard
c4bc6da87d Fix dead link in CONTRIBUTING.md (#1044) 2019-10-10 10:22:25 +08:00
dota17
736409f1b5 fix clang-format error for ci (#1036)
* fix clang-format error for ci

* update
2019-10-01 12:54:07 -07:00
Griffin Downs
7e97345e26 Add vcpkg installation instructions (#1037) 2019-10-01 12:53:42 -07:00
Vincent
227c7cdfa5 Supplement the testcase for comparing object (#1032)
* supplement the testcase for comparing object

* update testcase

* add a new test scenarios in compareObject
2019-09-25 14:07:34 -07:00
Vincent
00c2c9f6e4 Supplement the testcase for comparing the Array and Null (#1031)
* supplement the testcase for comparing the Array and Null

* update testcase
2019-09-25 14:07:08 -07:00
Jordan Bayles
d448610770 Fixup Json::Value append methods, run clang format. (#1022) 2019-09-25 14:05:45 -07:00
Jordan Bayles
00b979f086 Issue #970: Rename features.h to json_features.h (#1024)
This patch fixes a build issue on CMake, presumably due to the new glibc
having a features.h include file. This patch renames our features.h file
to avoid a name collision.
2019-09-25 14:04:53 -07:00
Rosen Penev
ae4dc9aa62 clang-tidy fixes (#1033)
* [clang-tidy] Replace C typedef with C++ using

Found with modernize-use-using

Added to .clang-tidy file.

Signed-off-by: Rosen Penev <rosenp@gmail.com>

* [clang-tidy] Remove redundant member init

Found with readability-redundant-member-init

Added to .clang-tidy

* [clang-tidy] Replace C casts with C++ ones

Found with google-readability-casting

Signed-off-by: Rosen Penev <rosenp@gmail.com>

* [clang-tidy] Use default member init

Found with modernize-use-default-member-init

Signed-off-by: Rosen Penev <rosenp@gmail.com>
2019-09-25 14:03:30 -07:00
dota17
e9ccbe0145 Create an example directory and add some code examples. (#944)
* update example directory

* modify some compile error.

* update with clang-format

* update

* update

* add_definitions("../include/json")

# Please enter the commit message for your changes. Lines starting
# with '#' will be ignored, and an empty message aborts the commit.
#
# Date:      Wed Jul 10 21:26:16 2019 +0800
#
# On branch code_example
# Your branch is up-to-date with 'origin/code_example'.
#
# Changes to be committed:
#	modified:   example/CMakeLists.txt
#

* change CMakeLists.txt

* update streamWrite.cpp

* update

* Update readFromStream.cpp

* fix typo
2019-09-17 13:30:00 -07:00
Rosen Penev
21e3d21243 pkgconfig: Fix for cross compilation (#1027)
exec_ and prefix must be overridden  in such a case.

Makes the .pc file more consistent with other projects.
2019-09-17 12:46:55 -07:00
Vincent
c97bd59ff2 add a testcase in ValueTest:CopyObject (#1028) 2019-09-17 12:46:29 -07:00
Jordan Bayles
81ae1d55f7 Just run clang format (#1025) 2019-09-16 12:37:14 -07:00
Jordan Bayles
18f790fbe7 Issue 1021: Fix clang 10 compilation (#1023)
This patch fixes an implicit long to double conversion, fixing
compilation on the as-of-yet unreleased clang v10.
2019-09-16 12:27:59 -07:00
m-gupta
3013ed48b3 jsoncpp: Define JSON_USE_INT64_DOUBLE_CONVERSION for clang as well. (#1002)
The current check to define JSON_USE_INT64_DOUBLE_CONVERSION
works for GCC but not clang.

Clang does define __GNUC__ but with a value 4 which misses
the check for >= 6.

This avoids the -Wimplicit-int-float-conversion warning
when jsoncpp is built with a recent version of clang.

Signed-off-by: Manoj Gupta <manojgupta@google.com>
2019-09-16 12:24:13 -07:00
dota17
2cb9a5803e reinforce readToken function and add simple tests (#1012) 2019-09-16 11:25:22 -07:00
Frank Richter
c5cb313ca0 Do not allow tokenError tokens after input if failIfExtra is set. (#1014)
Currently when failIfExtra is set and strictRoot is not set,
OurReader::parse() will accept trailing non-whitespace after the JSON value
as long as the first token is not a valid JSON token. This commit changes
this to disallow any non-whitespace after the JSON value.

This commit also suppresses the "Extra non-whitespace after JSON value."
error message if parsing was aborted after another error.
2019-09-16 10:41:50 -07:00
dota17
abcd3f7b1f Modify code comments in write.h (#987)
* modify code comments in write.h

* update
2019-09-16 10:40:35 -07:00
dota17
d622250c3e Check the comments array boundry. (#993)
* check the comments array boundry

* remove empty line
2019-09-16 10:40:09 -07:00
dota17
db61dba885 Improving Code Readability (#1004) 2019-09-16 10:35:48 -07:00
dota17
7ef0f9fa5b [Language Conformance] Use constexpr restriction in jsoncpp (#1005)
* use constexpr restriction in jsoncpp

* remove TODO comment
2019-09-16 10:33:47 -07:00
dota17
3550a0a939 add some testcases: WriteTest, StreamWriterTest (#1015) 2019-09-16 10:32:46 -07:00
Google AutoFuzz Team
21ab82916b Add dictionary for fuzzing (#1020) 2019-09-16 10:30:59 -07:00
dota17
fd940255ce change the returned value (#1003) 2019-08-26 12:47:54 -07:00
aliha
472adb60ee Fix a coupe of typos (#1007) 2019-08-26 12:37:05 -07:00
dota17
c92c87b47d Add some test cases in ValueTest (#1010)
* add some test cases in ValueTest

* add some test cases in ValueTest
2019-08-26 12:36:51 -07:00
Frank Richter
b941149a37 tests: Improve CharReaderFailIfExtraTest (#1011)
* There was a nonsensical change of 'failIfExtra' before calling strictMode():
  the latter resets the former.
  Dealt with by having one test with pure strictMode and one with strictMode
  but failIfExtra=false.
* The JSONTEST_ASSERT_STRING_EQUAL tests for the error strings swapped
  the 'expected' and 'actual' values.
2019-08-26 12:36:27 -07:00
dota17
2cf939e8c3 change Value::null to Value::nullSingleton() (#1000) 2019-08-13 22:42:10 -07:00
Jordan Bayles
7b28698c5c Cleanup versioning strategy relanding (#989) (#997)
* Cleanup versioning strategy

Currently, versioning is a mess. CMake and Meson have seperate build
version number storage locations, with no way of knowing you need to
have both. Plus, due to recent revisions the amalgamate script is broken
unless you build first, and may still be broken afterwards.

This PR fixes some issues with versioning, and adds comments clarifying
what has to be done when doing a release.

* Run clang format

* Update SOVERSION....
2019-08-13 22:41:43 -07:00
Jordan Bayles
0d27381acf Revert "Cleanup versioning strategy (#989)" (#996)
This reverts commit 12325b814f.
2019-07-31 11:26:48 -07:00
Jordan Bayles
12325b814f Cleanup versioning strategy (#989)
* Cleanup versioning strategy

Currently, versioning is a mess. CMake and Meson have seperate build
version number storage locations, with no way of knowing you need to
have both. Plus, due to recent revisions the amalgamate script is broken
unless you build first, and may still be broken afterwards.

This PR fixes some issues with versioning, and adds comments clarifying
what has to be done when doing a release.

* Run clang format

* Update SOVERSION....
2019-07-22 15:25:23 -07:00
dota17
b27c83f691 Delete JSONCPP_DEPRECATED, use [[deprecated]] instead. (#978)
* delete JSONCPP_DEPRECATED, use [[deprecated]]

* add pragma warning(disable:4996)

* add error C2416

* update

* update

* update
2019-07-17 13:35:33 -07:00
Billy Donahue
483eba84a7 Improve code comment formatting (Issue #985) 2019-07-17 13:04:53 -07:00
Jordan Bayles
b3507948e2 Fix definition check for GNUC 2019-07-17 13:03:23 -07:00
159 changed files with 2691 additions and 1429 deletions

View File

@@ -1,47 +1,4 @@
---
# BasedOnStyle: LLVM
AccessModifierOffset: -2
ConstructorInitializerIndentWidth: 4
AlignEscapedNewlinesLeft: false
AlignTrailingComments: true
AllowAllParametersOfDeclarationOnNextLine: true
AllowShortIfStatementsOnASingleLine: false
AllowShortLoopsOnASingleLine: false
AlwaysBreakTemplateDeclarations: false
AlwaysBreakBeforeMultilineStrings: false
BreakBeforeBinaryOperators: false
BreakBeforeTernaryOperators: true
BreakConstructorInitializersBeforeComma: false
BinPackParameters: false
ColumnLimit: 80
ConstructorInitializerAllOnOneLineOrOnePerLine: false
DerivePointerBinding: false
ExperimentalAutoDetectBinPacking: false
IndentCaseLabels: false
MaxEmptyLinesToKeep: 1
NamespaceIndentation: None
ObjCSpaceBeforeProtocolList: true
PenaltyBreakBeforeFirstCallParameter: 19
PenaltyBreakComment: 60
PenaltyBreakString: 1000
PenaltyBreakFirstLessLess: 120
PenaltyExcessCharacter: 1000000
PenaltyReturnTypeOnItsOwnLine: 60
PointerBindsToType: true
SpacesBeforeTrailingComments: 1
Cpp11BracedListStyle: true
Standard: Cpp11
IndentWidth: 2
TabWidth: 8
UseTab: Never
BreakBeforeBraces: Attach
IndentFunctionDeclarationAfterType: false
SpacesInParentheses: false
SpacesInAngles: false
SpaceInEmptyParentheses: false
SpacesInCStyleCastParentheses: false
SpaceAfterControlStatementKeyword: true
SpaceBeforeAssignmentOperators: true
ContinuationIndentWidth: 4
...
BasedOnStyle: LLVM
DerivePointerAlignment: false
PointerAlignment: Left

11
.clang-tidy Normal file
View File

@@ -0,0 +1,11 @@
---
Checks: 'google-readability-casting,modernize-use-default-member-init,modernize-use-using,readability-redundant-member-init'
WarningsAsErrors: ''
HeaderFilterRegex: ''
AnalyzeTemporaryDtors: false
FormatStyle: none
CheckOptions:
- key: modernize-use-using.IgnoreMacros
value: '0'
...

2
.gitignore vendored
View File

@@ -10,8 +10,6 @@
/libs/
/doc/doxyfile
/dist/
#/version
#/include/json/version.h
# MSVC project files:
*.sln

View File

@@ -9,6 +9,7 @@ sudo: false
addons:
homebrew:
packages:
- clang-format
- meson
- ninja
update: false # do not update homebrew by default
@@ -17,6 +18,7 @@ addons:
- ubuntu-toolchain-r-test
- llvm-toolchain-xenial-8
packages:
- clang-format-8
- clang-8
- valgrind
matrix:
@@ -25,7 +27,7 @@ matrix:
include:
- name: Mac clang meson static release testing
os: osx
osx_image: xcode10.2
osx_image: xcode11
compiler: clang
env:
CXX="clang++"
@@ -60,6 +62,10 @@ matrix:
BUILD_TYPE=Debug
LIB_TYPE=shared
DESTDIR=/tmp/cmake_json_cpp
before_install:
- pip install --user cpp-coveralls
script: ./.travis_scripts/cmake_builder.sh
after_success:
- coveralls --include src/lib_json --include include
notifications:
email: false

View File

@@ -63,9 +63,11 @@ meson --version
ninja --version
_COMPILER_NAME=`basename ${CXX}`
_BUILD_DIR_NAME="build-${BUILD_TYPE}_${LIB_TYPE}_${_COMPILER_NAME}"
meson --buildtype ${BUILD_TYPE} --default-library ${LIB_TYPE} . "${_BUILD_DIR_NAME}"
./.travis_scripts/run-clang-format.sh
meson --fatal-meson-warnings --werror --buildtype ${BUILD_TYPE} --default-library ${LIB_TYPE} . "${_BUILD_DIR_NAME}"
ninja -v -j 2 -C "${_BUILD_DIR_NAME}"
#ninja -v -j 2 -C "${_BUILD_DIR_NAME}" test
cd "${_BUILD_DIR_NAME}"
meson test --no-rebuild --print-errorlogs

View File

@@ -0,0 +1,356 @@
#!/usr/bin/env python
"""A wrapper script around clang-format, suitable for linting multiple files
and to use for continuous integration.
This is an alternative API for the clang-format command line.
It runs over multiple files and directories in parallel.
A diff output is produced and a sensible exit code is returned.
NOTE: pulled from https://github.com/Sarcasm/run-clang-format, which is
licensed under the MIT license.
"""
from __future__ import print_function, unicode_literals
import argparse
import codecs
import difflib
import fnmatch
import io
import multiprocessing
import os
import signal
import subprocess
import sys
import traceback
from functools import partial
try:
from subprocess import DEVNULL # py3k
except ImportError:
DEVNULL = open(os.devnull, "wb")
DEFAULT_EXTENSIONS = 'c,h,C,H,cpp,hpp,cc,hh,c++,h++,cxx,hxx'
class ExitStatus:
SUCCESS = 0
DIFF = 1
TROUBLE = 2
def list_files(files, recursive=False, extensions=None, exclude=None):
if extensions is None:
extensions = []
if exclude is None:
exclude = []
out = []
for file in files:
if recursive and os.path.isdir(file):
for dirpath, dnames, fnames in os.walk(file):
fpaths = [os.path.join(dirpath, fname) for fname in fnames]
for pattern in exclude:
# os.walk() supports trimming down the dnames list
# by modifying it in-place,
# to avoid unnecessary directory listings.
dnames[:] = [
x for x in dnames
if
not fnmatch.fnmatch(os.path.join(dirpath, x), pattern)
]
fpaths = [
x for x in fpaths if not fnmatch.fnmatch(x, pattern)
]
for f in fpaths:
ext = os.path.splitext(f)[1][1:]
if ext in extensions:
out.append(f)
else:
out.append(file)
return out
def make_diff(file, original, reformatted):
return list(
difflib.unified_diff(
original,
reformatted,
fromfile='{}\t(original)'.format(file),
tofile='{}\t(reformatted)'.format(file),
n=3))
class DiffError(Exception):
def __init__(self, message, errs=None):
super(DiffError, self).__init__(message)
self.errs = errs or []
class UnexpectedError(Exception):
def __init__(self, message, exc=None):
super(UnexpectedError, self).__init__(message)
self.formatted_traceback = traceback.format_exc()
self.exc = exc
def run_clang_format_diff_wrapper(args, file):
try:
ret = run_clang_format_diff(args, file)
return ret
except DiffError:
raise
except Exception as e:
raise UnexpectedError('{}: {}: {}'.format(file, e.__class__.__name__,
e), e)
def run_clang_format_diff(args, file):
try:
with io.open(file, 'r', encoding='utf-8') as f:
original = f.readlines()
except IOError as exc:
raise DiffError(str(exc))
invocation = [args.clang_format_executable, file]
# Use of utf-8 to decode the process output.
#
# Hopefully, this is the correct thing to do.
#
# It's done due to the following assumptions (which may be incorrect):
# - clang-format will returns the bytes read from the files as-is,
# without conversion, and it is already assumed that the files use utf-8.
# - if the diagnostics were internationalized, they would use utf-8:
# > Adding Translations to Clang
# >
# > Not possible yet!
# > Diagnostic strings should be written in UTF-8,
# > the client can translate to the relevant code page if needed.
# > Each translation completely replaces the format string
# > for the diagnostic.
# > -- http://clang.llvm.org/docs/InternalsManual.html#internals-diag-translation
#
# It's not pretty, due to Python 2 & 3 compatibility.
encoding_py3 = {}
if sys.version_info[0] >= 3:
encoding_py3['encoding'] = 'utf-8'
try:
proc = subprocess.Popen(
invocation,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
universal_newlines=True,
**encoding_py3)
except OSError as exc:
raise DiffError(
"Command '{}' failed to start: {}".format(
subprocess.list2cmdline(invocation), exc
)
)
proc_stdout = proc.stdout
proc_stderr = proc.stderr
if sys.version_info[0] < 3:
# make the pipes compatible with Python 3,
# reading lines should output unicode
encoding = 'utf-8'
proc_stdout = codecs.getreader(encoding)(proc_stdout)
proc_stderr = codecs.getreader(encoding)(proc_stderr)
# hopefully the stderr pipe won't get full and block the process
outs = list(proc_stdout.readlines())
errs = list(proc_stderr.readlines())
proc.wait()
if proc.returncode:
raise DiffError(
"Command '{}' returned non-zero exit status {}".format(
subprocess.list2cmdline(invocation), proc.returncode
),
errs,
)
return make_diff(file, original, outs), errs
def bold_red(s):
return '\x1b[1m\x1b[31m' + s + '\x1b[0m'
def colorize(diff_lines):
def bold(s):
return '\x1b[1m' + s + '\x1b[0m'
def cyan(s):
return '\x1b[36m' + s + '\x1b[0m'
def green(s):
return '\x1b[32m' + s + '\x1b[0m'
def red(s):
return '\x1b[31m' + s + '\x1b[0m'
for line in diff_lines:
if line[:4] in ['--- ', '+++ ']:
yield bold(line)
elif line.startswith('@@ '):
yield cyan(line)
elif line.startswith('+'):
yield green(line)
elif line.startswith('-'):
yield red(line)
else:
yield line
def print_diff(diff_lines, use_color):
if use_color:
diff_lines = colorize(diff_lines)
if sys.version_info[0] < 3:
sys.stdout.writelines((l.encode('utf-8') for l in diff_lines))
else:
sys.stdout.writelines(diff_lines)
def print_trouble(prog, message, use_colors):
error_text = 'error:'
if use_colors:
error_text = bold_red(error_text)
print("{}: {} {}".format(prog, error_text, message), file=sys.stderr)
def main():
parser = argparse.ArgumentParser(description=__doc__)
parser.add_argument(
'--clang-format-executable',
metavar='EXECUTABLE',
help='path to the clang-format executable',
default='clang-format')
parser.add_argument(
'--extensions',
help='comma separated list of file extensions (default: {})'.format(
DEFAULT_EXTENSIONS),
default=DEFAULT_EXTENSIONS)
parser.add_argument(
'-r',
'--recursive',
action='store_true',
help='run recursively over directories')
parser.add_argument('files', metavar='file', nargs='+')
parser.add_argument(
'-q',
'--quiet',
action='store_true')
parser.add_argument(
'-j',
metavar='N',
type=int,
default=0,
help='run N clang-format jobs in parallel'
' (default number of cpus + 1)')
parser.add_argument(
'--color',
default='auto',
choices=['auto', 'always', 'never'],
help='show colored diff (default: auto)')
parser.add_argument(
'-e',
'--exclude',
metavar='PATTERN',
action='append',
default=[],
help='exclude paths matching the given glob-like pattern(s)'
' from recursive search')
args = parser.parse_args()
# use default signal handling, like diff return SIGINT value on ^C
# https://bugs.python.org/issue14229#msg156446
signal.signal(signal.SIGINT, signal.SIG_DFL)
try:
signal.SIGPIPE
except AttributeError:
# compatibility, SIGPIPE does not exist on Windows
pass
else:
signal.signal(signal.SIGPIPE, signal.SIG_DFL)
colored_stdout = False
colored_stderr = False
if args.color == 'always':
colored_stdout = True
colored_stderr = True
elif args.color == 'auto':
colored_stdout = sys.stdout.isatty()
colored_stderr = sys.stderr.isatty()
version_invocation = [args.clang_format_executable, str("--version")]
try:
subprocess.check_call(version_invocation, stdout=DEVNULL)
except subprocess.CalledProcessError as e:
print_trouble(parser.prog, str(e), use_colors=colored_stderr)
return ExitStatus.TROUBLE
except OSError as e:
print_trouble(
parser.prog,
"Command '{}' failed to start: {}".format(
subprocess.list2cmdline(version_invocation), e
),
use_colors=colored_stderr,
)
return ExitStatus.TROUBLE
retcode = ExitStatus.SUCCESS
files = list_files(
args.files,
recursive=args.recursive,
exclude=args.exclude,
extensions=args.extensions.split(','))
if not files:
return
njobs = args.j
if njobs == 0:
njobs = multiprocessing.cpu_count() + 1
njobs = min(len(files), njobs)
if njobs == 1:
# execute directly instead of in a pool,
# less overhead, simpler stacktraces
it = (run_clang_format_diff_wrapper(args, file) for file in files)
pool = None
else:
pool = multiprocessing.Pool(njobs)
it = pool.imap_unordered(
partial(run_clang_format_diff_wrapper, args), files)
while True:
try:
outs, errs = next(it)
except StopIteration:
break
except DiffError as e:
print_trouble(parser.prog, str(e), use_colors=colored_stderr)
retcode = ExitStatus.TROUBLE
sys.stderr.writelines(e.errs)
except UnexpectedError as e:
print_trouble(parser.prog, str(e), use_colors=colored_stderr)
sys.stderr.write(e.formatted_traceback)
retcode = ExitStatus.TROUBLE
# stop at the first unexpected error,
# something could be very wrong,
# don't process all files unnecessarily
if pool:
pool.terminate()
break
else:
sys.stderr.writelines(errs)
if outs == []:
continue
if not args.quiet:
print_diff(outs, use_color=colored_stdout)
if retcode == ExitStatus.SUCCESS:
retcode = ExitStatus.DIFF
return retcode
if __name__ == '__main__':
sys.exit(main())

View File

@@ -0,0 +1,4 @@
#!/usr/bin/env bash
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python $DIR/run-clang-format.py -r $DIR/../src/**/ $DIR/../include/**/

View File

@@ -63,11 +63,18 @@ if(NOT DEFINED CMAKE_BUILD_TYPE)
endif()
project(JSONCPP
VERSION 1.9.0 # <major>[.<minor>[.<patch>[.<tweak>]]]
# Note: version must be updated in three places when doing a release. This
# annoying process ensures that amalgamate, CMake, and meson all report the
# correct version.
# 1. /meson.build
# 2. /include/json/version.h
# 3. /CMakeLists.txt
# IMPORTANT: also update the SOVERSION!!
VERSION 1.9.2 # <major>[.<minor>[.<patch>[.<tweak>]]]
LANGUAGES CXX)
message(STATUS "JsonCpp Version: ${JSONCPP_VERSION_MAJOR}.${JSONCPP_VERSION_MINOR}.${JSONCPP_VERSION_PATCH}")
set( JSONCPP_SOVERSION 21 )
set( JSONCPP_SOVERSION 22 )
option(JSONCPP_WITH_TESTS "Compile and (for jsoncpp_check) run JsonCpp test executables" ON)
option(JSONCPP_WITH_POST_BUILD_UNITTEST "Automatically run unit-tests as a post build step" ON)
@@ -89,10 +96,6 @@ set(DEBUG_LIBNAME_SUFFIX "" CACHE STRING "Optional suffix to append to the libra
set(JSONCPP_USE_SECURE_MEMORY "0" CACHE STRING "-D...=1 to use memory-wiping allocator for STL" )
# File version.h is only regenerated on CMake configure step
configure_file( "${PROJECT_SOURCE_DIR}/src/lib_json/version.h.in"
"${PROJECT_BINARY_DIR}/include/json/version.h"
NEWLINE_STYLE UNIX )
configure_file( "${PROJECT_SOURCE_DIR}/version.in"
"${PROJECT_BINARY_DIR}/version"
NEWLINE_STYLE UNIX )
@@ -101,11 +104,23 @@ macro(UseCompilationWarningAsError)
if(MSVC)
# Only enabled in debug because some old versions of VS STL generate
# warnings when compiled in release configuration.
add_compile_options($<$<CONFIG:Debug>:/WX>)
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.12.0)
add_compile_options($<$<CONFIG:Debug>:/WX>)
else()
set(CMAKE_CXX_FLAGS_DEBUG "${CMAKE_CXX_FLAGS_DEBUG} /WX ")
endif()
elseif(CMAKE_CXX_COMPILER_ID STREQUAL "GNU")
add_compile_options(-Werror)
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.12.0)
add_compile_options(-Werror)
else()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Werror")
endif()
if(JSONCPP_WITH_STRICT_ISO)
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.12.0)
add_compile_options(-pedantic-errors)
else()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -pedantic-errors")
endif()
endif()
endif()
endmacro()
@@ -116,29 +131,57 @@ include_directories( ${jsoncpp_SOURCE_DIR}/include )
if(MSVC)
# Only enabled in debug because some old versions of VS STL generate
# unreachable code warning when compiled in release configuration.
add_compile_options($<$<CONFIG:Debug>:/W4>)
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.12.0)
add_compile_options($<$<CONFIG:Debug>:/W4>)
else()
set(CMAKE_CXX_FLAGS_DEBUG "${CMAKE_CXX_FLAGS_DEBUG} /W4 ")
endif()
endif()
if(CMAKE_CXX_COMPILER_ID MATCHES "Clang")
# using regular Clang or AppleClang
add_compile_options(-Wall -Wconversion -Wshadow -Werror=conversion -Werror=sign-compare)
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.12.0)
add_compile_options(-Wall -Wconversion -Wshadow -Werror=conversion -Werror=sign-compare)
else()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall -Wconversion -Wshadow -Werror=conversion -Werror=sign-compare")
endif()
elseif(CMAKE_CXX_COMPILER_ID STREQUAL "GNU")
# using GCC
add_compile_options(-Wall -Wconversion -Wshadow -Wextra)
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.12.0)
add_compile_options(-Wall -Wconversion -Wshadow -Wextra)
else()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall -Wconversion -Wshadow -Wextra")
endif()
# not yet ready for -Wsign-conversion
if(JSONCPP_WITH_STRICT_ISO)
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.12.0)
add_compile_options(-pedantic)
else()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -pedantic")
endif()
endif()
if(JSONCPP_WITH_WARNING_AS_ERROR)
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.12.0)
add_compile_options(-Werror=conversion)
else()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Werror=conversion")
endif()
endif()
elseif(CMAKE_CXX_COMPILER_ID STREQUAL "Intel")
# using Intel compiler
add_compile_options(-Wall -Wconversion -Wshadow -Wextra -Werror=conversion)
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.12.0)
add_compile_options(-Wall -Wconversion -Wshadow -Wextra -Werror=conversion)
else()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall -Wconversion -Wshadow -Wextra -Werror=conversion")
endif()
if(JSONCPP_WITH_STRICT_ISO AND NOT JSONCPP_WITH_WARNING_AS_ERROR)
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.12.0)
add_compile_options(-pedantic)
else()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -pedantic")
endif()
endif()
endif()
@@ -184,3 +227,5 @@ add_subdirectory( src )
#install the includes
add_subdirectory( include )
#install the example
add_subdirectory( example )

View File

@@ -11,7 +11,7 @@ An example of a common Meson/Ninja environment is described next.
Thanks to David Seifert (@SoapGentoo), we (the maintainers) now use
[meson](http://mesonbuild.com/) and [ninja](https://ninja-build.org/) to build
for debugging, as well as for continuous integration (see
[`./travis_scripts/meson_builder.sh`](./travis_scripts/meson_builder.sh) ). Other systems may work, but minor
[`./.travis_scripts/meson_builder.sh`](./.travis_scripts/meson_builder.sh) ). Other systems may work, but minor
things like version strings might break.
First, install both meson (which requires Python3) and ninja.
@@ -26,7 +26,6 @@ Then,
LIB_TYPE=shared
#LIB_TYPE=static
meson --buildtype ${BUILD_TYPE} --default-library ${LIB_TYPE} . build-${LIB_TYPE}
#ninja -v -C build-${LIB_TYPE} test # This stopped working on my Mac.
ninja -v -C build-${LIB_TYPE}
cd build-${LIB_TYPE}
meson test --no-rebuild --print-errorlogs

View File

@@ -1,6 +1,10 @@
# JsonCpp
[![badge](https://img.shields.io/badge/conan.io-jsoncpp%2F1.8.0-green.svg?logo=data:image/png;base64%2CiVBORw0KGgoAAAANSUhEUgAAAA4AAAAOCAMAAAAolt3jAAAA1VBMVEUAAABhlctjlstkl8tlmMtlmMxlmcxmmcxnmsxpnMxpnM1qnc1sn85voM91oM11oc1xotB2oc56pNF6pNJ2ptJ8ptJ8ptN9ptN8p9N5qNJ9p9N9p9R8qtOBqdSAqtOAqtR%2BrNSCrNJ/rdWDrNWCsNWCsNaJs9eLs9iRvNuVvdyVv9yXwd2Zwt6axN6dxt%2Bfx%2BChyeGiyuGjyuCjyuGly%2BGlzOKmzOGozuKoz%2BKqz%2BOq0OOv1OWw1OWw1eWx1eWy1uay1%2Baz1%2Baz1%2Bez2Oe02Oe12ee22ujUGwH3AAAAAXRSTlMAQObYZgAAAAFiS0dEAIgFHUgAAAAJcEhZcwAACxMAAAsTAQCanBgAAAAHdElNRQfgBQkREyOxFIh/AAAAiklEQVQI12NgAAMbOwY4sLZ2NtQ1coVKWNvoc/Eq8XDr2wB5Ig62ekza9vaOqpK2TpoMzOxaFtwqZua2Bm4makIM7OzMAjoaCqYuxooSUqJALjs7o4yVpbowvzSUy87KqSwmxQfnsrPISyFzWeWAXCkpMaBVIC4bmCsOdgiUKwh3JojLgAQ4ZCE0AMm2D29tZwe6AAAAAElFTkSuQmCC)](https://bintray.com/theirix/conan-repo/jsoncpp%3Atheirix)
[![badge](https://img.shields.io/badge/license-MIT-blue)](https://github.com/open-source-parsers/jsoncpp/blob/master/LICENSE)
[![badge](https://img.shields.io/badge/document-doxygen-brightgreen)](http://open-source-parsers.github.io/jsoncpp-docs/doxygen/index.html)
[![Coverage Status](https://coveralls.io/repos/github/open-source-parsers/jsoncpp/badge.svg?branch=master)](https://coveralls.io/github/open-source-parsers/jsoncpp?branch=master)
[JSON][json-org] is a lightweight data-interchange format. It can represent
numbers, strings, ordered sequences of values, and collections of name/value
@@ -31,14 +35,25 @@ format to store user input files.
## Using JsonCpp in your project
### The vcpkg dependency manager
You can download and install JsonCpp using the [vcpkg](https://github.com/Microsoft/vcpkg/) dependency manager:
git clone https://github.com/Microsoft/vcpkg.git
cd vcpkg
./bootstrap-vcpkg.sh
./vcpkg integrate install
vcpkg install jsoncpp
The JsonCpp port in vcpkg is kept up to date by Microsoft team members and community contributors. If the version is out of date, please [create an issue or pull request](https://github.com/Microsoft/vcpkg) on the vcpkg repository.
### Amalgamated source
https://github.com/open-source-parsers/jsoncpp/wiki/Amalgamated
https://github.com/open-source-parsers/jsoncpp/wiki/Amalgamated-(Possibly-outdated)
### The Meson Build System
If you are using the [Meson Build System](http://mesonbuild.com), then you can get a wrap file by downloading it from [Meson WrapDB](https://wrapdb.mesonbuild.com/jsoncpp), or simply use `meson wrap install jsoncpp`.
### Other ways
If you have trouble, see the Wiki, or post a question as an Issue.
If you have trouble, see the [Wiki](https://github.com/open-source-parsers/jsoncpp/wiki), or post a question as an Issue.
## License

38
amalgamate.py Normal file → Executable file
View File

@@ -1,3 +1,5 @@
#!/usr/bin/env python
"""Amalgamate json-cpp library sources into a single source and header file.
Works with python2.6+ and python3.4+.
@@ -9,6 +11,9 @@ import os
import os.path
import sys
INCLUDE_PATH = "include/json"
SRC_PATH = "src/lib_json"
class AmalgamationFile:
def __init__(self, top_dir):
self.top_dir = top_dir
@@ -66,15 +71,15 @@ def amalgamate_source(source_top_dir=None,
header.add_text("/// If defined, indicates that the source file is amalgamated")
header.add_text("/// to prevent private header inclusion.")
header.add_text("#define JSON_IS_AMALGAMATION")
header.add_file("include/json/version.h")
header.add_file("include/json/allocator.h")
header.add_file("include/json/config.h")
header.add_file("include/json/forwards.h")
header.add_file("include/json/features.h")
header.add_file("include/json/value.h")
header.add_file("include/json/reader.h")
header.add_file("include/json/writer.h")
header.add_file("include/json/assertions.h")
header.add_file(os.path.join(INCLUDE_PATH, "version.h"))
header.add_file(os.path.join(INCLUDE_PATH, "allocator.h"))
header.add_file(os.path.join(INCLUDE_PATH, "config.h"))
header.add_file(os.path.join(INCLUDE_PATH, "forwards.h"))
header.add_file(os.path.join(INCLUDE_PATH, "json_features.h"))
header.add_file(os.path.join(INCLUDE_PATH, "value.h"))
header.add_file(os.path.join(INCLUDE_PATH, "reader.h"))
header.add_file(os.path.join(INCLUDE_PATH, "writer.h"))
header.add_file(os.path.join(INCLUDE_PATH, "assertions.h"))
header.add_text("#endif //ifndef JSON_AMALGAMATED_H_INCLUDED")
target_header_path = os.path.join(os.path.dirname(target_source_path), header_include_path)
@@ -94,8 +99,8 @@ def amalgamate_source(source_top_dir=None,
header.add_text("/// If defined, indicates that the source file is amalgamated")
header.add_text("/// to prevent private header inclusion.")
header.add_text("#define JSON_IS_AMALGAMATION")
header.add_file("include/json/config.h")
header.add_file("include/json/forwards.h")
header.add_file(os.path.join(INCLUDE_PATH, "config.h"))
header.add_file(os.path.join(INCLUDE_PATH, "forwards.h"))
header.add_text("#endif //ifndef JSON_FORWARD_AMALGAMATED_H_INCLUDED")
target_forward_header_path = os.path.join(os.path.dirname(target_source_path),
@@ -116,12 +121,11 @@ def amalgamate_source(source_top_dir=None,
#endif
""")
source.add_text("")
lib_json = "src/lib_json"
source.add_file(os.path.join(lib_json, "json_tool.h"))
source.add_file(os.path.join(lib_json, "json_reader.cpp"))
source.add_file(os.path.join(lib_json, "json_valueiterator.inl"))
source.add_file(os.path.join(lib_json, "json_value.cpp"))
source.add_file(os.path.join(lib_json, "json_writer.cpp"))
source.add_file(os.path.join(SRC_PATH, "json_tool.h"))
source.add_file(os.path.join(SRC_PATH, "json_reader.cpp"))
source.add_file(os.path.join(SRC_PATH, "json_valueiterator.inl"))
source.add_file(os.path.join(SRC_PATH, "json_value.cpp"))
source.add_file(os.path.join(SRC_PATH, "json_writer.cpp"))
print("Writing amalgamated source to %r" % target_source_path)
source.write_to(target_source_path)

View File

@@ -3,7 +3,7 @@
\section _intro Introduction
<a HREF="http://www.json.org/">JSON (JavaScript Object Notation)</a>
is a lightweight data-interchange format.
is a lightweight data-interchange format.
Here is an example of JSON data:
\verbatim
@@ -23,14 +23,14 @@ Here is an example of JSON data:
{
// Default encoding for text
"encoding" : "UTF-8",
// Plug-ins loaded at start-up
"plug-ins" : [
"python",
"c++", // trailing comment
"ruby"
],
// Tab indent size
// (multi-line comment)
"indent" : { /*embedded comment*/ "length" : 3, "use_space": true }
@@ -67,7 +67,7 @@ const Json::Value plugins = root["plug-ins"];
// Iterate over the sequence elements.
for ( int index = 0; index < plugins.size(); ++index )
loadPlugIn( plugins[index].asString() );
// Try other datatypes. Some are auto-convertible to others.
foo::setIndentLength( root["indent"].get("length", 3).asInt() );
foo::setIndentUseSpace( root["indent"].get("use_space", true).asBool() );
@@ -124,7 +124,7 @@ reader->parse(start, stop, &value2, &errs);
\endcode
\section _pbuild Build instructions
The build instructions are located in the file
The build instructions are located in the file
<a HREF="https://github.com/open-source-parsers/jsoncpp/blob/master/README.md">README.md</a> in the top-directory of the project.
The latest version of the source is available in the project's GitHub repository:
@@ -132,7 +132,7 @@ The latest version of the source is available in the project's GitHub repository
jsoncpp</a>
\section _news What's New?
The description of latest changes can be found in
The description of latest changes can be found in
<a HREF="https://github.com/open-source-parsers/jsoncpp/wiki/NEWS">
the NEWS wiki
</a>.
@@ -152,7 +152,7 @@ The description of latest changes can be found in
\section _license License
See file <a href="https://github.com/open-source-parsers/jsoncpp/blob/master/LICENSE"><code>LICENSE</code></a> in the top-directory of the project.
Basically JsonCpp is licensed under MIT license, or public domain if desired
Basically JsonCpp is licensed under MIT license, or public domain if desired
and recognized in your jurisdiction.
\author Baptiste Lepilleur <blep@users.sourceforge.net> (originator)

29
example/CMakeLists.txt Normal file
View File

@@ -0,0 +1,29 @@
#vim: et ts =4 sts = 4 sw = 4 tw = 0
cmake_minimum_required(VERSION 3.1)
set(EXAMPLES
readFromString
readFromStream
stringWrite
streamWrite
)
add_definitions(-D_GLIBCXX_USE_CXX11_ABI)
set_property(DIRECTORY PROPERTY COMPILE_OPTIONS ${EXTRA_CXX_FLAGS})
if("${CMAKE_CXX_COMPILER_ID}" STREQUAL "GNU")
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall -Wextra ")
else()
add_definitions(
-D_SCL_SECURE_NO_WARNINGS
-D_CRT_SECURE_NO_WARNINGS
-D_WIN32_WINNT=0x601
-D_WINSOCK_DEPRECATED_NO_WARNINGS)
endif()
foreach (example ${EXAMPLES})
add_executable(${example} ${example}/${example}.cpp)
target_include_directories(${example} PUBLIC ${CMAKE_SOURCE_DIR}/include)
target_link_libraries(${example} jsoncpp_lib)
endforeach()
add_custom_target(examples ALL DEPENDS ${EXAMPLES})

13
example/README.md Normal file
View File

@@ -0,0 +1,13 @@
***NOTE***
If you get linker errors about undefined references to symbols that involve types in the `std::__cxx11` namespace or the tag
`[abi:cxx11]` then it probably indicates that you are trying to link together object files that were compiled with different
values for the _GLIBCXX_USE_CXX11_ABI marco. This commonly happens when linking to a third-party library that was compiled with
an older version of GCC. If the third-party library cannot be rebuilt with the new ABI, then you need to recompile your code with
the old ABI,just like:
**g++ stringWrite.cpp -ljsoncpp -std=c++11 -D_GLIBCXX_USE_CXX11_ABI=0 -o stringWrite**
Not all of uses of the new ABI will cause changes in symbol names, for example a class with a `std::string` member variable will
have the same mangled name whether compiled with the older or new ABI. In order to detect such problems, the new types and functions
are annotated with the abi_tag attribute, allowing the compiler to warn about potential ABI incompatibilities in code using them.
Those warnings can be enabled with the `-Wabi-tag` option.

View File

@@ -0,0 +1,3 @@
{
1: "value"
}

View File

@@ -0,0 +1,30 @@
#include "json/json.h"
#include <fstream>
#include <iostream>
/** \brief Parse from stream, collect comments and capture error info.
* Example Usage:
* $g++ readFromStream.cpp -ljsoncpp -std=c++11 -o readFromStream
* $./readFromStream
* // comment head
* {
* // comment before
* "key" : "value"
* }
* // comment after
* // comment tail
*/
int main(int argc, char* argv[]) {
Json::Value root;
std::ifstream ifs;
ifs.open(argv[1]);
Json::CharReaderBuilder builder;
builder["collectComments"] = true;
JSONCPP_STRING errs;
if (!parseFromStream(builder, ifs, &root, &errs)) {
std::cout << errs << std::endl;
return EXIT_FAILURE;
}
std::cout << root << std::endl;
return EXIT_SUCCESS;
}

View File

@@ -0,0 +1,6 @@
// comment head
{
// comment before
"key" : "value"
// comment after
}// comment tail

View File

@@ -0,0 +1,37 @@
#include "json/json.h"
#include <iostream>
/**
* \brief Parse a raw string into Value object using the CharReaderBuilder
* class, or the legacy Reader class.
* Example Usage:
* $g++ readFromString.cpp -ljsoncpp -std=c++11 -o readFromString
* $./readFromString
* colin
* 20
*/
int main() {
const std::string rawJson = R"({"Age": 20, "Name": "colin"})";
const int rawJsonLength = static_cast<int>(rawJson.length());
constexpr bool shouldUseOldWay = false;
JSONCPP_STRING err;
Json::Value root;
if (shouldUseOldWay) {
Json::Reader reader;
reader.parse(rawJson, root);
} else {
Json::CharReaderBuilder builder;
const std::unique_ptr<Json::CharReader> reader(builder.newCharReader());
if (!reader->parse(rawJson.c_str(), rawJson.c_str() + rawJsonLength, &root,
&err)) {
std::cout << "error" << std::endl;
return EXIT_FAILURE;
}
}
const std::string name = root["Name"].asString();
const int age = root["Age"].asInt();
std::cout << name << std::endl;
std::cout << age << std::endl;
return EXIT_SUCCESS;
}

View File

@@ -0,0 +1,22 @@
#include "json/json.h"
#include <iostream>
/** \brief Write the Value object to a stream.
* Example Usage:
* $g++ streamWrite.cpp -ljsoncpp -std=c++11 -o streamWrite
* $./streamWrite
* {
* "Age" : 20,
* "Name" : "robin"
* }
*/
int main() {
Json::Value root;
Json::StreamWriterBuilder builder;
const std::unique_ptr<Json::StreamWriter> writer(builder.newStreamWriter());
root["Name"] = "robin";
root["Age"] = 20;
writer->write(root, &std::cout);
return EXIT_SUCCESS;
}

View File

@@ -0,0 +1,33 @@
#include "json/json.h"
#include <iostream>
/** \brief Write a Value object to a string.
* Example Usage:
* $g++ stringWrite.cpp -ljsoncpp -std=c++11 -o stringWrite
* $./stringWrite
* {
* "action" : "run",
* "data" :
* {
* "number" : 1
* }
* }
*/
int main() {
Json::Value root;
Json::Value data;
constexpr bool shouldUseOldWay = false;
root["action"] = "run";
data["number"] = 1;
root["data"] = data;
if (shouldUseOldWay) {
Json::FastWriter writer;
const std::string json_file = writer.write(root);
std::cout << json_file << std::endl;
} else {
Json::StreamWriterBuilder builder;
const std::string json_file = Json::writeString(builder, root);
std::cout << json_file << std::endl;
}
return EXIT_SUCCESS;
}

View File

@@ -1,6 +1,5 @@
file(GLOB INCLUDE_FILES "json/*.h")
install(FILES
${INCLUDE_FILES}
${PROJECT_BINARY_DIR}/include/json/version.h
DESTINATION ${CMAKE_INSTALL_INCLUDEDIR}/json)

View File

@@ -74,8 +74,8 @@
#if defined(_MSC_VER) && _MSC_VER < 1900
// As recommended at
// https://stackoverflow.com/questions/2915672/snprintf-and-visual-studio-2010
extern JSON_API int
msvc_pre1900_c99_snprintf(char* outBuf, size_t size, const char* format, ...);
extern JSON_API int msvc_pre1900_c99_snprintf(char* outBuf, size_t size,
const char* format, ...);
#define jsoncpp_snprintf msvc_pre1900_c99_snprintf
#else
#define jsoncpp_snprintf std::snprintf
@@ -108,7 +108,7 @@ msvc_pre1900_c99_snprintf(char* outBuf, size_t size, const char* format, ...);
#if __has_extension(attribute_deprecated_with_message)
#define JSONCPP_DEPRECATED(message) __attribute__((deprecated(message)))
#endif
#elif defined __GNUC__ // not clang (gcc comes later since clang emulates gcc)
#elif defined(__GNUC__) // not clang (gcc comes later since clang emulates gcc)
#if (__GNUC__ > 4 || (__GNUC__ == 4 && __GNUC_MINOR__ >= 5))
#define JSONCPP_DEPRECATED(message) __attribute__((deprecated(message)))
#elif (__GNUC__ > 3 || (__GNUC__ == 3 && __GNUC_MINOR__ >= 1))
@@ -123,7 +123,7 @@ msvc_pre1900_c99_snprintf(char* outBuf, size_t size, const char* format, ...);
#define JSONCPP_DEPRECATED(message)
#endif // if !defined(JSONCPP_DEPRECATED)
#if __GNUC__ >= 6
#if defined(__clang__) || (defined(__GNUC__) && (__GNUC__ >= 6))
#define JSON_USE_INT64_DOUBLE_CONVERSION 1
#endif
@@ -156,16 +156,16 @@ typedef UInt64 LargestUInt;
#endif // if defined(JSON_NO_INT64)
template <typename T>
using Allocator = typename std::conditional<JSONCPP_USING_SECURE_MEMORY,
SecureAllocator<T>,
std::allocator<T>>::type;
using Allocator =
typename std::conditional<JSONCPP_USING_SECURE_MEMORY, SecureAllocator<T>,
std::allocator<T>>::type;
using String = std::basic_string<char, std::char_traits<char>, Allocator<char>>;
using IStringStream = std::basic_istringstream<String::value_type,
String::traits_type,
String::allocator_type>;
using OStringStream = std::basic_ostringstream<String::value_type,
String::traits_type,
String::allocator_type>;
using IStringStream =
std::basic_istringstream<String::value_type, String::traits_type,
String::allocator_type>;
using OStringStream =
std::basic_ostringstream<String::value_type, String::traits_type,
String::allocator_type>;
using IStream = std::istream;
using OStream = std::ostream;
} // namespace Json

View File

@@ -25,7 +25,7 @@ class Reader;
class CharReader;
class CharReaderBuilder;
// features.h
// json_features.h
class Features;
// value.h

View File

@@ -7,7 +7,7 @@
#define JSON_JSON_H_INCLUDED
#include "autolink.h"
#include "features.h"
#include "json_features.h"
#include "reader.h"
#include "value.h"
#include "writer.h"

View File

@@ -7,7 +7,7 @@
#define CPPTL_JSON_READER_H_INCLUDED
#if !defined(JSON_IS_AMALGAMATION)
#include "features.h"
#include "json_features.h"
#include "value.h"
#endif // if !defined(JSON_IS_AMALGAMATION)
#include <deque>
@@ -28,11 +28,13 @@
namespace Json {
/** \brief Unserialize a <a HREF="http://www.json.org">JSON</a> document into a
*Value.
* Value.
*
* \deprecated Use CharReader and CharReaderBuilder.
*/
class JSON_API Reader {
class JSONCPP_DEPRECATED(
"Use CharReader and CharReaderBuilder instead.") JSON_API Reader {
public:
typedef char Char;
typedef const Char* Location;
@@ -41,7 +43,6 @@ public:
*
* The offsets give the [start, limit) range of bytes within the text. Note
* that this is bytes, not codepoints.
*
*/
struct StructuredError {
ptrdiff_t offset_start;
@@ -49,56 +50,50 @@ public:
String message;
};
/** \brief Constructs a Reader allowing all features
* for parsing.
/** \brief Constructs a Reader allowing all features for parsing.
*/
JSONCPP_DEPRECATED("Use CharReader and CharReaderBuilder instead")
Reader();
/** \brief Constructs a Reader allowing the specified feature set
* for parsing.
/** \brief Constructs a Reader allowing the specified feature set for parsing.
*/
JSONCPP_DEPRECATED("Use CharReader and CharReaderBuilder instead")
Reader(const Features& features);
/** \brief Read a Value from a <a HREF="http://www.json.org">JSON</a>
* document.
* \param document UTF-8 encoded string containing the document to read.
* \param root [out] Contains the root value of the document if it was
* successfully parsed.
* \param collectComments \c true to collect comment and allow writing them
* back during
* serialization, \c false to discard comments.
* This parameter is ignored if
* Features::allowComments_
* is \c false.
*
* \param document UTF-8 encoded string containing the document
* to read.
* \param[out] root Contains the root value of the document if it
* was successfully parsed.
* \param collectComments \c true to collect comment and allow writing
* them back during serialization, \c false to
* discard comments. This parameter is ignored
* if Features::allowComments_ is \c false.
* \return \c true if the document was successfully parsed, \c false if an
* error occurred.
*/
bool
parse(const std::string& document, Value& root, bool collectComments = true);
bool parse(const std::string& document, Value& root,
bool collectComments = true);
/** \brief Read a Value from a <a HREF="http://www.json.org">JSON</a>
document.
* \param beginDoc Pointer on the beginning of the UTF-8 encoded string of the
document to read.
* \param endDoc Pointer on the end of the UTF-8 encoded string of the
document to read.
* Must be >= beginDoc.
* \param root [out] Contains the root value of the document if it was
* successfully parsed.
* \param collectComments \c true to collect comment and allow writing them
back during
* serialization, \c false to discard comments.
* This parameter is ignored if
Features::allowComments_
* is \c false.
* document.
*
* \param beginDoc Pointer on the beginning of the UTF-8 encoded
* string of the document to read.
* \param endDoc Pointer on the end of the UTF-8 encoded string
* of the document to read. Must be >= beginDoc.
* \param[out] root Contains the root value of the document if it
* was successfully parsed.
* \param collectComments \c true to collect comment and allow writing
* them back during serialization, \c false to
* discard comments. This parameter is ignored
* if Features::allowComments_ is \c false.
* \return \c true if the document was successfully parsed, \c false if an
error occurred.
* error occurred.
*/
bool parse(const char* beginDoc,
const char* endDoc,
Value& root,
bool parse(const char* beginDoc, const char* endDoc, Value& root,
bool collectComments = true);
/// \brief Parse from input stream.
@@ -107,11 +102,10 @@ public:
/** \brief Returns a user friendly string that list errors in the parsed
* document.
* \return Formatted error message with the list of errors with their location
* in
* the parsed document. An empty string is returned if no error
* occurred
* during parsing.
*
* \return Formatted error message with the list of errors with their
* location in the parsed document. An empty string is returned if no error
* occurred during parsing.
* \deprecated Use getFormattedErrorMessages() instead (typo fix).
*/
JSONCPP_DEPRECATED("Use getFormattedErrorMessages() instead.")
@@ -119,43 +113,45 @@ public:
/** \brief Returns a user friendly string that list errors in the parsed
* document.
* \return Formatted error message with the list of errors with their location
* in
* the parsed document. An empty string is returned if no error
* occurred
* during parsing.
*
* \return Formatted error message with the list of errors with their
* location in the parsed document. An empty string is returned if no error
* occurred during parsing.
*/
String getFormattedErrorMessages() const;
/** \brief Returns a vector of structured erros encounted while parsing.
/** \brief Returns a vector of structured errors encountered while parsing.
*
* \return A (possibly empty) vector of StructuredError objects. Currently
* only one error can be returned, but the caller should tolerate
* multiple
* errors. This can occur if the parser recovers from a non-fatal
* parse error and then encounters additional errors.
* only one error can be returned, but the caller should tolerate multiple
* errors. This can occur if the parser recovers from a non-fatal parse
* error and then encounters additional errors.
*/
std::vector<StructuredError> getStructuredErrors() const;
/** \brief Add a semantic error message.
* \param value JSON Value location associated with the error
*
* \param value JSON Value location associated with the error
* \param message The error message.
* \return \c true if the error was successfully added, \c false if the
* Value offset exceeds the document size.
* \return \c true if the error was successfully added, \c false if the Value
* offset exceeds the document size.
*/
bool pushError(const Value& value, const String& message);
/** \brief Add a semantic error message with extra context.
* \param value JSON Value location associated with the error
*
* \param value JSON Value location associated with the error
* \param message The error message.
* \param extra Additional JSON Value location to contextualize the error
* \param extra Additional JSON Value location to contextualize the error
* \return \c true if the error was successfully added, \c false if either
* Value offset exceeds the document size.
*/
bool pushError(const Value& value, const String& message, const Value& extra);
/** \brief Return whether there are any errors.
* \return \c true if there are no errors to report \c false if
* errors have occurred.
*
* \return \c true if there are no errors to report \c false if errors have
* occurred.
*/
bool good() const;
@@ -195,7 +191,7 @@ private:
bool readToken(Token& token);
void skipSpaces();
bool match(Location pattern, int patternLength);
bool match(const Char* pattern, int patternLength);
bool readComment();
bool readCStyleComment();
bool readCppStyleComment();
@@ -210,24 +206,19 @@ private:
bool decodeString(Token& token, String& decoded);
bool decodeDouble(Token& token);
bool decodeDouble(Token& token, Value& decoded);
bool decodeUnicodeCodePoint(Token& token,
Location& current,
Location end,
bool decodeUnicodeCodePoint(Token& token, Location& current, Location end,
unsigned int& unicode);
bool decodeUnicodeEscapeSequence(Token& token,
Location& current,
Location end,
unsigned int& unicode);
bool decodeUnicodeEscapeSequence(Token& token, Location& current,
Location end, unsigned int& unicode);
bool addError(const String& message, Token& token, Location extra = nullptr);
bool recoverFromError(TokenType skipUntilToken);
bool addErrorAndRecover(const String& message,
Token& token,
bool addErrorAndRecover(const String& message, Token& token,
TokenType skipUntilToken);
void skipUntilSpace();
Value& currentValue();
Char getNextChar();
void
getLocationLineAndColumn(Location location, int& line, int& column) const;
void getLocationLineAndColumn(Location location, int& line,
int& column) const;
String getLocationLineAndColumn(Location location) const;
void addComment(Location begin, Location end, CommentPlacement placement);
void skipCommentTokens(Token& token);
@@ -255,26 +246,22 @@ class JSON_API CharReader {
public:
virtual ~CharReader() = default;
/** \brief Read a Value from a <a HREF="http://www.json.org">JSON</a>
document.
* The document must be a UTF-8 encoded string containing the document to
read.
* document. The document must be a UTF-8 encoded string containing the
* document to read.
*
* \param beginDoc Pointer on the beginning of the UTF-8 encoded string of the
document to read.
* \param endDoc Pointer on the end of the UTF-8 encoded string of the
document to read.
* Must be >= beginDoc.
* \param root [out] Contains the root value of the document if it was
* successfully parsed.
* \param errs [out] Formatted error messages (if not NULL)
* a user friendly string that lists errors in the parsed
* document.
* \param beginDoc Pointer on the beginning of the UTF-8 encoded string
* of the document to read.
* \param endDoc Pointer on the end of the UTF-8 encoded string of the
* document to read. Must be >= beginDoc.
* \param[out] root Contains the root value of the document if it was
* successfully parsed.
* \param[out] errs Formatted error messages (if not NULL) a user
* friendly string that lists errors in the parsed
* document.
* \return \c true if the document was successfully parsed, \c false if an
error occurred.
* error occurred.
*/
virtual bool parse(char const* beginDoc,
char const* endDoc,
Value* root,
virtual bool parse(char const* beginDoc, char const* endDoc, Value* root,
String* errs) = 0;
class JSON_API Factory {
@@ -288,59 +275,58 @@ public:
}; // CharReader
/** \brief Build a CharReader implementation.
Usage:
\code
using namespace Json;
CharReaderBuilder builder;
builder["collectComments"] = false;
Value value;
String errs;
bool ok = parseFromStream(builder, std::cin, &value, &errs);
\endcode
*/
*
* Usage:
* \code
* using namespace Json;
* CharReaderBuilder builder;
* builder["collectComments"] = false;
* Value value;
* String errs;
* bool ok = parseFromStream(builder, std::cin, &value, &errs);
* \endcode
*/
class JSON_API CharReaderBuilder : public CharReader::Factory {
public:
// Note: We use a Json::Value so that we can add data-members to this class
// without a major version bump.
/** Configuration of this builder.
These are case-sensitive.
Available settings (case-sensitive):
- `"collectComments": false or true`
- true to collect comment and allow writing them
back during serialization, false to discard comments.
This parameter is ignored if allowComments is false.
- `"allowComments": false or true`
- true if comments are allowed.
- `"strictRoot": false or true`
- true if root must be either an array or an object value
- `"allowDroppedNullPlaceholders": false or true`
- true if dropped null placeholders are allowed. (See
StreamWriterBuilder.)
- `"allowNumericKeys": false or true`
- true if numeric object keys are allowed.
- `"allowSingleQuotes": false or true`
- true if '' are allowed for strings (both keys and values)
- `"stackLimit": integer`
- Exceeding stackLimit (recursive depth of `readValue()`) will
cause an exception.
- This is a security issue (seg-faults caused by deeply nested JSON),
so the default is low.
- `"failIfExtra": false or true`
- If true, `parse()` returns false when extra non-whitespace trails
the JSON value in the input string.
- `"rejectDupKeys": false or true`
- If true, `parse()` returns false when a key is duplicated within an
object.
- `"allowSpecialFloats": false or true`
- If true, special float values (NaNs and infinities) are allowed
and their values are lossfree restorable.
You can examine 'settings_` yourself
to see the defaults. You can also write and read them just like any
JSON Value.
\sa setDefaults()
*/
* These are case-sensitive.
* Available settings (case-sensitive):
* - `"collectComments": false or true`
* - true to collect comment and allow writing them back during
* serialization, false to discard comments. This parameter is ignored
* if allowComments is false.
* - `"allowComments": false or true`
* - true if comments are allowed.
* - `"strictRoot": false or true`
* - true if root must be either an array or an object value
* - `"allowDroppedNullPlaceholders": false or true`
* - true if dropped null placeholders are allowed. (See
* StreamWriterBuilder.)
* - `"allowNumericKeys": false or true`
* - true if numeric object keys are allowed.
* - `"allowSingleQuotes": false or true`
* - true if '' are allowed for strings (both keys and values)
* - `"stackLimit": integer`
* - Exceeding stackLimit (recursive depth of `readValue()`) will cause an
* exception.
* - This is a security issue (seg-faults caused by deeply nested JSON), so
* the default is low.
* - `"failIfExtra": false or true`
* - If true, `parse()` returns false when extra non-whitespace trails the
* JSON value in the input string.
* - `"rejectDupKeys": false or true`
* - If true, `parse()` returns false when a key is duplicated within an
* object.
* - `"allowSpecialFloats": false or true`
* - If true, special float values (NaNs and infinities) are allowed and
* their values are lossfree restorable.
*
* You can examine 'settings_` yourself to see the defaults. You can also
* write and read them just like any JSON Value.
* \sa setDefaults()
*/
Json::Value settings_;
CharReaderBuilder();
@@ -375,35 +361,33 @@ public:
* Someday we might have a real StreamReader, but for now this
* is convenient.
*/
bool JSON_API parseFromStream(CharReader::Factory const&,
IStream&,
Value* root,
bool JSON_API parseFromStream(CharReader::Factory const&, IStream&, Value* root,
String* errs);
/** \brief Read from 'sin' into 'root'.
Always keep comments from the input JSON.
This can be used to read a file into a particular sub-object.
For example:
\code
Json::Value root;
cin >> root["dir"]["file"];
cout << root;
\endcode
Result:
\verbatim
{
"dir": {
"file": {
// The input stream JSON would be nested here.
}
}
}
\endverbatim
\throw std::exception on parse error.
\see Json::operator<<()
*/
*
* Always keep comments from the input JSON.
*
* This can be used to read a file into a particular sub-object.
* For example:
* \code
* Json::Value root;
* cin >> root["dir"]["file"];
* cout << root;
* \endcode
* Result:
* \verbatim
* {
* "dir": {
* "file": {
* // The input stream JSON would be nested here.
* }
* }
* }
* \endverbatim
* \throw std::exception on parse error.
* \see Json::operator<<()
*/
JSON_API IStream& operator>>(IStream&, Value&);
} // namespace Json

View File

@@ -9,6 +9,18 @@
#if !defined(JSON_IS_AMALGAMATION)
#include "forwards.h"
#endif // if !defined(JSON_IS_AMALGAMATION)
// Conditional NORETURN attribute on the throw functions would:
// a) suppress false positives from static code analysis
// b) possibly improve optimization opportunities.
#if !defined(JSONCPP_NORETURN)
#if defined(_MSC_VER) && _MSC_VER == 1800
#define JSONCPP_NORETURN __declspec(noreturn)
#else
#define JSONCPP_NORETURN [[noreturn]]
#endif
#endif
#include <array>
#include <exception>
#include <memory>
@@ -76,9 +88,9 @@ public:
#endif
/// used internally
[[noreturn]] void throwRuntimeError(String const& msg);
JSONCPP_NORETURN void throwRuntimeError(String const& msg);
/// used internally
[[noreturn]] void throwLogicError(String const& msg);
JSONCPP_NORETURN void throwLogicError(String const& msg);
/** \brief Type of the value held by a Value object.
*/
@@ -203,31 +215,34 @@ public:
static Value const& nullSingleton();
/// Minimum signed integer value that can be stored in a Json::Value.
static const LargestInt minLargestInt;
static constexpr LargestInt minLargestInt =
LargestInt(~(LargestUInt(-1) / 2));
/// Maximum signed integer value that can be stored in a Json::Value.
static const LargestInt maxLargestInt;
static constexpr LargestInt maxLargestInt = LargestInt(LargestUInt(-1) / 2);
/// Maximum unsigned integer value that can be stored in a Json::Value.
static const LargestUInt maxLargestUInt;
static constexpr LargestUInt maxLargestUInt = LargestUInt(-1);
/// Minimum signed int value that can be stored in a Json::Value.
static const Int minInt;
static constexpr Int minInt = Int(~(UInt(-1) / 2));
/// Maximum signed int value that can be stored in a Json::Value.
static const Int maxInt;
static constexpr Int maxInt = Int(UInt(-1) / 2);
/// Maximum unsigned int value that can be stored in a Json::Value.
static const UInt maxUInt;
static constexpr UInt maxUInt = UInt(-1);
#if defined(JSON_HAS_INT64)
/// Minimum signed 64 bits int value that can be stored in a Json::Value.
static const Int64 minInt64;
static constexpr Int64 minInt64 = Int64(~(UInt64(-1) / 2));
/// Maximum signed 64 bits int value that can be stored in a Json::Value.
static const Int64 maxInt64;
static constexpr Int64 maxInt64 = Int64(UInt64(-1) / 2);
/// Maximum unsigned 64 bits int value that can be stored in a Json::Value.
static const UInt64 maxUInt64;
static constexpr UInt64 maxUInt64 = UInt64(-1);
#endif // defined(JSON_HAS_INT64)
/// Default precision for real value for string representation.
static const UInt defaultRealPrecision;
static constexpr UInt defaultRealPrecision = 17;
// The constant is hard-coded because some compiler have trouble
// converting Value::maxUInt64 to a double correctly (AIX/xlC).
// Assumes that UInt64 is a 64 bits integer.
static constexpr double maxUInt64AsDouble = 18446744073709551615.0;
// Workaround for bug in the NVIDIAs CUDA 9.1 nvcc compiler
// when using gcc and clang backend compilers. CZString
// cannot be defined as private. See issue #486
@@ -280,21 +295,22 @@ public:
#endif // ifndef JSONCPP_DOC_EXCLUDE_IMPLEMENTATION
public:
/** \brief Create a default Value of the given type.
This is a very useful constructor.
To create an empty array, pass arrayValue.
To create an empty object, pass objectValue.
Another Value can then be set to this one by assignment.
This is useful since clear() and resize() will not alter types.
Examples:
\code
Json::Value null_value; // null
Json::Value arr_value(Json::arrayValue); // []
Json::Value obj_value(Json::objectValue); // {}
\endcode
*/
/**
* \brief Create a default Value of the given type.
*
* This is a very useful constructor.
* To create an empty array, pass arrayValue.
* To create an empty object, pass objectValue.
* Another Value can then be set to this one by assignment.
* This is useful since clear() and resize() will not alter types.
*
* Examples:
* \code
* Json::Value null_value; // null
* Json::Value arr_value(Json::arrayValue); // []
* Json::Value obj_value(Json::objectValue); // {}
* \endcode
*/
Value(ValueType type = nullValue);
Value(Int value);
Value(UInt value);
@@ -305,24 +321,25 @@ Json::Value obj_value(Json::objectValue); // {}
Value(double value);
Value(const char* value); ///< Copy til first 0. (NULL causes to seg-fault.)
Value(const char* begin, const char* end); ///< Copy all, incl zeroes.
/** \brief Constructs a value from a static string.
/**
* \brief Constructs a value from a static string.
*
* Like other value string constructor but do not duplicate the string for
* internal storage. The given string must remain alive after the call to this
* constructor.
* internal storage. The given string must remain alive after the call to
* this constructor.
*
* \note This works only for null-terminated strings. (We cannot change the
* size of this class, so we have nowhere to store the length,
* which might be computed later for various operations.)
* size of this class, so we have nowhere to store the length, which might be
* computed later for various operations.)
*
* Example of usage:
* \code
* static StaticString foo("some text");
* Json::Value aValue(foo);
* \endcode
* \code
* static StaticString foo("some text");
* Json::Value aValue(foo);
* \endcode
*/
Value(const StaticString& value);
Value(const String& value); ///< Copy data() til size(). Embedded
///< zeroes too.
Value(const String& value);
#ifdef JSON_USE_CPPTL
Value(const CppTL::ConstString& value);
#endif
@@ -395,6 +412,10 @@ Json::Value obj_value(Json::objectValue); // {}
bool isArray() const;
bool isObject() const;
/// The `as<T>` and `is<T>` member function templates and specializations.
template <typename T> T as() const = delete;
template <typename T> bool is() const = delete;
bool isConvertibleTo(ValueType other) const;
/// Number of values in array or object
@@ -419,35 +440,26 @@ Json::Value obj_value(Json::objectValue); // {}
/// \post type() is arrayValue
void resize(ArrayIndex newSize);
/// Access an array element (zero based index ).
/// If the array contains less than index element, then null value are
/// inserted
/// in the array so that its size is index+1.
//@{
/// Access an array element (zero based index). If the array contains less
/// than index element, then null value are inserted in the array so that
/// its size is index+1.
/// (You may need to say 'value[0u]' to get your compiler to distinguish
/// this from the operator[] which takes a string.)
/// this from the operator[] which takes a string.)
Value& operator[](ArrayIndex index);
/// Access an array element (zero based index ).
/// If the array contains less than index element, then null value are
/// inserted
/// in the array so that its size is index+1.
/// (You may need to say 'value[0u]' to get your compiler to distinguish
/// this from the operator[] which takes a string.)
Value& operator[](int index);
//@}
/// Access an array element (zero based index )
//@{
/// Access an array element (zero based index).
/// (You may need to say 'value[0u]' to get your compiler to distinguish
/// this from the operator[] which takes a string.)
/// this from the operator[] which takes a string.)
const Value& operator[](ArrayIndex index) const;
/// Access an array element (zero based index )
/// (You may need to say 'value[0u]' to get your compiler to distinguish
/// this from the operator[] which takes a string.)
const Value& operator[](int index) const;
//@}
/// If the array contains at least index+1 elements, returns the element
/// value,
/// otherwise returns defaultValue.
/// value, otherwise returns defaultValue.
Value get(ArrayIndex index, const Value& defaultValue) const;
/// Return true if index < size().
bool isValidIndex(ArrayIndex index) const;
@@ -456,10 +468,12 @@ Json::Value obj_value(Json::objectValue); // {}
/// Equivalent to jsonvalue[jsonvalue.size()] = value;
Value& append(const Value& value);
Value& append(Value&& value);
/// \brief Insert value in array at specific index
bool insert(ArrayIndex index, Value newValue);
/// Access an object value by name, create a null member if it does not exist.
/// \note Because of our implementation, keys are limited to 2^30 -1 chars.
/// Exceeding that will cause an exception.
/// Exceeding that will cause an exception.
Value& operator[](const char* key);
/// Access an object value by name, returns null if there is no member with
/// that name.
@@ -472,17 +486,16 @@ Json::Value obj_value(Json::objectValue); // {}
/// \param key may contain embedded nulls.
const Value& operator[](const String& key) const;
/** \brief Access an object value by name, create a null member if it does not
exist.
* exist.
*
* If the object has no entry for that name, then the member name used to
store
* the new entry is not duplicated.
* store the new entry is not duplicated.
* Example of use:
* \code
* Json::Value object;
* static const StaticString code("code");
* object[code] = 1234;
* \endcode
* \code
* Json::Value object;
* static const StaticString code("code");
* object[code] = 1234;
* \endcode
*/
Value& operator[](const StaticString& key);
#ifdef JSON_USE_CPPTL
@@ -498,8 +511,8 @@ Json::Value obj_value(Json::objectValue); // {}
/// Return the member named key if it exist, defaultValue otherwise.
/// \note deep copy
/// \note key may contain embedded nulls.
Value
get(const char* begin, const char* end, const Value& defaultValue) const;
Value get(const char* begin, const char* end,
const Value& defaultValue) const;
/// Return the member named key if it exist, defaultValue otherwise.
/// \note deep copy
/// \param key may contain embedded nulls.
@@ -530,20 +543,20 @@ Json::Value obj_value(Json::objectValue); // {}
/// but 'key' is null-terminated.
bool removeMember(const char* key, Value* removed);
/** \brief Remove the named map member.
Update 'removed' iff removed.
\param key may contain embedded nulls.
\return true iff removed (no exceptions)
*/
*
* Update 'removed' iff removed.
* \param key may contain embedded nulls.
* \return true iff removed (no exceptions)
*/
bool removeMember(String const& key, Value* removed);
/// Same as removeMember(String const& key, Value* removed)
bool removeMember(const char* begin, const char* end, Value* removed);
/** \brief Remove the indexed array element.
O(n) expensive operations.
Update 'removed' iff removed.
\return true if removed (no exceptions)
*/
*
* O(n) expensive operations.
* Update 'removed' iff removed.
* \return true if removed (no exceptions)
*/
bool removeIndex(ArrayIndex index, Value* removed);
/// Return true if the object has a member named key.
@@ -650,7 +663,7 @@ private:
Comments& operator=(Comments&& that);
bool has(CommentPlacement slot) const;
String get(CommentPlacement slot) const;
void set(CommentPlacement slot, String s);
void set(CommentPlacement slot, String comment);
private:
using Array = std::array<String, numberOfCommentPlacement>;
@@ -664,6 +677,41 @@ private:
ptrdiff_t limit_;
};
template <> inline bool Value::as<bool>() const { return asBool(); }
template <> inline bool Value::is<bool>() const { return isBool(); }
template <> inline Int Value::as<Int>() const { return asInt(); }
template <> inline bool Value::is<Int>() const { return isInt(); }
template <> inline UInt Value::as<UInt>() const { return asUInt(); }
template <> inline bool Value::is<UInt>() const { return isUInt(); }
#if defined(JSON_HAS_INT64)
template <> inline Int64 Value::as<Int64>() const { return asInt64(); }
template <> inline bool Value::is<Int64>() const { return isInt64(); }
template <> inline UInt64 Value::as<UInt64>() const { return asUInt64(); }
template <> inline bool Value::is<UInt64>() const { return isUInt64(); }
#endif
template <> inline double Value::as<double>() const { return asDouble(); }
template <> inline bool Value::is<double>() const { return isDouble(); }
template <> inline String Value::as<String>() const { return asString(); }
template <> inline bool Value::is<String>() const { return isString(); }
/// These `as` specializations are type conversions, and do not have a
/// corresponding `is`.
template <> inline float Value::as<float>() const { return asFloat(); }
template <> inline const char* Value::as<const char*>() const {
return asCString();
}
#ifdef JSON_USE_CPPTL
template <> inline CppTL::ConstString Value::as<CppTL::ConstString>() const {
return asConstString();
}
#endif
/** \brief Experimental and untested: represents an element of the "path" to
* access a node.
*/
@@ -674,7 +722,7 @@ public:
PathArgument();
PathArgument(ArrayIndex index);
PathArgument(const char* key);
PathArgument(const String& key);
PathArgument(String key);
private:
enum Kind { kindNone = 0, kindIndex, kindKey };
@@ -692,12 +740,11 @@ private:
* - ".name1.name2.name3"
* - ".[0][1][2].name1[3]"
* - ".%" => member name is provided as parameter
* - ".[%]" => index is provied as parameter
* - ".[%]" => index is provided as parameter
*/
class JSON_API Path {
public:
Path(const String& path,
const PathArgument& a1 = PathArgument(),
Path(const String& path, const PathArgument& a1 = PathArgument(),
const PathArgument& a2 = PathArgument(),
const PathArgument& a3 = PathArgument(),
const PathArgument& a4 = PathArgument(),
@@ -714,10 +761,8 @@ private:
typedef std::vector<PathArgument> Args;
void makePath(const String& path, const InArgs& in);
void addPathInArg(const String& path,
const InArgs& in,
InArgs::const_iterator& itInArg,
PathArgument::Kind kind);
void addPathInArg(const String& path, const InArgs& in,
InArgs::const_iterator& itInArg, PathArgument::Kind kind);
static void invalidPath(const String& path, int location);
Args args_;
@@ -766,7 +811,14 @@ public:
char const* memberName(char const** end) const;
protected:
Value& deref() const;
/*! Internal utility functions to assist with implementing
* other iterator functions. The const and non-const versions
* of the "deref" protected methods expose the protected
* current_ member variable in a way that can often be
* optimized away by the compiler.
*/
const Value& deref() const;
Value& deref();
void increment();
@@ -889,9 +941,13 @@ public:
return *this;
}
reference operator*() const { return deref(); }
pointer operator->() const { return &deref(); }
/*! The return value of non-const iterators can be
* changed, so the these functions are not const
* because the returned references/pointers can be used
* to change state of the base class.
*/
reference operator*() { return deref(); }
pointer operator->() { return &deref(); }
};
inline void swap(Value& a, Value& b) { a.swap(b); }

28
include/json/version.h Normal file
View File

@@ -0,0 +1,28 @@
#ifndef JSON_VERSION_H_INCLUDED
#define JSON_VERSION_H_INCLUDED
// Note: version must be updated in three places when doing a release. This
// annoying process ensures that amalgamate, CMake, and meson all report the
// correct version.
// 1. /meson.build
// 2. /include/json/version.h
// 3. /CMakeLists.txt
// IMPORTANT: also update the SOVERSION!!
#define JSONCPP_VERSION_STRING "1.9.2"
#define JSONCPP_VERSION_MAJOR 1
#define JSONCPP_VERSION_MINOR 9
#define JSONCPP_VERSION_PATCH 2
#define JSONCPP_VERSION_QUALIFIER
#define JSONCPP_VERSION_HEXA \
((JSONCPP_VERSION_MAJOR << 24) | (JSONCPP_VERSION_MINOR << 16) | \
(JSONCPP_VERSION_PATCH << 8))
#ifdef JSONCPP_USING_SECURE_MEMORY
#undef JSONCPP_USING_SECURE_MEMORY
#endif
#define JSONCPP_USING_SECURE_MEMORY 0
// If non-zero, the library zeroes any memory that it has allocated before
// it frees its memory.
#endif // JSON_VERSION_H_INCLUDED

View File

@@ -27,18 +27,17 @@ namespace Json {
class Value;
/**
Usage:
\code
using namespace Json;
void writeToStdout(StreamWriter::Factory const& factory, Value const& value) {
std::unique_ptr<StreamWriter> const writer(
factory.newStreamWriter());
writer->write(value, &std::cout);
std::cout << std::endl; // add lf and flush
}
\endcode
*/
*
* Usage:
* \code
* using namespace Json;
* void writeToStdout(StreamWriter::Factory const& factory, Value const& value)
* { std::unique_ptr<StreamWriter> const writer( factory.newStreamWriter());
* writer->write(value, &std::cout);
* std::cout << std::endl; // add lf and flush
* }
* \endcode
*/
class JSON_API StreamWriter {
protected:
OStream* sout_; // not owned; will not delete
@@ -46,10 +45,11 @@ public:
StreamWriter();
virtual ~StreamWriter();
/** Write Value into document as configured in sub-class.
Do not take ownership of sout, but maintain a reference during function.
\pre sout != NULL
\return zero on success (For now, we always return zero, so check the
stream instead.) \throw std::exception possibly, depending on configuration
* Do not take ownership of sout, but maintain a reference during function.
* \pre sout != NULL
* \return zero on success (For now, we always return zero, so check the
* stream instead.) \throw std::exception possibly, depending on
* configuration
*/
virtual int write(Value const& root, OStream* sout) = 0;
@@ -73,49 +73,49 @@ String JSON_API writeString(StreamWriter::Factory const& factory,
/** \brief Build a StreamWriter implementation.
Usage:
\code
using namespace Json;
Value value = ...;
StreamWriterBuilder builder;
builder["commentStyle"] = "None";
builder["indentation"] = " "; // or whatever you like
std::unique_ptr<Json::StreamWriter> writer(
builder.newStreamWriter());
writer->write(value, &std::cout);
std::cout << std::endl; // add lf and flush
\endcode
* Usage:
* \code
* using namespace Json;
* Value value = ...;
* StreamWriterBuilder builder;
* builder["commentStyle"] = "None";
* builder["indentation"] = " "; // or whatever you like
* std::unique_ptr<Json::StreamWriter> writer(
* builder.newStreamWriter());
* writer->write(value, &std::cout);
* std::cout << std::endl; // add lf and flush
* \endcode
*/
class JSON_API StreamWriterBuilder : public StreamWriter::Factory {
public:
// Note: We use a Json::Value so that we can add data-members to this class
// without a major version bump.
/** Configuration of this builder.
Available settings (case-sensitive):
- "commentStyle": "None" or "All"
- "indentation": "<anything>".
- Setting this to an empty string also omits newline characters.
- "enableYAMLCompatibility": false or true
- slightly change the whitespace around colons
- "dropNullPlaceholders": false or true
- Drop the "null" string from the writer's output for nullValues.
Strictly speaking, this is not valid JSON. But when the output is being
fed to a browser's JavaScript, it makes for smaller output and the
browser can handle the output just fine.
- "useSpecialFloats": false or true
- If true, outputs non-finite floating point values in the following way:
NaN values as "NaN", positive infinity as "Infinity", and negative
infinity as "-Infinity".
- "precision": int
- Number of precision digits for formatting of real values.
- "precisionType": "significant"(default) or "decimal"
- Type of precision for formatting of real values.
* Available settings (case-sensitive):
* - "commentStyle": "None" or "All"
* - "indentation": "<anything>".
* - Setting this to an empty string also omits newline characters.
* - "enableYAMLCompatibility": false or true
* - slightly change the whitespace around colons
* - "dropNullPlaceholders": false or true
* - Drop the "null" string from the writer's output for nullValues.
* Strictly speaking, this is not valid JSON. But when the output is being
* fed to a browser's JavaScript, it makes for smaller output and the
* browser can handle the output just fine.
* - "useSpecialFloats": false or true
* - If true, outputs non-finite floating point values in the following way:
* NaN values as "NaN", positive infinity as "Infinity", and negative
* infinity as "-Infinity".
* - "precision": int
* - Number of precision digits for formatting of real values.
* - "precisionType": "significant"(default) or "decimal"
* - Type of precision for formatting of real values.
You can examine 'settings_` yourself
to see the defaults. You can also write and read them just like any
JSON Value.
\sa setDefaults()
*/
* You can examine 'settings_` yourself
* to see the defaults. You can also write and read them just like any
* JSON Value.
* \sa setDefaults()
*/
Json::Value settings_;
StreamWriterBuilder();
@@ -346,10 +346,9 @@ String JSON_API valueToString(UInt value);
#endif // if defined(JSON_HAS_INT64)
String JSON_API valueToString(LargestInt value);
String JSON_API valueToString(LargestUInt value);
String JSON_API
valueToString(double value,
unsigned int precision = Value::defaultRealPrecision,
PrecisionType precisionType = PrecisionType::significantDigits);
String JSON_API valueToString(
double value, unsigned int precision = Value::defaultRealPrecision,
PrecisionType precisionType = PrecisionType::significantDigits);
String JSON_API valueToString(bool value);
String JSON_API valueToQuotedString(const char* value);

View File

@@ -1,390 +0,0 @@
# Copyright 2010 Baptiste Lepilleur and The JsonCpp Authors
# Distributed under MIT license, or public domain if desired and
# recognized in your jurisdiction.
# See file LICENSE for detail or copy at http://jsoncpp.sourceforge.net/LICENSE
"""Tag the sandbox for release, make source and doc tarballs.
Requires Python 2.6
Example of invocation (use to test the script):
python makerelease.py --platform=msvc6,msvc71,msvc80,msvc90,mingw -ublep 0.6.0 0.7.0-dev
When testing this script:
python makerelease.py --force --retag --platform=msvc6,msvc71,msvc80,mingw -ublep test-0.6.0 test-0.6.1-dev
Example of invocation when doing a release:
python makerelease.py 0.5.0 0.6.0-dev
Note: This was for Subversion. Now that we are in GitHub, we do not
need to build versioned tarballs anymore, so makerelease.py is defunct.
"""
from __future__ import print_function
import os.path
import subprocess
import sys
import doxybuild
import subprocess
import xml.etree.ElementTree as ElementTree
import shutil
import urllib2
import tempfile
import os
import time
from devtools import antglob, fixeol, tarball
import amalgamate
SVN_ROOT = 'https://jsoncpp.svn.sourceforge.net/svnroot/jsoncpp/'
SVN_TAG_ROOT = SVN_ROOT + 'tags/jsoncpp'
SCONS_LOCAL_URL = 'http://sourceforge.net/projects/scons/files/scons-local/1.2.0/scons-local-1.2.0.tar.gz/download'
SOURCEFORGE_PROJECT = 'jsoncpp'
def set_version(version):
with open('version','wb') as f:
f.write(version.strip())
def rmdir_if_exist(dir_path):
if os.path.isdir(dir_path):
shutil.rmtree(dir_path)
class SVNError(Exception):
pass
def svn_command(command, *args):
cmd = ['svn', '--non-interactive', command] + list(args)
print('Running:', ' '.join(cmd))
process = subprocess.Popen(cmd,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
stdout = process.communicate()[0]
if process.returncode:
error = SVNError('SVN command failed:\n' + stdout)
error.returncode = process.returncode
raise error
return stdout
def check_no_pending_commit():
"""Checks that there is no pending commit in the sandbox."""
stdout = svn_command('status', '--xml')
etree = ElementTree.fromstring(stdout)
msg = []
for entry in etree.getiterator('entry'):
path = entry.get('path')
status = entry.find('wc-status').get('item')
if status != 'unversioned' and path != 'version':
msg.append('File "%s" has pending change (status="%s")' % (path, status))
if msg:
msg.insert(0, 'Pending change to commit found in sandbox. Commit them first!')
return '\n'.join(msg)
def svn_join_url(base_url, suffix):
if not base_url.endswith('/'):
base_url += '/'
if suffix.startswith('/'):
suffix = suffix[1:]
return base_url + suffix
def svn_check_if_tag_exist(tag_url):
"""Checks if a tag exist.
Returns: True if the tag exist, False otherwise.
"""
try:
list_stdout = svn_command('list', tag_url)
except SVNError as e:
if e.returncode != 1 or not str(e).find('tag_url'):
raise e
# otherwise ignore error, meaning tag does not exist
return False
return True
def svn_commit(message):
"""Commit the sandbox, providing the specified comment.
"""
svn_command('ci', '-m', message)
def svn_tag_sandbox(tag_url, message):
"""Makes a tag based on the sandbox revisions.
"""
svn_command('copy', '-m', message, '.', tag_url)
def svn_remove_tag(tag_url, message):
"""Removes an existing tag.
"""
svn_command('delete', '-m', message, tag_url)
def svn_export(tag_url, export_dir):
"""Exports the tag_url revision to export_dir.
Target directory, including its parent is created if it does not exist.
If the directory export_dir exist, it is deleted before export proceed.
"""
rmdir_if_exist(export_dir)
svn_command('export', tag_url, export_dir)
def fix_sources_eol(dist_dir):
"""Set file EOL for tarball distribution.
"""
print('Preparing exported source file EOL for distribution...')
prune_dirs = antglob.prune_dirs + 'scons-local* ./build* ./libs ./dist'
win_sources = antglob.glob(dist_dir,
includes = '**/*.sln **/*.vcproj',
prune_dirs = prune_dirs)
unix_sources = antglob.glob(dist_dir,
includes = '''**/*.h **/*.cpp **/*.inl **/*.txt **/*.dox **/*.py **/*.html **/*.in
sconscript *.json *.expected AUTHORS LICENSE''',
excludes = antglob.default_excludes + 'scons.py sconsign.py scons-*',
prune_dirs = prune_dirs)
for path in win_sources:
fixeol.fix_source_eol(path, is_dry_run = False, verbose = True, eol = '\r\n')
for path in unix_sources:
fixeol.fix_source_eol(path, is_dry_run = False, verbose = True, eol = '\n')
def download(url, target_path):
"""Download file represented by url to target_path.
"""
f = urllib2.urlopen(url)
try:
data = f.read()
finally:
f.close()
fout = open(target_path, 'wb')
try:
fout.write(data)
finally:
fout.close()
def check_compile(distcheck_top_dir, platform):
cmd = [sys.executable, 'scons.py', 'platform=%s' % platform, 'check']
print('Running:', ' '.join(cmd))
log_path = os.path.join(distcheck_top_dir, 'build-%s.log' % platform)
flog = open(log_path, 'wb')
try:
process = subprocess.Popen(cmd,
stdout=flog,
stderr=subprocess.STDOUT,
cwd=distcheck_top_dir)
stdout = process.communicate()[0]
status = (process.returncode == 0)
finally:
flog.close()
return (status, log_path)
def write_tempfile(content, **kwargs):
fd, path = tempfile.mkstemp(**kwargs)
f = os.fdopen(fd, 'wt')
try:
f.write(content)
finally:
f.close()
return path
class SFTPError(Exception):
pass
def run_sftp_batch(userhost, sftp, batch, retry=0):
path = write_tempfile(batch, suffix='.sftp', text=True)
# psftp -agent -C blep,jsoncpp@web.sourceforge.net -batch -b batch.sftp -bc
cmd = [sftp, '-agent', '-C', '-batch', '-b', path, '-bc', userhost]
error = None
for retry_index in range(0, max(1,retry)):
heading = retry_index == 0 and 'Running:' or 'Retrying:'
print(heading, ' '.join(cmd))
process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
stdout = process.communicate()[0]
if process.returncode != 0:
error = SFTPError('SFTP batch failed:\n' + stdout)
else:
break
if error:
raise error
return stdout
def sourceforge_web_synchro(sourceforge_project, doc_dir,
user=None, sftp='sftp'):
"""Notes: does not synchronize sub-directory of doc-dir.
"""
userhost = '%s,%s@web.sourceforge.net' % (user, sourceforge_project)
stdout = run_sftp_batch(userhost, sftp, """
cd htdocs
dir
exit
""")
existing_paths = set()
collect = 0
for line in stdout.split('\n'):
line = line.strip()
if not collect and line.endswith('> dir'):
collect = True
elif collect and line.endswith('> exit'):
break
elif collect == 1:
collect = 2
elif collect == 2:
path = line.strip().split()[-1:]
if path and path[0] not in ('.', '..'):
existing_paths.add(path[0])
upload_paths = set([os.path.basename(p) for p in antglob.glob(doc_dir)])
paths_to_remove = existing_paths - upload_paths
if paths_to_remove:
print('Removing the following file from web:')
print('\n'.join(paths_to_remove))
stdout = run_sftp_batch(userhost, sftp, """cd htdocs
rm %s
exit""" % ' '.join(paths_to_remove))
print('Uploading %d files:' % len(upload_paths))
batch_size = 10
upload_paths = list(upload_paths)
start_time = time.time()
for index in range(0,len(upload_paths),batch_size):
paths = upload_paths[index:index+batch_size]
file_per_sec = (time.time() - start_time) / (index+1)
remaining_files = len(upload_paths) - index
remaining_sec = file_per_sec * remaining_files
print('%d/%d, ETA=%.1fs' % (index+1, len(upload_paths), remaining_sec))
run_sftp_batch(userhost, sftp, """cd htdocs
lcd %s
mput %s
exit""" % (doc_dir, ' '.join(paths)), retry=3)
def sourceforge_release_tarball(sourceforge_project, paths, user=None, sftp='sftp'):
userhost = '%s,%s@frs.sourceforge.net' % (user, sourceforge_project)
run_sftp_batch(userhost, sftp, """
mput %s
exit
""" % (' '.join(paths),))
def main():
usage = """%prog release_version next_dev_version
Update 'version' file to release_version and commit.
Generates the document tarball.
Tags the sandbox revision with release_version.
Update 'version' file to next_dev_version and commit.
Performs an svn export of tag release version, and build a source tarball.
Must be started in the project top directory.
Warning: --force should only be used when developing/testing the release script.
"""
from optparse import OptionParser
parser = OptionParser(usage=usage)
parser.allow_interspersed_args = False
parser.add_option('--dot', dest="dot_path", action='store', default=doxybuild.find_program('dot'),
help="""Path to GraphViz dot tool. Must be full qualified path. [Default: %default]""")
parser.add_option('--doxygen', dest="doxygen_path", action='store', default=doxybuild.find_program('doxygen'),
help="""Path to Doxygen tool. [Default: %default]""")
parser.add_option('--force', dest="ignore_pending_commit", action='store_true', default=False,
help="""Ignore pending commit. [Default: %default]""")
parser.add_option('--retag', dest="retag_release", action='store_true', default=False,
help="""Overwrite release existing tag if it exist. [Default: %default]""")
parser.add_option('-p', '--platforms', dest="platforms", action='store', default='',
help="""Comma separated list of platform passed to scons for build check.""")
parser.add_option('--no-test', dest="no_test", action='store_true', default=False,
help="""Skips build check.""")
parser.add_option('--no-web', dest="no_web", action='store_true', default=False,
help="""Do not update web site.""")
parser.add_option('-u', '--upload-user', dest="user", action='store',
help="""Sourceforge user for SFTP documentation upload.""")
parser.add_option('--sftp', dest='sftp', action='store', default=doxybuild.find_program('psftp', 'sftp'),
help="""Path of the SFTP compatible binary used to upload the documentation.""")
parser.enable_interspersed_args()
options, args = parser.parse_args()
if len(args) != 2:
parser.error('release_version missing on command-line.')
release_version = args[0]
next_version = args[1]
if not options.platforms and not options.no_test:
parser.error('You must specify either --platform or --no-test option.')
if options.ignore_pending_commit:
msg = ''
else:
msg = check_no_pending_commit()
if not msg:
print('Setting version to', release_version)
set_version(release_version)
svn_commit('Release ' + release_version)
tag_url = svn_join_url(SVN_TAG_ROOT, release_version)
if svn_check_if_tag_exist(tag_url):
if options.retag_release:
svn_remove_tag(tag_url, 'Overwriting previous tag')
else:
print('Aborting, tag %s already exist. Use --retag to overwrite it!' % tag_url)
sys.exit(1)
svn_tag_sandbox(tag_url, 'Release ' + release_version)
print('Generated doxygen document...')
## doc_dirname = r'jsoncpp-api-html-0.5.0'
## doc_tarball_path = r'e:\prg\vc\Lib\jsoncpp-trunk\dist\jsoncpp-api-html-0.5.0.tar.gz'
doc_tarball_path, doc_dirname = doxybuild.build_doc(options, make_release=True)
doc_distcheck_dir = 'dist/doccheck'
tarball.decompress(doc_tarball_path, doc_distcheck_dir)
doc_distcheck_top_dir = os.path.join(doc_distcheck_dir, doc_dirname)
export_dir = 'dist/export'
svn_export(tag_url, export_dir)
fix_sources_eol(export_dir)
source_dir = 'jsoncpp-src-' + release_version
source_tarball_path = 'dist/%s.tar.gz' % source_dir
print('Generating source tarball to', source_tarball_path)
tarball.make_tarball(source_tarball_path, [export_dir], export_dir, prefix_dir=source_dir)
amalgamation_tarball_path = 'dist/%s-amalgamation.tar.gz' % source_dir
print('Generating amalgamation source tarball to', amalgamation_tarball_path)
amalgamation_dir = 'dist/amalgamation'
amalgamate.amalgamate_source(export_dir, '%s/jsoncpp.cpp' % amalgamation_dir, 'json/json.h')
amalgamation_source_dir = 'jsoncpp-src-amalgamation' + release_version
tarball.make_tarball(amalgamation_tarball_path, [amalgamation_dir],
amalgamation_dir, prefix_dir=amalgamation_source_dir)
# Decompress source tarball, download and install scons-local
distcheck_dir = 'dist/distcheck'
distcheck_top_dir = distcheck_dir + '/' + source_dir
print('Decompressing source tarball to', distcheck_dir)
rmdir_if_exist(distcheck_dir)
tarball.decompress(source_tarball_path, distcheck_dir)
scons_local_path = 'dist/scons-local.tar.gz'
print('Downloading scons-local to', scons_local_path)
download(SCONS_LOCAL_URL, scons_local_path)
print('Decompressing scons-local to', distcheck_top_dir)
tarball.decompress(scons_local_path, distcheck_top_dir)
# Run compilation
print('Compiling decompressed tarball')
all_build_status = True
for platform in options.platforms.split(','):
print('Testing platform:', platform)
build_status, log_path = check_compile(distcheck_top_dir, platform)
print('see build log:', log_path)
print(build_status and '=> ok' or '=> FAILED')
all_build_status = all_build_status and build_status
if not build_status:
print('Testing failed on at least one platform, aborting...')
svn_remove_tag(tag_url, 'Removing tag due to failed testing')
sys.exit(1)
if options.user:
if not options.no_web:
print('Uploading documentation using user', options.user)
sourceforge_web_synchro(SOURCEFORGE_PROJECT, doc_distcheck_top_dir, user=options.user, sftp=options.sftp)
print('Completed documentation upload')
print('Uploading source and documentation tarballs for release using user', options.user)
sourceforge_release_tarball(SOURCEFORGE_PROJECT,
[source_tarball_path, doc_tarball_path],
user=options.user, sftp=options.sftp)
print('Source and doc release tarballs uploaded')
else:
print('No upload user specified. Web site and download tarball were not uploaded.')
print('Tarball can be found at:', doc_tarball_path)
# Set next version number and commit
set_version(next_version)
svn_commit('Released ' + release_version)
else:
sys.stderr.write(msg + '\n')
if __name__ == '__main__':
main()

View File

@@ -1,44 +1,34 @@
project(
'jsoncpp',
'cpp',
version : '1.9.0',
# Note: version must be updated in three places when doing a release. This
# annoying process ensures that amalgamate, CMake, and meson all report the
# correct version.
# 1. /meson.build
# 2. /include/json/version.h
# 3. /CMakeLists.txt
# IMPORTANT: also update the SOVERSION!!
version : '1.9.2',
default_options : [
'buildtype=release',
'cpp_std=c++11',
'warning_level=1'],
license : 'Public Domain',
meson_version : '>= 0.50.0')
meson_version : '>= 0.49.0')
jsoncpp_ver_arr = meson.project_version().split('.')
jsoncpp_major_version = jsoncpp_ver_arr[0]
jsoncpp_minor_version = jsoncpp_ver_arr[1]
jsoncpp_patch_version = jsoncpp_ver_arr[2]
jsoncpp_cdata = configuration_data()
jsoncpp_cdata.set('JSONCPP_VERSION', meson.project_version())
jsoncpp_cdata.set('JSONCPP_VERSION_MAJOR', jsoncpp_major_version)
jsoncpp_cdata.set('JSONCPP_VERSION_MINOR', jsoncpp_minor_version)
jsoncpp_cdata.set('JSONCPP_VERSION_PATCH', jsoncpp_patch_version)
jsoncpp_cdata.set('JSONCPP_USE_SECURE_MEMORY',0)
jsoncpp_gen_sources = configure_file(
input : 'src/lib_json/version.h.in',
output : 'version.h',
configuration : jsoncpp_cdata,
install : true,
install_dir : join_paths(get_option('prefix'), get_option('includedir'), 'json')
)
jsoncpp_headers = [
'include/json/allocator.h',
'include/json/assertions.h',
'include/json/autolink.h',
'include/json/config.h',
'include/json/features.h',
'include/json/json_features.h',
'include/json/forwards.h',
'include/json/json.h',
'include/json/reader.h',
'include/json/value.h',
'include/json/version.h',
'include/json/writer.h']
jsoncpp_include_directories = include_directories('include')
@@ -56,13 +46,12 @@ endif
jsoncpp_lib = library(
'jsoncpp',
[ jsoncpp_gen_sources,
jsoncpp_headers,
[ jsoncpp_headers,
'src/lib_json/json_tool.h',
'src/lib_json/json_reader.cpp',
'src/lib_json/json_value.cpp',
'src/lib_json/json_writer.cpp'],
soversion : 21,
soversion : 22,
install : true,
include_directories : jsoncpp_include_directories,
cpp_args: dll_export_flag)
@@ -79,10 +68,10 @@ jsoncpp_dep = declare_dependency(
include_directories : jsoncpp_include_directories,
link_with : jsoncpp_lib,
version : meson.project_version(),
sources : jsoncpp_gen_sources)
)
# tests
python = import('python3').find_python()
python = import('python').find_installation()
jsoncpp_test = executable(
'jsoncpp_test',

View File

@@ -1,5 +1,7 @@
libdir=@CMAKE_INSTALL_FULL_LIBDIR@
includedir=@CMAKE_INSTALL_FULL_INCLUDEDIR@
prefix=@CMAKE_INSTALL_PREFIX@
exec_prefix=@CMAKE_INSTALL_PREFIX@
libdir=${exec_prefix}/@CMAKE_INSTALL_LIBDIR@
includedir=${prefix}/@CMAKE_INSTALL_INCLUDEDIR@
Name: jsoncpp
Description: A C++ library for interacting with JSON

View File

@@ -1,11 +1,24 @@
find_package(PythonInterp 2.6)
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.12.0)
# The new Python3 module is much more robust than the previous PythonInterp
find_package (Python3 COMPONENTS Interpreter)
# Set variables for backwards compatibility with cmake < 3.12.0
set(PYTHONINTERP_FOUND ${Python3_Interpreter_FOUND})
set(PYTHON_EXECUTABLE ${Python3_EXECUTABLE})
else()
set(Python_ADDITIONAL_VERSIONS 3.8)
find_package(PythonInterp 3)
endif()
add_executable(jsontestrunner_exe
main.cpp
)
if(BUILD_SHARED_LIBS)
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.12.0)
add_compile_definitions( JSON_DLL )
else()
add_definitions( -DJSON_DLL )
endif()
endif()
target_link_libraries(jsontestrunner_exe jsoncpp_lib)

View File

@@ -15,7 +15,9 @@
#include <algorithm> // sort
#include <cstdio>
#include <iostream>
#include <json/json.h>
#include <memory>
#include <sstream>
struct Options {
@@ -68,8 +70,8 @@ static Json::String readInputTestFile(const char* path) {
return text;
}
static void
printValueTree(FILE* fout, Json::Value& value, const Json::String& path = ".") {
static void printValueTree(FILE* fout, Json::Value& value,
const Json::String& path = ".") {
if (value.hasComment(Json::commentBefore)) {
fprintf(fout, "%s\n", value.getComment(Json::commentBefore).c_str());
}
@@ -125,21 +127,46 @@ printValueTree(FILE* fout, Json::Value& value, const Json::String& path = ".") {
static int parseAndSaveValueTree(const Json::String& input,
const Json::String& actual,
const Json::String& kind,
const Json::Features& features,
bool parseOnly,
Json::Value* root) {
Json::Reader reader(features);
bool parsingSuccessful =
reader.parse(input.data(), input.data() + input.size(), *root);
if (!parsingSuccessful) {
printf("Failed to parse %s file: \n%s\n", kind.c_str(),
reader.getFormattedErrorMessages().c_str());
return 1;
const Json::Features& features, bool parseOnly,
Json::Value* root, bool use_legacy) {
if (!use_legacy) {
Json::CharReaderBuilder builder;
builder.settings_["allowComments"] = features.allowComments_;
builder.settings_["strictRoot"] = features.strictRoot_;
builder.settings_["allowDroppedNullPlaceholders"] =
features.allowDroppedNullPlaceholders_;
builder.settings_["allowNumericKeys"] = features.allowNumericKeys_;
std::unique_ptr<Json::CharReader> reader(builder.newCharReader());
Json::String errors;
const bool parsingSuccessful =
reader->parse(input.data(), input.data() + input.size(), root, &errors);
if (!parsingSuccessful) {
std::cerr << "Failed to parse " << kind << " file: " << std::endl
<< errors << std::endl;
return 1;
}
// We may instead check the legacy implementation (to ensure it doesn't
// randomly get broken).
} else {
Json::Reader reader(features);
const bool parsingSuccessful =
reader.parse(input.data(), input.data() + input.size(), *root);
if (!parsingSuccessful) {
std::cerr << "Failed to parse " << kind << " file: " << std::endl
<< reader.getFormatedErrorMessages() << std::endl;
return 1;
}
}
if (!parseOnly) {
FILE* factual = fopen(actual.c_str(), "wt");
if (!factual) {
printf("Failed to create %s actual file.\n", kind.c_str());
std::cerr << "Failed to create '" << kind << "' actual file."
<< std::endl;
return 2;
}
printValueTree(factual, *root);
@@ -173,7 +200,7 @@ static int rewriteValueTree(const Json::String& rewritePath,
*rewrite = write(root);
FILE* fout = fopen(rewritePath.c_str(), "wt");
if (!fout) {
printf("Failed to create rewrite file: %s\n", rewritePath.c_str());
std::cerr << "Failed to create rewrite file: " << rewritePath << std::endl;
return 2;
}
fprintf(fout, "%s\n", rewrite->c_str());
@@ -194,14 +221,15 @@ static Json::String removeSuffix(const Json::String& path,
static void printConfig() {
// Print the configuration used to compile JsonCpp
#if defined(JSON_NO_INT64)
printf("JSON_NO_INT64=1\n");
std::cout << "JSON_NO_INT64=1" << std::endl;
#else
printf("JSON_NO_INT64=0\n");
std::cout << "JSON_NO_INT64=0" << std::endl;
#endif
}
static int printUsage(const char* argv[]) {
printf("Usage: %s [--strict] input-json-file", argv[0]);
std::cout << "Usage: " << argv[0] << " [--strict] input-json-file"
<< std::endl;
return 3;
}
@@ -231,7 +259,7 @@ static int parseCommandLine(int argc, const char* argv[], Options* opts) {
} else if (writerName == "BuiltStyledStreamWriter") {
opts->write = &useBuiltStyledStreamWriter;
} else {
printf("Unknown '--json-writer %s'\n", writerName.c_str());
std::cerr << "Unknown '--json-writer' " << writerName << std::endl;
return 4;
}
}
@@ -241,19 +269,20 @@ static int parseCommandLine(int argc, const char* argv[], Options* opts) {
opts->path = argv[index];
return 0;
}
static int runTest(Options const& opts) {
static int runTest(Options const& opts, bool use_legacy) {
int exitCode = 0;
Json::String input = readInputTestFile(opts.path.c_str());
if (input.empty()) {
printf("Failed to read input or empty input: %s\n", opts.path.c_str());
std::cerr << "Invalid input file: " << opts.path << std::endl;
return 3;
}
Json::String basePath = removeSuffix(opts.path, ".json");
if (!opts.parseOnly && basePath.empty()) {
printf("Bad input path. Path does not end with '.expected':\n%s\n",
opts.path.c_str());
std::cerr << "Bad input path '" << opts.path
<< "'. Must end with '.expected'" << std::endl;
return 3;
}
@@ -263,34 +292,47 @@ static int runTest(Options const& opts) {
Json::Value root;
exitCode = parseAndSaveValueTree(input, actualPath, "input", opts.features,
opts.parseOnly, &root);
opts.parseOnly, &root, use_legacy);
if (exitCode || opts.parseOnly) {
return exitCode;
}
Json::String rewrite;
exitCode = rewriteValueTree(rewritePath, root, opts.write, &rewrite);
if (exitCode) {
return exitCode;
}
Json::Value rewriteRoot;
exitCode = parseAndSaveValueTree(rewrite, rewriteActualPath, "rewrite",
opts.features, opts.parseOnly, &rewriteRoot);
if (exitCode) {
return exitCode;
}
return 0;
opts.features, opts.parseOnly, &rewriteRoot,
use_legacy);
return exitCode;
}
int main(int argc, const char* argv[]) {
Options opts;
try {
int exitCode = parseCommandLine(argc, argv, &opts);
if (exitCode != 0) {
printf("Failed to parse command-line.");
std::cerr << "Failed to parse command-line." << std::endl;
return exitCode;
}
return runTest(opts);
const int modern_return_code = runTest(opts, false);
if (modern_return_code) {
return modern_return_code;
}
const std::string filename =
opts.path.substr(opts.path.find_last_of("\\/") + 1);
const bool should_run_legacy = (filename.rfind("legacy_", 0) == 0);
if (should_run_legacy) {
return runTest(opts, true);
}
} catch (const std::exception& e) {
printf("Unhandled exception:\n%s\n", e.what());
std::cerr << "Unhandled exception:" << std::endl << e.what() << std::endl;
return 1;
}
}

View File

@@ -34,7 +34,11 @@ endif()
if(NOT (HAVE_CLOCALE AND HAVE_LCONV_SIZE AND HAVE_DECIMAL_POINT AND HAVE_LOCALECONV))
message(WARNING "Locale functionality is not supported")
add_compile_definitions(JSONCPP_NO_LOCALE_SUPPORT)
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.12.0)
add_compile_definitions(JSONCPP_NO_LOCALE_SUPPORT)
else()
add_definitions(-DJSONCPP_NO_LOCALE_SUPPORT)
endif()
endif()
set( JSONCPP_INCLUDE_DIR ../../include )
@@ -42,12 +46,12 @@ set( JSONCPP_INCLUDE_DIR ../../include )
set( PUBLIC_HEADERS
${JSONCPP_INCLUDE_DIR}/json/config.h
${JSONCPP_INCLUDE_DIR}/json/forwards.h
${JSONCPP_INCLUDE_DIR}/json/features.h
${JSONCPP_INCLUDE_DIR}/json/json_features.h
${JSONCPP_INCLUDE_DIR}/json/value.h
${JSONCPP_INCLUDE_DIR}/json/reader.h
${JSONCPP_INCLUDE_DIR}/json/version.h
${JSONCPP_INCLUDE_DIR}/json/writer.h
${JSONCPP_INCLUDE_DIR}/json/assertions.h
${PROJECT_BINARY_DIR}/include/json/version.h
)
source_group( "Public API" FILES ${PUBLIC_HEADERS} )
@@ -57,8 +61,7 @@ set(jsoncpp_sources
json_reader.cpp
json_valueiterator.inl
json_value.cpp
json_writer.cpp
version.h.in)
json_writer.cpp)
# Install instructions for this target
if(JSONCPP_WITH_CMAKE_PACKAGE)
@@ -67,8 +70,13 @@ else(JSONCPP_WITH_CMAKE_PACKAGE)
set(INSTALL_EXPORT)
endif()
if(BUILD_SHARED_LIBS)
add_compile_definitions( JSON_DLL_BUILD )
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.12.0)
add_compile_definitions( JSON_DLL_BUILD )
else()
add_definitions( -DJSON_DLL_BUILD )
endif()
endif()

View File

@@ -12,6 +12,7 @@
#endif // if !defined(JSON_IS_AMALGAMATION)
#include <cassert>
#include <cstring>
#include <iostream>
#include <istream>
#include <limits>
#include <memory>
@@ -51,7 +52,7 @@ static size_t const stackLimit_g =
namespace Json {
#if __cplusplus >= 201103L || (defined(_CPPLIB_VER) && _CPPLIB_VER >= 520)
typedef std::unique_ptr<CharReader> CharReaderPtr;
using CharReaderPtr = std::unique_ptr<CharReader>;
#else
typedef std::auto_ptr<CharReader> CharReaderPtr;
#endif
@@ -85,16 +86,11 @@ bool Reader::containsNewLine(Reader::Location begin, Reader::Location end) {
// Class Reader
// //////////////////////////////////////////////////////////////////
Reader::Reader()
: errors_(), document_(), commentsBefore_(), features_(Features::all()) {}
Reader::Reader() : features_(Features::all()) {}
Reader::Reader(const Features& features)
: errors_(), document_(), begin_(), end_(), current_(), lastValueEnd_(),
lastValue_(), commentsBefore_(), features_(features), collectComments_() {
}
Reader::Reader(const Features& features) : features_(features) {}
bool Reader::parse(const std::string& document,
Value& root,
bool Reader::parse(const std::string& document, Value& root,
bool collectComments) {
document_.assign(document.begin(), document.end());
const char* begin = document_.c_str();
@@ -111,13 +107,11 @@ bool Reader::parse(std::istream& is, Value& root, bool collectComments) {
// Since String is reference-counted, this at least does not
// create an extra copy.
String doc;
std::getline(is, doc, (char)EOF);
std::getline(is, doc, static_cast<char> EOF);
return parse(doc.data(), doc.data() + doc.size(), root, collectComments);
}
bool Reader::parse(const char* beginDoc,
const char* endDoc,
Value& root,
bool Reader::parse(const char* beginDoc, const char* endDoc, Value& root,
bool collectComments) {
if (!features_.allowComments_) {
collectComments = false;
@@ -311,7 +305,7 @@ bool Reader::readToken(Token& token) {
if (!ok)
token.type_ = tokenError;
token.end_ = current_;
return true;
return ok;
}
void Reader::skipSpaces() {
@@ -324,7 +318,7 @@ void Reader::skipSpaces() {
}
}
bool Reader::match(Location pattern, int patternLength) {
bool Reader::match(const Char* pattern, int patternLength) {
if (end_ - current_ < patternLength)
return false;
int index = patternLength;
@@ -377,8 +371,7 @@ String Reader::normalizeEOL(Reader::Location begin, Reader::Location end) {
return normalized;
}
void Reader::addComment(Location begin,
Location end,
void Reader::addComment(Location begin, Location end,
CommentPlacement placement) {
assert(collectComments_);
const String& normalized = normalizeEOL(begin, end);
@@ -416,7 +409,7 @@ bool Reader::readCppStyleComment() {
}
void Reader::readNumber() {
const char* p = current_;
Location p = current_;
char c = '0'; // stopgap for already consumed character
// integral part
while (c >= '0' && c <= '9')
@@ -681,10 +674,8 @@ bool Reader::decodeString(Token& token, String& decoded) {
return true;
}
bool Reader::decodeUnicodeCodePoint(Token& token,
Location& current,
Location end,
unsigned int& unicode) {
bool Reader::decodeUnicodeCodePoint(Token& token, Location& current,
Location end, unsigned int& unicode) {
if (!decodeUnicodeEscapeSequence(token, current, end, unicode))
return false;
@@ -708,8 +699,7 @@ bool Reader::decodeUnicodeCodePoint(Token& token,
return true;
}
bool Reader::decodeUnicodeEscapeSequence(Token& token,
Location& current,
bool Reader::decodeUnicodeEscapeSequence(Token& token, Location& current,
Location end,
unsigned int& ret_unicode) {
if (end - current < 4)
@@ -757,8 +747,7 @@ bool Reader::recoverFromError(TokenType skipUntilToken) {
return false;
}
bool Reader::addErrorAndRecover(const String& message,
Token& token,
bool Reader::addErrorAndRecover(const String& message, Token& token,
TokenType skipUntilToken) {
addError(message, token);
return recoverFromError(skipUntilToken);
@@ -772,8 +761,7 @@ Reader::Char Reader::getNextChar() {
return *current_++;
}
void Reader::getLocationLineAndColumn(Location location,
int& line,
void Reader::getLocationLineAndColumn(Location location, int& line,
int& column) const {
Location current = begin_;
Location lastLineStart = current;
@@ -849,8 +837,7 @@ bool Reader::pushError(const Value& value, const String& message) {
return true;
}
bool Reader::pushError(const Value& value,
const String& message,
bool Reader::pushError(const Value& value, const String& message,
const Value& extra) {
ptrdiff_t const length = end_ - begin_;
if (value.getOffsetStart() > length || value.getOffsetLimit() > length ||
@@ -895,24 +882,19 @@ OurFeatures OurFeatures::all() { return {}; }
// for implementing JSON reading.
class OurReader {
public:
typedef char Char;
typedef const Char* Location;
using Char = char;
using Location = const Char*;
struct StructuredError {
ptrdiff_t offset_start;
ptrdiff_t offset_limit;
String message;
};
OurReader(OurFeatures const& features);
bool parse(const char* beginDoc,
const char* endDoc,
Value& root,
explicit OurReader(OurFeatures const& features);
bool parse(const char* beginDoc, const char* endDoc, Value& root,
bool collectComments = true);
String getFormattedErrorMessages() const;
std::vector<StructuredError> getStructuredErrors() const;
bool pushError(const Value& value, const String& message);
bool pushError(const Value& value, const String& message, const Value& extra);
bool good() const;
private:
OurReader(OurReader const&); // no impl
@@ -952,13 +934,13 @@ private:
Location extra_;
};
typedef std::deque<ErrorInfo> Errors;
using Errors = std::deque<ErrorInfo>;
bool readToken(Token& token);
void skipSpaces();
bool match(Location pattern, int patternLength);
bool match(const Char* pattern, int patternLength);
bool readComment();
bool readCStyleComment();
bool readCStyleComment(bool* containsNewLineResult);
bool readCppStyleComment();
bool readString();
bool readStringSingleQuote();
@@ -972,24 +954,19 @@ private:
bool decodeString(Token& token, String& decoded);
bool decodeDouble(Token& token);
bool decodeDouble(Token& token, Value& decoded);
bool decodeUnicodeCodePoint(Token& token,
Location& current,
Location end,
bool decodeUnicodeCodePoint(Token& token, Location& current, Location end,
unsigned int& unicode);
bool decodeUnicodeEscapeSequence(Token& token,
Location& current,
Location end,
unsigned int& unicode);
bool decodeUnicodeEscapeSequence(Token& token, Location& current,
Location end, unsigned int& unicode);
bool addError(const String& message, Token& token, Location extra = nullptr);
bool recoverFromError(TokenType skipUntilToken);
bool addErrorAndRecover(const String& message,
Token& token,
bool addErrorAndRecover(const String& message, Token& token,
TokenType skipUntilToken);
void skipUntilSpace();
Value& currentValue();
Char getNextChar();
void
getLocationLineAndColumn(Location location, int& line, int& column) const;
void getLocationLineAndColumn(Location location, int& line,
int& column) const;
String getLocationLineAndColumn(Location location) const;
void addComment(Location begin, Location end, CommentPlacement placement);
void skipCommentTokens(Token& token);
@@ -997,19 +974,21 @@ private:
static String normalizeEOL(Location begin, Location end);
static bool containsNewLine(Location begin, Location end);
typedef std::stack<Value*> Nodes;
Nodes nodes_;
Errors errors_;
String document_;
Location begin_;
Location end_;
Location current_;
Location lastValueEnd_;
Value* lastValue_;
String commentsBefore_;
using Nodes = std::stack<Value*>;
Nodes nodes_{};
Errors errors_{};
String document_{};
Location begin_ = nullptr;
Location end_ = nullptr;
Location current_ = nullptr;
Location lastValueEnd_ = nullptr;
Value* lastValue_ = nullptr;
bool lastValueHasAComment_ = false;
String commentsBefore_{};
OurFeatures const features_;
bool collectComments_;
bool collectComments_ = false;
}; // OurReader
// complete copy of Read impl, for OurReader
@@ -1022,14 +1001,9 @@ bool OurReader::containsNewLine(OurReader::Location begin,
return false;
}
OurReader::OurReader(OurFeatures const& features)
: errors_(), document_(), begin_(), end_(), current_(), lastValueEnd_(),
lastValue_(), commentsBefore_(), features_(features), collectComments_() {
}
OurReader::OurReader(OurFeatures const& features) : features_(features) {}
bool OurReader::parse(const char* beginDoc,
const char* endDoc,
Value& root,
bool OurReader::parse(const char* beginDoc, const char* endDoc, Value& root,
bool collectComments) {
if (!features_.allowComments_) {
collectComments = false;
@@ -1051,12 +1025,9 @@ bool OurReader::parse(const char* beginDoc,
nodes_.pop();
Token token;
skipCommentTokens(token);
if (features_.failIfExtra_) {
if ((features_.strictRoot_ || token.type_ != tokenError) &&
token.type_ != tokenEndOfStream) {
addError("Extra non-whitespace after JSON value.", token);
return false;
}
if (features_.failIfExtra_ && (token.type_ != tokenEndOfStream)) {
addError("Extra non-whitespace after JSON value.", token);
return false;
}
if (collectComments_ && !commentsBefore_.empty())
root.setComment(commentsBefore_, commentAfter);
@@ -1161,6 +1132,7 @@ bool OurReader::readValue() {
if (collectComments_) {
lastValueEnd_ = current_;
lastValueHasAComment_ = false;
lastValue_ = &currentValue();
}
@@ -1230,6 +1202,14 @@ bool OurReader::readToken(Token& token) {
ok = features_.allowSpecialFloats_ && match("nfinity", 7);
}
break;
case '+':
if (readNumber(true)) {
token.type_ = tokenNumber;
} else {
token.type_ = tokenPosInf;
ok = features_.allowSpecialFloats_ && match("nfinity", 7);
}
break;
case 't':
token.type_ = tokenTrue;
ok = match("rue", 3);
@@ -1274,7 +1254,7 @@ bool OurReader::readToken(Token& token) {
if (!ok)
token.type_ = tokenError;
token.end_ = current_;
return true;
return ok;
}
void OurReader::skipSpaces() {
@@ -1287,7 +1267,7 @@ void OurReader::skipSpaces() {
}
}
bool OurReader::match(Location pattern, int patternLength) {
bool OurReader::match(const Char* pattern, int patternLength) {
if (end_ - current_ < patternLength)
return false;
int index = patternLength;
@@ -1299,21 +1279,32 @@ bool OurReader::match(Location pattern, int patternLength) {
}
bool OurReader::readComment() {
Location commentBegin = current_ - 1;
Char c = getNextChar();
const Location commentBegin = current_ - 1;
const Char c = getNextChar();
bool successful = false;
if (c == '*')
successful = readCStyleComment();
else if (c == '/')
bool cStyleWithEmbeddedNewline = false;
const bool isCStyleComment = (c == '*');
const bool isCppStyleComment = (c == '/');
if (isCStyleComment) {
successful = readCStyleComment(&cStyleWithEmbeddedNewline);
} else if (isCppStyleComment) {
successful = readCppStyleComment();
}
if (!successful)
return false;
if (collectComments_) {
CommentPlacement placement = commentBefore;
if (lastValueEnd_ && !containsNewLine(lastValueEnd_, commentBegin)) {
if (c != '*' || !containsNewLine(commentBegin, current_))
placement = commentAfterOnSameLine;
if (!lastValueHasAComment_) {
if (lastValueEnd_ && !containsNewLine(lastValueEnd_, commentBegin)) {
if (isCppStyleComment || !cStyleWithEmbeddedNewline) {
placement = commentAfterOnSameLine;
lastValueHasAComment_ = true;
}
}
}
addComment(commentBegin, current_, placement);
@@ -1341,8 +1332,7 @@ String OurReader::normalizeEOL(OurReader::Location begin,
return normalized;
}
void OurReader::addComment(Location begin,
Location end,
void OurReader::addComment(Location begin, Location end,
CommentPlacement placement) {
assert(collectComments_);
const String& normalized = normalizeEOL(begin, end);
@@ -1354,12 +1344,18 @@ void OurReader::addComment(Location begin,
}
}
bool OurReader::readCStyleComment() {
bool OurReader::readCStyleComment(bool* containsNewLineResult) {
*containsNewLineResult = false;
while ((current_ + 1) < end_) {
Char c = getNextChar();
if (c == '*' && *current_ == '/')
if (c == '*' && *current_ == '/') {
break;
} else if (c == '\n') {
*containsNewLineResult = true;
}
}
return getNextChar() == '/';
}
@@ -1380,7 +1376,7 @@ bool OurReader::readCppStyleComment() {
}
bool OurReader::readNumber(bool checkInf) {
const char* p = current_;
Location p = current_;
if (checkInf && p != end_ && *p == 'I') {
current_ = ++p;
return false;
@@ -1544,20 +1540,45 @@ bool OurReader::decodeNumber(Token& token, Value& decoded) {
// larger than the maximum supported value of an integer then
// we decode the number as a double.
Location current = token.start_;
bool isNegative = *current == '-';
if (isNegative)
const bool isNegative = *current == '-';
if (isNegative) {
++current;
}
// TODO(issue #960): Change to constexpr
static const auto positive_threshold = Value::maxLargestUInt / 10;
static const auto positive_last_digit = Value::maxLargestUInt % 10;
static const auto negative_threshold =
Value::LargestUInt(Value::minLargestInt) / 10;
static const auto negative_last_digit =
Value::LargestUInt(Value::minLargestInt) % 10;
// We assume we can represent the largest and smallest integer types as
// unsigned integers with separate sign. This is only true if they can fit
// into an unsigned integer.
static_assert(Value::maxLargestInt <= Value::maxLargestUInt,
"Int must be smaller than UInt");
const auto threshold = isNegative ? negative_threshold : positive_threshold;
const auto last_digit =
// We need to convert minLargestInt into a positive number. The easiest way
// to do this conversion is to assume our "threshold" value of minLargestInt
// divided by 10 can fit in maxLargestInt when absolute valued. This should
// be a safe assumption.
static_assert(Value::minLargestInt <= -Value::maxLargestInt,
"The absolute value of minLargestInt must be greater than or "
"equal to maxLargestInt");
static_assert(Value::minLargestInt / 10 >= -Value::maxLargestInt,
"The absolute value of minLargestInt must be only 1 magnitude "
"larger than maxLargest Int");
static constexpr Value::LargestUInt positive_threshold =
Value::maxLargestUInt / 10;
static constexpr Value::UInt positive_last_digit = Value::maxLargestUInt % 10;
// For the negative values, we have to be more careful. Since typically
// -Value::minLargestInt will cause an overflow, we first divide by 10 and
// then take the inverse. This assumes that minLargestInt is only a single
// power of 10 different in magnitude, which we check above. For the last
// digit, we take the modulus before negating for the same reason.
static constexpr Value::LargestUInt negative_threshold =
Value::LargestUInt(-(Value::minLargestInt / 10));
static constexpr Value::UInt negative_last_digit =
Value::UInt(-(Value::minLargestInt % 10));
const Value::LargestUInt threshold =
isNegative ? negative_threshold : positive_threshold;
const Value::UInt max_last_digit =
isNegative ? negative_last_digit : positive_last_digit;
Value::LargestUInt value = 0;
@@ -1566,26 +1587,30 @@ bool OurReader::decodeNumber(Token& token, Value& decoded) {
if (c < '0' || c > '9')
return decodeDouble(token, decoded);
const auto digit(static_cast<Value::UInt>(c - '0'));
const Value::UInt digit(static_cast<Value::UInt>(c - '0'));
if (value >= threshold) {
// We've hit or exceeded the max value divided by 10 (rounded down). If
// a) we've only just touched the limit, meaing value == threshold,
// b) this is the last digit, or
// c) it's small enough to fit in that rounding delta, we're okay.
// Otherwise treat this number as a double to avoid overflow.
if (value > threshold || current != token.end_ || digit > last_digit) {
if (value > threshold || current != token.end_ ||
digit > max_last_digit) {
return decodeDouble(token, decoded);
}
}
value = value * 10 + digit;
}
if (isNegative)
decoded = -Value::LargestInt(value);
else if (value <= Value::LargestUInt(Value::maxLargestInt))
if (isNegative) {
// We use the same magnitude assumption here, just in case.
const Value::UInt last_digit = static_cast<Value::UInt>(value % 10);
decoded = -Value::LargestInt(value / 10) * 10 - last_digit;
} else if (value <= Value::LargestUInt(Value::maxLargestInt)) {
decoded = Value::LargestInt(value);
else
} else {
decoded = value;
}
return true;
}
@@ -1602,37 +1627,12 @@ bool OurReader::decodeDouble(Token& token) {
bool OurReader::decodeDouble(Token& token, Value& decoded) {
double value = 0;
const int bufferSize = 32;
int count;
ptrdiff_t const length = token.end_ - token.start_;
// Sanity check to avoid buffer overflow exploits.
if (length < 0) {
return addError("Unable to parse token length", token);
}
auto const ulength = static_cast<size_t>(length);
// Avoid using a string constant for the format control string given to
// sscanf, as this can cause hard to debug crashes on OS X. See here for more
// info:
//
// http://developer.apple.com/library/mac/#DOCUMENTATION/DeveloperTools/gcc-4.0.1/gcc/Incompatibilities.html
char format[] = "%lf";
if (length <= bufferSize) {
Char buffer[bufferSize + 1];
memcpy(buffer, token.start_, ulength);
buffer[length] = 0;
fixNumericLocaleInput(buffer, buffer + length);
count = sscanf(buffer, format, &value);
} else {
String buffer(token.start_, token.end_);
count = sscanf(buffer.c_str(), format, &value);
}
if (count != 1)
const String buffer(token.start_, token.end_);
IStringStream is(buffer);
if (!(is >> value)) {
return addError(
"'" + String(token.start_, token.end_) + "' is not a number.", token);
}
decoded = value;
return true;
}
@@ -1654,9 +1654,9 @@ bool OurReader::decodeString(Token& token, String& decoded) {
Location end = token.end_ - 1; // do not include '"'
while (current != end) {
Char c = *current++;
if (c == '"')
if (c == '"') {
break;
else if (c == '\\') {
} else if (c == '\\') {
if (current == end)
return addError("Empty escape sequence in string", token, current);
Char escape = *current++;
@@ -1701,10 +1701,8 @@ bool OurReader::decodeString(Token& token, String& decoded) {
return true;
}
bool OurReader::decodeUnicodeCodePoint(Token& token,
Location& current,
Location end,
unsigned int& unicode) {
bool OurReader::decodeUnicodeCodePoint(Token& token, Location& current,
Location end, unsigned int& unicode) {
if (!decodeUnicodeEscapeSequence(token, current, end, unicode))
return false;
@@ -1728,8 +1726,7 @@ bool OurReader::decodeUnicodeCodePoint(Token& token,
return true;
}
bool OurReader::decodeUnicodeEscapeSequence(Token& token,
Location& current,
bool OurReader::decodeUnicodeEscapeSequence(Token& token, Location& current,
Location end,
unsigned int& ret_unicode) {
if (end - current < 4)
@@ -1777,8 +1774,7 @@ bool OurReader::recoverFromError(TokenType skipUntilToken) {
return false;
}
bool OurReader::addErrorAndRecover(const String& message,
Token& token,
bool OurReader::addErrorAndRecover(const String& message, Token& token,
TokenType skipUntilToken) {
addError(message, token);
return recoverFromError(skipUntilToken);
@@ -1792,8 +1788,7 @@ OurReader::Char OurReader::getNextChar() {
return *current_++;
}
void OurReader::getLocationLineAndColumn(Location location,
int& line,
void OurReader::getLocationLineAndColumn(Location location, int& line,
int& column) const {
Location current = begin_;
Location lastLineStart = current;
@@ -1848,43 +1843,6 @@ std::vector<OurReader::StructuredError> OurReader::getStructuredErrors() const {
return allErrors;
}
bool OurReader::pushError(const Value& value, const String& message) {
ptrdiff_t length = end_ - begin_;
if (value.getOffsetStart() > length || value.getOffsetLimit() > length)
return false;
Token token;
token.type_ = tokenError;
token.start_ = begin_ + value.getOffsetStart();
token.end_ = begin_ + value.getOffsetLimit();
ErrorInfo info;
info.token_ = token;
info.message_ = message;
info.extra_ = nullptr;
errors_.push_back(info);
return true;
}
bool OurReader::pushError(const Value& value,
const String& message,
const Value& extra) {
ptrdiff_t length = end_ - begin_;
if (value.getOffsetStart() > length || value.getOffsetLimit() > length ||
extra.getOffsetLimit() > length)
return false;
Token token;
token.type_ = tokenError;
token.start_ = begin_ + value.getOffsetStart();
token.end_ = begin_ + value.getOffsetLimit();
ErrorInfo info;
info.token_ = token;
info.message_ = message;
info.extra_ = begin_ + extra.getOffsetStart();
errors_.push_back(info);
return true;
}
bool OurReader::good() const { return errors_.empty(); }
class OurCharReader : public CharReader {
bool const collectComments_;
OurReader reader_;
@@ -1892,9 +1850,7 @@ class OurCharReader : public CharReader {
public:
OurCharReader(bool collectComments, OurFeatures const& features)
: collectComments_(collectComments), reader_(features) {}
bool parse(char const* beginDoc,
char const* endDoc,
Value* root,
bool parse(char const* beginDoc, char const* endDoc, Value* root,
String* errs) override {
bool ok = reader_.parse(beginDoc, endDoc, *root, collectComments_);
if (errs) {
@@ -1990,9 +1946,7 @@ void CharReaderBuilder::setDefaults(Json::Value* settings) {
//////////////////////////////////
// global functions
bool parseFromStream(CharReader::Factory const& fact,
IStream& sin,
Value* root,
bool parseFromStream(CharReader::Factory const& fact, IStream& sin, Value* root,
String* errs) {
OStringStream ssin;
ssin << sin.rdbuf();

View File

@@ -22,10 +22,8 @@
// Provide implementation equivalent of std::snprintf for older _MSC compilers
#if defined(_MSC_VER) && _MSC_VER < 1900
#include <stdarg.h>
static int msvc_pre1900_c99_vsnprintf(char* outBuf,
size_t size,
const char* format,
va_list ap) {
static int msvc_pre1900_c99_vsnprintf(char* outBuf, size_t size,
const char* format, va_list ap) {
int count = -1;
if (size != 0)
count = _vsnprintf_s(outBuf, size, _TRUNCATE, format, ap);
@@ -34,10 +32,8 @@ static int msvc_pre1900_c99_vsnprintf(char* outBuf,
return count;
}
int JSON_API msvc_pre1900_c99_snprintf(char* outBuf,
size_t size,
const char* format,
...) {
int JSON_API msvc_pre1900_c99_snprintf(char* outBuf, size_t size,
const char* format, ...) {
va_list ap;
va_start(ap, format);
const int count = msvc_pre1900_c99_vsnprintf(outBuf, size, format, ap);
@@ -88,31 +84,12 @@ Value const& Value::null = Value::nullSingleton();
Value const& Value::nullRef = Value::nullSingleton();
#endif
const Int Value::minInt = Int(~(UInt(-1) / 2));
const Int Value::maxInt = Int(UInt(-1) / 2);
const UInt Value::maxUInt = UInt(-1);
#if defined(JSON_HAS_INT64)
const Int64 Value::minInt64 = Int64(~(UInt64(-1) / 2));
const Int64 Value::maxInt64 = Int64(UInt64(-1) / 2);
const UInt64 Value::maxUInt64 = UInt64(-1);
// The constant is hard-coded because some compiler have trouble
// converting Value::maxUInt64 to a double correctly (AIX/xlC).
// Assumes that UInt64 is a 64 bits integer.
static const double maxUInt64AsDouble = 18446744073709551615.0;
#endif // defined(JSON_HAS_INT64)
const LargestInt Value::minLargestInt = LargestInt(~(LargestUInt(-1) / 2));
const LargestInt Value::maxLargestInt = LargestInt(LargestUInt(-1) / 2);
const LargestUInt Value::maxLargestUInt = LargestUInt(-1);
const UInt Value::defaultRealPrecision = 17;
#if !defined(JSON_USE_INT64_DOUBLE_CONVERSION)
template <typename T, typename U>
static inline bool InRange(double d, T min, U max) {
// The casts can lose precision, but we are looking only for
// an approximate range. Might fail on edge cases though. ~cdunn
// return d >= static_cast<double>(min) && d <= static_cast<double>(max);
return d >= min && d <= max;
return d >= static_cast<double>(min) && d <= static_cast<double>(max);
}
#else // if !defined(JSON_USE_INT64_DOUBLE_CONVERSION)
static inline double integerToDouble(Json::UInt64 value) {
@@ -175,10 +152,8 @@ static inline char* duplicateAndPrefixStringValue(const char* value,
0; // to avoid buffer over-run accidents by users later
return newString;
}
inline static void decodePrefixedString(bool isPrefixed,
char const* prefixed,
unsigned* length,
char const** value) {
inline static void decodePrefixedString(bool isPrefixed, char const* prefixed,
unsigned* length, char const** value) {
if (!isPrefixed) {
*length = static_cast<unsigned>(strlen(prefixed));
*value = prefixed;
@@ -228,19 +203,19 @@ namespace Json {
#if JSON_USE_EXCEPTION
Exception::Exception(String msg) : msg_(std::move(msg)) {}
Exception::~Exception() JSONCPP_NOEXCEPT {}
Exception::~Exception() JSONCPP_NOEXCEPT = default;
char const* Exception::what() const JSONCPP_NOEXCEPT { return msg_.c_str(); }
RuntimeError::RuntimeError(String const& msg) : Exception(msg) {}
LogicError::LogicError(String const& msg) : Exception(msg) {}
[[noreturn]] void throwRuntimeError(String const& msg) {
JSONCPP_NORETURN void throwRuntimeError(String const& msg) {
throw RuntimeError(msg);
}
[[noreturn]] void throwLogicError(String const& msg) {
JSONCPP_NORETURN void throwLogicError(String const& msg) {
throw LogicError(msg);
}
#else // !JSON_USE_EXCEPTION
[[noreturn]] void throwRuntimeError(String const& msg) { abort(); }
[[noreturn]] void throwLogicError(String const& msg) { abort(); }
JSONCPP_NORETURN void throwRuntimeError(String const& msg) { abort(); }
JSONCPP_NORETURN void throwLogicError(String const& msg) { abort(); }
#endif
// //////////////////////////////////////////////////////////////////
@@ -256,8 +231,7 @@ LogicError::LogicError(String const& msg) : Exception(msg) {}
Value::CZString::CZString(ArrayIndex index) : cstr_(nullptr), index_(index) {}
Value::CZString::CZString(char const* str,
unsigned length,
Value::CZString::CZString(char const* str, unsigned length,
DuplicationPolicy allocate)
: cstr_(str) {
// allocate != duplicate
@@ -289,7 +263,7 @@ Value::CZString::CZString(CZString&& other)
Value::CZString::~CZString() {
if (cstr_ && storage_.policy_ == duplicate) {
releaseStringValue(const_cast<char*>(cstr_),
storage_.length_ + 1u); // +1 for null terminating
storage_.length_ + 1U); // +1 for null terminating
// character for sake of
// completeness but not actually
// necessary
@@ -520,7 +494,7 @@ int Value::compare(const Value& other) const {
bool Value::operator<(const Value& other) const {
int typeDelta = type() - other.type();
if (typeDelta)
return typeDelta < 0 ? true : false;
return typeDelta < 0;
switch (type()) {
case nullValue:
return false;
@@ -534,10 +508,7 @@ bool Value::operator<(const Value& other) const {
return value_.bool_ < other.value_.bool_;
case stringValue: {
if ((value_.string_ == nullptr) || (other.value_.string_ == nullptr)) {
if (other.value_.string_)
return true;
else
return false;
return other.value_.string_ != nullptr;
}
unsigned this_len;
unsigned other_len;
@@ -835,7 +806,7 @@ float Value::asFloat() const {
case nullValue:
return 0.0;
case booleanValue:
return value_.bool_ ? 1.0f : 0.0f;
return value_.bool_ ? 1.0F : 0.0F;
default:
break;
}
@@ -849,9 +820,9 @@ bool Value::asBool() const {
case nullValue:
return false;
case intValue:
return value_.int_ ? true : false;
return value_.int_ != 0;
case uintValue:
return value_.uint_ ? true : false;
return value_.uint_ != 0;
case realValue: {
// According to JavaScript language zero or NaN is regarded as false
const auto value_classification = std::fpclassify(value_.real_);
@@ -867,7 +838,7 @@ bool Value::isConvertibleTo(ValueType other) const {
switch (other) {
case nullValue:
return (isNumeric() && asDouble() == 0.0) ||
(type() == booleanValue && value_.bool_ == false) ||
(type() == booleanValue && !value_.bool_) ||
(type() == stringValue && asString().empty()) ||
(type() == arrayValue && value_.map_->empty()) ||
(type() == objectValue && value_.map_->empty()) ||
@@ -922,7 +893,7 @@ ArrayIndex Value::size() const {
bool Value::empty() const {
if (isNull() || isArray() || isObject())
return size() == 0u;
return size() == 0U;
else
return false;
}
@@ -1176,14 +1147,33 @@ Value const& Value::operator[](CppTL::ConstString const& key) const {
}
#endif
Value& Value::append(const Value& value) { return (*this)[size()] = value; }
Value& Value::append(const Value& value) { return append(Value(value)); }
Value& Value::append(Value&& value) {
return (*this)[size()] = std::move(value);
JSON_ASSERT_MESSAGE(type() == nullValue || type() == arrayValue,
"in Json::Value::append: requires arrayValue");
if (type() == nullValue) {
*this = Value(arrayValue);
}
return this->value_.map_->emplace(size(), std::move(value)).first->second;
}
Value Value::get(char const* begin,
char const* end,
bool Value::insert(ArrayIndex index, Value newValue) {
JSON_ASSERT_MESSAGE(type() == nullValue || type() == arrayValue,
"in Json::Value::insert: requires arrayValue");
ArrayIndex length = size();
if (index > length) {
return false;
} else {
for (ArrayIndex i = length; i > index; i--) {
(*this)[i] = std::move((*this)[i - 1]);
}
(*this)[index] = std::move(newValue);
return true;
}
}
Value Value::get(char const* begin, char const* end,
Value const& defaultValue) const {
Value const* found = find(begin, end);
return !found ? defaultValue : *found;
@@ -1469,7 +1459,10 @@ void Value::Comments::set(CommentPlacement slot, String comment) {
if (!ptr_) {
ptr_ = std::unique_ptr<Array>(new Array());
}
(*ptr_)[slot] = std::move(comment);
// check comments array boundry.
if (slot < CommentPlacement::numberOfCommentPlacement) {
(*ptr_)[slot] = std::move(comment);
}
}
void Value::setComment(String comment, CommentPlacement placement) {
@@ -1565,25 +1558,20 @@ Value::iterator Value::end() {
// class PathArgument
// //////////////////////////////////////////////////////////////////
PathArgument::PathArgument() : key_() {}
PathArgument::PathArgument() = default;
PathArgument::PathArgument(ArrayIndex index)
: key_(), index_(index), kind_(kindIndex) {}
: index_(index), kind_(kindIndex) {}
PathArgument::PathArgument(const char* key)
: key_(key), index_(), kind_(kindKey) {}
PathArgument::PathArgument(const char* key) : key_(key), kind_(kindKey) {}
PathArgument::PathArgument(const String& key)
: key_(key.c_str()), index_(), kind_(kindKey) {}
PathArgument::PathArgument(String key) : key_(std::move(key)), kind_(kindKey) {}
// class Path
// //////////////////////////////////////////////////////////////////
Path::Path(const String& path,
const PathArgument& a1,
const PathArgument& a2,
const PathArgument& a3,
const PathArgument& a4,
Path::Path(const String& path, const PathArgument& a1, const PathArgument& a2,
const PathArgument& a3, const PathArgument& a4,
const PathArgument& a5) {
InArgs in;
in.reserve(5);
@@ -1626,8 +1614,7 @@ void Path::makePath(const String& path, const InArgs& in) {
}
}
void Path::addPathInArg(const String& /*path*/,
const InArgs& in,
void Path::addPathInArg(const String& /*path*/, const InArgs& in,
InArgs::const_iterator& itInArg,
PathArgument::Kind kind) {
if (itInArg == in.end()) {

View File

@@ -21,7 +21,8 @@ ValueIteratorBase::ValueIteratorBase(
const Value::ObjectValues::iterator& current)
: current_(current), isNull_(false) {}
Value& ValueIteratorBase::deref() const { return current_->second; }
Value& ValueIteratorBase::deref() { return current_->second; }
const Value& ValueIteratorBase::deref() const { return current_->second; }
void ValueIteratorBase::increment() { ++current_; }

View File

@@ -84,7 +84,7 @@
namespace Json {
#if __cplusplus >= 201103L || (defined(_CPPLIB_VER) && _CPPLIB_VER >= 520)
typedef std::unique_ptr<StreamWriter> StreamWriterPtr;
using StreamWriterPtr = std::unique_ptr<StreamWriter>;
#else
typedef std::auto_ptr<StreamWriter> StreamWriterPtr;
#endif
@@ -122,10 +122,8 @@ String valueToString(UInt value) { return valueToString(LargestUInt(value)); }
#endif // # if defined(JSON_HAS_INT64)
namespace {
String valueToString(double value,
bool useSpecialFloats,
unsigned int precision,
PrecisionType precisionType) {
String valueToString(double value, bool useSpecialFloats,
unsigned int precision, PrecisionType precisionType) {
// Print into the buffer. We need not request the alternative representation
// that always has a decimal point because JSON doesn't distinguish the
// concepts of reals and integers.
@@ -168,8 +166,7 @@ String valueToString(double value,
}
} // namespace
String valueToString(double value,
unsigned int precision,
String valueToString(double value, unsigned int precision,
PrecisionType precisionType) {
return valueToString(value, false, precision, precisionType);
}
@@ -267,7 +264,8 @@ static String toHex16Bit(unsigned int x) {
return result;
}
static String valueToQuotedStringN(const char* value, unsigned length) {
static String valueToQuotedStringN(const char* value, unsigned length,
bool emitUTF8 = false) {
if (value == nullptr)
return "";
@@ -313,21 +311,31 @@ static String valueToQuotedStringN(const char* value, unsigned length) {
// Should add a flag to allow this compatibility mode and prevent this
// sequence from occurring.
default: {
unsigned int cp = utf8ToCodepoint(c, end);
// don't escape non-control characters
// (short escape sequence are applied above)
if (cp < 0x80 && cp >= 0x20)
result += static_cast<char>(cp);
else if (cp < 0x10000) { // codepoint is in Basic Multilingual Plane
result += "\\u";
result += toHex16Bit(cp);
} else { // codepoint is not in Basic Multilingual Plane
// convert to surrogate pair first
cp -= 0x10000;
result += "\\u";
result += toHex16Bit((cp >> 10) + 0xD800);
result += "\\u";
result += toHex16Bit((cp & 0x3FF) + 0xDC00);
if (emitUTF8) {
result += *c;
} else {
unsigned int codepoint = utf8ToCodepoint(c, end);
const unsigned int FIRST_NON_CONTROL_CODEPOINT = 0x20;
const unsigned int LAST_NON_CONTROL_CODEPOINT = 0x7F;
const unsigned int FIRST_SURROGATE_PAIR_CODEPOINT = 0x10000;
// don't escape non-control characters
// (short escape sequence are applied above)
if (FIRST_NON_CONTROL_CODEPOINT <= codepoint &&
codepoint <= LAST_NON_CONTROL_CODEPOINT) {
result += static_cast<char>(codepoint);
} else if (codepoint <
FIRST_SURROGATE_PAIR_CODEPOINT) { // codepoint is in Basic
// Multilingual Plane
result += "\\u";
result += toHex16Bit(codepoint);
} else { // codepoint is not in Basic Multilingual Plane
// convert to surrogate pair first
codepoint -= FIRST_SURROGATE_PAIR_CODEPOINT;
result += "\\u";
result += toHex16Bit((codepoint >> 10) + 0xD800);
result += "\\u";
result += toHex16Bit((codepoint & 0x3FF) + 0xDC00);
}
}
} break;
}
@@ -864,13 +872,10 @@ struct CommentStyle {
};
struct BuiltStyledStreamWriter : public StreamWriter {
BuiltStyledStreamWriter(String indentation,
CommentStyle::Enum cs,
String colonSymbol,
String nullSymbol,
String endingLineFeedSymbol,
bool useSpecialFloats,
unsigned int precision,
BuiltStyledStreamWriter(String indentation, CommentStyle::Enum cs,
String colonSymbol, String nullSymbol,
String endingLineFeedSymbol, bool useSpecialFloats,
bool emitUTF8, unsigned int precision,
PrecisionType precisionType);
int write(Value const& root, OStream* sout) override;
@@ -887,7 +892,7 @@ private:
void writeCommentAfterValueOnSameLine(Value const& root);
static bool hasCommentForValue(const Value& value);
typedef std::vector<String> ChildValues;
using ChildValues = std::vector<String>;
ChildValues childValues_;
String indentString_;
@@ -900,23 +905,20 @@ private:
bool addChildValues_ : 1;
bool indented_ : 1;
bool useSpecialFloats_ : 1;
bool emitUTF8_ : 1;
unsigned int precision_;
PrecisionType precisionType_;
};
BuiltStyledStreamWriter::BuiltStyledStreamWriter(String indentation,
CommentStyle::Enum cs,
String colonSymbol,
String nullSymbol,
String endingLineFeedSymbol,
bool useSpecialFloats,
unsigned int precision,
PrecisionType precisionType)
BuiltStyledStreamWriter::BuiltStyledStreamWriter(
String indentation, CommentStyle::Enum cs, String colonSymbol,
String nullSymbol, String endingLineFeedSymbol, bool useSpecialFloats,
bool emitUTF8, unsigned int precision, PrecisionType precisionType)
: rightMargin_(74), indentation_(std::move(indentation)), cs_(cs),
colonSymbol_(std::move(colonSymbol)), nullSymbol_(std::move(nullSymbol)),
endingLineFeedSymbol_(std::move(endingLineFeedSymbol)),
addChildValues_(false), indented_(false),
useSpecialFloats_(useSpecialFloats), precision_(precision),
precisionType_(precisionType) {}
useSpecialFloats_(useSpecialFloats), emitUTF8_(emitUTF8),
precision_(precision), precisionType_(precisionType) {}
int BuiltStyledStreamWriter::write(Value const& root, OStream* sout) {
sout_ = sout;
addChildValues_ = false;
@@ -953,7 +955,8 @@ void BuiltStyledStreamWriter::writeValue(Value const& value) {
char const* end;
bool ok = value.getString(&str, &end);
if (ok)
pushValue(valueToQuotedStringN(str, static_cast<unsigned>(end - str)));
pushValue(valueToQuotedStringN(str, static_cast<unsigned>(end - str),
emitUTF8_));
else
pushValue("");
break;
@@ -977,7 +980,7 @@ void BuiltStyledStreamWriter::writeValue(Value const& value) {
Value const& childValue = value[name];
writeCommentBeforeValue(childValue);
writeWithIndent(valueToQuotedStringN(
name.data(), static_cast<unsigned>(name.length())));
name.data(), static_cast<unsigned>(name.length()), emitUTF8_));
*sout_ << colonSymbol_;
writeValue(childValue);
if (++it == members.end()) {
@@ -1153,12 +1156,13 @@ StreamWriter::Factory::~Factory() = default;
StreamWriterBuilder::StreamWriterBuilder() { setDefaults(&settings_); }
StreamWriterBuilder::~StreamWriterBuilder() = default;
StreamWriter* StreamWriterBuilder::newStreamWriter() const {
String indentation = settings_["indentation"].asString();
String cs_str = settings_["commentStyle"].asString();
String pt_str = settings_["precisionType"].asString();
bool eyc = settings_["enableYAMLCompatibility"].asBool();
bool dnp = settings_["dropNullPlaceholders"].asBool();
bool usf = settings_["useSpecialFloats"].asBool();
const String indentation = settings_["indentation"].asString();
const String cs_str = settings_["commentStyle"].asString();
const String pt_str = settings_["precisionType"].asString();
const bool eyc = settings_["enableYAMLCompatibility"].asBool();
const bool dnp = settings_["dropNullPlaceholders"].asBool();
const bool usf = settings_["useSpecialFloats"].asBool();
const bool emitUTF8 = settings_["emitUTF8"].asBool();
unsigned int pre = settings_["precision"].asUInt();
CommentStyle::Enum cs = CommentStyle::All;
if (cs_str == "All") {
@@ -1190,7 +1194,7 @@ StreamWriter* StreamWriterBuilder::newStreamWriter() const {
pre = 17;
String endingLineFeedSymbol;
return new BuiltStyledStreamWriter(indentation, cs, colonSymbol, nullSymbol,
endingLineFeedSymbol, usf, pre,
endingLineFeedSymbol, usf, emitUTF8, pre,
precisionType);
}
static void getValidWriterKeys(std::set<String>* valid_keys) {
@@ -1200,6 +1204,7 @@ static void getValidWriterKeys(std::set<String>* valid_keys) {
valid_keys->insert("enableYAMLCompatibility");
valid_keys->insert("dropNullPlaceholders");
valid_keys->insert("useSpecialFloats");
valid_keys->insert("emitUTF8");
valid_keys->insert("precision");
valid_keys->insert("precisionType");
}
@@ -1231,6 +1236,7 @@ void StreamWriterBuilder::setDefaults(Json::Value* settings) {
(*settings)["enableYAMLCompatibility"] = false;
(*settings)["dropNullPlaceholders"] = false;
(*settings)["useSpecialFloats"] = false;
(*settings)["emitUTF8"] = false;
(*settings)["precision"] = 17;
(*settings)["precisionType"] = "significant";
//! [StreamWriterBuilderDefaults]

View File

@@ -1,22 +0,0 @@
// DO NOT EDIT. This file (and "version") is a template used by the build system
// (either CMake or Meson) to generate a "version.h" header file.
#ifndef JSON_VERSION_H_INCLUDED
#define JSON_VERSION_H_INCLUDED
#define JSONCPP_VERSION_STRING "@JSONCPP_VERSION@"
#define JSONCPP_VERSION_MAJOR @JSONCPP_VERSION_MAJOR@
#define JSONCPP_VERSION_MINOR @JSONCPP_VERSION_MINOR@
#define JSONCPP_VERSION_PATCH @JSONCPP_VERSION_PATCH@
#define JSONCPP_VERSION_QUALIFIER
#define JSONCPP_VERSION_HEXA ((JSONCPP_VERSION_MAJOR << 24) \
| (JSONCPP_VERSION_MINOR << 16) \
| (JSONCPP_VERSION_PATCH << 8))
#ifdef JSONCPP_USING_SECURE_MEMORY
#undef JSONCPP_USING_SECURE_MEMORY
#endif
#define JSONCPP_USING_SECURE_MEMORY @JSONCPP_USE_SECURE_MEMORY@
// If non-zero, the library zeroes any memory that it has allocated before
// it frees its memory.
#endif // JSON_VERSION_H_INCLUDED

View File

@@ -10,7 +10,11 @@ add_executable( jsoncpp_test
if(BUILD_SHARED_LIBS)
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.12.0)
add_compile_definitions( JSON_DLL )
else()
add_definitions( -DJSON_DLL )
endif()
endif()
target_link_libraries(jsoncpp_test jsoncpp_lib)

View File

@@ -23,8 +23,12 @@ extern "C" int LLVMFuzzerTestOneInput(const uint8_t* data, size_t size) {
return 0;
}
uint32_t hash_settings = *(const uint32_t*)data;
const uint32_t hash_settings = static_cast<uint32_t>(data[0]) |
(static_cast<uint32_t>(data[1]) << 8) |
(static_cast<uint32_t>(data[2]) << 16) |
(static_cast<uint32_t>(data[3]) << 24);
data += sizeof(uint32_t);
size -= sizeof(uint32_t);
builder.settings_["failIfExtra"] = hash_settings & (1 << 0);
builder.settings_["allowComments_"] = hash_settings & (1 << 1);
@@ -35,6 +39,7 @@ extern "C" int LLVMFuzzerTestOneInput(const uint8_t* data, size_t size) {
builder.settings_["failIfExtra_"] = hash_settings & (1 << 6);
builder.settings_["rejectDupKeys_"] = hash_settings & (1 << 7);
builder.settings_["allowSpecialFloats_"] = hash_settings & (1 << 8);
builder.settings_["collectComments"] = hash_settings & (1 << 9);
std::unique_ptr<Json::CharReader> reader(builder.newCharReader());

View File

@@ -0,0 +1,54 @@
#
# AFL dictionary for JSON
# -----------------------
#
# Just the very basics.
#
# Inspired by a dictionary by Jakub Wilk <jwilk@jwilk.net>
#
# https://github.com/rc0r/afl-fuzz/blob/master/dictionaries/json.dict
#
"0"
",0"
":0"
"0:"
"-1.2e+3"
"true"
"false"
"null"
"\"\""
",\"\""
":\"\""
"\"\":"
"{}"
",{}"
":{}"
"{\"\":0}"
"{{}}"
"[]"
",[]"
":[]"
"[0]"
"[[]]"
"''"
"\\"
"\\b"
"\\f"
"\\n"
"\\r"
"\\t"
"\\u0000"
"\\x00"
"\\0"
"\\uD800\\uDC00"
"\\uDBFF\\uDFFF"
"\"\":0"
"//"
"/**/"

View File

@@ -82,8 +82,8 @@ TestResult::TestResult() {
void TestResult::setTestName(const Json::String& name) { name_ = name; }
TestResult&
TestResult::addFailure(const char* file, unsigned int line, const char* expr) {
TestResult& TestResult::addFailure(const char* file, unsigned int line,
const char* expr) {
/// Walks the PredicateContext stack adding them to failures_ if not already
/// added.
unsigned int nestingLevel = 0;
@@ -107,10 +107,8 @@ TestResult::addFailure(const char* file, unsigned int line, const char* expr) {
return *this;
}
void TestResult::addFailureInfo(const char* file,
unsigned int line,
const char* expr,
unsigned int nestingLevel) {
void TestResult::addFailureInfo(const char* file, unsigned int line,
const char* expr, unsigned int nestingLevel) {
Failure failure;
failure.file_ = file;
failure.line_ = line;
@@ -342,8 +340,8 @@ int Runner::runCommandLine(int argc, const char* argv[]) const {
#if defined(_MSC_VER) && defined(_DEBUG)
// Hook MSVCRT assertions to prevent dialog from appearing
static int
msvcrtSilentReportHook(int reportType, char* message, int* /*returnValue*/) {
static int msvcrtSilentReportHook(int reportType, char* message,
int* /*returnValue*/) {
// The default CRT handling of error and assertion is to display
// an error dialog to the user.
// Instead, when an error or an assertion occurs, we force the
@@ -418,12 +416,9 @@ Json::String ToJsonString(std::string in) {
}
#endif
TestResult& checkStringEqual(TestResult& result,
const Json::String& expected,
const Json::String& actual,
const char* file,
unsigned int line,
const char* expr) {
TestResult& checkStringEqual(TestResult& result, const Json::String& expected,
const Json::String& actual, const char* file,
unsigned int line, const char* expr) {
if (expected != actual) {
result.addFailure(file, line, expr);
result << "Expected: '" << expected << "'\n";

View File

@@ -8,6 +8,7 @@
#include <cstdio>
#include <deque>
#include <iomanip>
#include <json/config.h>
#include <json/value.h>
#include <json/writer.h>
@@ -68,8 +69,8 @@ public:
void setTestName(const Json::String& name);
/// Adds an assertion failure.
TestResult&
addFailure(const char* file, unsigned int line, const char* expr = nullptr);
TestResult& addFailure(const char* file, unsigned int line,
const char* expr = nullptr);
/// Removes the last PredicateContext added to the predicate stack
/// chained list.
@@ -83,9 +84,7 @@ public:
// Generic operator that will work with anything ostream can deal with.
template <typename T> TestResult& operator<<(const T& value) {
Json::OStringStream oss;
oss.precision(16);
oss.setf(std::ios_base::floatfield);
oss << value;
oss << std::setprecision(16) << std::hexfloat << value;
return addToLastFailure(oss.str());
}
@@ -98,9 +97,7 @@ public:
private:
TestResult& addToLastFailure(const Json::String& message);
/// Adds a failure or a predicate context
void addFailureInfo(const char* file,
unsigned int line,
const char* expr,
void addFailureInfo(const char* file, unsigned int line, const char* expr,
unsigned int nestingLevel);
static Json::String indentText(const Json::String& text,
const Json::String& indent);
@@ -176,12 +173,8 @@ private:
};
template <typename T, typename U>
TestResult& checkEqual(TestResult& result,
T expected,
U actual,
const char* file,
unsigned int line,
const char* expr) {
TestResult& checkEqual(TestResult& result, T expected, U actual,
const char* file, unsigned int line, const char* expr) {
if (static_cast<U>(expected) != actual) {
result.addFailure(file, line, expr);
result << "Expected: " << static_cast<U>(expected) << "\n";
@@ -196,12 +189,9 @@ Json::String ToJsonString(Json::String in);
Json::String ToJsonString(std::string in);
#endif
TestResult& checkStringEqual(TestResult& result,
const Json::String& expected,
const Json::String& actual,
const char* file,
unsigned int line,
const char* expr);
TestResult& checkStringEqual(TestResult& result, const Json::String& expected,
const Json::String& actual, const char* file,
unsigned int line, const char* expr);
} // namespace JsonTest
@@ -273,4 +263,26 @@ TestResult& checkStringEqual(TestResult& result,
#define JSONTEST_REGISTER_FIXTURE(runner, FixtureType, name) \
(runner).add(JSONTEST_FIXTURE_FACTORY(FixtureType, name))
/// \brief Begin a fixture test case.
#define JSONTEST_FIXTURE_V2(FixtureType, name, collections) \
class Test##FixtureType##name : public FixtureType { \
public: \
static JsonTest::TestCase* factory() { \
return new Test##FixtureType##name(); \
} \
static bool collect() { \
(collections).push_back(JSONTEST_FIXTURE_FACTORY(FixtureType, name)); \
return true; \
} \
\
public: /* overridden from TestCase */ \
const char* testName() const override { return #FixtureType "/" #name; } \
void runTestCase() override; \
}; \
\
static bool test##FixtureType##name##collect = \
Test##FixtureType##name::collect(); \
\
void Test##FixtureType##name::runTestCase()
#endif // ifndef JSONTEST_H_INCLUDED

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More