mirror of
https://github.com/open-source-parsers/jsoncpp.git
synced 2024-12-13 10:22:55 +01:00
- Documentation generation is no longer handled by SCons. The script doxybuild.py is used to generate the documentation on demand.
- Added file 'version' that contains jsoncpp version number. It is used by both SConstruct and doxybuild.py. - Updated README.txt with documentation build instruction, and instructions to add a test case.
This commit is contained in:
parent
8d3790d217
commit
57ee0e3b37
74
README.txt
74
README.txt
@ -1,10 +1,11 @@
|
|||||||
* Introduction:
|
* Introduction:
|
||||||
|
=============
|
||||||
|
|
||||||
JSON (JavaScript Object Notation) is a lightweight data-interchange format.
|
JSON (JavaScript Object Notation) is a lightweight data-interchange format.
|
||||||
It can represent integer, real number, string, an ordered sequence of
|
It can represent integer, real number, string, an ordered sequence of
|
||||||
value, and a collection of name/value pairs.
|
value, and a collection of name/value pairs.
|
||||||
|
|
||||||
JsonCpp is a simple API to manipulate JSON value, and handle serialization
|
JsonCpp is a simple API to manipulate JSON value, handle serialization
|
||||||
and unserialization to string.
|
and unserialization to string.
|
||||||
|
|
||||||
It can also preserve existing comment in unserialization/serialization steps,
|
It can also preserve existing comment in unserialization/serialization steps,
|
||||||
@ -12,7 +13,9 @@ making it a convenient format to store user input files.
|
|||||||
|
|
||||||
Unserialization parsing is user friendly and provides precise error reports.
|
Unserialization parsing is user friendly and provides precise error reports.
|
||||||
|
|
||||||
|
|
||||||
* Building/Testing:
|
* Building/Testing:
|
||||||
|
=================
|
||||||
|
|
||||||
JsonCpp uses Scons (http://www.scons.org) as a build system. Scons requires
|
JsonCpp uses Scons (http://www.scons.org) as a build system. Scons requires
|
||||||
python to be installed (http://www.python.org).
|
python to be installed (http://www.python.org).
|
||||||
@ -39,15 +42,76 @@ to do so.
|
|||||||
|
|
||||||
and TARGET may be:
|
and TARGET may be:
|
||||||
check: build library and run unit tests.
|
check: build library and run unit tests.
|
||||||
doc: build documentation
|
|
||||||
doc-dist: build documentation tarball
|
|
||||||
|
|
||||||
To run the test manually:
|
|
||||||
|
* Running the test manually:
|
||||||
|
==========================
|
||||||
|
|
||||||
cd test
|
cd test
|
||||||
# This will run the Reader/Writer tests
|
# This will run the Reader/Writer tests
|
||||||
python runjsontests.py "path to jsontest.exe"
|
python runjsontests.py "path to jsontest.exe"
|
||||||
|
|
||||||
|
# This will run the Reader/Writer tests, using JSONChecker test suite
|
||||||
|
# (http://www.json.org/JSON_checker/).
|
||||||
|
# Notes: not all tests pass: JsonCpp is too lenient (for example,
|
||||||
|
# it allows an integer to start with '0'). The goal is to improve
|
||||||
|
# strict mode parsing to get all tests to pass.
|
||||||
|
python runjsontests.py --with-json-checker "path to jsontest.exe"
|
||||||
|
|
||||||
# This will run the unit tests (mostly Value)
|
# This will run the unit tests (mostly Value)
|
||||||
python rununittests.py "path to test_lib_json.exe"
|
python rununittests.py "path to test_lib_json.exe"
|
||||||
|
|
||||||
You can run the tests using valgrind using:
|
You can run the tests using valgrind:
|
||||||
python rununittests.py --valgrind "path to test_lib_json.exe"
|
python rununittests.py --valgrind "path to test_lib_json.exe"
|
||||||
|
|
||||||
|
|
||||||
|
* Building the documentation:
|
||||||
|
===========================
|
||||||
|
|
||||||
|
Run the python script doxybuild.py from the top directory:
|
||||||
|
|
||||||
|
python doxybuild.py --open --with-dot
|
||||||
|
|
||||||
|
See doxybuild.py --help for options.
|
||||||
|
|
||||||
|
|
||||||
|
* Adding a reader/writer test:
|
||||||
|
============================
|
||||||
|
|
||||||
|
To add a test, you need to create two files in test/data:
|
||||||
|
- a TESTNAME.json file, that contains the input document in JSON format.
|
||||||
|
- a TESTNAME.expected file, that contains a flatened representation of
|
||||||
|
the input document.
|
||||||
|
|
||||||
|
TESTNAME.expected file format:
|
||||||
|
- each line represents a JSON element of the element tree represented
|
||||||
|
by the input document.
|
||||||
|
- each line has two parts: the path to access the element separated from
|
||||||
|
the element value by '='. Array and object values are always empty
|
||||||
|
(e.g. represented by either [] or {}).
|
||||||
|
- element path: '.' represented the root element, and is used to separate
|
||||||
|
object members. [N] is used to specify the value of an array element
|
||||||
|
at index N.
|
||||||
|
See test_complex_01.json and test_complex_01.expected to better understand
|
||||||
|
element path.
|
||||||
|
|
||||||
|
|
||||||
|
* Understanding reader/writer test output:
|
||||||
|
========================================
|
||||||
|
|
||||||
|
When a test is run, output files are generated aside the input test files.
|
||||||
|
Below is a short description of the content of each file:
|
||||||
|
|
||||||
|
- test_complex_01.json: input JSON document
|
||||||
|
- test_complex_01.expected: flattened JSON element tree used to check if
|
||||||
|
parsing was corrected.
|
||||||
|
|
||||||
|
- test_complex_01.actual: flattened JSON element tree produced by
|
||||||
|
jsontest.exe from reading test_complex_01.json
|
||||||
|
- test_complex_01.rewrite: JSON document written by jsontest.exe using the
|
||||||
|
Json::Value parsed from test_complex_01.json and serialized using
|
||||||
|
Json::StyledWritter.
|
||||||
|
- test_complex_01.actual-rewrite: flattened JSON element tree produced by
|
||||||
|
jsontest.exe from reading test_complex_01.rewrite.
|
||||||
|
test_complex_01.process-output: jsontest.exe output, typically useful to
|
||||||
|
understand parsing error.
|
||||||
|
77
SConstruct
77
SConstruct
@ -1,79 +1,17 @@
|
|||||||
"""
|
"""
|
||||||
Build system can be clean-up by sticking to a few core production factory, with automatic dependencies resolution.
|
Notes:
|
||||||
4 basic project productions:
|
- shared library support is buggy: it assumes that a static and dynamic library can be build from the same object files. This is not true on many platforms. For this reason it is only enabled on linux-gcc at the current time.
|
||||||
- library
|
|
||||||
- binary
|
|
||||||
- documentation
|
|
||||||
- tests
|
|
||||||
|
|
||||||
* Library:
|
To add a platform:
|
||||||
Input:
|
- add its name in options allowed_values below
|
||||||
- dependencies (other libraries)
|
- add tool initialization for this platform. Search for "if platform == 'suncc'" as an example.
|
||||||
- headers: include path & files
|
|
||||||
- sources
|
|
||||||
- generated sources
|
|
||||||
- resources
|
|
||||||
- generated resources
|
|
||||||
Production:
|
|
||||||
- Static library
|
|
||||||
- Dynamic library
|
|
||||||
- Naming rule
|
|
||||||
Life-cycle:
|
|
||||||
- Library compilation
|
|
||||||
- Compilation as a dependencies
|
|
||||||
- Run-time
|
|
||||||
- Packaging
|
|
||||||
Identity:
|
|
||||||
- Name
|
|
||||||
- Version
|
|
||||||
* Binary:
|
|
||||||
Input:
|
|
||||||
- dependencies (other libraries)
|
|
||||||
- headers: include path & files (usually empty)
|
|
||||||
- sources
|
|
||||||
- generated sources
|
|
||||||
- resources
|
|
||||||
- generated resources
|
|
||||||
- supported variant (optimized/debug, dll/static...)
|
|
||||||
Production:
|
|
||||||
- Binary executable
|
|
||||||
- Manifest [on some platforms]
|
|
||||||
- Debug symbol [on some platforms]
|
|
||||||
Life-cycle:
|
|
||||||
- Compilation
|
|
||||||
- Run-time
|
|
||||||
- Packaging
|
|
||||||
Identity:
|
|
||||||
- Name
|
|
||||||
- Version
|
|
||||||
* Documentation:
|
|
||||||
Input:
|
|
||||||
- dependencies (libraries, binaries)
|
|
||||||
- additional sources
|
|
||||||
- generated sources
|
|
||||||
- resources
|
|
||||||
- generated resources
|
|
||||||
- supported variant (public/internal)
|
|
||||||
Production:
|
|
||||||
- HTML documentation
|
|
||||||
- PDF documentation
|
|
||||||
- CHM documentation
|
|
||||||
Life-cycle:
|
|
||||||
- Documentation
|
|
||||||
- Packaging
|
|
||||||
- Test
|
|
||||||
Identity:
|
|
||||||
- Name
|
|
||||||
- Version
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
import os
|
import os
|
||||||
import os.path
|
import os.path
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
JSONCPP_VERSION = '0.2'
|
JSONCPP_VERSION = open(File('#version').abspath,'rt').read().strip()
|
||||||
DIST_DIR = '#dist'
|
DIST_DIR = '#dist'
|
||||||
|
|
||||||
options = Variables()
|
options = Variables()
|
||||||
@ -174,8 +112,6 @@ else:
|
|||||||
print "UNSUPPORTED PLATFORM."
|
print "UNSUPPORTED PLATFORM."
|
||||||
env.Exit(1)
|
env.Exit(1)
|
||||||
|
|
||||||
env.Tool('doxygen')
|
|
||||||
env.Tool('substinfile')
|
|
||||||
env.Tool('targz')
|
env.Tool('targz')
|
||||||
env.Tool('srcdist')
|
env.Tool('srcdist')
|
||||||
env.Tool('globtool')
|
env.Tool('globtool')
|
||||||
@ -295,6 +231,5 @@ env.Alias( 'src-dist', srcdist_cmd )
|
|||||||
buildProjectInDirectory( 'src/jsontestrunner' )
|
buildProjectInDirectory( 'src/jsontestrunner' )
|
||||||
buildProjectInDirectory( 'src/lib_json' )
|
buildProjectInDirectory( 'src/lib_json' )
|
||||||
buildProjectInDirectory( 'src/test_lib_json' )
|
buildProjectInDirectory( 'src/test_lib_json' )
|
||||||
buildProjectInDirectory( 'doc' )
|
|
||||||
#print env.Dump()
|
#print env.Dump()
|
||||||
|
|
||||||
|
1364
doc/doxyfile.in
1364
doc/doxyfile.in
File diff suppressed because it is too large
Load Diff
@ -1,61 +0,0 @@
|
|||||||
Import( 'env' )
|
|
||||||
import os.path
|
|
||||||
|
|
||||||
if 'doxygen' in env['TOOLS']:
|
|
||||||
doc_topdir = str(env['ROOTBUILD_DIR'])
|
|
||||||
html_dir = 'jsoncpp-api-doc'
|
|
||||||
|
|
||||||
doxygen_inputs = env.Glob( includes = '*.dox', dir = '#doc' ) \
|
|
||||||
+ env.Glob( includes = '*.h', dir = '#include/json/' ) \
|
|
||||||
+ env.Glob( includes = ('*.dox','*.h','*.inl','*.cpp'),
|
|
||||||
dir = '#src/lib_json' )
|
|
||||||
## for p in doxygen_inputs:
|
|
||||||
## print p.abspath
|
|
||||||
|
|
||||||
top_dir = env.Dir('#').abspath
|
|
||||||
include_top_dir = env.Dir('#include').abspath
|
|
||||||
env['DOXYFILE_DICT'] = { 'PROJECT_NAME': 'JsonCpp',
|
|
||||||
'PROJECT_NUMBER': env['JSONCPP_VERSION'],
|
|
||||||
'STRIP_FROM_PATH': top_dir,
|
|
||||||
'STRIP_FROM_INC_PATH': include_top_dir,
|
|
||||||
'HTML_OUTPUT': html_dir,
|
|
||||||
'HTML_HEADER': env.File('#doc/header.html').abspath,
|
|
||||||
'HTML_FOOTER': env.File('#doc/footer.html').abspath,
|
|
||||||
'INCLUDE_PATH': include_top_dir,
|
|
||||||
'PREDEFINED': 'JSONCPP_DOC_EXCLUDE_IMPLEMENTATION JSON_VALUE_USE_INTERNAL_MAP'
|
|
||||||
}
|
|
||||||
env['DOXYFILE_FILE'] = 'doxyfile.in'
|
|
||||||
doxfile_nodes = env.Doxyfile( os.path.join( doc_topdir, 'doxyfile' ), doxygen_inputs )
|
|
||||||
html_doc_path = os.path.join( doc_topdir, html_dir )
|
|
||||||
doc_nodes = env.Doxygen( source = doxfile_nodes,
|
|
||||||
target = os.path.join( html_doc_path, 'index.html' ) )
|
|
||||||
alias_doc_cmd = env.Alias('doc', doc_nodes )
|
|
||||||
env.Alias('doc', env.Install( html_doc_path, '#README.txt' ) )
|
|
||||||
if 'TarGz' in env['BUILDERS']:
|
|
||||||
targz_path = os.path.join( env['DIST_DIR'], '%s.tar.gz' % html_dir )
|
|
||||||
zip_doc_cmd = env.TarGz( targz_path, [env.Dir(html_doc_path)],
|
|
||||||
TARGZ_BASEDIR = env['ROOTBUILD_DIR'] )
|
|
||||||
env.Depends( zip_doc_cmd, alias_doc_cmd )
|
|
||||||
env.Alias( 'doc-dist', zip_doc_cmd )
|
|
||||||
##
|
|
||||||
## doxyfile = env.SubstInFile( '#doc/doxyfile', 'doxyfile.in',
|
|
||||||
## SUBST_DICT = {
|
|
||||||
## '%JSONCPP_VERSION%' : env['JSONCPP_VERSION'],
|
|
||||||
## '%TOPDIR%' : env.Dir('#').abspath,
|
|
||||||
## '%DOC_TOPDIR%' : str(doc_topdir) } )
|
|
||||||
## doc_cmd = env.Doxygen( doxyfile )
|
|
||||||
## alias_doc_cmd = env.Alias('doc', doc_cmd )
|
|
||||||
## env.AlwaysBuild(alias_doc_cmd)
|
|
||||||
##
|
|
||||||
## for dir in doc_cmd:
|
|
||||||
## env.Alias('doc', env.Install( '#' + dir.path, '#README.txt' ) )
|
|
||||||
## filename = os.path.split(dir.path)[1]
|
|
||||||
## targz_path = os.path.join( env['DIST_DIR'], '%s.tar.gz' % filename )
|
|
||||||
## zip_doc_cmd = env.TarGz( targz_path, [env.Dir(dir)],
|
|
||||||
## TARGZ_BASEDIR = doc_topdir )
|
|
||||||
## env.Depends( zip_doc_cmd, alias_doc_cmd )
|
|
||||||
## env.Alias( 'doc-dist', zip_doc_cmd )
|
|
||||||
##
|
|
||||||
## # When doxyfile gets updated, I get errors on the first pass.
|
|
||||||
## # I have to run scons twice. Something is wrong with the dependencies
|
|
||||||
## # here, but I avoid it by running "scons doc/doxyfile" first.
|
|
191
doxybuild.py
Normal file
191
doxybuild.py
Normal file
@ -0,0 +1,191 @@
|
|||||||
|
"""Script to generate doxygen documentation.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import re
|
||||||
|
import os
|
||||||
|
import os.path
|
||||||
|
import sys
|
||||||
|
import shutil
|
||||||
|
import gzip
|
||||||
|
import tarfile
|
||||||
|
|
||||||
|
TARGZ_DEFAULT_COMPRESSION_LEVEL = 9
|
||||||
|
|
||||||
|
def make_tarball(tarball_path, sources, base_dir, prefix_dir=''):
|
||||||
|
"""Parameters:
|
||||||
|
tarball_path: output path of the .tar.gz file
|
||||||
|
sources: list of sources to include in the tarball, relative to the current directory
|
||||||
|
base_dir: if a source file is in a sub-directory of base_dir, then base_dir is stripped
|
||||||
|
from path in the tarball.
|
||||||
|
prefix_dir: all files stored in the tarball be sub-directory of prefix_dir. Set to ''
|
||||||
|
to make them child of root.
|
||||||
|
"""
|
||||||
|
base_dir = os.path.normpath( os.path.abspath( base_dir ) )
|
||||||
|
def archive_name( path ):
|
||||||
|
"""Makes path relative to base_dir."""
|
||||||
|
path = os.path.normpath( os.path.abspath( path ) )
|
||||||
|
common_path = os.path.commonprefix( (base_dir, path) )
|
||||||
|
archive_name = path[len(common_path):]
|
||||||
|
if os.path.isabs( archive_name ):
|
||||||
|
archive_name = archive_name[1:]
|
||||||
|
return os.path.join( prefix_dir, archive_name )
|
||||||
|
def visit(tar, dirname, names):
|
||||||
|
for name in names:
|
||||||
|
path = os.path.join(dirname, name)
|
||||||
|
if os.path.isfile(path):
|
||||||
|
path_in_tar = archive_name(path)
|
||||||
|
tar.add(path, path_in_tar )
|
||||||
|
compression = TARGZ_DEFAULT_COMPRESSION_LEVEL
|
||||||
|
fileobj = gzip.GzipFile( tarball_path, 'wb', compression )
|
||||||
|
tar = tarfile.TarFile(os.path.splitext(tarball_path)[0], 'w', fileobj)
|
||||||
|
for source in sources:
|
||||||
|
source_path = source
|
||||||
|
if os.path.isdir( source ):
|
||||||
|
os.path.walk(source_path, visit, tar)
|
||||||
|
else:
|
||||||
|
path_in_tar = archive_name(source_path)
|
||||||
|
tar.add(source_path, path_in_tar ) # filename, arcname
|
||||||
|
tar.close()
|
||||||
|
|
||||||
|
|
||||||
|
def find_program(filename):
|
||||||
|
"""find a program in folders path_lst, and sets env[var]
|
||||||
|
@param env: environmentA
|
||||||
|
@param filename: name of the program to search for
|
||||||
|
@param path_list: list of directories to search for filename
|
||||||
|
@param var: environment value to be checked for in env or os.environ
|
||||||
|
@return: either the value that is referenced with [var] in env or os.environ
|
||||||
|
or the first occurrence filename or '' if filename could not be found
|
||||||
|
"""
|
||||||
|
paths = os.environ.get('PATH', '').split(os.pathsep)
|
||||||
|
suffixes = ('win32' in sys.platform ) and '.exe .com .bat .cmd' or ''
|
||||||
|
for name in [filename+ext for ext in suffixes.split()]:
|
||||||
|
for directory in paths:
|
||||||
|
full_path = os.path.join(directory, name)
|
||||||
|
if os.path.isfile(full_path):
|
||||||
|
return full_path
|
||||||
|
return ''
|
||||||
|
|
||||||
|
def do_subst_in_file(targetfile, sourcefile, dict):
|
||||||
|
"""Replace all instances of the keys of dict with their values.
|
||||||
|
For example, if dict is {'%VERSION%': '1.2345', '%BASE%': 'MyProg'},
|
||||||
|
then all instances of %VERSION% in the file will be replaced with 1.2345 etc.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
f = open(sourcefile, 'rb')
|
||||||
|
contents = f.read()
|
||||||
|
f.close()
|
||||||
|
except:
|
||||||
|
print "Can't read source file %s"%sourcefile
|
||||||
|
raise
|
||||||
|
for (k,v) in dict.items():
|
||||||
|
v = v.replace('\\','\\\\')
|
||||||
|
contents = re.sub(k, v, contents)
|
||||||
|
try:
|
||||||
|
f = open(targetfile, 'wb')
|
||||||
|
f.write(contents)
|
||||||
|
f.close()
|
||||||
|
except:
|
||||||
|
print "Can't write target file %s"%targetfile
|
||||||
|
raise
|
||||||
|
|
||||||
|
def run_doxygen(doxygen_path, config_file, working_dir):
|
||||||
|
config_file = os.path.abspath( config_file )
|
||||||
|
doxygen_path = doxygen_path
|
||||||
|
old_cwd = os.getcwd()
|
||||||
|
try:
|
||||||
|
os.chdir( working_dir )
|
||||||
|
cmd = [doxygen_path, config_file]
|
||||||
|
print ' '.join( cmd )
|
||||||
|
try:
|
||||||
|
import subprocess
|
||||||
|
except:
|
||||||
|
if os.system( ' '.join( cmd ) ) != 0:
|
||||||
|
print 'Documentation generation failed'
|
||||||
|
return False
|
||||||
|
else:
|
||||||
|
try:
|
||||||
|
subprocess.check_call( cmd )
|
||||||
|
except subprocess.CalledProcessError:
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
finally:
|
||||||
|
os.chdir( old_cwd )
|
||||||
|
|
||||||
|
def main():
|
||||||
|
usage = """%prog
|
||||||
|
Generates doxygen documentation in build/doxygen.
|
||||||
|
Optionaly makes a tarball of the documentation to dist/.
|
||||||
|
|
||||||
|
Must be started in the project top directory.
|
||||||
|
"""
|
||||||
|
from optparse import OptionParser
|
||||||
|
parser = OptionParser(usage=usage)
|
||||||
|
parser.allow_interspersed_args = False
|
||||||
|
parser.add_option('--with-dot', dest="with_dot", action='store_true', default=False,
|
||||||
|
help="""Enable usage of DOT to generate collaboration diagram""")
|
||||||
|
parser.add_option('--dot', dest="dot_path", action='store', default=find_program('dot'),
|
||||||
|
help="""Path to GraphViz dot tool. Must be full qualified path. [Default: %default]""")
|
||||||
|
parser.add_option('--doxygen', dest="doxygen_path", action='store', default=find_program('doxygen'),
|
||||||
|
help="""Path to Doxygen tool. [Default: %default]""")
|
||||||
|
parser.add_option('--with-html-help', dest="with_html_help", action='store_true', default=False,
|
||||||
|
help="""Enable generation of Microsoft HTML HELP""")
|
||||||
|
parser.add_option('--no-uml-look', dest="with_uml_look", action='store_false', default=True,
|
||||||
|
help="""Generates DOT graph without UML look [Default: False]""")
|
||||||
|
parser.add_option('--open', dest="open", action='store_true', default=False,
|
||||||
|
help="""Open the HTML index in the web browser after generation""")
|
||||||
|
parser.add_option('--tarball', dest="make_tarball", action='store_true', default=False,
|
||||||
|
help="""Generates a tarball of the documentation in dist/ directory""")
|
||||||
|
parser.enable_interspersed_args()
|
||||||
|
options, args = parser.parse_args()
|
||||||
|
|
||||||
|
version = open('version','rt').read().strip()
|
||||||
|
output_dir = '../build/doxygen' # relative to doc/doxyfile location.
|
||||||
|
top_dir = os.path.abspath( '.' )
|
||||||
|
html_output_dirname = 'jsoncpp-api-html-' + version
|
||||||
|
tarball_path = os.path.join( 'dist', html_output_dirname + '.tar.gz' )
|
||||||
|
warning_log_path = os.path.join( output_dir, '../jsoncpp-doxygen-warning.log' )
|
||||||
|
def yesno( bool ):
|
||||||
|
return bool and 'YES' or 'NO'
|
||||||
|
subst_keys = {
|
||||||
|
'%JSONCPP_VERSION%': version,
|
||||||
|
'%DOC_TOPDIR%': '',
|
||||||
|
'%TOPDIR%': top_dir,
|
||||||
|
'%HTML_OUTPUT%': os.path.join( output_dir, html_output_dirname ),
|
||||||
|
'%HAVE_DOT%': yesno(options.with_dot),
|
||||||
|
'%DOT_PATH%': os.path.split(options.dot_path)[0],
|
||||||
|
'%HTML_HELP%': yesno(options.with_html_help),
|
||||||
|
'%UML_LOOK%': yesno(options.with_uml_look),
|
||||||
|
'%WARNING_LOG_PATH%': warning_log_path
|
||||||
|
}
|
||||||
|
|
||||||
|
full_output_dir = os.path.join( 'doc', output_dir )
|
||||||
|
if os.path.isdir( full_output_dir ):
|
||||||
|
print 'Deleting directory:', full_output_dir
|
||||||
|
shutil.rmtree( full_output_dir )
|
||||||
|
if not os.path.isdir( full_output_dir ):
|
||||||
|
os.makedirs( full_output_dir )
|
||||||
|
|
||||||
|
do_subst_in_file( 'doc/doxyfile', 'doc/doxyfile.in', subst_keys )
|
||||||
|
ok = run_doxygen( options.doxygen_path, 'doc/doxyfile', 'doc' )
|
||||||
|
print open(os.path.join('doc', warning_log_path), 'rb').read()
|
||||||
|
if not ok:
|
||||||
|
print 'Doxygen generation failed'
|
||||||
|
index_path = os.path.abspath(os.path.join(subst_keys['%HTML_OUTPUT%'], 'index.html'))
|
||||||
|
print 'Generated documentation can be found in:'
|
||||||
|
print index_path
|
||||||
|
if options.open:
|
||||||
|
import webbrowser
|
||||||
|
webbrowser.open( 'file://' + index_path )
|
||||||
|
if options.make_tarball:
|
||||||
|
print 'Generating doc tarball to', tarball_path
|
||||||
|
tarball_sources = [
|
||||||
|
full_output_dir,
|
||||||
|
'README.txt',
|
||||||
|
'version'
|
||||||
|
]
|
||||||
|
tarball_basedir = os.path.join( full_output_dir, html_output_dirname )
|
||||||
|
make_tarball( tarball_path, tarball_sources, tarball_basedir, html_output_dirname )
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
@ -1,116 +0,0 @@
|
|||||||
# Big issue:
|
|
||||||
# emitter depends on doxyfile which is generated from doxyfile.in.
|
|
||||||
# build fails after cleaning and relaunching the build.
|
|
||||||
|
|
||||||
# Todo:
|
|
||||||
# Add helper function to environment like for glob
|
|
||||||
# Easier passage of header/footer
|
|
||||||
# Automatic deduction of index.html path based on custom parameters passed to doxyfile
|
|
||||||
|
|
||||||
import os
|
|
||||||
import os.path
|
|
||||||
from fnmatch import fnmatch
|
|
||||||
import SCons
|
|
||||||
|
|
||||||
def Doxyfile_emitter(target, source, env):
|
|
||||||
"""
|
|
||||||
Modify the target and source lists to use the defaults if nothing
|
|
||||||
else has been specified.
|
|
||||||
|
|
||||||
Dependencies on external HTML documentation references are also
|
|
||||||
appended to the source list.
|
|
||||||
"""
|
|
||||||
doxyfile_template = env.File(env['DOXYFILE_FILE'])
|
|
||||||
source.insert(0, doxyfile_template)
|
|
||||||
|
|
||||||
return target, source
|
|
||||||
|
|
||||||
def Doxyfile_Builder(target, source, env):
|
|
||||||
"""Input:
|
|
||||||
DOXYFILE_FILE
|
|
||||||
Path of the template file for the output doxyfile
|
|
||||||
|
|
||||||
DOXYFILE_DICT
|
|
||||||
A dictionnary of parameter to append to the generated doxyfile
|
|
||||||
"""
|
|
||||||
subdir = os.path.split(source[0].abspath)[0]
|
|
||||||
doc_top_dir = os.path.split(target[0].abspath)[0]
|
|
||||||
doxyfile_path = source[0].abspath
|
|
||||||
doxy_file = file( target[0].abspath, 'wt' )
|
|
||||||
try:
|
|
||||||
# First, output the template file
|
|
||||||
try:
|
|
||||||
f = file(doxyfile_path, 'rt')
|
|
||||||
doxy_file.write( f.read() )
|
|
||||||
f.close()
|
|
||||||
doxy_file.write( '\n' )
|
|
||||||
doxy_file.write( '# Generated content:\n' )
|
|
||||||
except:
|
|
||||||
raise SCons.Errors.UserError, "Can't read doxygen template file '%s'" % doxyfile_path
|
|
||||||
# Then, the input files
|
|
||||||
doxy_file.write( 'INPUT = \\\n' )
|
|
||||||
for source in source:
|
|
||||||
if source.abspath != doxyfile_path: # skip doxyfile path, which is the first source
|
|
||||||
doxy_file.write( '"%s" \\\n' % source.abspath )
|
|
||||||
doxy_file.write( '\n' )
|
|
||||||
# Dot...
|
|
||||||
values_dict = { 'HAVE_DOT': env.get('DOT') and 'YES' or 'NO',
|
|
||||||
'DOT_PATH': env.get('DOT') and os.path.split(env['DOT'])[0] or '',
|
|
||||||
'OUTPUT_DIRECTORY': doc_top_dir,
|
|
||||||
'WARN_LOGFILE': target[0].abspath + '-warning.log'}
|
|
||||||
values_dict.update( env['DOXYFILE_DICT'] )
|
|
||||||
# Finally, output user dictionary values which override any of the previously set parameters.
|
|
||||||
for key, value in values_dict.iteritems():
|
|
||||||
doxy_file.write ('%s = "%s"\n' % (key, str(value)))
|
|
||||||
finally:
|
|
||||||
doxy_file.close()
|
|
||||||
|
|
||||||
def generate(env):
|
|
||||||
"""
|
|
||||||
Add builders and construction variables for the
|
|
||||||
Doxygen tool.
|
|
||||||
"""
|
|
||||||
## Doxyfile builder
|
|
||||||
def doxyfile_message (target, source, env):
|
|
||||||
return "creating Doxygen config file '%s'" % target[0]
|
|
||||||
|
|
||||||
doxyfile_variables = [
|
|
||||||
'DOXYFILE_DICT',
|
|
||||||
'DOXYFILE_FILE'
|
|
||||||
]
|
|
||||||
|
|
||||||
#doxyfile_action = SCons.Action.Action( Doxyfile_Builder, doxyfile_message,
|
|
||||||
# doxyfile_variables )
|
|
||||||
doxyfile_action = SCons.Action.Action( Doxyfile_Builder, doxyfile_message)
|
|
||||||
|
|
||||||
doxyfile_builder = SCons.Builder.Builder( action = doxyfile_action,
|
|
||||||
emitter = Doxyfile_emitter )
|
|
||||||
|
|
||||||
env['BUILDERS']['Doxyfile'] = doxyfile_builder
|
|
||||||
env['DOXYFILE_DICT'] = {}
|
|
||||||
env['DOXYFILE_FILE'] = 'doxyfile.in'
|
|
||||||
|
|
||||||
## Doxygen builder
|
|
||||||
def Doxygen_emitter(target, source, env):
|
|
||||||
output_dir = str( source[0].dir )
|
|
||||||
if str(target[0]) == str(source[0]):
|
|
||||||
target = env.File( os.path.join( output_dir, 'html', 'index.html' ) )
|
|
||||||
return target, source
|
|
||||||
|
|
||||||
doxygen_action = SCons.Action.Action( [ '$DOXYGEN_COM'] )
|
|
||||||
doxygen_builder = SCons.Builder.Builder( action = doxygen_action,
|
|
||||||
emitter = Doxygen_emitter )
|
|
||||||
env['BUILDERS']['Doxygen'] = doxygen_builder
|
|
||||||
env['DOXYGEN_COM'] = '$DOXYGEN $DOXYGEN_FLAGS $SOURCE'
|
|
||||||
env['DOXYGEN_FLAGS'] = ''
|
|
||||||
env['DOXYGEN'] = 'doxygen'
|
|
||||||
|
|
||||||
dot_path = env.WhereIs("dot")
|
|
||||||
if dot_path:
|
|
||||||
env['DOT'] = dot_path
|
|
||||||
|
|
||||||
def exists(env):
|
|
||||||
"""
|
|
||||||
Make sure doxygen exists.
|
|
||||||
"""
|
|
||||||
return env.Detect("doxygen")
|
|
Loading…
Reference in New Issue
Block a user