some more doc cleanup

This commit is contained in:
Vadim Pisarevsky
2011-03-03 07:29:55 +00:00
parent 4e6572acd9
commit f025e4739a
39 changed files with 1531 additions and 889 deletions

View File

@@ -98,7 +98,7 @@ Boosting training parameters. ::
CvBoostParams( int boost_type, int weak_count, double weight_trim_rate,
int max_depth, bool use_surrogates, const float* priors );
};
..
The structure is derived from
:ref:`CvDTreeParams` , but not all of the decision tree parameters are supported. In particular, cross-validation is not supported.
@@ -130,7 +130,7 @@ Weak tree classifier. ::
...
CvBoost* ensemble;
};
..
The weak classifier, a component of the boosted tree classifier
:ref:`CvBoost` , is a derivative of
@@ -204,7 +204,7 @@ Boosted tree classifier. ::
CvSeq* weak;
...
};
..
.. index:: CvBoost::train

View File

@@ -89,7 +89,7 @@ Decision tree node split. ::
ord;
};
};
..
.. index:: CvDTreeNode
@@ -117,7 +117,7 @@ Decision tree node. ::
int depth;
...
};
..
Other numerous fields of ``CvDTreeNode`` are used internally at the training stage.
@@ -154,7 +154,7 @@ Decision tree training parameters. ::
bool _use_1se_rule, bool _truncate_pruned_tree,
const float* _priors );
};
..
The structure contains all the decision tree training parameters. There is a default constructor that initializes all the parameters with the default values tuned for standalone classification tree. Any of the parameters can be overridden then, or the structure may be fully initialized using the advanced variant of the constructor.
@@ -260,7 +260,7 @@ Decision tree training data and shared data for tree ensembles. ::
CvRNG rng;
};
..
This structure is mostly used internally for storing both standalone trees and tree ensembles efficiently. Basically, it contains 3 types of information:
@@ -368,7 +368,7 @@ Decision tree. ::
CvDTreeTrainData* data;
};
..
.. index:: CvDTree::train

View File

@@ -124,7 +124,7 @@ Parameters of the EM algorithm. ::
const CvMat** covs;
CvTermCriteria term_crit;
};
..
The structure has 2 constructors, the default one represents a rough rule-of-thumb, with another one it is possible to override a variety of parameters, from a single number of mixtures (the only essential problem-dependent parameter), to the initial values for the mixture parameters.
@@ -186,7 +186,7 @@ EM model. ::
CvMat* inv_eigen_values;
CvMat** cov_rotate_mats;
};
..
.. index:: CvEM::train
@@ -311,5 +311,5 @@ Example: Clustering random samples of multi-Gaussian distribution using EM ::
cvReleaseMat( &labels );
return 0;
}
..

View File

@@ -43,7 +43,7 @@ K Nearest Neighbors model. ::
protected:
...
};
..
.. index:: CvKNearest::train
@@ -164,5 +164,5 @@ If only a single input vector is passed, all output matrices are optional and th
cvReleaseMat( &trainData );
return 0;
}
..

View File

@@ -121,7 +121,7 @@ Parameters of the MLP training algorithm. ::
double rp_dw0, rp_dw_plus, rp_dw_minus, rp_dw_min, rp_dw_max;
};
..
The structure has default constructor that initializes parameters for ``RPROP`` algorithm. There is also more advanced constructor to customize the parameters and/or choose backpropagation algorithm. Finally, the individual parameters can be adjusted after the structure is created.
@@ -212,7 +212,7 @@ MLP model. ::
CvRNG rng;
};
..
Unlike many other models in ML that are constructed and trained at once, in the MLP model these steps are separated. First, a network with the specified topology is created using the non-default constructor or the method ``create`` . All the weights are set to zeros. Then the network is trained using the set of input and output vectors. The training procedure can be repeated more than once, i.e. the weights can be adjusted based on the new training data.

View File

@@ -40,7 +40,7 @@ Bayes classifier for normally distributed data. ::
protected:
...
};
..
.. index:: CvNormalBayesClassifier::train

View File

@@ -76,7 +76,7 @@ Training Parameters of Random Trees. ::
int _nactive_vars, int max_tree_count,
float forest_accuracy, int termcrit_type );
};
..
The set of training parameters for the forest is the superset of the training parameters for a single tree. However, Random trees do not need all the functionality/features of decision trees, most noticeably, the trees are not pruned, so the cross-validation parameters are not used.
@@ -128,7 +128,7 @@ Random Trees. ::
int nclasses;
...
};
..
.. index:: CvRTrees::train
@@ -295,5 +295,5 @@ Example: Prediction of mushroom goodness using random trees classifier ::
cvReleaseFileStorage(&storage);
return 0;
}
..

View File

@@ -38,7 +38,7 @@ Base class for the statistical models in ML. ::
virtual void write( CvFileStorage* storage, const char* name )=0;
virtual void read( CvFileStorage* storage, CvFileNode* node )=0;
};
..
In this declaration some methods are commented off. Actually, these are methods for which there is no unified API (with the exception of the default constructor), however, there are many similarities in the syntax and semantics that are briefly described below in this section, as if they are a part of the base class.
@@ -85,7 +85,7 @@ The destructor of the base class is declared as virtual, so it is safe to write
model = new CvDTree(... /* Decision tree params */);
...
delete model;
..
Normally, the destructor of each derived class does nothing, but in this instance it calls the overridden method ``clear()`` that deallocates all the memory.

View File

@@ -82,7 +82,7 @@ Support Vector Machines. ::
protected:
...
};
..
.. index:: CvSVMParams
@@ -114,7 +114,7 @@ SVM training parameters. ::
CvMat* class_weights; // for CV_SVM_C_SVC
CvTermCriteria term_crit; // termination criteria
};
..
The structure must be initialized and passed to the training method of
:ref:`CvSVM` .
@@ -199,7 +199,6 @@ CvSVM::get_default_grid
* **CvSVM::COEF**
* **CvSVM::DEGREE**
.
The grid will be generated for the parameter with this ID.