Restore severity precondition to logging.h.
I mistakenly ommitted the checks when logging.h was ported from libjingle to webrtc. This caused a significant CPU cost for logs which were later filtered out anyway. Verified with LS_VERBOSE logging in neteq4, running: $ out/Release/modules_unittests \ --gtest_filter=NetEqDecodingTest.TestBitExactness \ --gtest_repeat=50 > time.txt $ grep "case ran" time.txt | grep "[0-9]* ms" -o | sort Results on a MacBook Retina, averaged over 5 runs: Verbose logs disabled: 666 ms Exisiting implementation, verbose logs enabled: 944 ms (1.42x) New implementation, verbose logs enabled: 673 ms (1.01x) BUG=2314 R=henrik.lundin@webrtc.org, henrike@webrtc.org, kjellander@webrtc.org, turaj@webrtc.org Review URL: https://webrtc-codereview.appspot.com/2160005 git-svn-id: http://webrtc.googlecode.com/svn/trunk@4682 4adac7df-926f-26a2-2b94-8c16560cd09d
This commit is contained in:
@@ -145,7 +145,7 @@ int main(int argc, char* argv[]) {
|
||||
webrtc::Trace::CreateTrace();
|
||||
webrtc::Trace::SetTraceFile((webrtc::test::OutputPath() +
|
||||
"neteq_trace.txt").c_str());
|
||||
webrtc::Trace::SetLevelFilter(webrtc::kTraceAll);
|
||||
webrtc::Trace::set_level_filter(webrtc::kTraceAll);
|
||||
|
||||
// Initialize NetEq instance.
|
||||
int sample_rate_hz = 16000;
|
||||
|
||||
Reference in New Issue
Block a user