The existing cast didn't make much sense. The expression itself is
already int, there's little point in casting it to int, especially
when assigning to a float.
This fixes warnings with MSVC.
The ForceIntraFrame test will fail (giving up after 100 skipped
frames) if the bitrate is not set high enough.
Set the minimum bitrate to w*h/50, which is a very low value,
but which still should allow getting a non-skipped frame within
a few attempts.
Both pNalLengthInByte[] that are accumulated, and sFbi.iFrameSizeInBytes
that it is compared to, are plain 'int', not 'uint32_t'.
This fixes warnings about comparison between signed and unsigned.
Make sure that pOptions is initialized to the parameters that
the codec actually uses, not the ones that we initially tried
to set.
When calling SetOption to update the codec parameters, this may
cause a reset of the whole codec if e.g. the number of threads
differs from what is set within the codec itself.
If the number of threads was changed by the codec internally
when inited (e.g. changed from 0 to 1), WelsEncoderParamAdjust
may think the whole codec needs to be reset.
This fixes running EncoderInterfaceTest.TemporalLayerSettingTest
on machines where the detected number of cores is 1.
Previously, this test used whatever size was set in m_iWidth
and m_iHeight before, which depended on the order that the tests
were executed. When this test was the first one executed in the
EncoderInterfaceTest, the width and height were set to the max.
Instead of having the test behaviour depend on the test order,
set a specific size, just as InitializeParamExt and MemoryCheckTest
do.
This reduces the runtime of TemporalLayerSettingTest from 86 seconds
to 26 seconds, when run in valgrind.
There's little point in running the same test over and over for
a huge number of frames if it doesn't test much different things.
This reduces the runtime of EncoderInterfaceTest.* from 322 seconds
to 140 seconds, when running in valgrind.