From 8f7ba03ed29284d86c68831996ec826692ba7bd6 Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Fri, 14 Jun 2013 11:53:54 +0400 Subject: [PATCH 01/75] Some fixes for incorrectly documented parameters identified by rst_parser.py (Bug #1205) --- modules/core/doc/basic_structures.rst | 31 +++++++++++++++++-- modules/core/doc/clustering.rst | 8 +++++ modules/core/doc/drawing_functions.rst | 8 +++++ modules/core/doc/operations_on_arrays.rst | 2 ++ ...tility_and_system_functions_and_macros.rst | 9 ++++++ modules/core/doc/xml_yaml_persistence.rst | 11 +++++++ modules/core/include/opencv2/core/core.hpp | 2 -- 7 files changed, 67 insertions(+), 4 deletions(-) diff --git a/modules/core/doc/basic_structures.rst b/modules/core/doc/basic_structures.rst index acfbb911d..370587922 100644 --- a/modules/core/doc/basic_structures.rst +++ b/modules/core/doc/basic_structures.rst @@ -489,6 +489,9 @@ Various Ptr constructors. .. ocv:function:: Ptr::Ptr(_Tp* _obj) .. ocv:function:: Ptr::Ptr(const Ptr& ptr) + :param _obj: Object for copy. + :param ptr: Object for copy. + Ptr::~Ptr --------- The Ptr destructor. @@ -501,6 +504,8 @@ Assignment operator. .. ocv:function:: Ptr& Ptr::operator = (const Ptr& ptr) + :param ptr: Object for assignment. + Decrements own reference counter (with ``release()``) and increments ptr's reference counter. Ptr::addref @@ -1465,6 +1470,7 @@ Adds elements to the bottom of the matrix. .. ocv:function:: void Mat::push_back( const Mat& m ) :param elem: Added element(s). + :param m: Added line(s). The methods add one or more elements to the bottom of the matrix. They emulate the corresponding method of the STL vector class. When ``elem`` is ``Mat`` , its type and the number of columns must be the same as in the container matrix. @@ -2160,7 +2166,6 @@ Various SparseMat constructors. :param dims: Array dimensionality. :param _sizes: Sparce matrix size on all dementions. :param _type: Sparse matrix data type. - :param try1d: if try1d is true and matrix is a single-column matrix (Nx1), then the sparse matrix will be 1-dimensional. SparseMat::~SparseMat --------------------- @@ -2175,6 +2180,8 @@ Provides sparse matrix assignment operators. .. ocv:function:: SparseMat& SparseMat::operator = (const SparseMat& m) .. ocv:function:: SparseMat& SparseMat::operator = (const Mat& m) + :param m: Matrix for assignment. + The last variant is equivalent to the corresponding constructor with try1d=false. @@ -2202,6 +2209,10 @@ Convert sparse matrix with possible type change and scaling. .. ocv:function:: void SparseMat::convertTo( SparseMat& m, int rtype, double alpha=1 ) const .. ocv:function:: void SparseMat::convertTo( Mat& m, int rtype, double alpha=1, double beta=0 ) const + :param m: Destination matrix. + :param rtype: Destination matrix type. + :param alpha: Conversion multiplier. + The first version converts arbitrary sparse matrix to dense matrix and multiplies all the matrix elements by the specified scalar. The second versiob converts sparse matrix to dense matrix with optional type conversion and scaling. When rtype=-1, the destination element type will be the same as the sparse matrix element type. @@ -2294,7 +2305,7 @@ The method returns the number of matrix channels. SparseMat::size --------------- -Returns the array of sizes or matrix size by i dimention and 0 if the matrix is not allocated. +Returns the array of sizes or matrix size by i dimension and 0 if the matrix is not allocated. .. ocv:function:: const int* SparseMat::size() const .. ocv:function:: int SparseMat::size(int i) const @@ -2322,6 +2333,11 @@ Compute element hash value from the element indices. .. ocv:function:: size_t SparseMat::hash(int i0, int i1, int i2) const .. ocv:function:: size_t SparseMat::hash(const int* idx) const + :param i0: The first dimension index. + :param i1: The second dimension index. + :param i2: The third dimension index. + :param idx: Array of element indices for multidimensional matices. + SparseMat::ptr -------------- Low-level element-access functions, special variants for 1D, 2D, 3D cases, and the generic one for n-D case. @@ -2331,6 +2347,12 @@ Low-level element-access functions, special variants for 1D, 2D, 3D cases, and t .. ocv:function:: uchar* SparseMat::ptr(int i0, int i1, int i2, bool createMissing, size_t* hashval=0) .. ocv:function:: uchar* SparseMat::ptr(const int* idx, bool createMissing, size_t* hashval=0) + :param i0: The first dimension index. + :param i1: The second dimension index. + :param i2: The third dimension index. + :param idx: Array of element indices for multidimensional matices. + :param createMissing: Create new element with 0 value if it does not exist in SparseMat. + Return pointer to the matrix element. If the element is there (it is non-zero), the pointer to it is returned. If it is not there and ``createMissing=false``, NULL pointer is returned. If it is not there and ``createMissing=true``, the new elementis created and initialized with 0. Pointer to it is returned. If the optional hashval pointer is not ``NULL``, @@ -2344,6 +2366,11 @@ Erase the specified matrix element. When there is no such an element, the method .. ocv:function:: void SparseMat::erase(int i0, int i1, int i2, size_t* hashval=0) .. ocv:function:: void SparseMat::erase(const int* idx, size_t* hashval=0) + :param i0: The first dimension index. + :param i1: The second dimension index. + :param i2: The third dimension index. + :param idx: Array of element indices for multidimensional matices. + SparseMat\_ ----------- .. ocv:class:: SparseMat_ diff --git a/modules/core/doc/clustering.rst b/modules/core/doc/clustering.rst index 46130bc8f..f58e99ce2 100644 --- a/modules/core/doc/clustering.rst +++ b/modules/core/doc/clustering.rst @@ -17,12 +17,18 @@ Finds centers of clusters and groups input samples around the clusters. :param samples: Floating-point matrix of input samples, one row per sample. + :param data: Data for clustering. + :param cluster_count: Number of clusters to split the set by. + :param K: Number of clusters to split the set by. + :param labels: Input/output integer array that stores the cluster indices for every sample. :param criteria: The algorithm termination criteria, that is, the maximum number of iterations and/or the desired accuracy. The accuracy is specified as ``criteria.epsilon``. As soon as each of the cluster centers moves by less than ``criteria.epsilon`` on some iteration, the algorithm stops. + :param termcrit: The algorithm termination criteria, that is, the maximum number of iterations and/or the desired accuracy. + :param attempts: Flag to specify the number of times the algorithm is executed using different initial labellings. The algorithm returns the labels that yield the best compactness (see the last function parameter). :param rng: CvRNG state initialized by RNG(). @@ -37,6 +43,8 @@ Finds centers of clusters and groups input samples around the clusters. :param centers: Output matrix of the cluster centers, one row per each cluster center. + :param _centers: Output matrix of the cluster centers, one row per each cluster center. + :param compactness: The returned value that is described below. The function ``kmeans`` implements a k-means algorithm that finds the diff --git a/modules/core/doc/drawing_functions.rst b/modules/core/doc/drawing_functions.rst index 24328f9a5..342301db9 100644 --- a/modules/core/doc/drawing_functions.rst +++ b/modules/core/doc/drawing_functions.rst @@ -234,6 +234,8 @@ Calculates the width and height of a text string. :param text: Input text string. + :param text_string: Input text string in C format. + :param fontFace: Font to use. See the :ocv:func:`putText` for details. :param fontScale: Font scale. See the :ocv:func:`putText` for details. @@ -242,6 +244,12 @@ Calculates the width and height of a text string. :param baseLine: Output parameter - y-coordinate of the baseline relative to the bottom-most text point. + :param baseline: Output parameter - y-coordinate of the baseline relative to the bottom-most text point. + + :param font: Font description in terms of old C API. + + :param text_size: Output parameter - The size of a box that contains the specified text. + The function ``getTextSize`` calculates and returns the size of a box that contains the specified text. That is, the following code renders some text, the tight box surrounding it, and the baseline: :: diff --git a/modules/core/doc/operations_on_arrays.rst b/modules/core/doc/operations_on_arrays.rst index d33844476..bd55993af 100644 --- a/modules/core/doc/operations_on_arrays.rst +++ b/modules/core/doc/operations_on_arrays.rst @@ -1062,6 +1062,8 @@ Returns the determinant of a square floating-point matrix. :param mtx: input matrix that must have ``CV_32FC1`` or ``CV_64FC1`` type and square size. + :param mat: input matrix that must have ``CV_32FC1`` or ``CV_64FC1`` type and square size. + The function ``determinant`` calculates and returns the determinant of the specified matrix. For small matrices ( ``mtx.cols=mtx.rows<=3`` ), the direct method is used. For larger matrices, the function uses LU factorization with partial pivoting. diff --git a/modules/core/doc/utility_and_system_functions_and_macros.rst b/modules/core/doc/utility_and_system_functions_and_macros.rst index 54198b058..41cf7e1b7 100644 --- a/modules/core/doc/utility_and_system_functions_and_macros.rst +++ b/modules/core/doc/utility_and_system_functions_and_macros.rst @@ -173,6 +173,8 @@ Checks a condition at runtime and throws exception if it fails .. ocv:function:: CV_Assert(expr) + :param expr: Expression for check. + The macros ``CV_Assert`` (and ``CV_DbgAssert``) evaluate the specified expression. If it is 0, the macros raise an error (see :ocv:func:`error` ). The macro ``CV_Assert`` checks the condition in both Debug and Release configurations while ``CV_DbgAssert`` is only retained in the Debug configuration. @@ -188,8 +190,14 @@ Signals an error and raises an exception. :param status: Error code. Normally, it is a negative value. The list of pre-defined error codes can be found in ``cxerror.h`` . + :param func_name: The function name where error occurs. + :param err_msg: Text of the error message. + :param file_name: The file name where error occurs. + + :param line: The line number where error occurs. + :param args: ``printf`` -like formatted error message in parentheses. The function and the helper macros ``CV_Error`` and ``CV_Error_``: :: @@ -249,6 +257,7 @@ Allocates an aligned memory buffer. .. ocv:cfunction:: void* cvAlloc( size_t size ) :param size: Allocated buffer size. + :param bufSize: Allocated buffer size. The function allocates the buffer of the specified size and returns it. When the buffer size is 16 bytes or more, the returned buffer is aligned to 16 bytes. diff --git a/modules/core/doc/xml_yaml_persistence.rst b/modules/core/doc/xml_yaml_persistence.rst index c7d55d01f..28bae2450 100644 --- a/modules/core/doc/xml_yaml_persistence.rst +++ b/modules/core/doc/xml_yaml_persistence.rst @@ -181,6 +181,17 @@ Opens a file. .. ocv:function:: bool FileStorage::open(const string& filename, int flags, const string& encoding=string()) + :param filename: Name of the file to open or the text string to read the data from. + Extension of the file (``.xml`` or ``.yml``/``.yaml``) determines its format (XML or YAML respectively). + Also you can append ``.gz`` to work with compressed files, for example ``myHugeMatrix.xml.gz``. + If both ``FileStorage::WRITE`` and ``FileStorage::MEMORY`` flags are specified, ``source`` + is used just to specify the output file format (e.g. ``mydata.xml``, ``.yml`` etc.). + + :param flags: Mode of operation. See FileStorage constructor for more details. + + :param encoding: Encoding of the file. Note that UTF-16 XML encoding is not supported currently and you should use 8-bit encoding instead of it. + + See description of parameters in :ocv:func:`FileStorage::FileStorage`. The method calls :ocv:func:`FileStorage::release` before opening the file. diff --git a/modules/core/include/opencv2/core/core.hpp b/modules/core/include/opencv2/core/core.hpp index 2b7791958..10210c511 100644 --- a/modules/core/include/opencv2/core/core.hpp +++ b/modules/core/include/opencv2/core/core.hpp @@ -3409,8 +3409,6 @@ public: //! converts dense 2d matrix to the sparse form /*! \param m the input matrix - \param try1d if true and m is a single-column matrix (Nx1), - then the sparse matrix will be 1-dimensional. */ explicit SparseMat(const Mat& m); //! converts old-style sparse matrix to the new-style. All the data is copied From 371a9cd8338e15f89b1c0ab6e46fe34fe377a553 Mon Sep 17 00:00:00 2001 From: Vladislav Vinogradov Date: Tue, 18 Jun 2013 17:46:57 +0400 Subject: [PATCH 02/75] fixed build with CUDA 5.5 on arm platforms --- cmake/OpenCVDetectCUDA.cmake | 40 ++++++++++++++++++++++++++++++++---- modules/gpu/CMakeLists.txt | 16 +++++++-------- 2 files changed, 44 insertions(+), 12 deletions(-) diff --git a/cmake/OpenCVDetectCUDA.cmake b/cmake/OpenCVDetectCUDA.cmake index 8db667762..3b93f2932 100644 --- a/cmake/OpenCVDetectCUDA.cmake +++ b/cmake/OpenCVDetectCUDA.cmake @@ -29,10 +29,42 @@ if(CUDA_FOUND) if(${CUDA_VERSION} VERSION_LESS "5.5") find_cuda_helper_libs(npp) else() - find_cuda_helper_libs(nppc) - find_cuda_helper_libs(nppi) - find_cuda_helper_libs(npps) - set(CUDA_npp_LIBRARY ${CUDA_nppc_LIBRARY} ${CUDA_nppi_LIBRARY} ${CUDA_npps_LIBRARY}) + # hack for CUDA 5.5 + if(${CMAKE_SYSTEM_PROCESSOR} STREQUAL "arm") + unset(CUDA_TOOLKIT_INCLUDE CACHE) + unset(CUDA_CUDART_LIBRARY CACHE) + unset(CUDA_cublas_LIBRARY CACHE) + unset(CUDA_cufft_LIBRARY CACHE) + unset(CUDA_npp_LIBRARY CACHE) + + if(SOFTFP) + set(cuda_arm_path "${CUDA_TOOLKIT_ROOT_DIR}/targets/armv7-linux-gnueabi") + else() + set(cuda_arm_path "${CUDA_TOOLKIT_ROOT_DIR}/targets/armv7-linux-gnueabihf") + endif() + + set(CUDA_TOOLKIT_INCLUDE "${cuda_arm_path}/include" CACHE PATH "include path") + set(CUDA_INCLUDE_DIRS ${CUDA_TOOLKIT_INCLUDE}) + + set(cuda_arm_library_path "${cuda_arm_path}/lib") + + set(CUDA_CUDART_LIBRARY "${cuda_arm_library_path}/libcudart.so" CACHE FILEPATH "cudart library") + set(CUDA_LIBRARIES ${CUDA_CUDART_LIBRARY}) + set(CUDA_cublas_LIBRARY "${cuda_arm_library_path}/libcublas.so" CACHE FILEPATH "cublas library") + set(CUDA_cufft_LIBRARY "${cuda_arm_library_path}/libcufft.so" CACHE FILEPATH "cufft library") + set(CUDA_nppc_LIBRARY "${cuda_arm_library_path}/libnppc.so" CACHE FILEPATH "nppc library") + set(CUDA_nppi_LIBRARY "${cuda_arm_library_path}/libnppi.so" CACHE FILEPATH "nppi library") + set(CUDA_npps_LIBRARY "${cuda_arm_library_path}/libnpps.so" CACHE FILEPATH "npps library") + set(CUDA_npp_LIBRARY "${CUDA_nppc_LIBRARY};${CUDA_nppi_LIBRARY};${CUDA_npps_LIBRARY}" CACHE STRING "npp library") + else() + unset(CUDA_npp_LIBRARY CACHE) + + find_cuda_helper_libs(nppc) + find_cuda_helper_libs(nppi) + find_cuda_helper_libs(npps) + + set(CUDA_npp_LIBRARY "${CUDA_nppc_LIBRARY};${CUDA_nppi_LIBRARY};${CUDA_npps_LIBRARY}" CACHE STRING "npp library") + endif() endif() if(WITH_NVCUVID) diff --git a/modules/gpu/CMakeLists.txt b/modules/gpu/CMakeLists.txt index 0062944ba..44b507268 100644 --- a/modules/gpu/CMakeLists.txt +++ b/modules/gpu/CMakeLists.txt @@ -43,6 +43,14 @@ if(HAVE_CUDA) ocv_cuda_compile(cuda_objs ${lib_cuda} ${ncv_cuda}) set(cuda_link_libs ${CUDA_LIBRARIES} ${CUDA_npp_LIBRARY}) + + if(HAVE_CUFFT) + set(cuda_link_libs ${cuda_link_libs} ${CUDA_cufft_LIBRARY}) + endif() + + if(HAVE_CUBLAS) + set(cuda_link_libs ${cuda_link_libs} ${CUDA_cublas_LIBRARY}) + endif() if(WITH_NVCUVID) set(cuda_link_libs ${cuda_link_libs} ${CUDA_CUDA_LIBRARY} ${CUDA_nvcuvid_LIBRARY}) @@ -71,14 +79,6 @@ ocv_set_module_sources( ocv_create_module(${cuda_link_libs}) if(HAVE_CUDA) - if(HAVE_CUFFT) - CUDA_ADD_CUFFT_TO_TARGET(${the_module}) - endif() - - if(HAVE_CUBLAS) - CUDA_ADD_CUBLAS_TO_TARGET(${the_module}) - endif() - install(FILES src/nvidia/NPP_staging/NPP_staging.hpp src/nvidia/core/NCV.hpp DESTINATION ${OPENCV_INCLUDE_INSTALL_PATH}/opencv2/${name} COMPONENT main) From 68741bf8a010913991fc43252c052d2519bdf301 Mon Sep 17 00:00:00 2001 From: Alexander Shishkov Date: Wed, 19 Jun 2013 00:20:21 +0400 Subject: [PATCH 03/75] moved iOS part to platforms folder --- .../introduction/ios_install/ios_install.rst | 2 +- ios/configure-device_xcode.sh | 1 - ios/configure-simulator_xcode.sh | 1 - ios/readme.txt | 15 --------------- {ios => platforms/ios}/Info.plist.in | 2 +- {ios => platforms/ios}/build_framework.py | 9 +++------ .../ios}/cmake/Modules/Platform/iOS.cmake | 0 .../Toolchains/Toolchain-iPhoneOS_Xcode.cmake | 8 ++++---- .../Toolchain-iPhoneSimulator_Xcode.cmake | 8 ++++---- platforms/ios/readme.txt | 7 +++++++ 10 files changed, 20 insertions(+), 33 deletions(-) delete mode 100755 ios/configure-device_xcode.sh delete mode 100755 ios/configure-simulator_xcode.sh delete mode 100644 ios/readme.txt rename {ios => platforms/ios}/Info.plist.in (93%) rename {ios => platforms/ios}/build_framework.py (95%) rename {ios => platforms/ios}/cmake/Modules/Platform/iOS.cmake (100%) rename {ios => platforms/ios}/cmake/Toolchains/Toolchain-iPhoneOS_Xcode.cmake (84%) rename {ios => platforms/ios}/cmake/Toolchains/Toolchain-iPhoneSimulator_Xcode.cmake (85%) create mode 100644 platforms/ios/readme.txt diff --git a/doc/tutorials/introduction/ios_install/ios_install.rst b/doc/tutorials/introduction/ios_install/ios_install.rst index ace657b21..8d117a0b4 100644 --- a/doc/tutorials/introduction/ios_install/ios_install.rst +++ b/doc/tutorials/introduction/ios_install/ios_install.rst @@ -37,7 +37,7 @@ Building OpenCV from Source, using CMake and Command Line .. code-block:: bash cd ~/ - python opencv/ios/build_framework.py ios + python opencv/platforms/ios/build_framework.py ios If everything's fine, a few minutes later you will get ~//ios/opencv2.framework. You can add this framework to your Xcode projects. diff --git a/ios/configure-device_xcode.sh b/ios/configure-device_xcode.sh deleted file mode 100755 index 8c28a3e90..000000000 --- a/ios/configure-device_xcode.sh +++ /dev/null @@ -1 +0,0 @@ -cmake -GXcode -DCMAKE_TOOLCHAIN_FILE=../opencv/ios/cmake/Toolchains/Toolchain-iPhoneOS_Xcode.cmake -DCMAKE_INSTALL_PREFIX=../OpenCV_iPhoneOS ../opencv diff --git a/ios/configure-simulator_xcode.sh b/ios/configure-simulator_xcode.sh deleted file mode 100755 index 50e00261d..000000000 --- a/ios/configure-simulator_xcode.sh +++ /dev/null @@ -1 +0,0 @@ -cmake -GXcode -DCMAKE_TOOLCHAIN_FILE=../opencv/ios/cmake/Toolchains/Toolchain-iPhoneSimulator_Xcode.cmake -DCMAKE_INSTALL_PREFIX=../OpenCV_iPhoneSimulator ../opencv diff --git a/ios/readme.txt b/ios/readme.txt deleted file mode 100644 index 1441b241b..000000000 --- a/ios/readme.txt +++ /dev/null @@ -1,15 +0,0 @@ -Assuming that your build directory is on the same level that opencv source, -From the build directory run - ../opencv/ios/configure-device_xcode.sh -or - ../opencv/ios/configure-simulator_xcode.sh - -Then from the same folder invoke - -xcodebuild -sdk iphoneos -configuration Release -target ALL_BUILD -xcodebuild -sdk iphoneos -configuration Release -target install install - -or - -xcodebuild -sdk iphonesimulator -configuration Release -target ALL_BUILD -xcodebuild -sdk iphonesimulator -configuration Release -target install install \ No newline at end of file diff --git a/ios/Info.plist.in b/platforms/ios/Info.plist.in similarity index 93% rename from ios/Info.plist.in rename to platforms/ios/Info.plist.in index 89ef38625..012de8856 100644 --- a/ios/Info.plist.in +++ b/platforms/ios/Info.plist.in @@ -5,7 +5,7 @@ CFBundleName OpenCV CFBundleIdentifier - com.itseez.opencv + opencv.org CFBundleVersion ${VERSION} CFBundleShortVersionString diff --git a/ios/build_framework.py b/platforms/ios/build_framework.py similarity index 95% rename from ios/build_framework.py rename to platforms/ios/build_framework.py index ceef4b71d..bc385bb1b 100755 --- a/ios/build_framework.py +++ b/platforms/ios/build_framework.py @@ -38,7 +38,7 @@ def build_opencv(srcroot, buildroot, target, arch): # for some reason, if you do not specify CMAKE_BUILD_TYPE, it puts libs to "RELEASE" rather than "Release" cmakeargs = ("-GXcode " + "-DCMAKE_BUILD_TYPE=Release " + - "-DCMAKE_TOOLCHAIN_FILE=%s/ios/cmake/Toolchains/Toolchain-%s_Xcode.cmake " + + "-DCMAKE_TOOLCHAIN_FILE=%s/platforms/ios/cmake/Toolchains/Toolchain-%s_Xcode.cmake " + "-DBUILD_opencv_world=ON " + "-DCMAKE_INSTALL_PREFIX=install") % (srcroot, target) # if cmake cache exists, just rerun cmake to update OpenCV.xproj if necessary @@ -92,16 +92,13 @@ def put_framework_together(srcroot, dstroot): os.system("lipo -create " + wlist + " -o " + dstdir + "/opencv2") # form Info.plist - srcfile = open(srcroot + "/ios/Info.plist.in", "rt") + srcfile = open(srcroot + "/platforms/ios/Info.plist.in", "rt") dstfile = open(dstdir + "/Resources/Info.plist", "wt") for l in srcfile.readlines(): dstfile.write(l.replace("${VERSION}", opencv_version)) srcfile.close() dstfile.close() - # copy cascades - # TODO ... - # make symbolic links os.symlink("A", "Versions/Current") os.symlink("Versions/Current/Headers", "Headers") @@ -125,4 +122,4 @@ if __name__ == "__main__": print "Usage:\n\t./build_framework.py \n\n" sys.exit(0) - build_framework(os.path.abspath(os.path.join(os.path.dirname(sys.argv[0]), "..")), os.path.abspath(sys.argv[1])) \ No newline at end of file + build_framework(os.path.abspath(os.path.join(os.path.dirname(sys.argv[0]), "../..")), os.path.abspath(sys.argv[1])) \ No newline at end of file diff --git a/ios/cmake/Modules/Platform/iOS.cmake b/platforms/ios/cmake/Modules/Platform/iOS.cmake similarity index 100% rename from ios/cmake/Modules/Platform/iOS.cmake rename to platforms/ios/cmake/Modules/Platform/iOS.cmake diff --git a/ios/cmake/Toolchains/Toolchain-iPhoneOS_Xcode.cmake b/platforms/ios/cmake/Toolchains/Toolchain-iPhoneOS_Xcode.cmake similarity index 84% rename from ios/cmake/Toolchains/Toolchain-iPhoneOS_Xcode.cmake rename to platforms/ios/cmake/Toolchains/Toolchain-iPhoneOS_Xcode.cmake index 67343253b..6493deb45 100644 --- a/ios/cmake/Toolchains/Toolchain-iPhoneOS_Xcode.cmake +++ b/platforms/ios/cmake/Toolchains/Toolchain-iPhoneOS_Xcode.cmake @@ -4,12 +4,12 @@ set (IPHONEOS TRUE) # Standard settings set (CMAKE_SYSTEM_NAME iOS) # Include extra modules for the iOS platform files -set (CMAKE_MODULE_PATH ${CMAKE_MODULE_PATH} "${CMAKE_CURRENT_SOURCE_DIR}/ios/cmake/Modules") +set (CMAKE_MODULE_PATH ${CMAKE_MODULE_PATH} "${CMAKE_CURRENT_SOURCE_DIR}/platforms/ios/cmake/Modules") -# Force the compilers to gcc for iOS +# Force the compilers to clang for iOS include (CMakeForceCompiler) -#CMAKE_FORCE_C_COMPILER (gcc gcc) -#CMAKE_FORCE_CXX_COMPILER (g++ g++) +#CMAKE_FORCE_C_COMPILER (clang GNU) +#CMAKE_FORCE_CXX_COMPILER (clang++ GNU) set (CMAKE_C_SIZEOF_DATA_PTR 4) set (CMAKE_C_HAS_ISYSROOT 1) diff --git a/ios/cmake/Toolchains/Toolchain-iPhoneSimulator_Xcode.cmake b/platforms/ios/cmake/Toolchains/Toolchain-iPhoneSimulator_Xcode.cmake similarity index 85% rename from ios/cmake/Toolchains/Toolchain-iPhoneSimulator_Xcode.cmake rename to platforms/ios/cmake/Toolchains/Toolchain-iPhoneSimulator_Xcode.cmake index 7ef8113ed..0056c8dbd 100644 --- a/ios/cmake/Toolchains/Toolchain-iPhoneSimulator_Xcode.cmake +++ b/platforms/ios/cmake/Toolchains/Toolchain-iPhoneSimulator_Xcode.cmake @@ -4,12 +4,12 @@ set (IPHONESIMULATOR TRUE) # Standard settings set (CMAKE_SYSTEM_NAME iOS) # Include extra modules for the iOS platform files -set (CMAKE_MODULE_PATH ${CMAKE_MODULE_PATH} "${CMAKE_CURRENT_SOURCE_DIR}/ios/cmake/Modules") +set (CMAKE_MODULE_PATH ${CMAKE_MODULE_PATH} "${CMAKE_CURRENT_SOURCE_DIR}/platforms/ios/cmake/Modules") -# Force the compilers to gcc for iOS +# Force the compilers to clang for iOS include (CMakeForceCompiler) -#CMAKE_FORCE_C_COMPILER (gcc gcc) -#CMAKE_FORCE_CXX_COMPILER (g++ g++) +#CMAKE_FORCE_C_COMPILER (clang GNU) +#CMAKE_FORCE_CXX_COMPILER (clang++ GNU) set (CMAKE_C_SIZEOF_DATA_PTR 4) set (CMAKE_C_HAS_ISYSROOT 1) diff --git a/platforms/ios/readme.txt b/platforms/ios/readme.txt new file mode 100644 index 000000000..8f1f206b0 --- /dev/null +++ b/platforms/ios/readme.txt @@ -0,0 +1,7 @@ +Building OpenCV from Source, using CMake and Command Line +========================================================= + +cd ~/ +python opencv/platforms/ios/build_framework.py ios + +If everything's fine, a few minutes later you will get ~//ios/opencv2.framework. You can add this framework to your Xcode projects. \ No newline at end of file From 37d19b9c46ebfc4be922237b11913cf838959b49 Mon Sep 17 00:00:00 2001 From: Roman Donchenko Date: Wed, 19 Jun 2013 17:44:12 +0400 Subject: [PATCH 04/75] Pass the HAVE_QT* flags through the config header, like all others. I don't know why it didn't work for the original author, but it definitely works now. --- cmake/OpenCVFindLibsGUI.cmake | 3 --- cmake/templates/cvconfig.h.cmake | 6 ++++++ modules/highgui/src/window_QT.cpp | 1 + 3 files changed, 7 insertions(+), 3 deletions(-) diff --git a/cmake/OpenCVFindLibsGUI.cmake b/cmake/OpenCVFindLibsGUI.cmake index 59ce1cd05..d685d23fe 100644 --- a/cmake/OpenCVFindLibsGUI.cmake +++ b/cmake/OpenCVFindLibsGUI.cmake @@ -24,7 +24,6 @@ if(WITH_QT) if(Qt5Core_FOUND AND Qt5Gui_FOUND AND Qt5Widgets_FOUND AND Qt5Test_FOUND AND Qt5Concurrent_FOUND) set(HAVE_QT5 ON) set(HAVE_QT ON) - add_definitions(-DHAVE_QT) find_package(Qt5OpenGL) if(Qt5OpenGL_FOUND) set(QT_QTOPENGL_FOUND ON) @@ -36,7 +35,6 @@ if(WITH_QT) find_package(Qt4 REQUIRED QtCore QtGui QtTest) if(QT4_FOUND) set(HAVE_QT TRUE) - add_definitions(-DHAVE_QT) # We need to define the macro this way, using cvconfig.h does not work endif() endif() endif() @@ -61,7 +59,6 @@ if(WITH_OPENGL) list(APPEND OPENCV_LINKER_LIBS ${OPENGL_LIBRARIES}) if(QT_QTOPENGL_FOUND) set(HAVE_QT_OPENGL TRUE) - add_definitions(-DHAVE_QT_OPENGL) else() ocv_include_directories(${OPENGL_INCLUDE_DIR}) endif() diff --git a/cmake/templates/cvconfig.h.cmake b/cmake/templates/cvconfig.h.cmake index db46af4b6..f12730988 100644 --- a/cmake/templates/cvconfig.h.cmake +++ b/cmake/templates/cvconfig.h.cmake @@ -228,3 +228,9 @@ /* Clp support */ #cmakedefine HAVE_CLP + +/* Qt support */ +#cmakedefine HAVE_QT + +/* Qt OpenGL support */ +#cmakedefine HAVE_QT_OPENGL diff --git a/modules/highgui/src/window_QT.cpp b/modules/highgui/src/window_QT.cpp index 0c50c7070..64d57ab26 100644 --- a/modules/highgui/src/window_QT.cpp +++ b/modules/highgui/src/window_QT.cpp @@ -38,6 +38,7 @@ //--------------------Google Code 2010 -- Yannick Verdie--------------------// +#include "precomp.hpp" #if defined(HAVE_QT) From 936236e4b1b190d7bc33a33df982fac8ab6cfc76 Mon Sep 17 00:00:00 2001 From: Roman Donchenko Date: Tue, 11 Jun 2013 16:06:51 +0400 Subject: [PATCH 05/75] Extended the CPU/GPU selection mechanism in performance tests. Now it allows choosing between arbitrary implementation variants. --- modules/gpu/perf/perf_main.cpp | 2 +- modules/nonfree/perf/perf_main.cpp | 2 +- modules/superres/perf/perf_main.cpp | 2 +- modules/ts/include/opencv2/ts/ts_perf.hpp | 20 +++-- modules/ts/src/ts_perf.cpp | 101 ++++++++++++++-------- 5 files changed, 81 insertions(+), 46 deletions(-) diff --git a/modules/gpu/perf/perf_main.cpp b/modules/gpu/perf/perf_main.cpp index a7ac1ccce..f9f3a6854 100644 --- a/modules/gpu/perf/perf_main.cpp +++ b/modules/gpu/perf/perf_main.cpp @@ -44,4 +44,4 @@ using namespace perf; -CV_PERF_TEST_MAIN(gpu, printCudaInfo()) +CV_PERF_TEST_MAIN_WITH_IMPLS(gpu, ("cuda", "plain"), printCudaInfo()) diff --git a/modules/nonfree/perf/perf_main.cpp b/modules/nonfree/perf/perf_main.cpp index de1242149..373e08aed 100644 --- a/modules/nonfree/perf/perf_main.cpp +++ b/modules/nonfree/perf/perf_main.cpp @@ -1,4 +1,4 @@ #include "perf_precomp.hpp" #include "opencv2/ts/gpu_perf.hpp" -CV_PERF_TEST_MAIN(nonfree, perf::printCudaInfo()) +CV_PERF_TEST_MAIN_WITH_IMPLS(nonfree, ("cuda", "plain"), perf::printCudaInfo()) diff --git a/modules/superres/perf/perf_main.cpp b/modules/superres/perf/perf_main.cpp index adc69e6e8..90a7f5125 100644 --- a/modules/superres/perf/perf_main.cpp +++ b/modules/superres/perf/perf_main.cpp @@ -44,4 +44,4 @@ using namespace perf; -CV_PERF_TEST_MAIN(superres, printCudaInfo()) +CV_PERF_TEST_MAIN_WITH_IMPLS(superres, ("cuda", "plain"), printCudaInfo()) diff --git a/modules/ts/include/opencv2/ts/ts_perf.hpp b/modules/ts/include/opencv2/ts/ts_perf.hpp index fe5765515..eb5e3e554 100644 --- a/modules/ts/include/opencv2/ts/ts_perf.hpp +++ b/modules/ts/include/opencv2/ts/ts_perf.hpp @@ -210,18 +210,13 @@ private: #define SANITY_CHECK_KEYPOINTS(array, ...) ::perf::Regression::addKeypoints(this, #array, array , ## __VA_ARGS__) #define SANITY_CHECK_MATCHES(array, ...) ::perf::Regression::addMatches(this, #array, array , ## __VA_ARGS__) -#ifdef HAVE_CUDA class CV_EXPORTS GpuPerf { public: static bool targetDevice(); }; -# define PERF_RUN_GPU() ::perf::GpuPerf::targetDevice() -#else -# define PERF_RUN_GPU() false -#endif - +#define PERF_RUN_GPU() ::perf::GpuPerf::targetDevice() /*****************************************************************************************\ * Container for performance metrics * @@ -263,7 +258,10 @@ public: TestBase(); static void Init(int argc, const char* const argv[]); + static void Init(const std::vector & availableImpls, + int argc, const char* const argv[]); static std::string getDataPath(const std::string& relativePath); + static std::string getSelectedImpl(); protected: virtual void PerfTestBody() = 0; @@ -476,18 +474,24 @@ CV_EXPORTS void PrintTo(const Size& sz, ::std::ostream* os); INSTANTIATE_TEST_CASE_P(/*none*/, fixture##_##name, params);\ void fixture##_##name::PerfTestBody() +#define CV_PERF_UNWRAP_IMPLS(...) __VA_ARGS__ -#define CV_PERF_TEST_MAIN(testsuitname, ...) \ +// "plain" should always be one of the implementations +#define CV_PERF_TEST_MAIN_WITH_IMPLS(testsuitname, impls, ...) \ int main(int argc, char **argv)\ {\ while (++argc >= (--argc,-1)) {__VA_ARGS__; break;} /*this ugly construction is needed for VS 2005*/\ + std::string impls_[] = { CV_PERF_UNWRAP_IMPLS impls };\ ::perf::Regression::Init(#testsuitname);\ - ::perf::TestBase::Init(argc, argv);\ + ::perf::TestBase::Init(std::vector(impls_, impls_ + sizeof impls_ / sizeof *impls_),\ + argc, argv);\ ::testing::InitGoogleTest(&argc, argv);\ cvtest::printVersionInfo();\ return RUN_ALL_TESTS();\ } +#define CV_PERF_TEST_MAIN(testsuitname, ...) CV_PERF_TEST_MAIN_WITH_IMPLS(testsuitname, ("plain"), __VA_ARGS__) + #define TEST_CYCLE_N(n) for(declare.iterations(n); startTimer(), next(); stopTimer()) #define TEST_CYCLE() for(; startTimer(), next(); stopTimer()) #define TEST_CYCLE_MULTIRUN(runsNum) for(declare.runs(runsNum); startTimer(), next(); stopTimer()) for(int r = 0; r < runsNum; ++r) diff --git a/modules/ts/src/ts_perf.cpp b/modules/ts/src/ts_perf.cpp index c375e7c38..3b73ddcf7 100644 --- a/modules/ts/src/ts_perf.cpp +++ b/modules/ts/src/ts_perf.cpp @@ -14,30 +14,10 @@ int64 TestBase::timeLimitDefault = 0; unsigned int TestBase::iterationsLimitDefault = (unsigned int)(-1); int64 TestBase::_timeadjustment = 0; -const std::string command_line_keys = - "{ |perf_max_outliers |8 |percent of allowed outliers}" - "{ |perf_min_samples |10 |minimal required numer of samples}" - "{ |perf_force_samples |100 |force set maximum number of samples for all tests}" - "{ |perf_seed |809564 |seed for random numbers generator}" - "{ |perf_threads |-1 |the number of worker threads, if parallel execution is enabled}" - "{ |perf_write_sanity |false |create new records for sanity checks}" - "{ |perf_verify_sanity |false |fail tests having no regression data for sanity checks}" -#ifdef ANDROID - "{ |perf_time_limit |6.0 |default time limit for a single test (in seconds)}" - "{ |perf_affinity_mask |0 |set affinity mask for the main thread}" - "{ |perf_log_power_checkpoints | |additional xml logging for power measurement}" -#else - "{ |perf_time_limit |3.0 |default time limit for a single test (in seconds)}" -#endif - "{ |perf_max_deviation |1.0 |}" - "{h |help |false |print help info}" -#ifdef HAVE_CUDA - "{ |perf_run_cpu |false |run GPU performance tests for analogical CPU functions}" - "{ |perf_cuda_device |0 |run GPU test suite onto specific CUDA capable device}" - "{ |perf_cuda_info_only |false |print an information about system and an available CUDA devices and then exit.}" -#endif -; +// Item [0] will be considered the default implementation. +static std::vector available_impls; +static std::string param_impl; static double param_max_outliers; static double param_max_deviation; static unsigned int param_min_samples; @@ -48,7 +28,6 @@ static int param_threads; static bool param_write_sanity; static bool param_verify_sanity; #ifdef HAVE_CUDA -static bool param_run_cpu; static int param_cuda_device; #endif @@ -577,11 +556,12 @@ Regression& Regression::operator() (const std::string& name, cv::InputArray arra std::string nodename = getCurrentTestNodeName(); -#ifdef HAVE_CUDA - static const std::string prefix = (param_run_cpu)? "CPU_" : "GPU_"; + // This is a hack for compatibility and it should eventually get removed. + // gpu's tests don't even have CPU sanity data anymore. if(suiteName == "gpu") - nodename = prefix + nodename; -#endif + { + nodename = (PERF_RUN_GPU() ? "GPU_" : "CPU_") + nodename; + } cv::FileNode n = rootIn[nodename]; if(n.isNone()) @@ -646,6 +626,42 @@ performance_metrics::performance_metrics() void TestBase::Init(int argc, const char* const argv[]) { + std::vector plain_only; + plain_only.push_back("plain"); + TestBase::Init(plain_only, argc, argv); +} + +void TestBase::Init(const std::vector & availableImpls, + int argc, const char* const argv[]) +{ + available_impls = availableImpls; + + const std::string command_line_keys = + "{ |perf_max_outliers |8 |percent of allowed outliers}" + "{ |perf_min_samples |10 |minimal required numer of samples}" + "{ |perf_force_samples |100 |force set maximum number of samples for all tests}" + "{ |perf_seed |809564 |seed for random numbers generator}" + "{ |perf_threads |-1 |the number of worker threads, if parallel execution is enabled}" + "{ |perf_write_sanity |false |create new records for sanity checks}" + "{ |perf_verify_sanity |false |fail tests having no regression data for sanity checks}" + "{ |perf_impl |" + available_impls[0] + + "|the implementation variant of functions under test}" + "{ |perf_run_cpu |false |deprecated, equivalent to --perf_impl=plain}" +#ifdef ANDROID + "{ |perf_time_limit |6.0 |default time limit for a single test (in seconds)}" + "{ |perf_affinity_mask |0 |set affinity mask for the main thread}" + "{ |perf_log_power_checkpoints | |additional xml logging for power measurement}" +#else + "{ |perf_time_limit |3.0 |default time limit for a single test (in seconds)}" +#endif + "{ |perf_max_deviation |1.0 |}" + "{h |help |false |print help info}" +#ifdef HAVE_CUDA + "{ |perf_cuda_device |0 |run GPU test suite onto specific CUDA capable device}" + "{ |perf_cuda_info_only |false |print an information about system and an available CUDA devices and then exit.}" +#endif + ; + cv::CommandLineParser args(argc, argv, command_line_keys.c_str()); if (args.get("help")) { @@ -656,6 +672,7 @@ void TestBase::Init(int argc, const char* const argv[]) ::testing::AddGlobalTestEnvironment(new PerfEnvironment); + param_impl = args.get("perf_run_cpu") ? "plain" : args.get("perf_impl"); param_max_outliers = std::min(100., std::max(0., args.get("perf_max_outliers"))); param_min_samples = std::max(1u, args.get("perf_min_samples")); param_max_deviation = std::max(0., args.get("perf_max_deviation")); @@ -670,19 +687,28 @@ void TestBase::Init(int argc, const char* const argv[]) log_power_checkpoints = args.get("perf_log_power_checkpoints"); #endif + if (std::find(available_impls.begin(), available_impls.end(), param_impl) == available_impls.end()) + { + printf("No such implementation: %s\n", param_impl.c_str()); + exit(1); + } + #ifdef HAVE_CUDA bool printOnly = args.get("perf_cuda_info_only"); if (printOnly) exit(0); +#endif + + if (available_impls.size() > 1) + printf("[----------]\n[ INFO ] \tImplementation variant: %s.\n[----------]\n", param_impl.c_str()), fflush(stdout); + +#ifdef HAVE_CUDA - param_run_cpu = args.get("perf_run_cpu"); param_cuda_device = std::max(0, std::min(cv::gpu::getCudaEnabledDeviceCount(), args.get("perf_cuda_device"))); - if (param_run_cpu) - printf("[----------]\n[ GPU INFO ] \tRun test suite on CPU.\n[----------]\n"), fflush(stdout); - else + if (param_impl == "cuda") { cv::gpu::DeviceInfo info(param_cuda_device); if (!info.isCompatible()) @@ -708,6 +734,13 @@ void TestBase::Init(int argc, const char* const argv[]) _timeadjustment = _calibrate(); } + +std::string TestBase::getSelectedImpl() +{ + return param_impl; +} + + int64 TestBase::_calibrate() { class _helper : public ::perf::TestBase @@ -1325,12 +1358,10 @@ void perf::sort(std::vector& pts, cv::InputOutputArray descriptors /*****************************************************************************************\ * ::perf::GpuPerf \*****************************************************************************************/ -#ifdef HAVE_CUDA bool perf::GpuPerf::targetDevice() { - return !param_run_cpu; + return param_impl == "cuda"; } -#endif /*****************************************************************************************\ * ::perf::PrintTo From b581f27249250fa7454be0e56e1dfe0bbf264ab6 Mon Sep 17 00:00:00 2001 From: Roman Donchenko Date: Tue, 18 Jun 2013 18:40:55 +0400 Subject: [PATCH 06/75] Made perf tests record module name, selected implementation and number of threads. --- modules/ts/include/opencv2/ts/ts_perf.hpp | 9 ++++++--- modules/ts/src/ts_perf.cpp | 5 +++++ 2 files changed, 11 insertions(+), 3 deletions(-) diff --git a/modules/ts/include/opencv2/ts/ts_perf.hpp b/modules/ts/include/opencv2/ts/ts_perf.hpp index eb5e3e554..ba0996403 100644 --- a/modules/ts/include/opencv2/ts/ts_perf.hpp +++ b/modules/ts/include/opencv2/ts/ts_perf.hpp @@ -260,6 +260,7 @@ public: static void Init(int argc, const char* const argv[]); static void Init(const std::vector & availableImpls, int argc, const char* const argv[]); + static void RecordRunParameters(); static std::string getDataPath(const std::string& relativePath); static std::string getSelectedImpl(); @@ -477,20 +478,22 @@ CV_EXPORTS void PrintTo(const Size& sz, ::std::ostream* os); #define CV_PERF_UNWRAP_IMPLS(...) __VA_ARGS__ // "plain" should always be one of the implementations -#define CV_PERF_TEST_MAIN_WITH_IMPLS(testsuitname, impls, ...) \ +#define CV_PERF_TEST_MAIN_WITH_IMPLS(modulename, impls, ...) \ int main(int argc, char **argv)\ {\ while (++argc >= (--argc,-1)) {__VA_ARGS__; break;} /*this ugly construction is needed for VS 2005*/\ std::string impls_[] = { CV_PERF_UNWRAP_IMPLS impls };\ - ::perf::Regression::Init(#testsuitname);\ + ::perf::Regression::Init(#modulename);\ ::perf::TestBase::Init(std::vector(impls_, impls_ + sizeof impls_ / sizeof *impls_),\ argc, argv);\ ::testing::InitGoogleTest(&argc, argv);\ cvtest::printVersionInfo();\ + ::testing::Test::RecordProperty("cv_module_name", #modulename);\ + ::perf::TestBase::RecordRunParameters();\ return RUN_ALL_TESTS();\ } -#define CV_PERF_TEST_MAIN(testsuitname, ...) CV_PERF_TEST_MAIN_WITH_IMPLS(testsuitname, ("plain"), __VA_ARGS__) +#define CV_PERF_TEST_MAIN(modulename, ...) CV_PERF_TEST_MAIN_WITH_IMPLS(modulename, ("plain"), __VA_ARGS__) #define TEST_CYCLE_N(n) for(declare.iterations(n); startTimer(), next(); stopTimer()) #define TEST_CYCLE() for(; startTimer(), next(); stopTimer()) diff --git a/modules/ts/src/ts_perf.cpp b/modules/ts/src/ts_perf.cpp index 3b73ddcf7..e61878e19 100644 --- a/modules/ts/src/ts_perf.cpp +++ b/modules/ts/src/ts_perf.cpp @@ -734,6 +734,11 @@ void TestBase::Init(const std::vector & availableImpls, _timeadjustment = _calibrate(); } +void TestBase::RecordRunParameters() +{ + ::testing::Test::RecordProperty("cv_implementation", param_impl); + ::testing::Test::RecordProperty("cv_num_threads", param_threads); +} std::string TestBase::getSelectedImpl() { From 7a104d2793ed0fde70b2ce3185823912d2455075 Mon Sep 17 00:00:00 2001 From: Roman Donchenko Date: Wed, 19 Jun 2013 18:47:15 +0400 Subject: [PATCH 07/75] Added an option to print available implementation variants. --- modules/ts/src/ts_perf.cpp | 14 ++++++++++++++ 1 file changed, 14 insertions(+) diff --git a/modules/ts/src/ts_perf.cpp b/modules/ts/src/ts_perf.cpp index e61878e19..c2c1ee6bd 100644 --- a/modules/ts/src/ts_perf.cpp +++ b/modules/ts/src/ts_perf.cpp @@ -646,6 +646,7 @@ void TestBase::Init(const std::vector & availableImpls, "{ |perf_verify_sanity |false |fail tests having no regression data for sanity checks}" "{ |perf_impl |" + available_impls[0] + "|the implementation variant of functions under test}" + "{ |perf_list_impls |false |list available implementation variants and exit}" "{ |perf_run_cpu |false |deprecated, equivalent to --perf_impl=plain}" #ifdef ANDROID "{ |perf_time_limit |6.0 |default time limit for a single test (in seconds)}" @@ -687,6 +688,19 @@ void TestBase::Init(const std::vector & availableImpls, log_power_checkpoints = args.get("perf_log_power_checkpoints"); #endif + bool param_list_impls = args.get("perf_list_impls"); + + if (param_list_impls) + { + fputs("Available implementation variants:", stdout); + for (size_t i = 0; i < available_impls.size(); ++i) { + putchar(' '); + fputs(available_impls[i].c_str(), stdout); + } + putchar('\n'); + exit(0); + } + if (std::find(available_impls.begin(), available_impls.end(), param_impl) == available_impls.end()) { printf("No such implementation: %s\n", param_impl.c_str()); From 51a672ec40d7637888fa6aae07247e4e737c64ce Mon Sep 17 00:00:00 2001 From: Roman Donchenko Date: Wed, 19 Jun 2013 19:16:18 +0400 Subject: [PATCH 08/75] Disabled the cuda variant when CUDA is not available. --- modules/gpu/perf/perf_main.cpp | 6 +++++- modules/nonfree/perf/perf_main.cpp | 6 +++++- modules/superres/perf/perf_main.cpp | 6 +++++- 3 files changed, 15 insertions(+), 3 deletions(-) diff --git a/modules/gpu/perf/perf_main.cpp b/modules/gpu/perf/perf_main.cpp index f9f3a6854..db362af8f 100644 --- a/modules/gpu/perf/perf_main.cpp +++ b/modules/gpu/perf/perf_main.cpp @@ -44,4 +44,8 @@ using namespace perf; -CV_PERF_TEST_MAIN_WITH_IMPLS(gpu, ("cuda", "plain"), printCudaInfo()) +CV_PERF_TEST_MAIN_WITH_IMPLS(gpu, ( +#ifdef HAVE_CUDA + "cuda", +#endif + "plain"), printCudaInfo()) diff --git a/modules/nonfree/perf/perf_main.cpp b/modules/nonfree/perf/perf_main.cpp index 373e08aed..a3245186a 100644 --- a/modules/nonfree/perf/perf_main.cpp +++ b/modules/nonfree/perf/perf_main.cpp @@ -1,4 +1,8 @@ #include "perf_precomp.hpp" #include "opencv2/ts/gpu_perf.hpp" -CV_PERF_TEST_MAIN_WITH_IMPLS(nonfree, ("cuda", "plain"), perf::printCudaInfo()) +CV_PERF_TEST_MAIN_WITH_IMPLS(nonfree, ( +#ifdef HAVE_CUDA + "cuda", +#endif + "plain"), perf::printCudaInfo()) diff --git a/modules/superres/perf/perf_main.cpp b/modules/superres/perf/perf_main.cpp index 90a7f5125..8bf217e30 100644 --- a/modules/superres/perf/perf_main.cpp +++ b/modules/superres/perf/perf_main.cpp @@ -44,4 +44,8 @@ using namespace perf; -CV_PERF_TEST_MAIN_WITH_IMPLS(superres, ("cuda", "plain"), printCudaInfo()) +CV_PERF_TEST_MAIN_WITH_IMPLS(superres, ( +#ifdef HAVE_CUDA + "cuda", +#endif + "plain"), printCudaInfo()) From 3ea4836a0a7e20455e6199d5bd32f0a462d286c6 Mon Sep 17 00:00:00 2001 From: Roman Donchenko Date: Thu, 20 Jun 2013 15:16:22 +0400 Subject: [PATCH 09/75] Changed the impls argument to be an array name. Turns out, you can't use preprocessor directives inside macro arguments. Who'd have thought? --- modules/gpu/perf/perf_main.cpp | 9 ++++++--- modules/nonfree/perf/perf_main.cpp | 9 ++++++--- modules/superres/perf/perf_main.cpp | 9 ++++++--- modules/ts/include/opencv2/ts/ts_perf.hpp | 24 ++++++++++++++--------- 4 files changed, 33 insertions(+), 18 deletions(-) diff --git a/modules/gpu/perf/perf_main.cpp b/modules/gpu/perf/perf_main.cpp index db362af8f..53a19ca41 100644 --- a/modules/gpu/perf/perf_main.cpp +++ b/modules/gpu/perf/perf_main.cpp @@ -44,8 +44,11 @@ using namespace perf; -CV_PERF_TEST_MAIN_WITH_IMPLS(gpu, ( +static const char * impls[] = { #ifdef HAVE_CUDA - "cuda", + "cuda", #endif - "plain"), printCudaInfo()) + "plain" +}; + +CV_PERF_TEST_MAIN_WITH_IMPLS(gpu, impls, printCudaInfo()) diff --git a/modules/nonfree/perf/perf_main.cpp b/modules/nonfree/perf/perf_main.cpp index a3245186a..d5f4a1a51 100644 --- a/modules/nonfree/perf/perf_main.cpp +++ b/modules/nonfree/perf/perf_main.cpp @@ -1,8 +1,11 @@ #include "perf_precomp.hpp" #include "opencv2/ts/gpu_perf.hpp" -CV_PERF_TEST_MAIN_WITH_IMPLS(nonfree, ( +static const char * impls[] = { #ifdef HAVE_CUDA - "cuda", + "cuda", #endif - "plain"), perf::printCudaInfo()) + "plain" +}; + +CV_PERF_TEST_MAIN_WITH_IMPLS(nonfree, impls, perf::printCudaInfo()) diff --git a/modules/superres/perf/perf_main.cpp b/modules/superres/perf/perf_main.cpp index 8bf217e30..0a8ab5dea 100644 --- a/modules/superres/perf/perf_main.cpp +++ b/modules/superres/perf/perf_main.cpp @@ -44,8 +44,11 @@ using namespace perf; -CV_PERF_TEST_MAIN_WITH_IMPLS(superres, ( +static const char * impls[] = { #ifdef HAVE_CUDA - "cuda", + "cuda", #endif - "plain"), printCudaInfo()) + "plain" +}; + +CV_PERF_TEST_MAIN_WITH_IMPLS(superres, impls, printCudaInfo()) diff --git a/modules/ts/include/opencv2/ts/ts_perf.hpp b/modules/ts/include/opencv2/ts/ts_perf.hpp index ba0996403..1e68cd49b 100644 --- a/modules/ts/include/opencv2/ts/ts_perf.hpp +++ b/modules/ts/include/opencv2/ts/ts_perf.hpp @@ -475,25 +475,31 @@ CV_EXPORTS void PrintTo(const Size& sz, ::std::ostream* os); INSTANTIATE_TEST_CASE_P(/*none*/, fixture##_##name, params);\ void fixture##_##name::PerfTestBody() -#define CV_PERF_UNWRAP_IMPLS(...) __VA_ARGS__ -// "plain" should always be one of the implementations -#define CV_PERF_TEST_MAIN_WITH_IMPLS(modulename, impls, ...) \ -int main(int argc, char **argv)\ -{\ +#define CV_PERF_TEST_MAIN_INTERNALS(modulename, impls, ...) \ while (++argc >= (--argc,-1)) {__VA_ARGS__; break;} /*this ugly construction is needed for VS 2005*/\ - std::string impls_[] = { CV_PERF_UNWRAP_IMPLS impls };\ ::perf::Regression::Init(#modulename);\ - ::perf::TestBase::Init(std::vector(impls_, impls_ + sizeof impls_ / sizeof *impls_),\ + ::perf::TestBase::Init(std::vector(impls, impls + sizeof impls / sizeof *impls),\ argc, argv);\ ::testing::InitGoogleTest(&argc, argv);\ cvtest::printVersionInfo();\ ::testing::Test::RecordProperty("cv_module_name", #modulename);\ ::perf::TestBase::RecordRunParameters();\ - return RUN_ALL_TESTS();\ + return RUN_ALL_TESTS(); + +// impls must be an array, not a pointer; "plain" should always be one of the implementations +#define CV_PERF_TEST_MAIN_WITH_IMPLS(modulename, impls, ...) \ +int main(int argc, char **argv)\ +{\ + CV_PERF_TEST_MAIN_INTERNALS(modulename, impls, __VA_ARGS__)\ } -#define CV_PERF_TEST_MAIN(modulename, ...) CV_PERF_TEST_MAIN_WITH_IMPLS(modulename, ("plain"), __VA_ARGS__) +#define CV_PERF_TEST_MAIN(modulename, ...) \ +int main(int argc, char **argv)\ +{\ + const char * plain_only[] = { "plain" };\ + CV_PERF_TEST_MAIN_INTERNALS(modulename, plain_only, __VA_ARGS__)\ +} #define TEST_CYCLE_N(n) for(declare.iterations(n); startTimer(), next(); stopTimer()) #define TEST_CYCLE() for(; startTimer(), next(); stopTimer()) From 2688e22cb595c6b652538c36dacf2bb35cc58ac3 Mon Sep 17 00:00:00 2001 From: Roman Donchenko Date: Thu, 20 Jun 2013 19:34:32 +0400 Subject: [PATCH 10/75] Made xls-report.py use global properties in XML files. Now it can determine, without looking at the file name, both the module name and the configuration name (the latter with a little help from the configuration file). --- modules/ts/misc/testlog_parser.py | 48 +++++++++++++++---- modules/ts/misc/xls-report.py | 79 ++++++++++++++++++++++--------- 2 files changed, 96 insertions(+), 31 deletions(-) diff --git a/modules/ts/misc/testlog_parser.py b/modules/ts/misc/testlog_parser.py index 8ab21417c..5d478645b 100755 --- a/modules/ts/misc/testlog_parser.py +++ b/modules/ts/misc/testlog_parser.py @@ -1,6 +1,9 @@ #!/usr/bin/env python -import sys, re, os.path +import collections +import re +import os.path +import sys from xml.dom.minidom import parse class TestInfo(object): @@ -159,12 +162,31 @@ class TestInfo(object): return 1 return 0 +# This is a Sequence for compatibility with old scripts, +# which treat parseLogFile's return value as a list. +class TestRunInfo(collections.Sequence): + def __init__(self, properties, tests): + self.properties = properties + self.tests = tests + + def __len__(self): + return len(self.tests) + + def __getitem__(self, key): + return self.tests[key] + def parseLogFile(filename): - tests = [] log = parse(filename) - for case in log.getElementsByTagName("testcase"): - tests.append(TestInfo(case)) - return tests + + properties = { + attr_name[3:]: attr_value + for (attr_name, attr_value) in log.documentElement.attributes.items() + if attr_name.startswith('cv_') + } + + tests = map(TestInfo, log.getElementsByTagName("testcase")) + + return TestRunInfo(properties, tests) if __name__ == "__main__": @@ -173,8 +195,18 @@ if __name__ == "__main__": exit(0) for arg in sys.argv[1:]: - print "Tests found in", arg - tests = parseLogFile(arg) - for t in sorted(tests): + print "Processing {}...".format(arg) + + run = parseLogFile(arg) + + print "Properties:" + + for (prop_name, prop_value) in run.properties.items(): + print "\t{} = {}".format(prop_name, prop_value) + + print "Tests:" + + for t in sorted(run.tests): t.dump() + print diff --git a/modules/ts/misc/xls-report.py b/modules/ts/misc/xls-report.py index e79bb123d..a3cf8daca 100755 --- a/modules/ts/misc/xls-report.py +++ b/modules/ts/misc/xls-report.py @@ -3,6 +3,7 @@ from __future__ import division import ast +import fnmatch import logging import numbers import os, os.path @@ -45,15 +46,55 @@ no_speedup_style = no_time_style error_speedup_style = xlwt.easyxf('pattern: pattern solid, fore_color orange') header_style = xlwt.easyxf('font: bold true; alignment: horizontal centre, vertical top, wrap True') -def collect_xml(collection, configuration, xml_fullname): - xml_fname = os.path.split(xml_fullname)[1] - module = xml_fname[:xml_fname.index('_')] +class Collector(object): + def __init__(self, config_match_func): + self.__config_cache = {} + self.config_match_func = config_match_func + self.tests = {} - module_tests = collection.setdefault(module, OrderedDict()) + def collect_from(self, xml_path): + run = parseLogFile(xml_path) - for test in sorted(parseLogFile(xml_fullname)): - test_results = module_tests.setdefault((test.shortName(), test.param()), {}) - test_results[configuration] = test.get("gmean") if test.status == 'run' else test.status + module = run.properties['module_name'] + + properties = run.properties.copy() + del properties['module_name'] + + props_key = tuple(sorted(properties.iteritems())) # dicts can't be keys + + if props_key in self.__config_cache: + configuration = self.__config_cache[props_key] + else: + configuration = self.config_match_func(properties) + + if configuration is None: + logging.warning('failed to match properties to a configuration: %r', props_key) + else: + same_config_props = [it[0] for it in self.__config_cache.iteritems() if it[1] == configuration] + if len(same_config_props) > 0: + logging.warning('property set %r matches the same configuration %r as property set %r', + props_key, configuration, same_config_props[0]) + + self.__config_cache[props_key] = configuration + + if configuration is None: return + + module_tests = self.tests.setdefault(module, OrderedDict()) + + for test in run.tests: + test_results = module_tests.setdefault((test.shortName(), test.param()), {}) + test_results[configuration] = test.get("gmean") if test.status == 'run' else test.status + +def make_match_func(matchers): + def match_func(properties): + for matcher in matchers: + if all(properties.get(name) == value + for (name, value) in matcher['properties'].iteritems()): + return matcher['name'] + + return None + + return match_func def main(): arg_parser = ArgumentParser(description='Build an XLS performance report.') @@ -83,23 +124,15 @@ def main(): sheet_conf = dict(global_conf.items() + sheet_conf.items()) - if 'configurations' in sheet_conf: - config_names = sheet_conf['configurations'] - else: - try: - config_names = [p for p in os.listdir(sheet_path) - if os.path.isdir(os.path.join(sheet_path, p))] - except Exception as e: - logging.warning('error while determining configuration names for %s: %s', sheet_path, e) - continue + config_names = sheet_conf.get('configurations', []) + config_matchers = sheet_conf.get('configuration_matchers', []) - collection = {} + collector = Collector(make_match_func(config_matchers)) - for configuration, configuration_path in \ - [(c, os.path.join(sheet_path, c)) for c in config_names]: - logging.info('processing %s', configuration_path) - for xml_fullname in glob(os.path.join(configuration_path, '*.xml')): - collect_xml(collection, configuration, xml_fullname) + for root, _, filenames in os.walk(sheet_path): + logging.info('looking in %s', root) + for filename in fnmatch.filter(filenames, '*.xml'): + collector.collect_from(os.path.join(root, filename)) sheet = wb.add_sheet(sheet_conf.get('sheet_name', os.path.basename(os.path.abspath(sheet_path)))) @@ -126,7 +159,7 @@ def main(): module_styles = {module: xlwt.easyxf('pattern: pattern solid, fore_color {}'.format(color)) for module, color in module_colors.iteritems()} - for module, tests in sorted(collection.iteritems()): + for module, tests in sorted(collector.tests.iteritems()): for ((test, param), configs) in tests.iteritems(): sheet.write(row, 0, module, module_styles.get(module, xlwt.Style.default_style)) sheet.write(row, 1, test) From 0e3a9eaf980b484a9d5f56c0f38c92164e9c5910 Mon Sep 17 00:00:00 2001 From: Roman Donchenko Date: Fri, 21 Jun 2013 13:43:16 +0400 Subject: [PATCH 11/75] Made Collector render property sets as dicts instead of tuples of pairs. --- modules/ts/misc/xls-report.py | 15 ++++++++++++--- 1 file changed, 12 insertions(+), 3 deletions(-) diff --git a/modules/ts/misc/xls-report.py b/modules/ts/misc/xls-report.py index a3cf8daca..2dcbf89cf 100755 --- a/modules/ts/misc/xls-report.py +++ b/modules/ts/misc/xls-report.py @@ -52,6 +52,12 @@ class Collector(object): self.config_match_func = config_match_func self.tests = {} + # Format a sorted sequence of pairs as if it was a dictionary. + # We can't just use a dictionary instead, since we want to preserve the sorted order of the keys. + @staticmethod + def __format_config_cache_key(pairs): + return '{' + ', '.join(repr(k) + ': ' + repr(v) for (k, v) in pairs) + '}' + def collect_from(self, xml_path): run = parseLogFile(xml_path) @@ -68,12 +74,15 @@ class Collector(object): configuration = self.config_match_func(properties) if configuration is None: - logging.warning('failed to match properties to a configuration: %r', props_key) + logging.warning('failed to match properties to a configuration: %s', + Collector.__format_config_cache_key(props_key)) else: same_config_props = [it[0] for it in self.__config_cache.iteritems() if it[1] == configuration] if len(same_config_props) > 0: - logging.warning('property set %r matches the same configuration %r as property set %r', - props_key, configuration, same_config_props[0]) + logging.warning('property set %s matches the same configuration %r as property set %s', + Collector.__format_config_cache_key(props_key), + configuration, + Collector.__format_config_cache_key(same_config_props[0])) self.__config_cache[props_key] = configuration From d4a8b87645f6df2ee5a61c8b0c52a4248d2c600a Mon Sep 17 00:00:00 2001 From: Roman Donchenko Date: Fri, 21 Jun 2013 16:45:17 +0400 Subject: [PATCH 12/75] Wrote relevant docs. --- modules/ts/misc/xls-report.py | 79 ++++++++++++++++++++++++++++------- 1 file changed, 64 insertions(+), 15 deletions(-) diff --git a/modules/ts/misc/xls-report.py b/modules/ts/misc/xls-report.py index 2dcbf89cf..e911314e9 100755 --- a/modules/ts/misc/xls-report.py +++ b/modules/ts/misc/xls-report.py @@ -1,5 +1,69 @@ #!/usr/bin/env python +""" + This script can generate XLS reports from OpenCV tests' XML output files. + + To use it, first, create a directory for each machine you ran tests on. + Each such directory will become a sheet in the report. Put each XML file + into the corresponding directory. + + Then, create your configuration file(s). You can have a global configuration + file (specified with the -c option), and per-sheet configuration files, which + must be called sheet.conf and placed in the directory corresponding to the sheet. + The settings in the per-sheet configuration file will override those in the + global configuration file, if both are present. + + A configuration file must consist of a Python dictionary. The following keys + will be recognized: + + * 'comparisons': [{'from': string, 'to': string}] + List of configurations to compare performance between. For each item, + the sheet will have a column showing speedup from configuration named + 'from' to configuration named "to". + + * 'configuration_matchers': [{'properties': {string: object}, 'name': string}] + Instructions for matching test run property sets to configuration names. + + For each found XML file: + + 1) All attributes of the root element starting with the prefix 'cv_' are + placed in a dictionary, with the cv_ prefix stripped and the cv_module_name + element deleted. + + 2) The first matcher for which the XML's file property set contains the same + keys with equal values as its 'properties' dictionary is searched for. + A missing property can be matched by using None as the value. + + Corollary 1: you should place more specific matchers before less specific + ones. + + Corollary 2: an empty 'properties' dictionary matches every property set. + + 3) If a matching matcher is found, its 'name' string is presumed to be the name + of the configuration the XML file corresponds to. Otherwise, a warning is + printed. A warning is also printed if two different property sets match to the + same configuration name. + + * 'configurations': [string] + List of names for compile-time and runtime configurations of OpenCV. + Each item will correspond to a column of the sheet. + + * 'module_colors': {string: string} + Mapping from module name to color name. In the sheet, cells containing module + names from this mapping will be colored with the corresponding color. You can + find the list of available colors here: + . + + * 'sheet_name': string + Name for the sheet. If this parameter is missing, the name of sheet's directory + will be used. + + Note that all keys are optional, although to get useful results, you'll want to + specify at least 'configurations' and 'configuration_matchers'. + + Finally, run the script. Use the --help option for usage information. +""" + from __future__ import division import ast @@ -18,21 +82,6 @@ import xlwt from testlog_parser import parseLogFile -# To build XLS report you neet to put your xmls (OpenCV tests output) in the -# following way: -# -# "root" --- folder, representing the whole XLS document. It contains several -# subfolders --- sheet-paths of the XLS document. Each sheet-path contains it's -# subfolders --- config-paths. Config-paths are columns of the sheet and -# they contains xmls files --- output of OpenCV modules testing. -# Config-path means OpenCV build configuration, including different -# options such as NEON, TBB, GPU enabling/disabling. -# -# root -# root\sheet_path -# root\sheet_path\configuration1 (column 1) -# root\sheet_path\configuration2 (column 2) - re_image_size = re.compile(r'^ \d+ x \d+$', re.VERBOSE) re_data_type = re.compile(r'^ (?: 8 | 16 | 32 | 64 ) [USF] C [1234] $', re.VERBOSE) From c16316c4b49fe73b768210d43db46405f177e9fe Mon Sep 17 00:00:00 2001 From: Roman Donchenko Date: Fri, 21 Jun 2013 12:43:16 +0400 Subject: [PATCH 13/75] Replaced the semi-public CV_PARALLEL_FRAMEWORK macro with a function. That way, core/internal.hpp doesn't have to depend on cvconfig.h, which we don't ship. --- .../core/include/opencv2/core/internal.hpp | 31 ++++--------------- modules/core/src/parallel.cpp | 31 +++++++++++++++++++ modules/ts/src/precomp.hpp | 1 - modules/ts/src/ts_func.cpp | 11 +++---- 4 files changed, 42 insertions(+), 32 deletions(-) diff --git a/modules/core/include/opencv2/core/internal.hpp b/modules/core/include/opencv2/core/internal.hpp index 10cd2caf9..606c62f8f 100644 --- a/modules/core/include/opencv2/core/internal.hpp +++ b/modules/core/include/opencv2/core/internal.hpp @@ -50,7 +50,8 @@ #include -#include "cvconfig.h" +#include "opencv2/core/core.hpp" +#include "opencv2/core/types_c.h" #if defined WIN32 || defined _WIN32 # ifndef WIN32 @@ -186,30 +187,6 @@ CV_INLINE IppiSize ippiSize(int width, int height) # include "opencv2/core/eigen.hpp" #endif -#ifdef _OPENMP -# define HAVE_OPENMP -#endif - -#ifdef __APPLE__ -# define HAVE_GCD -#endif - -#if defined _MSC_VER && _MSC_VER >= 1600 -# define HAVE_CONCURRENCY -#endif - -#if defined HAVE_TBB && TBB_VERSION_MAJOR*100 + TBB_VERSION_MINOR >= 202 -# define CV_PARALLEL_FRAMEWORK "tbb" -#elif defined HAVE_CSTRIPES -# define CV_PARALLEL_FRAMEWORK "cstripes" -#elif defined HAVE_OPENMP -# define CV_PARALLEL_FRAMEWORK "openmp" -#elif defined HAVE_GCD -# define CV_PARALLEL_FRAMEWORK "gcd" -#elif defined HAVE_CONCURRENCY -# define CV_PARALLEL_FRAMEWORK "ms-concurrency" -#endif - #ifdef __cplusplus namespace cv @@ -277,6 +254,10 @@ namespace cv body(range); } #endif + + // Returns a static string if there is a parallel framework, + // NULL otherwise. + CV_EXPORTS const char* currentParallelFramework(); } //namespace cv #define CV_INIT_ALGORITHM(classname, algname, memberinit) \ diff --git a/modules/core/src/parallel.cpp b/modules/core/src/parallel.cpp index 51b165275..0a9ed0987 100644 --- a/modules/core/src/parallel.cpp +++ b/modules/core/src/parallel.cpp @@ -61,6 +61,17 @@ #endif #endif +#ifdef _OPENMP + #define HAVE_OPENMP +#endif + +#ifdef __APPLE__ + #define HAVE_GCD +#endif + +#if defined _MSC_VER && _MSC_VER >= 1600 + #define HAVE_CONCURRENCY +#endif /* IMPORTANT: always use the same order of defines 1. HAVE_TBB - 3rdparty library, should be explicitly enabled @@ -99,6 +110,18 @@ #endif #endif +#if defined HAVE_TBB && TBB_VERSION_MAJOR*100 + TBB_VERSION_MINOR >= 202 +# define CV_PARALLEL_FRAMEWORK "tbb" +#elif defined HAVE_CSTRIPES +# define CV_PARALLEL_FRAMEWORK "cstripes" +#elif defined HAVE_OPENMP +# define CV_PARALLEL_FRAMEWORK "openmp" +#elif defined HAVE_GCD +# define CV_PARALLEL_FRAMEWORK "gcd" +#elif defined HAVE_CONCURRENCY +# define CV_PARALLEL_FRAMEWORK "ms-concurrency" +#endif + namespace cv { ParallelLoopBody::~ParallelLoopBody() {} @@ -465,6 +488,14 @@ int cv::getNumberOfCPUs(void) #endif } +const char* cv::currentParallelFramework() { +#ifdef CV_PARALLEL_FRAMEWORK + return CV_PARALLEL_FRAMEWORK; +#else + return NULL; +#endif +} + CV_IMPL void cvSetNumThreads(int nt) { cv::setNumThreads(nt); diff --git a/modules/ts/src/precomp.hpp b/modules/ts/src/precomp.hpp index 0b2adacc4..a74417da4 100644 --- a/modules/ts/src/precomp.hpp +++ b/modules/ts/src/precomp.hpp @@ -1,4 +1,3 @@ -#include "opencv2/core/core.hpp" #include "opencv2/core/core_c.h" #include "opencv2/core/internal.hpp" #include "opencv2/ts/ts.hpp" diff --git a/modules/ts/src/ts_func.cpp b/modules/ts/src/ts_func.cpp index e2998149d..0186e9c8f 100644 --- a/modules/ts/src/ts_func.cpp +++ b/modules/ts/src/ts_func.cpp @@ -2963,13 +2963,12 @@ void printVersionInfo(bool useStdOut) if(useStdOut) std::cout << ver << std::endl; } -#ifdef CV_PARALLEL_FRAMEWORK - ::testing::Test::RecordProperty("cv_parallel_framework", CV_PARALLEL_FRAMEWORK); - if (useStdOut) - { - std::cout << "Parallel framework: " << CV_PARALLEL_FRAMEWORK << std::endl; + const char* parallel_framework = currentParallelFramework(); + + if (parallel_framework) { + ::testing::Test::RecordProperty("cv_parallel_framework", parallel_framework); + if (useStdOut) std::cout << "Parallel framework: " << parallel_framework << std::endl; } -#endif std::string cpu_features; From e3577c2f586aaa83c858139d67c1317259416454 Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Mon, 8 Apr 2013 18:13:49 -0700 Subject: [PATCH 14/75] Build with dev release of TBB enabled. --- 3rdparty/tbb/CMakeLists.txt | 47 ++++++++++++++++++++++++++++--------- 1 file changed, 36 insertions(+), 11 deletions(-) diff --git a/3rdparty/tbb/CMakeLists.txt b/3rdparty/tbb/CMakeLists.txt index af1581349..9dcb63b7f 100644 --- a/3rdparty/tbb/CMakeLists.txt +++ b/3rdparty/tbb/CMakeLists.txt @@ -1,13 +1,20 @@ #Cross compile TBB from source project(tbb) -# 4.1 update 2 - works fine -set(tbb_ver "tbb41_20130116oss") -set(tbb_url "http://threadingbuildingblocks.org/sites/default/files/software_releases/source/tbb41_20130116oss_src.tgz") -set(tbb_md5 "3809790e1001a1b32d59c9fee590ee85") +# 4.1 update 3 dev - works fine +set(tbb_ver "tbb41_20130401oss") +set(tbb_url "http://threadingbuildingblocks.org/sites/default/files/software_releases/source/tbb41_20130401oss_src.tgz") +set(tbb_md5 "f2f591a0d2ca8f801e221ce7d9ea84bb") set(tbb_version_file "version_string.ver") ocv_warnings_disable(CMAKE_CXX_FLAGS -Wshadow) +# 4.1 update 2 - works fine +#set(tbb_ver "tbb41_20130116oss") +#set(tbb_url "http://threadingbuildingblocks.org/sites/default/files/software_releases/source/tbb41_20130116oss_src.tgz") +#set(tbb_md5 "3809790e1001a1b32d59c9fee590ee85") +#set(tbb_version_file "version_string.ver") +#ocv_warnings_disable(CMAKE_CXX_FLAGS -Wshadow) + # 4.1 update 1 - works fine #set(tbb_ver "tbb41_20121003oss") #set(tbb_url "http://threadingbuildingblocks.org/sites/default/files/software_releases/source/tbb41_20121003oss_src.tgz") @@ -107,7 +114,8 @@ if(NOT EXISTS "${tbb_src_dir}") RESULT_VARIABLE tbb_untar_RESULT) if(NOT tbb_untar_RESULT EQUAL 0 OR NOT EXISTS "${tbb_src_dir}") - message(FATAL_ERROR "Failed to unpack TBB sources") + message(FATAL_ERROR "Failed to unpack TBB sources (${tbb_untar_RESULT} ${tbb_src_dir})") + endif() endif() @@ -124,11 +132,12 @@ list(APPEND lib_srcs "${tbb_src_dir}/src/rml/client/rml_tbb.cpp") if (WIN32) add_definitions(-D__TBB_DYNAMIC_LOAD_ENABLED=0 - -D__TBB_BUILD=1 - -D_UNICODE - -DUNICODE - -DWINAPI_FAMILY=WINAPI_FAMILY_APP - -DDO_ITT_NOTIFY=0 + /D__TBB_BUILD=1 + /D_UNICODE + /DUNICODE + /DWINAPI_FAMILY=WINAPI_FAMILY_APP + /DDO_ITT_NOTIFY=0 + /DUSE_WINTHREAD ) # defines were copied from windows.cl.inc set(CMAKE_LINKER_FLAGS "${CMAKE_LINKER_FLAGS} /APPCONTAINER") else() @@ -173,7 +182,23 @@ endif() set(TBB_SOURCE_FILES ${TBB_SOURCE_FILES} "${CMAKE_CURRENT_SOURCE_DIR}/${tbb_version_file}") add_library(tbb ${TBB_SOURCE_FILES}) -target_link_libraries(tbb c m dl) + +if (WIN32) + add_custom_command(TARGET tbb + PRE_BUILD + COMMAND ${CMAKE_C_COMPILER} /nologo /TC /EP ${tbb_src_dir}\\src\\tbb\\win32-tbb-export.def /DTBB_NO_LEGACY /DUSE_WINTHREAD /D_CRT_SECURE_NO_DEPRECATE /D_WIN32_WINNT=0x0400 /D__TBB_BUILD=1 /I${tbb_src_dir}\\src /I${tbb_src_dir}\\include > "${tbb_src_dir}\\src\\tbb\\tbb.def" + WORKING_DIRECTORY ${tbb_src_dir}\\src\\tbb + COMMENT "Generating tbb.def file" VERBATIM + ) +endif() + +if (WIN32) + set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} /DEF:${tbb_src_dir}/src/tbb/tbb.def /DLL /MAP /fixed:no /INCREMENTAL:NO") +endif() + +if (NOT WIN32) + target_link_libraries(tbb c m dl) +endif() ocv_warnings_disable(CMAKE_CXX_FLAGS -Wundef -Wmissing-declarations) string(REPLACE "-Werror=non-virtual-dtor" "" CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS}") From 033e3092a3297d8cb3e49eeff825cb706dd01f95 Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Tue, 16 Apr 2013 06:25:10 -0700 Subject: [PATCH 15/75] Media Foundation based VideoCapture improved Code formating fixed; GrabFrame method implemented correclty. --- modules/highgui/src/cap.cpp | 5 ++ modules/highgui/src/cap_msmf.cpp | 146 +++++++++++++++++++++++++++++-- 2 files changed, 144 insertions(+), 7 deletions(-) diff --git a/modules/highgui/src/cap.cpp b/modules/highgui/src/cap.cpp index 2c3b3a94c..3750d1d66 100644 --- a/modules/highgui/src/cap.cpp +++ b/modules/highgui/src/cap.cpp @@ -117,6 +117,9 @@ CV_IMPL CvCapture * cvCreateCameraCapture (int index) #ifdef HAVE_DSHOW CV_CAP_DSHOW, #endif +#ifdef HAVE_MSMF + CV_CAP_MSMF, +#endif #if 1 CV_CAP_IEEE1394, // identical to CV_CAP_DC1394 #endif @@ -198,7 +201,9 @@ CV_IMPL CvCapture * cvCreateCameraCapture (int index) { #ifdef HAVE_MSMF case CV_CAP_MSMF: + printf("Creating Media foundation capture\n"); capture = cvCreateCameraCapture_MSMF (index); + printf("Capture address %p\n", capture); if (capture) return capture; break; diff --git a/modules/highgui/src/cap_msmf.cpp b/modules/highgui/src/cap_msmf.cpp index 52b780463..28d92c361 100644 --- a/modules/highgui/src/cap_msmf.cpp +++ b/modules/highgui/src/cap_msmf.cpp @@ -61,18 +61,22 @@ #include #include #include + #pragma warning(disable:4503) #pragma comment(lib, "mfplat") #pragma comment(lib, "mf") #pragma comment(lib, "mfuuid") #pragma comment(lib, "Strmiids") #pragma comment(lib, "MinCore_Downlevel") + struct IMFMediaType; struct IMFActivate; struct IMFMediaSource; struct IMFAttributes; + namespace { + template void SafeRelease(T **ppT) { if (*ppT) @@ -81,7 +85,8 @@ template void SafeRelease(T **ppT) *ppT = NULL; } } - /// Class for printing info into consol + +/// Class for printing info into consol class DebugPrintOut { public: @@ -93,6 +98,7 @@ public: private: DebugPrintOut(void); }; + // Structure for collecting info about types of video, which are supported by current video device struct MediaType { @@ -127,6 +133,7 @@ struct MediaType ~MediaType(); void Clear(); }; + /// Class for parsing info from IMFMediaType into the local MediaType class FormatReader { @@ -136,9 +143,11 @@ public: private: FormatReader(void); }; + DWORD WINAPI MainThreadFunction( LPVOID lpParam ); typedef void(*emergensyStopEventCallback)(int, void *); typedef unsigned char BYTE; + class RawImage { public: @@ -156,6 +165,7 @@ private: unsigned char *ri_pixels; RawImage(unsigned int size); }; + // Class for grabbing image from video stream class ImageGrabber : public IMFSampleGrabberSinkCallback { @@ -836,10 +846,13 @@ MediaType FormatReader::Read(IMFMediaType *pType) FormatReader::~FormatReader(void) { } -#define CHECK_HR(x) if (FAILED(x)) { goto done; } + +#define CHECK_HR(x) if (FAILED(x)) { printf("Checking failed !!!\n"); goto done; } + ImageGrabber::ImageGrabber(unsigned int deviceID): m_cRef(1), ig_DeviceID(deviceID), ig_pSource(NULL), ig_pSession(NULL), ig_pTopology(NULL), ig_RIE(true), ig_Close(false) { } + ImageGrabber::~ImageGrabber(void) { if (ig_pSession) @@ -851,6 +864,7 @@ ImageGrabber::~ImageGrabber(void) DebugPrintOut *DPO = &DebugPrintOut::getInstance(); DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: Destroing instance of the ImageGrabber class \n", ig_DeviceID); } + HRESULT ImageGrabber::initImageGrabber(IMFMediaSource *pSource, GUID VideoFormat) { IMFActivate *pSinkActivate = NULL; @@ -871,23 +885,34 @@ HRESULT ImageGrabber::initImageGrabber(IMFMediaSource *pSource, GUID VideoFormat ig_pSource = pSource; hr = pSource->CreatePresentationDescriptor(&pPD); if (FAILED(hr)) + { + printf("Error creating CreatePresentationDescriptor()\n"); goto err; + } BOOL fSelected; hr = pPD->GetStreamDescriptorByIndex(0, &fSelected, &pSD); - if (FAILED(hr)) + if (FAILED(hr)) { + printf("Error GetStreamDescriptorByIndex()\n"); goto err; + } hr = pSD->GetMediaTypeHandler(&pHandler); - if (FAILED(hr)) + if (FAILED(hr)) { + printf("Error GetMediaTypeHandler()\n"); goto err; + } DWORD cTypes = 0; hr = pHandler->GetMediaTypeCount(&cTypes); - if (FAILED(hr)) + if (FAILED(hr)) { + printf("Error GetMediaTypeCount()\n"); goto err; + } if(cTypes > 0) { hr = pHandler->GetCurrentMediaType(&pCurrentType); - if (FAILED(hr)) + if (FAILED(hr)) { + printf("Error GetCurrentMediaType()\n"); goto err; + } MT = FormatReader::Read(pCurrentType); } err: @@ -904,6 +929,10 @@ err: { sizeRawImage = MT.MF_MT_FRAME_SIZE * 4; } + else + { + printf("Video format is not RBG 24/32!\n"); + } CHECK_HR(hr = RawImage::CreateInstance(&ig_RIFirst, sizeRawImage)); CHECK_HR(hr = RawImage::CreateInstance(&ig_RISecond, sizeRawImage)); ig_RIOut = ig_RISecond; @@ -936,6 +965,7 @@ done: SafeRelease(&pType); return hr; } + void ImageGrabber::stopGrabbing() { if(ig_pSession) @@ -943,6 +973,7 @@ void ImageGrabber::stopGrabbing() DebugPrintOut *DPO = &DebugPrintOut::getInstance(); DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: Stopping of of grabbing of images\n", ig_DeviceID); } + HRESULT ImageGrabber::startGrabbing(void) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -995,12 +1026,15 @@ HRESULT ImageGrabber::startGrabbing(void) SafeRelease(&pEvent); } DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: Finish startGrabbing \n", ig_DeviceID); + done: SafeRelease(&pEvent); SafeRelease(&ig_pSession); SafeRelease(&ig_pTopology); + return hr; } + HRESULT ImageGrabber::CreateTopology(IMFMediaSource *pSource, IMFActivate *pSinkActivate, IMFTopology **ppTopo) { IMFTopology *pTopology = NULL; @@ -1038,6 +1072,7 @@ HRESULT ImageGrabber::CreateTopology(IMFMediaSource *pSource, IMFActivate *pSink } *ppTopo = pTopology; (*ppTopo)->AddRef(); + done: SafeRelease(&pTopology); SafeRelease(&pNode1); @@ -1045,8 +1080,10 @@ done: SafeRelease(&pPD); SafeRelease(&pSD); SafeRelease(&pHandler); + return hr; } + HRESULT ImageGrabber::AddSourceNode( IMFTopology *pTopology, // Topology. IMFMediaSource *pSource, // Media source. @@ -1064,10 +1101,13 @@ HRESULT ImageGrabber::AddSourceNode( // Return the pointer to the caller. *ppNode = pNode; (*ppNode)->AddRef(); + done: SafeRelease(&pNode); + return hr; } + HRESULT ImageGrabber::AddOutputNode( IMFTopology *pTopology, // Topology. IMFActivate *pActivate, // Media sink activation object. @@ -1084,10 +1124,13 @@ HRESULT ImageGrabber::AddOutputNode( // Return the pointer to the caller. *ppNode = pNode; (*ppNode)->AddRef(); + done: SafeRelease(&pNode); + return hr; } + HRESULT ImageGrabber::CreateInstance(ImageGrabber **ppIG, unsigned int deviceID) { *ppIG = new (std::nothrow) ImageGrabber(deviceID); @@ -1099,6 +1142,7 @@ HRESULT ImageGrabber::CreateInstance(ImageGrabber **ppIG, unsigned int deviceID) DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: Creating instance of ImageGrabber\n", deviceID); return S_OK; } + STDMETHODIMP ImageGrabber::QueryInterface(REFIID riid, void** ppv) { HRESULT hr = E_NOINTERFACE; @@ -1119,10 +1163,12 @@ STDMETHODIMP ImageGrabber::QueryInterface(REFIID riid, void** ppv) } return hr; } + STDMETHODIMP_(ULONG) ImageGrabber::AddRef() { return InterlockedIncrement(&m_cRef); } + STDMETHODIMP_(ULONG) ImageGrabber::Release() { ULONG cRef = InterlockedDecrement(&m_cRef); @@ -1132,38 +1178,45 @@ STDMETHODIMP_(ULONG) ImageGrabber::Release() } return cRef; } + STDMETHODIMP ImageGrabber::OnClockStart(MFTIME hnsSystemTime, LONGLONG llClockStartOffset) { (void)hnsSystemTime; (void)llClockStartOffset; return S_OK; } + STDMETHODIMP ImageGrabber::OnClockStop(MFTIME hnsSystemTime) { (void)hnsSystemTime; return S_OK; } + STDMETHODIMP ImageGrabber::OnClockPause(MFTIME hnsSystemTime) { (void)hnsSystemTime; return S_OK; } + STDMETHODIMP ImageGrabber::OnClockRestart(MFTIME hnsSystemTime) { (void)hnsSystemTime; return S_OK; } + STDMETHODIMP ImageGrabber::OnClockSetRate(MFTIME hnsSystemTime, float flRate) { (void)flRate; (void)hnsSystemTime; return S_OK; } + STDMETHODIMP ImageGrabber::OnSetPresentationClock(IMFPresentationClock* pClock) { (void)pClock; return S_OK; } + STDMETHODIMP ImageGrabber::OnProcessSample(REFGUID guidMajorMediaType, DWORD dwSampleFlags, LONGLONG llSampleTime, LONGLONG llSampleDuration, const BYTE * pSampleBuffer, DWORD dwSampleSize) @@ -1173,6 +1226,9 @@ STDMETHODIMP ImageGrabber::OnProcessSample(REFGUID guidMajorMediaType, DWORD dwS (void)dwSampleFlags; (void)llSampleDuration; (void)dwSampleSize; + + printf("ImageGrabber::OnProcessSample() -- begin\n"); + if(ig_RIE) { ig_RIFirst->fastCopy(pSampleBuffer); @@ -1184,22 +1240,29 @@ STDMETHODIMP ImageGrabber::OnProcessSample(REFGUID guidMajorMediaType, DWORD dwS ig_RIOut = ig_RISecond; } ig_RIE = !ig_RIE; + + printf("ImageGrabber::OnProcessSample() -- end\n"); + return S_OK; } + STDMETHODIMP ImageGrabber::OnShutdown() { return S_OK; } + RawImage *ImageGrabber::getRawImage() { return ig_RIOut; } + DWORD WINAPI MainThreadFunction( LPVOID lpParam ) { ImageGrabberThread *pIGT = (ImageGrabberThread *)lpParam; pIGT->run(); return 0; } + HRESULT ImageGrabberThread::CreateInstance(ImageGrabberThread **ppIGT, IMFMediaSource *pSource, unsigned int deviceID) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -1213,6 +1276,7 @@ HRESULT ImageGrabberThread::CreateInstance(ImageGrabberThread **ppIGT, IMFMediaS DPO->printOut(L"IMAGEGRABBERTHREAD VIDEODEVICE %i: Creating of the instance of ImageGrabberThread\n", deviceID); return S_OK; } + ImageGrabberThread::ImageGrabberThread(IMFMediaSource *pSource, unsigned int deviceID): igt_Handle(NULL), igt_stop(false) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -1235,6 +1299,7 @@ ImageGrabberThread::ImageGrabberThread(IMFMediaSource *pSource, unsigned int dev DPO->printOut(L"IMAGEGRABBERTHREAD VIDEODEVICE %i There is a problem with creation of the instance of the ImageGrabber class\n", deviceID); } } + void ImageGrabberThread::setEmergencyStopEvent(void *userData, void(*func)(int, void *)) { if(func) @@ -1243,12 +1308,14 @@ void ImageGrabberThread::setEmergencyStopEvent(void *userData, void(*func)(int, igt_userData = userData; } } + ImageGrabberThread::~ImageGrabberThread(void) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); DPO->printOut(L"IMAGEGRABBERTHREAD VIDEODEVICE %i: Destroing ImageGrabberThread\n", igt_DeviceID); delete igt_pImageGrabber; } + void ImageGrabberThread::stop() { igt_stop = true; @@ -1257,6 +1324,7 @@ void ImageGrabberThread::stop() igt_pImageGrabber->stopGrabbing(); } } + void ImageGrabberThread::start() { igt_Handle = CreateThread( @@ -1267,6 +1335,7 @@ void ImageGrabberThread::start() 0, // use default creation flags &igt_ThreadIdArray); // returns the thread identifier } + void ImageGrabberThread::run() { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -1294,10 +1363,12 @@ void ImageGrabberThread::run() else DPO->printOut(L"IMAGEGRABBERTHREAD VIDEODEVICE %i: Finish thread\n", igt_DeviceID); } + ImageGrabber *ImageGrabberThread::getImageGrabber() { return igt_pImageGrabber; } + Media_Foundation::Media_Foundation(void) { HRESULT hr = MFStartup(MF_VERSION); @@ -1307,6 +1378,7 @@ Media_Foundation::Media_Foundation(void) DPO->printOut(L"MEDIA FOUNDATION: It cannot be created!!!\n"); } } + Media_Foundation::~Media_Foundation(void) { HRESULT hr = MFShutdown(); @@ -1316,6 +1388,7 @@ Media_Foundation::~Media_Foundation(void) DPO->printOut(L"MEDIA FOUNDATION: Resources cannot be released\n"); } } + bool Media_Foundation::buildListOfDevices() { HRESULT hr = S_OK; @@ -1342,30 +1415,36 @@ bool Media_Foundation::buildListOfDevices() SafeRelease(&pAttributes); return (SUCCEEDED(hr)); } + Media_Foundation& Media_Foundation::getInstance() { static Media_Foundation instance; return instance; } + RawImage::RawImage(unsigned int size): ri_new(false), ri_pixels(NULL) { ri_size = size; ri_pixels = new unsigned char[size]; memset((void *)ri_pixels,0,ri_size); } + bool RawImage::isNew() { return ri_new; } + unsigned int RawImage::getSize() { return ri_size; } + RawImage::~RawImage(void) { delete []ri_pixels; ri_pixels = NULL; } + long RawImage::CreateInstance(RawImage **ppRImage,unsigned int size) { *ppRImage = new (std::nothrow) RawImage(size); @@ -1375,25 +1454,30 @@ long RawImage::CreateInstance(RawImage **ppRImage,unsigned int size) } return S_OK; } + void RawImage::setCopy(const BYTE * pSampleBuffer) { memcpy(ri_pixels, pSampleBuffer, ri_size); ri_new = true; } + void RawImage::fastCopy(const BYTE * pSampleBuffer) { memcpy(ri_pixels, pSampleBuffer, ri_size); ri_new = true; } + unsigned char * RawImage::getpPixels() { ri_new = false; return ri_pixels; } + videoDevice::videoDevice(void): vd_IsSetuped(false), vd_LockOut(OpenLock), vd_pFriendlyName(NULL), vd_Width(0), vd_Height(0), vd_pSource(NULL), vd_func(NULL), vd_userData(NULL) { } + void videoDevice::setParametrs(CamParametrs parametrs) { if(vd_IsSetuped) @@ -1428,6 +1512,7 @@ void videoDevice::setParametrs(CamParametrs parametrs) } } } + CamParametrs videoDevice::getParametrs() { CamParametrs out; @@ -1472,6 +1557,7 @@ CamParametrs videoDevice::getParametrs() } return out; } + long videoDevice::resetDevice(IMFActivate *pActivate) { HRESULT hr = -1; @@ -1503,6 +1589,7 @@ long videoDevice::resetDevice(IMFActivate *pActivate) } return hr; } + long videoDevice::readInfoOfDevice(IMFActivate *pActivate, unsigned int Num) { HRESULT hr = -1; @@ -1510,6 +1597,7 @@ long videoDevice::readInfoOfDevice(IMFActivate *pActivate, unsigned int Num) hr = resetDevice(pActivate); return hr; } + long videoDevice::checkDevice(IMFAttributes *pAttributes, IMFActivate **pDevice) { HRESULT hr = S_OK; @@ -1911,8 +1999,10 @@ done: SafeRelease(&pType); return hr; } + videoDevices::videoDevices(void): count(0) {} + void videoDevices::clearDevices() { std::vector::iterator i = vds_Devices.begin(); @@ -1920,10 +2010,12 @@ void videoDevices::clearDevices() delete (*i); vds_Devices.clear(); } + videoDevices::~videoDevices(void) { clearDevices(); } + videoDevice * videoDevices::getDevice(unsigned int i) { if(i >= vds_Devices.size()) @@ -1936,6 +2028,7 @@ videoDevice * videoDevices::getDevice(unsigned int i) } return vds_Devices[i]; } + long videoDevices::initDevices(IMFAttributes *pAttributes) { HRESULT hr = S_OK; @@ -1965,15 +2058,18 @@ long videoDevices::initDevices(IMFAttributes *pAttributes) } return hr; } + size_t videoDevices::getCount() { return vds_Devices.size(); } + videoDevices& videoDevices::getInstance() { static videoDevices instance; return instance; } + Parametr::Parametr() { CurrentValue = 0; @@ -1983,6 +2079,7 @@ Parametr::Parametr() Default = 0; Flag = 0; } + MediaType::MediaType() { pMF_MT_AM_FORMAT_TYPEName = NULL; @@ -1990,10 +2087,12 @@ MediaType::MediaType() pMF_MT_SUBTYPEName = NULL; Clear(); } + MediaType::~MediaType() { Clear(); } + void MediaType::Clear() { MF_MT_FRAME_SIZE = 0; @@ -2021,6 +2120,7 @@ void MediaType::Clear() memset(&MF_MT_AM_FORMAT_TYPE, 0, sizeof(GUID)); memset(&MF_MT_SUBTYPE, 0, sizeof(GUID)); } + videoInput::videoInput(void): accessToDevices(false) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2029,6 +2129,7 @@ videoInput::videoInput(void): accessToDevices(false) if(!accessToDevices) DPO->printOut(L"INITIALIZATION: Ther is not any suitable video device\n"); } + void videoInput::updateListOfDevices() { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2037,11 +2138,13 @@ void videoInput::updateListOfDevices() if(!accessToDevices) DPO->printOut(L"UPDATING: Ther is not any suitable video device\n"); } + videoInput::~videoInput(void) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); DPO->printOut(L"\n***** CLOSE VIDEOINPUT LIBRARY - 2013 *****\n\n"); } + IMFMediaSource *videoInput::getMediaSource(int deviceID) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2063,6 +2166,7 @@ IMFMediaSource *videoInput::getMediaSource(int deviceID) } return NULL; } + bool videoInput::setupDevice(int deviceID, unsigned int id) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2089,6 +2193,7 @@ bool videoInput::setupDevice(int deviceID, unsigned int id) } return false; } + bool videoInput::setupDevice(int deviceID, unsigned int w, unsigned int h, unsigned int idealFramerate) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2115,6 +2220,7 @@ bool videoInput::setupDevice(int deviceID, unsigned int w, unsigned int h, unsig } return false; } + MediaType videoInput::getFormat(int deviceID, unsigned int id) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2136,6 +2242,7 @@ MediaType videoInput::getFormat(int deviceID, unsigned int id) } return MediaType(); } + bool videoInput::isDeviceSetup(int deviceID) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2157,6 +2264,7 @@ bool videoInput::isDeviceSetup(int deviceID) } return false; } + bool videoInput::isDeviceMediaSource(int deviceID) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2178,6 +2286,7 @@ bool videoInput::isDeviceMediaSource(int deviceID) } return false; } + bool videoInput::isDeviceRawDataSource(int deviceID) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2202,6 +2311,7 @@ bool videoInput::isDeviceRawDataSource(int deviceID) } return false; } + bool videoInput::isFrameNew(int deviceID) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2230,6 +2340,7 @@ bool videoInput::isFrameNew(int deviceID) } return false; } + unsigned int videoInput::getCountFormats(int deviceID) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2251,12 +2362,14 @@ unsigned int videoInput::getCountFormats(int deviceID) } return 0; } + void videoInput::closeAllDevices() { videoDevices *VDS = &videoDevices::getInstance(); for(unsigned int i = 0; i < VDS->getCount(); i++) closeDevice(i); } + void videoInput::setParametrs(int deviceID, CamParametrs parametrs) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2277,6 +2390,7 @@ void videoInput::setParametrs(int deviceID, CamParametrs parametrs) DPO->printOut(L"VIDEODEVICE(s): There is not any suitable video device\n"); } } + CamParametrs videoInput::getParametrs(int deviceID) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2299,6 +2413,7 @@ CamParametrs videoInput::getParametrs(int deviceID) } return out; } + void videoInput::closeDevice(int deviceID) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2319,6 +2434,7 @@ void videoInput::closeDevice(int deviceID) DPO->printOut(L"VIDEODEVICE(s): There is not any suitable video device\n"); } } + unsigned int videoInput::getWidth(int deviceID) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2340,6 +2456,7 @@ unsigned int videoInput::getWidth(int deviceID) } return 0; } + unsigned int videoInput::getHeight(int deviceID) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2361,6 +2478,7 @@ unsigned int videoInput::getHeight(int deviceID) } return 0; } + wchar_t *videoInput::getNameVideoDevice(int deviceID) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2382,6 +2500,7 @@ wchar_t *videoInput::getNameVideoDevice(int deviceID) } return L"Empty"; } + unsigned int videoInput::listDevices(bool silent) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2405,20 +2524,24 @@ unsigned int videoInput::listDevices(bool silent) } return out; } + videoInput& videoInput::getInstance() { static videoInput instance; return instance; } + bool videoInput::isDevicesAcceable() { return accessToDevices; } + void videoInput::setVerbose(bool state) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); DPO->setVerbose(state); } + void videoInput::setEmergencyStopEvent(int deviceID, void *userData, void(*func)(int, void *)) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2442,6 +2565,7 @@ void videoInput::setEmergencyStopEvent(int deviceID, void *userData, void(*func) DPO->printOut(L"VIDEODEVICE(s): There is not any suitable video device\n"); } } + bool videoInput::getPixels(int deviceID, unsigned char * dstBuffer, bool flipRedAndBlue, bool flipImage) { bool success = false; @@ -2491,6 +2615,7 @@ bool videoInput::getPixels(int deviceID, unsigned char * dstBuffer, bool flipRed } return success; } + void videoInput::processPixels(unsigned char * src, unsigned char * dst, unsigned int width, unsigned int height, unsigned int bpp, bool bRGB, bool bFlip) { @@ -2553,6 +2678,7 @@ void videoInput::processPixels(unsigned char * src, unsigned char * dst, unsigne } } } + /******* Capturing video from camera via Microsoft Media Foundation **********/ class CvCaptureCAM_MSMF : public CvCapture { @@ -2605,6 +2731,7 @@ void CvCaptureCAM_MSMF::close() } widthSet = heightSet = width = height = -1; } + // Initialize camera input bool CvCaptureCAM_MSMF::open( int _index ) { @@ -2621,10 +2748,14 @@ bool CvCaptureCAM_MSMF::open( int _index ) index = try_index; return true; } + bool CvCaptureCAM_MSMF::grabFrame() { - return true; + while (VI.isDeviceSetup(index) && !VI.isFrameNew(index)) + Sleep(1); + return VI.isDeviceSetup(index); } + IplImage* CvCaptureCAM_MSMF::retrieveFrame(int) { if( !frame || (int)VI.getWidth(index) != frame->width || (int)VI.getHeight(index) != frame->height ) @@ -2637,6 +2768,7 @@ IplImage* CvCaptureCAM_MSMF::retrieveFrame(int) VI.getPixels( index, (uchar*)frame->imageData, false, true ); return frame; } + double CvCaptureCAM_MSMF::getProperty( int property_id ) { // image format proprrties From ccb8292e8ec245944d71fe38c32a70ecb0428feb Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Mon, 6 May 2013 03:36:51 -0700 Subject: [PATCH 16/75] Media Foundation-based VideoWriter added --- modules/highgui/src/cap.cpp | 12 + modules/highgui/src/cap_ffmpeg.cpp | 4 - modules/highgui/src/cap_msmf.cpp | 356 ++++++++++++++++++++++++++++- modules/highgui/src/precomp.hpp | 2 + 4 files changed, 367 insertions(+), 7 deletions(-) diff --git a/modules/highgui/src/cap.cpp b/modules/highgui/src/cap.cpp index 3750d1d66..afab8d4b5 100644 --- a/modules/highgui/src/cap.cpp +++ b/modules/highgui/src/cap.cpp @@ -408,8 +408,20 @@ CV_IMPL CvVideoWriter* cvCreateVideoWriter( const char* filename, int fourcc, if(!fourcc || !fps) result = cvCreateVideoWriter_Images(filename); +#ifdef HAVE_FFMPEG if(!result) result = cvCreateVideoWriter_FFMPEG_proxy (filename, fourcc, fps, frameSize, is_color); +#endif + +#ifdef HAVE_VFW + if(!result) + return cvCreateVideoWriter_VFW(filename, fourcc, fps, frameSize, isColor); +#endif + +#ifdef HAVE_MSMF + if (!result) + result = cvCreateVideoWriter_MSMF(filename, fourcc, fps, frameSize, is_color); +#endif /* #ifdef HAVE_XINE if(!result) diff --git a/modules/highgui/src/cap_ffmpeg.cpp b/modules/highgui/src/cap_ffmpeg.cpp index 669ebda12..7d4d6af38 100644 --- a/modules/highgui/src/cap_ffmpeg.cpp +++ b/modules/highgui/src/cap_ffmpeg.cpp @@ -263,9 +263,5 @@ CvVideoWriter* cvCreateVideoWriter_FFMPEG_proxy( const char* filename, int fourc if( result->open( filename, fourcc, fps, frameSize, isColor != 0 )) return result; delete result; -#ifdef HAVE_VFW - return cvCreateVideoWriter_VFW(filename, fourcc, fps, frameSize, isColor); - #else return 0; -#endif } diff --git a/modules/highgui/src/cap_msmf.cpp b/modules/highgui/src/cap_msmf.cpp index 28d92c361..1d6bb597b 100644 --- a/modules/highgui/src/cap_msmf.cpp +++ b/modules/highgui/src/cap_msmf.cpp @@ -54,6 +54,8 @@ #include #include #include "Strsafe.h" +#include +#include #include #include #include @@ -67,6 +69,7 @@ #pragma comment(lib, "mf") #pragma comment(lib, "mfuuid") #pragma comment(lib, "Strmiids") +#pragma comment(lib, "Mfreadwrite") #pragma comment(lib, "MinCore_Downlevel") struct IMFMediaType; @@ -146,7 +149,6 @@ private: DWORD WINAPI MainThreadFunction( LPVOID lpParam ); typedef void(*emergensyStopEventCallback)(int, void *); -typedef unsigned char BYTE; class RawImage { @@ -1031,7 +1033,7 @@ done: SafeRelease(&pEvent); SafeRelease(&ig_pSession); SafeRelease(&ig_pTopology); - + return hr; } @@ -1080,7 +1082,7 @@ done: SafeRelease(&pPD); SafeRelease(&pSD); SafeRelease(&pHandler); - + return hr; } @@ -2939,4 +2941,352 @@ CvCapture* cvCreateCameraCapture_MSMF( int index ) delete capture; return 0; } + + +// +// +// Media Foundation-based Video Writer +// +// + +using namespace Microsoft::WRL; + +class CvVideoWriter_MSMF : public CvVideoWriter +{ +public: + CvVideoWriter_MSMF(); + virtual ~CvVideoWriter_MSMF(); + virtual bool open( const char* filename, int fourcc, + double fps, CvSize frameSize, bool isColor ); + virtual void close(); + virtual bool writeFrame( const IplImage* img); + +private: + UINT32 videoWidth; + UINT32 videoHeight; + double fps; + UINT32 bitRate; + UINT32 frameSize; + GUID encodingFormat; + GUID inputFormat; + + DWORD streamIndex; + ComPtr sinkWriter; + + bool initiated; + + LONGLONG rtStart; + UINT64 rtDuration; + + HRESULT InitializeSinkWriter(const char* filename); + HRESULT WriteFrame(DWORD *videoFrameBuffer, const LONGLONG& rtStart, const LONGLONG& rtDuration); +}; + +CvVideoWriter_MSMF::CvVideoWriter_MSMF() +{ +} + +CvVideoWriter_MSMF::~CvVideoWriter_MSMF() +{ + close(); +} + +bool CvVideoWriter_MSMF::open( const char* filename, int fourcc, + double _fps, CvSize frameSize, bool isColor ) +{ + videoWidth = frameSize.width; + videoHeight = frameSize.height; + fps = _fps; + bitRate = 4*800000; + encodingFormat = MFVideoFormat_WMV3; + inputFormat = MFVideoFormat_RGB32; + + HRESULT hr = CoInitializeEx(NULL, COINIT_APARTMENTTHREADED); + if (SUCCEEDED(hr)) + { + printf("CoInitializeEx is successfull\n"); + hr = MFStartup(MF_VERSION); + if (SUCCEEDED(hr)) + { + printf("MFStartup is successfull\n"); + hr = InitializeSinkWriter(filename); + if (SUCCEEDED(hr)) + { + printf("InitializeSinkWriter is successfull\n"); + initiated = true; + rtStart = 0; + MFFrameRateToAverageTimePerFrame(fps, 1, &rtDuration); + printf("duration: %d\n", rtDuration); + } + } + } + + return SUCCEEDED(hr); +} + +void CvVideoWriter_MSMF::close() +{ + printf("VideoWriter::close()\n"); + if (!initiated) + { + printf("VideoWriter was not Initialized\n"); + return; + } + + initiated = false; + HRESULT hr = sinkWriter->Finalize(); + printf("sinkWriter Finalize status %u\n", hr); + MFShutdown(); +} + +bool CvVideoWriter_MSMF::writeFrame(const IplImage* img) +{ + if (!img) + return false; + + printf("Writing not empty IplImage\n"); + + auto length = img->width * img->height * 4; + printf("Image: %dx%d, %d\n", img->width, img->height, length); + DWORD* target = new DWORD[length]; + + printf("Before for loop\n"); + for (int rowIdx = 0; rowIdx < img->height; rowIdx++) + { + char* rowStart = img->imageData + rowIdx*img->widthStep; + for (int colIdx = 0; colIdx < img->width; colIdx++) + { + BYTE b = rowStart[colIdx * img->nChannels + 0]; + BYTE g = rowStart[colIdx * img->nChannels + 1]; + BYTE r = rowStart[colIdx * img->nChannels + 2]; + + // On ARM devices data is stored starting from the last line + // (and not the first line) so you have to revert them on the Y axis +#if _M_ARM + auto row = index / videoWidth; + auto targetRow = videoHeight - row - 1; + auto column = index - (row * videoWidth); + target[(targetRow * videoWidth) + column] = (r << 16) + (g << 8) + b; +#else + target[rowIdx*img->width+colIdx] = (r << 16) + (g << 8) + b; +#endif + } + } + + // Send frame to the sink writer. + printf("Before private WriteFrame call\n"); + HRESULT hr = WriteFrame(target, rtStart, rtDuration); + printf("After private WriteFrame call\n"); + if (FAILED(hr)) + { + printf("Private WriteFrame failed\n"); + delete[] target; + return false; + } + rtStart += rtDuration; + + printf("End of writing IplImage\n"); + + delete[] target; + + return true; +} + +HRESULT CvVideoWriter_MSMF::InitializeSinkWriter(const char* filename) +{ + ComPtr spAttr; + ComPtr mediaTypeOut; + ComPtr mediaTypeIn; + ComPtr spByteStream; + + MFCreateAttributes(&spAttr, 10); + spAttr->SetUINT32(MF_READWRITE_ENABLE_HARDWARE_TRANSFORMS, true); + + wchar_t* unicodeFileName = new wchar_t[strlen(filename)+1]; + MultiByteToWideChar(CP_ACP, 0, filename, -1, unicodeFileName, strlen(filename)+1); + + HRESULT hr = MFCreateSinkWriterFromURL(unicodeFileName, NULL, spAttr.Get(), &sinkWriter); + + delete[] unicodeFileName; + + // Set the output media type. + if (SUCCEEDED(hr)) + { + printf("MFCreateSinkWriterFromURL is successfull\n"); + hr = MFCreateMediaType(&mediaTypeOut); + } + if (SUCCEEDED(hr)) + { + hr = mediaTypeOut->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video); + } + if (SUCCEEDED(hr)) + { + hr = mediaTypeOut->SetGUID(MF_MT_SUBTYPE, encodingFormat); + } + if (SUCCEEDED(hr)) + { + hr = mediaTypeOut->SetUINT32(MF_MT_AVG_BITRATE, bitRate); + } + if (SUCCEEDED(hr)) + { + hr = mediaTypeOut->SetUINT32(MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive); + } + if (SUCCEEDED(hr)) + { + hr = MFSetAttributeSize(mediaTypeOut.Get(), MF_MT_FRAME_SIZE, videoWidth, videoHeight); + } + if (SUCCEEDED(hr)) + { + hr = MFSetAttributeRatio(mediaTypeOut.Get(), MF_MT_FRAME_RATE, fps, 1); + } + if (SUCCEEDED(hr)) + { + hr = MFSetAttributeRatio(mediaTypeOut.Get(), MF_MT_PIXEL_ASPECT_RATIO, 1, 1); + } + if (SUCCEEDED(hr)) + { + hr = sinkWriter->AddStream(mediaTypeOut.Get(), &streamIndex); + } + + // Set the input media type. + if (SUCCEEDED(hr)) + { + hr = MFCreateMediaType(&mediaTypeIn); + } + if (SUCCEEDED(hr)) + { + hr = mediaTypeIn->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video); + } + if (SUCCEEDED(hr)) + { + hr = mediaTypeIn->SetGUID(MF_MT_SUBTYPE, inputFormat); + } + if (SUCCEEDED(hr)) + { + hr = mediaTypeIn->SetUINT32(MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive); + } + if (SUCCEEDED(hr)) + { + hr = MFSetAttributeSize(mediaTypeIn.Get(), MF_MT_FRAME_SIZE, videoWidth, videoHeight); + } + if (SUCCEEDED(hr)) + { + hr = MFSetAttributeRatio(mediaTypeIn.Get(), MF_MT_FRAME_RATE, fps, 1); + } + if (SUCCEEDED(hr)) + { + hr = MFSetAttributeRatio(mediaTypeIn.Get(), MF_MT_PIXEL_ASPECT_RATIO, 1, 1); + } + if (SUCCEEDED(hr)) + { + hr = sinkWriter->SetInputMediaType(streamIndex, mediaTypeIn.Get(), NULL); + } + + // Tell the sink writer to start accepting data. + if (SUCCEEDED(hr)) + { + hr = sinkWriter->BeginWriting(); + } + + return hr; +} + +HRESULT CvVideoWriter_MSMF::WriteFrame(DWORD *videoFrameBuffer, const LONGLONG& Start, const LONGLONG& Duration) +{ + printf("Private WriteFrame(%p, %llu, %llu)\n", videoFrameBuffer, Start, Duration); + IMFSample* sample; + IMFMediaBuffer* buffer; + + const LONG cbWidth = 4 * videoWidth; + const DWORD cbBuffer = cbWidth * videoHeight; + + BYTE *pData = NULL; + + // Create a new memory buffer. + HRESULT hr = MFCreateMemoryBuffer(cbBuffer, &buffer); + + // Lock the buffer and copy the video frame to the buffer. + if (SUCCEEDED(hr)) + { + printf("MFCreateMemoryBuffer successfull\n"); + hr = buffer->Lock(&pData, NULL, NULL); + } + + if (SUCCEEDED(hr)) + { + printf("Before MFCopyImage(%p, %d, %p, %d, %d %d)\n", pData, cbWidth, videoFrameBuffer, cbWidth, cbWidth, videoHeight); + hr = MFCopyImage( + pData, // Destination buffer. + cbWidth, // Destination stride. + (BYTE*)videoFrameBuffer, // First row in source image. + cbWidth, // Source stride. + cbWidth, // Image width in bytes. + videoHeight // Image height in pixels. + ); + printf("After MFCopyImage()\n"); + } + + printf("Before buffer.Get()\n"); + if (buffer) + { + printf("Before buffer->Unlock\n"); + buffer->Unlock(); + printf("After buffer->Unlock\n"); + } + + // Set the data length of the buffer. + if (SUCCEEDED(hr)) + { + printf("MFCopyImage successfull\n"); + hr = buffer->SetCurrentLength(cbBuffer); + } + + // Create a media sample and add the buffer to the sample. + if (SUCCEEDED(hr)) + { + hr = MFCreateSample(&sample); + } + if (SUCCEEDED(hr)) + { + hr = sample->AddBuffer(buffer); + } + + // Set the time stamp and the duration. + if (SUCCEEDED(hr)) + { + printf("Sample time: %d\n", Start); + hr = sample->SetSampleTime(Start); + } + if (SUCCEEDED(hr)) + { + printf("Duration: %d\n", Duration); + hr = sample->SetSampleDuration(Duration); + } + + // Send the sample to the Sink Writer. + if (SUCCEEDED(hr)) + { + printf("Setting writer params successfull\n"); + hr = sinkWriter->WriteSample(streamIndex, sample); + } + + printf("Private WriteFrame(%d, %p) end with status %u\n", streamIndex, sample, hr); + + SafeRelease(&sample); + SafeRelease(&buffer); + + return hr; +} + +CvVideoWriter* cvCreateVideoWriter_MSMF( const char* filename, int fourcc, + double fps, CvSize frameSize, int isColor ) +{ + printf("Creating Media Foundation VideoWriter\n"); + CvVideoWriter_MSMF* writer = new CvVideoWriter_MSMF; + if( writer->open( filename, fourcc, fps, frameSize, isColor != 0 )) + return writer; + delete writer; + return NULL; +} + #endif \ No newline at end of file diff --git a/modules/highgui/src/precomp.hpp b/modules/highgui/src/precomp.hpp index aa327d6d7..b9896955c 100644 --- a/modules/highgui/src/precomp.hpp +++ b/modules/highgui/src/precomp.hpp @@ -119,6 +119,8 @@ CvVideoWriter* cvCreateVideoWriter_VFW( const char* filename, int fourcc, double fps, CvSize frameSize, int is_color ); CvCapture* cvCreateCameraCapture_DShow( int index ); CvCapture* cvCreateCameraCapture_MSMF( int index ); +CvVideoWriter* cvCreateVideoWriter_MSMF( const char* filename, int fourcc, + double fps, CvSize frameSize, int is_color ); CvCapture* cvCreateCameraCapture_OpenNI( int index ); CvCapture* cvCreateFileCapture_OpenNI( const char* filename ); CvCapture* cvCreateCameraCapture_Android( int index ); From 22b0cfbaa215878809d4685d5bf14dce75ee401f Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Mon, 6 May 2013 07:17:53 -0700 Subject: [PATCH 17/75] Media Foundation-based VideoWriter improvements. FourCC parameter handlig added; Smart pointers instead SafeRelease call; Windows RT support (vertical mirroring). --- modules/highgui/src/cap_msmf.cpp | 87 ++++++++++++++++++++++++-------- 1 file changed, 67 insertions(+), 20 deletions(-) diff --git a/modules/highgui/src/cap_msmf.cpp b/modules/highgui/src/cap_msmf.cpp index 1d6bb597b..d3c365f54 100644 --- a/modules/highgui/src/cap_msmf.cpp +++ b/modules/highgui/src/cap_msmf.cpp @@ -2979,6 +2979,7 @@ private: UINT64 rtDuration; HRESULT InitializeSinkWriter(const char* filename); + static const GUID FourCC2GUID(int fourcc); HRESULT WriteFrame(DWORD *videoFrameBuffer, const LONGLONG& rtStart, const LONGLONG& rtDuration); }; @@ -2991,14 +2992,65 @@ CvVideoWriter_MSMF::~CvVideoWriter_MSMF() close(); } +const GUID CvVideoWriter_MSMF::FourCC2GUID(int fourcc) +{ + switch(fourcc) + { + case 'dv25': + return MFVideoFormat_DV25; break; + case 'dv50': + return MFVideoFormat_DV50; break; + case 'dvc ': + return MFVideoFormat_DVC; break; + case 'dvh1': + return MFVideoFormat_DVH1; break; + case 'dvhd': + return MFVideoFormat_DVHD; break; + case 'dvsd': + return MFVideoFormat_DVSD; break; + case 'dvsl': + return MFVideoFormat_DVSL; break; + case 'H263': + return MFVideoFormat_H263; break; + case 'H264': + return MFVideoFormat_H264; break; + case 'M4S2': + return MFVideoFormat_M4S2; break; + case 'MJPG': + return MFVideoFormat_MJPG; break; + case 'MP43': + return MFVideoFormat_MP43; break; + case 'MP4S': + return MFVideoFormat_MP4S; break; + case 'MP4V': + return MFVideoFormat_MP4V; break; + case 'MPG1': + return MFVideoFormat_MPG1; break; + case 'MSS1': + return MFVideoFormat_MSS1; break; + case 'MSS2': + return MFVideoFormat_MSS2; break; + case 'WMV1': + return MFVideoFormat_WMV1; break; + case 'WMV2': + return MFVideoFormat_WMV2; break; + case 'WMV3': + return MFVideoFormat_WMV3; break; + case 'WVC1': + return MFVideoFormat_WVC1; break; + default: + return MFVideoFormat_H264; + } +} + bool CvVideoWriter_MSMF::open( const char* filename, int fourcc, double _fps, CvSize frameSize, bool isColor ) { videoWidth = frameSize.width; videoHeight = frameSize.height; fps = _fps; - bitRate = 4*800000; - encodingFormat = MFVideoFormat_WMV3; + bitRate = videoWidth*videoHeight; // 1-bit per pixel + encodingFormat = FourCC2GUID(fourcc); inputFormat = MFVideoFormat_RGB32; HRESULT hr = CoInitializeEx(NULL, COINIT_APARTMENTTHREADED); @@ -3015,7 +3067,7 @@ bool CvVideoWriter_MSMF::open( const char* filename, int fourcc, printf("InitializeSinkWriter is successfull\n"); initiated = true; rtStart = 0; - MFFrameRateToAverageTimePerFrame(fps, 1, &rtDuration); + MFFrameRateToAverageTimePerFrame((UINT32)fps, 1, &rtDuration); printf("duration: %d\n", rtDuration); } } @@ -3046,7 +3098,7 @@ bool CvVideoWriter_MSMF::writeFrame(const IplImage* img) printf("Writing not empty IplImage\n"); - auto length = img->width * img->height * 4; + int length = img->width * img->height * 4; printf("Image: %dx%d, %d\n", img->width, img->height, length); DWORD* target = new DWORD[length]; @@ -3060,13 +3112,11 @@ bool CvVideoWriter_MSMF::writeFrame(const IplImage* img) BYTE g = rowStart[colIdx * img->nChannels + 1]; BYTE r = rowStart[colIdx * img->nChannels + 2]; - // On ARM devices data is stored starting from the last line - // (and not the first line) so you have to revert them on the Y axis + // On ARM devices data is stored starting from the last line + // (and not the first line) so you have to revert them on the Y axis #if _M_ARM - auto row = index / videoWidth; - auto targetRow = videoHeight - row - 1; - auto column = index - (row * videoWidth); - target[(targetRow * videoWidth) + column] = (r << 16) + (g << 8) + b; + int targetRow = videoHeight - rowIdx - 1; + target[(targetRow * videoWidth) + colIdx] = (r << 16) + (g << 8) + b; #else target[rowIdx*img->width+colIdx] = (r << 16) + (g << 8) + b; #endif @@ -3137,7 +3187,7 @@ HRESULT CvVideoWriter_MSMF::InitializeSinkWriter(const char* filename) } if (SUCCEEDED(hr)) { - hr = MFSetAttributeRatio(mediaTypeOut.Get(), MF_MT_FRAME_RATE, fps, 1); + hr = MFSetAttributeRatio(mediaTypeOut.Get(), MF_MT_FRAME_RATE, (UINT32)fps, 1); } if (SUCCEEDED(hr)) { @@ -3171,7 +3221,7 @@ HRESULT CvVideoWriter_MSMF::InitializeSinkWriter(const char* filename) } if (SUCCEEDED(hr)) { - hr = MFSetAttributeRatio(mediaTypeIn.Get(), MF_MT_FRAME_RATE, fps, 1); + hr = MFSetAttributeRatio(mediaTypeIn.Get(), MF_MT_FRAME_RATE, (UINT32)fps, 1); } if (SUCCEEDED(hr)) { @@ -3194,8 +3244,8 @@ HRESULT CvVideoWriter_MSMF::InitializeSinkWriter(const char* filename) HRESULT CvVideoWriter_MSMF::WriteFrame(DWORD *videoFrameBuffer, const LONGLONG& Start, const LONGLONG& Duration) { printf("Private WriteFrame(%p, %llu, %llu)\n", videoFrameBuffer, Start, Duration); - IMFSample* sample; - IMFMediaBuffer* buffer; + ComPtr sample; + ComPtr buffer; const LONG cbWidth = 4 * videoWidth; const DWORD cbBuffer = cbWidth * videoHeight; @@ -3248,7 +3298,7 @@ HRESULT CvVideoWriter_MSMF::WriteFrame(DWORD *videoFrameBuffer, const LONGLONG& } if (SUCCEEDED(hr)) { - hr = sample->AddBuffer(buffer); + hr = sample->AddBuffer(buffer.Get()); } // Set the time stamp and the duration. @@ -3267,13 +3317,10 @@ HRESULT CvVideoWriter_MSMF::WriteFrame(DWORD *videoFrameBuffer, const LONGLONG& if (SUCCEEDED(hr)) { printf("Setting writer params successfull\n"); - hr = sinkWriter->WriteSample(streamIndex, sample); + hr = sinkWriter->WriteSample(streamIndex, sample.Get()); } - printf("Private WriteFrame(%d, %p) end with status %u\n", streamIndex, sample, hr); - - SafeRelease(&sample); - SafeRelease(&buffer); + printf("Private WriteFrame(%d, %p) end with status %u\n", streamIndex, sample.Get(), hr); return hr; } From 9fb762ccecbfc5768e44db7753047dde8f2b8383 Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Tue, 14 May 2013 05:17:34 -0700 Subject: [PATCH 18/75] VideoCapture for video files implemented. Set and Get methods are not implemented; Camera based video capture is broken due to modifications. --- modules/highgui/src/cap.cpp | 16 +- modules/highgui/src/cap_ffmpeg.cpp | 4 - modules/highgui/src/cap_msmf.cpp | 525 +++++++++++++++++++++++++++-- modules/highgui/src/precomp.hpp | 1 + 4 files changed, 521 insertions(+), 25 deletions(-) diff --git a/modules/highgui/src/cap.cpp b/modules/highgui/src/cap.cpp index afab8d4b5..8db873102 100644 --- a/modules/highgui/src/cap.cpp +++ b/modules/highgui/src/cap.cpp @@ -363,6 +363,20 @@ CV_IMPL CvCapture * cvCreateFileCapture (const char * filename) if (! result) result = cvCreateFileCapture_FFMPEG_proxy (filename); +#ifdef HAVE_MSMF + if (! result) + { + printf("Creating Media foundation based reader\n"); + result = cvCreateFileCapture_MSMF (filename); + printf("Construction result %p\n", result); + } +#endif + +#ifdef HAVE_VFW + if (! result) + result = cvCreateFileCapture_VFW (filename); +#endif + #ifdef HAVE_XINE if (! result) result = cvCreateFileCapture_XINE (filename); @@ -415,7 +429,7 @@ CV_IMPL CvVideoWriter* cvCreateVideoWriter( const char* filename, int fourcc, #ifdef HAVE_VFW if(!result) - return cvCreateVideoWriter_VFW(filename, fourcc, fps, frameSize, isColor); + return cvCreateVideoWriter_VFW(filename, fourcc, fps, frameSize, is_color); #endif #ifdef HAVE_MSMF diff --git a/modules/highgui/src/cap_ffmpeg.cpp b/modules/highgui/src/cap_ffmpeg.cpp index 7d4d6af38..bf73c0810 100644 --- a/modules/highgui/src/cap_ffmpeg.cpp +++ b/modules/highgui/src/cap_ffmpeg.cpp @@ -209,11 +209,7 @@ CvCapture* cvCreateFileCapture_FFMPEG_proxy(const char * filename) if( result->open( filename )) return result; delete result; -#ifdef HAVE_VFW - return cvCreateFileCapture_VFW(filename); -#else return 0; -#endif } class CvVideoWriter_FFMPEG_proxy : diff --git a/modules/highgui/src/cap_msmf.cpp b/modules/highgui/src/cap_msmf.cpp index d3c365f54..c037d2f98 100644 --- a/modules/highgui/src/cap_msmf.cpp +++ b/modules/highgui/src/cap_msmf.cpp @@ -175,10 +175,17 @@ public: ~ImageGrabber(void); HRESULT initImageGrabber(IMFMediaSource *pSource, GUID VideoFormat); HRESULT startGrabbing(void); + void pauseGrabbing(); + void resumeGrabbing(); void stopGrabbing(); RawImage *getRawImage(); // Function of creation of the instance of the class static HRESULT CreateInstance(ImageGrabber **ppIG,unsigned int deviceID); + + HANDLE ig_hFrameReady; + HANDLE ig_hFrameGrabbed; + HANDLE ig_hFinish; + private: bool ig_RIE; bool ig_Close; @@ -198,11 +205,7 @@ private: IMFPresentationDescriptor *pPD, IMFStreamDescriptor *pSD, IMFTopologyNode **ppNode); - HRESULT AddOutputNode( - IMFTopology *pTopology, - IMFActivate *pActivate, - DWORD dwId, - IMFTopologyNode **ppNode); + HRESULT AddOutputNode(IMFTopology *pTopology, IMFActivate *pActivate, DWORD dwId, IMFTopologyNode **ppNode); // IUnknown methods STDMETHODIMP QueryInterface(REFIID iid, void** ppv); STDMETHODIMP_(ULONG) AddRef(); @@ -851,20 +854,37 @@ FormatReader::~FormatReader(void) #define CHECK_HR(x) if (FAILED(x)) { printf("Checking failed !!!\n"); goto done; } -ImageGrabber::ImageGrabber(unsigned int deviceID): m_cRef(1), ig_DeviceID(deviceID), ig_pSource(NULL), ig_pSession(NULL), ig_pTopology(NULL), ig_RIE(true), ig_Close(false) -{ -} +ImageGrabber::ImageGrabber(unsigned int deviceID): + m_cRef(1), + ig_DeviceID(deviceID), + ig_pSource(NULL), + ig_pSession(NULL), + ig_pTopology(NULL), + ig_RIE(true), + ig_Close(false), + ig_hFrameReady(CreateEvent(NULL, FALSE, FALSE, "ig_hFrameReady")), + ig_hFrameGrabbed(CreateEvent(NULL, FALSE, TRUE, "ig_hFrameGrabbed")), + ig_hFinish(CreateEvent(NULL, FALSE, FALSE, "ig_hFinish")) +{} ImageGrabber::~ImageGrabber(void) { + printf("ImageGrabber::~ImageGrabber()\n"); if (ig_pSession) { + printf("ig_pSession->Shutdown()"); ig_pSession->Shutdown(); } - //SafeRelease(&ig_pSession); - //SafeRelease(&ig_pTopology); + + CloseHandle(ig_hFrameReady); + CloseHandle(ig_hFrameGrabbed); + CloseHandle(ig_hFinish); + + SafeRelease(&ig_pSession); + SafeRelease(&ig_pTopology); DebugPrintOut *DPO = &DebugPrintOut::getInstance(); - DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: Destroing instance of the ImageGrabber class \n", ig_DeviceID); + + DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: Destroing instance of the ImageGrabber class\n", ig_DeviceID); } HRESULT ImageGrabber::initImageGrabber(IMFMediaSource *pSource, GUID VideoFormat) @@ -983,9 +1003,17 @@ HRESULT ImageGrabber::startGrabbing(void) PROPVARIANT var; PropVariantInit(&var); HRESULT hr = S_OK; - CHECK_HR(hr = ig_pSession->SetTopology(0, ig_pTopology)); - CHECK_HR(hr = ig_pSession->Start(&GUID_NULL, &var)); + hr = ig_pSession->SetTopology(0, ig_pTopology); + if (FAILED(hr)) + { + printf("Error: cannot set topology (status %u)\n", hr); + } DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: Start Grabbing of the images\n", ig_DeviceID); + hr = ig_pSession->Start(&GUID_NULL, &var); + if (FAILED(hr)) + { + printf("Error: cannot start session (status %u)\n", hr); + } for(;;) { HRESULT hrStatus = S_OK; @@ -1027,6 +1055,9 @@ HRESULT ImageGrabber::startGrabbing(void) } SafeRelease(&pEvent); } + + SetEvent(ig_hFinish); + DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: Finish startGrabbing \n", ig_DeviceID); done: @@ -1037,6 +1068,14 @@ done: return hr; } +void ImageGrabber::pauseGrabbing() +{ +} + +void ImageGrabber::resumeGrabbing() +{ +} + HRESULT ImageGrabber::CreateTopology(IMFMediaSource *pSource, IMFActivate *pSinkActivate, IMFTopology **ppTopo) { IMFTopology *pTopology = NULL; @@ -1229,6 +1268,8 @@ STDMETHODIMP ImageGrabber::OnProcessSample(REFGUID guidMajorMediaType, DWORD dwS (void)llSampleDuration; (void)dwSampleSize; + WaitForSingleObject(ig_hFrameGrabbed, INFINITE); + printf("ImageGrabber::OnProcessSample() -- begin\n"); if(ig_RIE) @@ -1245,11 +1286,14 @@ STDMETHODIMP ImageGrabber::OnProcessSample(REFGUID guidMajorMediaType, DWORD dwS printf("ImageGrabber::OnProcessSample() -- end\n"); + SetEvent(ig_hFrameReady); + return S_OK; } STDMETHODIMP ImageGrabber::OnShutdown() { + SetEvent(ig_hFrameGrabbed); return S_OK; } @@ -1279,7 +1323,7 @@ HRESULT ImageGrabberThread::CreateInstance(ImageGrabberThread **ppIGT, IMFMediaS return S_OK; } -ImageGrabberThread::ImageGrabberThread(IMFMediaSource *pSource, unsigned int deviceID): igt_Handle(NULL), igt_stop(false) +ImageGrabberThread::ImageGrabberThread(IMFMediaSource *pSource, unsigned int deviceID): igt_func(NULL), igt_Handle(NULL), igt_stop(false) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); HRESULT hr = ImageGrabber::CreateInstance(&igt_pImageGrabber, deviceID); @@ -1330,11 +1374,11 @@ void ImageGrabberThread::stop() void ImageGrabberThread::start() { igt_Handle = CreateThread( - NULL, // default security attributes - 0, // use default stack size - MainThreadFunction, // thread function name - this, // argument to thread function - 0, // use default creation flags + NULL, // default security attributes + 0, // use default stack size + MainThreadFunction, // thread function name + this, // argument to thread function + 0, // use default creation flags &igt_ThreadIdArray); // returns the thread identifier } @@ -1359,6 +1403,7 @@ void ImageGrabberThread::run() DPO->printOut(L"IMAGEGRABBERTHREAD VIDEODEVICE %i: Emergency Stop thread\n", igt_DeviceID); if(igt_func) { + printf("Calling Emergency stop even handler\n"); igt_func(igt_DeviceID, igt_userData); } } @@ -2701,11 +2746,14 @@ protected: IplImage* frame; videoInput VI; }; + struct SuppressVideoInputMessages { SuppressVideoInputMessages() { videoInput::setVerbose(true); } }; + static SuppressVideoInputMessages do_it; + CvCaptureCAM_MSMF::CvCaptureCAM_MSMF(): index(-1), width(-1), @@ -2713,7 +2761,7 @@ CvCaptureCAM_MSMF::CvCaptureCAM_MSMF(): fourcc(-1), widthSet(-1), heightSet(-1), - frame(0), + frame(NULL), VI(videoInput::getInstance()) { CoInitialize(0); @@ -2925,6 +2973,435 @@ bool CvCaptureCAM_MSMF::setProperty( int property_id, double value ) } return false; } + +class CvCaptureFile_MSMF : public CvCapture +{ +public: + CvCaptureFile_MSMF(); + virtual ~CvCaptureFile_MSMF(); + + virtual bool open( const char* filename ); + virtual void close(); + + virtual double getProperty(int); + virtual bool setProperty(int, double); + virtual bool grabFrame(); + virtual IplImage* retrieveFrame(int); + virtual int getCaptureDomain() { return CV_CAP_MSMF; } +protected: + ImageGrabberThread* grabberThread; + IMFMediaSource* videoFileSource; + std::vector captureFormats; + int captureFormatIndex; + IplImage* frame; + bool isOpened; + + long enumerateCaptureFormats(IMFMediaSource *pSource); + void processPixels(unsigned char * src, unsigned char * dst, unsigned int width, + unsigned int height, unsigned int bpp, bool bRGB, bool bFlip); +}; + +CvCaptureFile_MSMF::CvCaptureFile_MSMF(): + grabberThread(NULL), + videoFileSource(NULL), + captureFormatIndex(0), + frame(NULL), + isOpened(false) +{ + MFStartup(MF_VERSION); +} + +CvCaptureFile_MSMF::~CvCaptureFile_MSMF() +{ + MFShutdown(); +} + +bool CvCaptureFile_MSMF::open(const char* filename) +{ + if (!filename) + return false; + + wchar_t* unicodeFileName = new wchar_t[strlen(filename)+1]; + MultiByteToWideChar(CP_ACP, 0, filename, -1, unicodeFileName, strlen(filename)+1); + + HRESULT hr = S_OK; + + MF_OBJECT_TYPE ObjectType = MF_OBJECT_INVALID; + + IMFSourceResolver* pSourceResolver = NULL; + IUnknown* pUnkSource = NULL; + + hr = MFCreateSourceResolver(&pSourceResolver); + + if (SUCCEEDED(hr)) + { + hr = pSourceResolver->CreateObjectFromURL( + unicodeFileName, + MF_RESOLUTION_MEDIASOURCE, + NULL, // Optional property store. + &ObjectType, + &pUnkSource + ); + } + + // Get the IMFMediaSource from the IUnknown pointer. + if (SUCCEEDED(hr)) + { + hr = pUnkSource->QueryInterface(IID_PPV_ARGS(&videoFileSource)); + } + + SafeRelease(&pSourceResolver); + SafeRelease(&pUnkSource); + + enumerateCaptureFormats(videoFileSource); + + if (SUCCEEDED(hr)) + { + hr = ImageGrabberThread::CreateInstance(&grabberThread, videoFileSource, -2); + } + + if (SUCCEEDED(hr)) + { + grabberThread->start(); + } + + isOpened = true; + + return true; +} + +void CvCaptureFile_MSMF::close() +{ + if (grabberThread) + { + isOpened = false; + SetEvent(grabberThread->getImageGrabber()->ig_hFrameReady); + grabberThread->stop(); + delete grabberThread; + } + + if (videoFileSource) + { + HRESULT hr = videoFileSource->Shutdown(); + if (FAILED(hr)) + { + printf("VideoCapture Closing failed!\n"); + } + } +} + +bool CvCaptureFile_MSMF::setProperty(int property_id, double value) +{ + // image capture properties + bool handled = false; + int width, height, fourcc; + switch( property_id ) + { + case CV_CAP_PROP_FRAME_WIDTH: + // width = cvRound(value); + // handled = true; + break; + case CV_CAP_PROP_FRAME_HEIGHT: + // height = cvRound(value); + // handled = true; + break; + case CV_CAP_PROP_FOURCC: + fourcc = (int)(unsigned long)(value); + if ( fourcc == -1 ) { + // following cvCreateVideo usage will pop up caprturepindialog here if fourcc=-1 + // TODO - how to create a capture pin dialog + } + handled = true; + break; + case CV_CAP_PROP_FPS: + // FIXME: implement method in VideoInput back end + // int fps = cvRound(value); + // if (fps != VI.getFPS(index)) + // { + // VI.stopDevice(index); + // VI.setIdealFramerate(index,fps); + // if (widthSet > 0 && heightSet > 0) + // VI.setupDevice(index, widthSet, heightSet); + // else + // VI.setupDevice(index); + // } + // return VI.isDeviceSetup(index); + ; + } + if ( handled ) { + // a stream setting + if( width > 0 && height > 0 ) + { + if( width != captureFormats[captureFormatIndex].width || + height != captureFormats[captureFormatIndex].height )//|| fourcc != VI.getFourcc(index) ) + { + // FIXME: implement method in VideoInput back end + // int fps = static_cast(VI.getFPS(index)); + // VI.stopDevice(index); + // VI.setIdealFramerate(index, fps); + // VI.setupDeviceFourcc(index, width, height, fourcc); + } + if (isOpened) + { + // widthSet = width; + // heightSet = height; + // width = height = fourcc = -1; + } + return isOpened; + } + return true; + } + // show video/camera filter dialog + // FIXME: implement method in VideoInput back end + // if ( property_id == CV_CAP_PROP_SETTINGS ) { + // VI.showSettingsWindow(index); + // return true; + // } + //video Filter properties + switch( property_id ) + { + case CV_CAP_PROP_BRIGHTNESS: + case CV_CAP_PROP_CONTRAST: + case CV_CAP_PROP_HUE: + case CV_CAP_PROP_SATURATION: + case CV_CAP_PROP_SHARPNESS: + case CV_CAP_PROP_GAMMA: + case CV_CAP_PROP_MONOCROME: + case CV_CAP_PROP_WHITE_BALANCE_BLUE_U: + case CV_CAP_PROP_BACKLIGHT: + case CV_CAP_PROP_GAIN: + // FIXME: implement method in VideoInput back end + //return VI.setVideoSettingFilter(index,VI.getVideoPropertyFromCV(property_id),(long)value); + ; + } + //camera properties + switch( property_id ) + { + case CV_CAP_PROP_PAN: + case CV_CAP_PROP_TILT: + case CV_CAP_PROP_ROLL: + case CV_CAP_PROP_ZOOM: + case CV_CAP_PROP_EXPOSURE: + case CV_CAP_PROP_IRIS: + case CV_CAP_PROP_FOCUS: + // FIXME: implement method in VideoInput back end + //return VI.setVideoSettingCamera(index,VI.getCameraPropertyFromCV(property_id),(long)value); + ; + } + + return false; +} + +double CvCaptureFile_MSMF::getProperty(int property_id) +{ + // image format proprrties + switch( property_id ) + { + case CV_CAP_PROP_FRAME_WIDTH: + return captureFormats[captureFormatIndex].width; + case CV_CAP_PROP_FRAME_HEIGHT: + return captureFormats[captureFormatIndex].height; + case CV_CAP_PROP_FOURCC: + // FIXME: implement method in VideoInput back end + //return VI.getFourcc(index); + ; + case CV_CAP_PROP_FPS: + // FIXME: implement method in VideoInput back end + //return VI.getFPS(index); + ; + } + // video filter properties + switch( property_id ) + { + case CV_CAP_PROP_BRIGHTNESS: + case CV_CAP_PROP_CONTRAST: + case CV_CAP_PROP_HUE: + case CV_CAP_PROP_SATURATION: + case CV_CAP_PROP_SHARPNESS: + case CV_CAP_PROP_GAMMA: + case CV_CAP_PROP_MONOCROME: + case CV_CAP_PROP_WHITE_BALANCE_BLUE_U: + case CV_CAP_PROP_BACKLIGHT: + case CV_CAP_PROP_GAIN: + // FIXME: implement method in VideoInput back end + // if ( VI.getVideoSettingFilter(index, VI.getVideoPropertyFromCV(property_id), min_value, + // max_value, stepping_delta, current_value, flags,defaultValue) ) + // return (double)current_value; + return 0.; + } + // camera properties + switch( property_id ) + { + case CV_CAP_PROP_PAN: + case CV_CAP_PROP_TILT: + case CV_CAP_PROP_ROLL: + case CV_CAP_PROP_ZOOM: + case CV_CAP_PROP_EXPOSURE: + case CV_CAP_PROP_IRIS: + case CV_CAP_PROP_FOCUS: + // FIXME: implement method in VideoInput back end + // if (VI.getVideoSettingCamera(index,VI.getCameraPropertyFromCV(property_id),min_value, + // max_value,stepping_delta,current_value,flags,defaultValue) ) return (double)current_value; + return 0.; + } + // unknown parameter or value not available + return -1; +} + +bool CvCaptureFile_MSMF::grabFrame() +{ + DWORD waitResult; + if (isOpened) + { + HANDLE tmp[] = {grabberThread->getImageGrabber()->ig_hFrameReady, grabberThread->getImageGrabber()->ig_hFinish, 0}; + waitResult = WaitForMultipleObjects(2, tmp, FALSE, INFINITE); + SetEvent(grabberThread->getImageGrabber()->ig_hFrameGrabbed); + } + + return isOpened && (waitResult == WAIT_OBJECT_0); +} + +IplImage* CvCaptureFile_MSMF::retrieveFrame(int) +{ + unsigned int width = captureFormats[captureFormatIndex].width; + unsigned int height = captureFormats[captureFormatIndex].height; + unsigned int bytes = 3; + if( !frame || width != frame->width || height != frame->height ) + { + if (frame) + cvReleaseImage( &frame ); + frame = cvCreateImage( cvSize(width,height), 8, 3 ); + } + + RawImage *RIOut = grabberThread->getImageGrabber()->getRawImage(); + unsigned int size = bytes * width * height; + + if(RIOut && size == RIOut->getSize()) + { + processPixels(RIOut->getpPixels(), (unsigned char*)frame->imageData, width, height, bytes, true, false); + } + + return frame; +} + +void CvCaptureFile_MSMF::processPixels(unsigned char * src, unsigned char * dst, unsigned int width, + unsigned int height, unsigned int bpp, bool bRGB, bool bFlip) +{ + unsigned int widthInBytes = width * bpp; + unsigned int numBytes = widthInBytes * height; + int *dstInt, *srcInt; + if(!bRGB) + { + if(bFlip) + { + for(unsigned int y = 0; y < height; y++) + { + dstInt = (int *)(dst + (y * widthInBytes)); + srcInt = (int *)(src + ( (height -y -1) * widthInBytes)); + memcpy(dstInt, srcInt, widthInBytes); + } + } + else + { + memcpy(dst, src, numBytes); + } + } + else + { + if(bFlip) + { + unsigned int x = 0; + unsigned int y = (height - 1) * widthInBytes; + src += y; + for(unsigned int i = 0; i < numBytes; i+=3) + { + if(x >= width) + { + x = 0; + src -= widthInBytes*2; + } + *dst = *(src+2); + dst++; + *dst = *(src+1); + dst++; + *dst = *src; + dst++; + src+=3; + x++; + } + } + else + { + for(unsigned int i = 0; i < numBytes; i+=3) + { + *dst = *(src+2); + dst++; + *dst = *(src+1); + dst++; + *dst = *src; + dst++; + src+=3; + } + } + } +} + +long CvCaptureFile_MSMF::enumerateCaptureFormats(IMFMediaSource *pSource) +{ + IMFPresentationDescriptor *pPD = NULL; + IMFStreamDescriptor *pSD = NULL; + IMFMediaTypeHandler *pHandler = NULL; + IMFMediaType *pType = NULL; + HRESULT hr = pSource->CreatePresentationDescriptor(&pPD); + if (FAILED(hr)) + { + goto done; + } + + DWORD cnt; + + pPD->GetStreamDescriptorCount(&cnt); + + printf("Stream count: %d\n", cnt); + + BOOL fSelected; + hr = pPD->GetStreamDescriptorByIndex(0, &fSelected, &pSD); + if (FAILED(hr)) + { + goto done; + } + hr = pSD->GetMediaTypeHandler(&pHandler); + if (FAILED(hr)) + { + goto done; + } + DWORD cTypes = 0; + hr = pHandler->GetMediaTypeCount(&cTypes); + if (FAILED(hr)) + { + goto done; + } + for (DWORD i = 0; i < cTypes; i++) + { + hr = pHandler->GetMediaTypeByIndex(i, &pType); + if (FAILED(hr)) + { + goto done; + } + MediaType MT = FormatReader::Read(pType); + captureFormats.push_back(MT); + SafeRelease(&pType); + } + +done: + SafeRelease(&pPD); + SafeRelease(&pSD); + SafeRelease(&pHandler); + SafeRelease(&pType); + return hr; +} + + CvCapture* cvCreateCameraCapture_MSMF( int index ) { CvCaptureCAM_MSMF* capture = new CvCaptureCAM_MSMF; @@ -2942,6 +3419,14 @@ CvCapture* cvCreateCameraCapture_MSMF( int index ) return 0; } +CvCapture* cvCreateFileCapture_MSMF (const char* filename) +{ + CvCaptureFile_MSMF* capture = new CvCaptureFile_MSMF; + if( capture->open(filename) ) + return capture; + delete capture; + return 0; +} // // diff --git a/modules/highgui/src/precomp.hpp b/modules/highgui/src/precomp.hpp index b9896955c..dcd4afdc0 100644 --- a/modules/highgui/src/precomp.hpp +++ b/modules/highgui/src/precomp.hpp @@ -119,6 +119,7 @@ CvVideoWriter* cvCreateVideoWriter_VFW( const char* filename, int fourcc, double fps, CvSize frameSize, int is_color ); CvCapture* cvCreateCameraCapture_DShow( int index ); CvCapture* cvCreateCameraCapture_MSMF( int index ); +CvCapture* cvCreateFileCapture_MSMF (const char* filename); CvVideoWriter* cvCreateVideoWriter_MSMF( const char* filename, int fourcc, double fps, CvSize frameSize, int is_color ); CvCapture* cvCreateCameraCapture_OpenNI( int index ); From e94cc0b5ee4f1d4d4207ef73aeb65a59b41a0424 Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Wed, 15 May 2013 05:12:25 -0700 Subject: [PATCH 19/75] Media Foundation camera capture fixed. Camera-based VideoCapture updated to fit changes in ImageGrabber from prev commit --- modules/highgui/src/cap_msmf.cpp | 68 ++++++++++++++++++++------------ 1 file changed, 43 insertions(+), 25 deletions(-) diff --git a/modules/highgui/src/cap_msmf.cpp b/modules/highgui/src/cap_msmf.cpp index c037d2f98..f5ea616bf 100644 --- a/modules/highgui/src/cap_msmf.cpp +++ b/modules/highgui/src/cap_msmf.cpp @@ -180,15 +180,16 @@ public: void stopGrabbing(); RawImage *getRawImage(); // Function of creation of the instance of the class - static HRESULT CreateInstance(ImageGrabber **ppIG,unsigned int deviceID); + static HRESULT CreateInstance(ImageGrabber **ppIG, unsigned int deviceID, bool synchronous = false); - HANDLE ig_hFrameReady; - HANDLE ig_hFrameGrabbed; - HANDLE ig_hFinish; + const HANDLE ig_hFrameReady; + const HANDLE ig_hFrameGrabbed; + const HANDLE ig_hFinish; private: bool ig_RIE; bool ig_Close; + bool ig_Synchronous; long m_cRef; unsigned int ig_DeviceID; IMFMediaSource *ig_pSource; @@ -197,7 +198,7 @@ private: RawImage *ig_RIFirst; RawImage *ig_RISecond; RawImage *ig_RIOut; - ImageGrabber(unsigned int deviceID); + ImageGrabber(unsigned int deviceID, bool synchronous); HRESULT CreateTopology(IMFMediaSource *pSource, IMFActivate *pSinkActivate, IMFTopology **ppTopo); HRESULT AddSourceNode( IMFTopology *pTopology, @@ -229,7 +230,7 @@ class ImageGrabberThread friend DWORD WINAPI MainThreadFunction( LPVOID lpParam ); public: ~ImageGrabberThread(void); - static HRESULT CreateInstance(ImageGrabberThread **ppIGT, IMFMediaSource *pSource, unsigned int deviceID); + static HRESULT CreateInstance(ImageGrabberThread **ppIGT, IMFMediaSource *pSource, unsigned int deviceID, bool synchronious = false); void start(); void stop(); void setEmergencyStopEvent(void *userData, void(*func)(int, void *)); @@ -237,7 +238,7 @@ public: protected: virtual void run(); private: - ImageGrabberThread(IMFMediaSource *pSource, unsigned int deviceID); + ImageGrabberThread(IMFMediaSource *pSource, unsigned int deviceID, bool synchronious); HANDLE igt_Handle; DWORD igt_ThreadIdArray; ImageGrabber *igt_pImageGrabber; @@ -854,7 +855,7 @@ FormatReader::~FormatReader(void) #define CHECK_HR(x) if (FAILED(x)) { printf("Checking failed !!!\n"); goto done; } -ImageGrabber::ImageGrabber(unsigned int deviceID): +ImageGrabber::ImageGrabber(unsigned int deviceID, bool synchronous): m_cRef(1), ig_DeviceID(deviceID), ig_pSource(NULL), @@ -862,9 +863,10 @@ ImageGrabber::ImageGrabber(unsigned int deviceID): ig_pTopology(NULL), ig_RIE(true), ig_Close(false), - ig_hFrameReady(CreateEvent(NULL, FALSE, FALSE, "ig_hFrameReady")), - ig_hFrameGrabbed(CreateEvent(NULL, FALSE, TRUE, "ig_hFrameGrabbed")), - ig_hFinish(CreateEvent(NULL, FALSE, FALSE, "ig_hFinish")) + ig_Synchronous(synchronous), + ig_hFrameReady(synchronous ? CreateEvent(NULL, FALSE, FALSE, "ig_hFrameReady"): 0), + ig_hFrameGrabbed(synchronous ? CreateEvent(NULL, FALSE, TRUE, "ig_hFrameGrabbed"): 0), + ig_hFinish(synchronous ? CreateEvent(NULL, FALSE, FALSE, "ig_hFinish") : 0) {} ImageGrabber::~ImageGrabber(void) @@ -876,9 +878,12 @@ ImageGrabber::~ImageGrabber(void) ig_pSession->Shutdown(); } - CloseHandle(ig_hFrameReady); - CloseHandle(ig_hFrameGrabbed); - CloseHandle(ig_hFinish); + if (ig_Synchronous) + { + CloseHandle(ig_hFrameReady); + CloseHandle(ig_hFrameGrabbed); + CloseHandle(ig_hFinish); + } SafeRelease(&ig_pSession); SafeRelease(&ig_pTopology); @@ -1056,7 +1061,10 @@ HRESULT ImageGrabber::startGrabbing(void) SafeRelease(&pEvent); } - SetEvent(ig_hFinish); + if (ig_Synchronous) + { + SetEvent(ig_hFinish); + } DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: Finish startGrabbing \n", ig_DeviceID); @@ -1172,9 +1180,9 @@ done: return hr; } -HRESULT ImageGrabber::CreateInstance(ImageGrabber **ppIG, unsigned int deviceID) +HRESULT ImageGrabber::CreateInstance(ImageGrabber **ppIG, unsigned int deviceID, bool synchronious) { - *ppIG = new (std::nothrow) ImageGrabber(deviceID); + *ppIG = new (std::nothrow) ImageGrabber(deviceID, synchronious); if (ppIG == NULL) { return E_OUTOFMEMORY; @@ -1286,14 +1294,21 @@ STDMETHODIMP ImageGrabber::OnProcessSample(REFGUID guidMajorMediaType, DWORD dwS printf("ImageGrabber::OnProcessSample() -- end\n"); - SetEvent(ig_hFrameReady); + if (ig_Synchronous) + { + SetEvent(ig_hFrameReady); + } return S_OK; } STDMETHODIMP ImageGrabber::OnShutdown() { - SetEvent(ig_hFrameGrabbed); + if (ig_Synchronous) + { + SetEvent(ig_hFrameGrabbed); + } + return S_OK; } @@ -1309,10 +1324,10 @@ DWORD WINAPI MainThreadFunction( LPVOID lpParam ) return 0; } -HRESULT ImageGrabberThread::CreateInstance(ImageGrabberThread **ppIGT, IMFMediaSource *pSource, unsigned int deviceID) +HRESULT ImageGrabberThread::CreateInstance(ImageGrabberThread **ppIGT, IMFMediaSource *pSource, unsigned int deviceID, bool synchronious) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); - *ppIGT = new (std::nothrow) ImageGrabberThread(pSource, deviceID); + *ppIGT = new (std::nothrow) ImageGrabberThread(pSource, deviceID, synchronious); if (ppIGT == NULL) { DPO->printOut(L"IMAGEGRABBERTHREAD VIDEODEVICE %i: Memory cannot be allocated\n", deviceID); @@ -1323,10 +1338,13 @@ HRESULT ImageGrabberThread::CreateInstance(ImageGrabberThread **ppIGT, IMFMediaS return S_OK; } -ImageGrabberThread::ImageGrabberThread(IMFMediaSource *pSource, unsigned int deviceID): igt_func(NULL), igt_Handle(NULL), igt_stop(false) +ImageGrabberThread::ImageGrabberThread(IMFMediaSource *pSource, unsigned int deviceID, bool synchronious): + igt_func(NULL), + igt_Handle(NULL), + igt_stop(false) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); - HRESULT hr = ImageGrabber::CreateInstance(&igt_pImageGrabber, deviceID); + HRESULT hr = ImageGrabber::CreateInstance(&igt_pImageGrabber, deviceID, synchronious); igt_DeviceID = deviceID; if(SUCCEEDED(hr)) { @@ -3057,7 +3075,7 @@ bool CvCaptureFile_MSMF::open(const char* filename) if (SUCCEEDED(hr)) { - hr = ImageGrabberThread::CreateInstance(&grabberThread, videoFileSource, -2); + hr = ImageGrabberThread::CreateInstance(&grabberThread, videoFileSource, -2, true); } if (SUCCEEDED(hr)) @@ -3278,7 +3296,7 @@ IplImage* CvCaptureFile_MSMF::retrieveFrame(int) if(RIOut && size == RIOut->getSize()) { - processPixels(RIOut->getpPixels(), (unsigned char*)frame->imageData, width, height, bytes, true, false); + processPixels(RIOut->getpPixels(), (unsigned char*)frame->imageData, width, height, bytes, false, false); } return frame; From 0c9d776083db4579404e5f1b5cb9b1bee7582881 Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Wed, 15 May 2013 06:09:03 -0700 Subject: [PATCH 20/75] Media Foundation-based code refactoring. I* + SafeRelease -> ComPtr. --- modules/highgui/src/cap_msmf.cpp | 191 ++++++++++++++----------------- 1 file changed, 86 insertions(+), 105 deletions(-) diff --git a/modules/highgui/src/cap_msmf.cpp b/modules/highgui/src/cap_msmf.cpp index f5ea616bf..15f2bdcab 100644 --- a/modules/highgui/src/cap_msmf.cpp +++ b/modules/highgui/src/cap_msmf.cpp @@ -53,7 +53,7 @@ #include #include #include -#include "Strsafe.h" +#include #include #include #include @@ -72,6 +72,8 @@ #pragma comment(lib, "Mfreadwrite") #pragma comment(lib, "MinCore_Downlevel") +using namespace Microsoft::WRL; + struct IMFMediaType; struct IMFActivate; struct IMFMediaSource; @@ -894,12 +896,12 @@ ImageGrabber::~ImageGrabber(void) HRESULT ImageGrabber::initImageGrabber(IMFMediaSource *pSource, GUID VideoFormat) { - IMFActivate *pSinkActivate = NULL; - IMFMediaType *pType = NULL; - IMFPresentationDescriptor *pPD = NULL; - IMFStreamDescriptor *pSD = NULL; - IMFMediaTypeHandler *pHandler = NULL; - IMFMediaType *pCurrentType = NULL; + ComPtr pSinkActivate = NULL; + ComPtr pType = NULL; + ComPtr pPD = NULL; + ComPtr pSD = NULL; + ComPtr pHandler = NULL; + ComPtr pCurrentType = NULL; HRESULT hr = S_OK; MediaType MT; // Clean up. @@ -940,13 +942,9 @@ HRESULT ImageGrabber::initImageGrabber(IMFMediaSource *pSource, GUID VideoFormat printf("Error GetCurrentMediaType()\n"); goto err; } - MT = FormatReader::Read(pCurrentType); + MT = FormatReader::Read(pCurrentType.Get()); } err: - SafeRelease(&pPD); - SafeRelease(&pSD); - SafeRelease(&pHandler); - SafeRelease(&pCurrentType); unsigned int sizeRawImage = 0; if(VideoFormat == MFVideoFormat_RGB24) { @@ -966,17 +964,17 @@ err: // Configure the media type that the Sample Grabber will receive. // Setting the major and subtype is usually enough for the topology loader // to resolve the topology. - CHECK_HR(hr = MFCreateMediaType(&pType)); + CHECK_HR(hr = MFCreateMediaType(pType.GetAddressOf())); CHECK_HR(hr = pType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video)); CHECK_HR(hr = pType->SetGUID(MF_MT_SUBTYPE, VideoFormat)); // Create the sample grabber sink. - CHECK_HR(hr = MFCreateSampleGrabberSinkActivate(pType, this, &pSinkActivate)); + CHECK_HR(hr = MFCreateSampleGrabberSinkActivate(pType.Get(), this, pSinkActivate.GetAddressOf())); // To run as fast as possible, set this attribute (requires Windows 7): CHECK_HR(hr = pSinkActivate->SetUINT32(MF_SAMPLEGRABBERSINK_IGNORE_CLOCK, TRUE)); // Create the Media Session. CHECK_HR(hr = MFCreateMediaSession(NULL, &ig_pSession)); // Create the topology. - CHECK_HR(hr = CreateTopology(pSource, pSinkActivate, &ig_pTopology)); + CHECK_HR(hr = CreateTopology(pSource, pSinkActivate.Get(), &ig_pTopology)); done: // Clean up. if (FAILED(hr)) @@ -988,8 +986,7 @@ done: SafeRelease(&ig_pSession); SafeRelease(&ig_pTopology); } - SafeRelease(&pSinkActivate); - SafeRelease(&pType); + return hr; } @@ -1004,7 +1001,7 @@ void ImageGrabber::stopGrabbing() HRESULT ImageGrabber::startGrabbing(void) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); - IMFMediaEvent *pEvent = NULL; + ComPtr pEvent = NULL; PROPVARIANT var; PropVariantInit(&var); HRESULT hr = S_OK; @@ -1058,7 +1055,6 @@ HRESULT ImageGrabber::startGrabbing(void) DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: MEVideoCaptureDeviceRemoved \n", ig_DeviceID); break; } - SafeRelease(&pEvent); } if (ig_Synchronous) @@ -1069,7 +1065,6 @@ HRESULT ImageGrabber::startGrabbing(void) DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: Finish startGrabbing \n", ig_DeviceID); done: - SafeRelease(&pEvent); SafeRelease(&ig_pSession); SafeRelease(&ig_pTopology); @@ -1086,16 +1081,16 @@ void ImageGrabber::resumeGrabbing() HRESULT ImageGrabber::CreateTopology(IMFMediaSource *pSource, IMFActivate *pSinkActivate, IMFTopology **ppTopo) { - IMFTopology *pTopology = NULL; - IMFPresentationDescriptor *pPD = NULL; - IMFStreamDescriptor *pSD = NULL; - IMFMediaTypeHandler *pHandler = NULL; - IMFTopologyNode *pNode1 = NULL; - IMFTopologyNode *pNode2 = NULL; + ComPtr pTopology = NULL; + ComPtr pPD = NULL; + ComPtr pSD = NULL; + ComPtr pHandler = NULL; + ComPtr pNode1 = NULL; + ComPtr pNode2 = NULL; HRESULT hr = S_OK; DWORD cStreams = 0; - CHECK_HR(hr = MFCreateTopology(&pTopology)); - CHECK_HR(hr = pSource->CreatePresentationDescriptor(&pPD)); + CHECK_HR(hr = MFCreateTopology(pTopology.GetAddressOf())); + CHECK_HR(hr = pSource->CreatePresentationDescriptor(pPD.GetAddressOf())); CHECK_HR(hr = pPD->GetStreamDescriptorCount(&cStreams)); for (DWORD i = 0; i < cStreams; i++) { @@ -1107,29 +1102,20 @@ HRESULT ImageGrabber::CreateTopology(IMFMediaSource *pSource, IMFActivate *pSink CHECK_HR(hr = pHandler->GetMajorType(&majorType)); if (majorType == MFMediaType_Video && fSelected) { - CHECK_HR(hr = AddSourceNode(pTopology, pSource, pPD, pSD, &pNode1)); - CHECK_HR(hr = AddOutputNode(pTopology, pSinkActivate, 0, &pNode2)); - CHECK_HR(hr = pNode1->ConnectOutput(0, pNode2, 0)); + CHECK_HR(hr = AddSourceNode(pTopology.Get(), pSource, pPD.Get(), pSD.Get(), pNode1.GetAddressOf())); + CHECK_HR(hr = AddOutputNode(pTopology.Get(), pSinkActivate, 0, pNode2.GetAddressOf())); + CHECK_HR(hr = pNode1->ConnectOutput(0, pNode2.Get(), 0)); break; } else { CHECK_HR(hr = pPD->DeselectStream(i)); } - SafeRelease(&pSD); - SafeRelease(&pHandler); } - *ppTopo = pTopology; + *ppTopo = pTopology.Get(); (*ppTopo)->AddRef(); done: - SafeRelease(&pTopology); - SafeRelease(&pNode1); - SafeRelease(&pNode2); - SafeRelease(&pPD); - SafeRelease(&pSD); - SafeRelease(&pHandler); - return hr; } @@ -1140,20 +1126,18 @@ HRESULT ImageGrabber::AddSourceNode( IMFStreamDescriptor *pSD, // Stream descriptor. IMFTopologyNode **ppNode) // Receives the node pointer. { - IMFTopologyNode *pNode = NULL; + ComPtr pNode = NULL; HRESULT hr = S_OK; - CHECK_HR(hr = MFCreateTopologyNode(MF_TOPOLOGY_SOURCESTREAM_NODE, &pNode)); + CHECK_HR(hr = MFCreateTopologyNode(MF_TOPOLOGY_SOURCESTREAM_NODE, pNode.GetAddressOf())); CHECK_HR(hr = pNode->SetUnknown(MF_TOPONODE_SOURCE, pSource)); CHECK_HR(hr = pNode->SetUnknown(MF_TOPONODE_PRESENTATION_DESCRIPTOR, pPD)); CHECK_HR(hr = pNode->SetUnknown(MF_TOPONODE_STREAM_DESCRIPTOR, pSD)); - CHECK_HR(hr = pTopology->AddNode(pNode)); + CHECK_HR(hr = pTopology->AddNode(pNode.Get())); // Return the pointer to the caller. - *ppNode = pNode; + *ppNode = pNode.Get(); (*ppNode)->AddRef(); done: - SafeRelease(&pNode); - return hr; } @@ -1163,20 +1147,18 @@ HRESULT ImageGrabber::AddOutputNode( DWORD dwId, // Identifier of the stream sink. IMFTopologyNode **ppNode) // Receives the node pointer. { - IMFTopologyNode *pNode = NULL; + ComPtr pNode = NULL; HRESULT hr = S_OK; - CHECK_HR(hr = MFCreateTopologyNode(MF_TOPOLOGY_OUTPUT_NODE, &pNode)); + CHECK_HR(hr = MFCreateTopologyNode(MF_TOPOLOGY_OUTPUT_NODE, pNode.GetAddressOf())); CHECK_HR(hr = pNode->SetObject(pActivate)); CHECK_HR(hr = pNode->SetUINT32(MF_TOPONODE_STREAMID, dwId)); CHECK_HR(hr = pNode->SetUINT32(MF_TOPONODE_NOSHUTDOWN_ON_REMOVE, FALSE)); - CHECK_HR(hr = pTopology->AddNode(pNode)); + CHECK_HR(hr = pTopology->AddNode(pNode.Get())); // Return the pointer to the caller. - *ppNode = pNode; + *ppNode = pNode.Get(); (*ppNode)->AddRef(); done: - SafeRelease(&pNode); - return hr; } @@ -1457,9 +1439,9 @@ Media_Foundation::~Media_Foundation(void) bool Media_Foundation::buildListOfDevices() { HRESULT hr = S_OK; - IMFAttributes *pAttributes = NULL; + ComPtr pAttributes = NULL; CoInitialize(NULL); - hr = MFCreateAttributes(&pAttributes, 1); + hr = MFCreateAttributes(pAttributes.GetAddressOf(), 1); if (SUCCEEDED(hr)) { hr = pAttributes->SetGUID( @@ -1470,14 +1452,14 @@ bool Media_Foundation::buildListOfDevices() if (SUCCEEDED(hr)) { videoDevices *vDs = &videoDevices::getInstance(); - hr = vDs->initDevices(pAttributes); + hr = vDs->initDevices(pAttributes.Get()); } else { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); DPO->printOut(L"MEDIA FOUNDATION: The access to the video cameras denied\n"); } - SafeRelease(&pAttributes); + return (SUCCEEDED(hr)); } @@ -1721,14 +1703,15 @@ long videoDevice::checkDevice(IMFAttributes *pAttributes, IMFActivate **pDevice) } return hr; } + long videoDevice::initDevice() { HRESULT hr = -1; - IMFAttributes *pAttributes = NULL; - IMFActivate * vd_pActivate= NULL; + ComPtr pAttributes = NULL; + IMFActivate *vd_pActivate = NULL; DebugPrintOut *DPO = &DebugPrintOut::getInstance(); CoInitialize(NULL); - hr = MFCreateAttributes(&pAttributes, 1); + hr = MFCreateAttributes(pAttributes.GetAddressOf(), 1); if (SUCCEEDED(hr)) { hr = pAttributes->SetGUID( @@ -1738,7 +1721,7 @@ long videoDevice::initDevice() } if (SUCCEEDED(hr)) { - hr = checkDevice(pAttributes, &vd_pActivate); + hr = checkDevice(pAttributes.Get(), &vd_pActivate); if (SUCCEEDED(hr) && vd_pActivate) { SafeRelease(&vd_pSource); @@ -1760,9 +1743,10 @@ long videoDevice::initDevice() { DPO->printOut(L"VIDEODEVICE %i: The attribute of video cameras cannot be getting \n", vd_CurrentNumber); } - SafeRelease(&pAttributes); + return hr; } + MediaType videoDevice::getFormat(unsigned int id) { if(id < vd_CurrentFormats.size()) @@ -1887,45 +1871,45 @@ void videoDevice::buildLibraryofTypes() count++; } } + long videoDevice::setDeviceFormat(IMFMediaSource *pSource, unsigned long dwFormatIndex) { - IMFPresentationDescriptor *pPD = NULL; - IMFStreamDescriptor *pSD = NULL; - IMFMediaTypeHandler *pHandler = NULL; - IMFMediaType *pType = NULL; - HRESULT hr = pSource->CreatePresentationDescriptor(&pPD); + ComPtr pPD = NULL; + ComPtr pSD = NULL; + ComPtr pHandler = NULL; + ComPtr pType = NULL; + HRESULT hr = pSource->CreatePresentationDescriptor(pPD.GetAddressOf()); if (FAILED(hr)) { goto done; } BOOL fSelected; - hr = pPD->GetStreamDescriptorByIndex(0, &fSelected, &pSD); + hr = pPD->GetStreamDescriptorByIndex(0, &fSelected, pSD.GetAddressOf()); if (FAILED(hr)) { goto done; } - hr = pSD->GetMediaTypeHandler(&pHandler); + hr = pSD->GetMediaTypeHandler(pHandler.GetAddressOf()); if (FAILED(hr)) { goto done; } - hr = pHandler->GetMediaTypeByIndex((DWORD)dwFormatIndex, &pType); + hr = pHandler->GetMediaTypeByIndex((DWORD)dwFormatIndex, pType.GetAddressOf()); if (FAILED(hr)) { goto done; } - hr = pHandler->SetCurrentMediaType(pType); + hr = pHandler->SetCurrentMediaType(pType.Get()); + done: - SafeRelease(&pPD); - SafeRelease(&pSD); - SafeRelease(&pHandler); - SafeRelease(&pType); return hr; } + bool videoDevice::isDeviceSetup() { return vd_IsSetuped; } + RawImage * videoDevice::getRawImageOut() { if(!vd_IsSetuped) return NULL; @@ -1938,6 +1922,7 @@ RawImage * videoDevice::getRawImageOut() } return NULL; } + bool videoDevice::isFrameNew() { if(!vd_IsSetuped) return false; @@ -1962,16 +1947,19 @@ bool videoDevice::isFrameNew() } return false; } + bool videoDevice::isDeviceMediaSource() { if(vd_LockOut == MediaSourceLock) return true; return false; } + bool videoDevice::isDeviceRawDataSource() { if(vd_LockOut == RawDataLock) return true; return false; } + bool videoDevice::setupDevice(unsigned int id) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); @@ -2002,15 +1990,18 @@ bool videoDevice::setupDevice(unsigned int id) return false; } } + bool videoDevice::setupDevice(unsigned int w, unsigned int h, unsigned int idealFramerate) { unsigned int id = findType(w * h, idealFramerate); return setupDevice(id); } + wchar_t *videoDevice::getName() { return vd_pFriendlyName; } + videoDevice::~videoDevice(void) { closeDevice(); @@ -2018,24 +2009,25 @@ videoDevice::~videoDevice(void) if(vd_pFriendlyName) CoTaskMemFree(vd_pFriendlyName); } + long videoDevice::enumerateCaptureFormats(IMFMediaSource *pSource) { - IMFPresentationDescriptor *pPD = NULL; - IMFStreamDescriptor *pSD = NULL; - IMFMediaTypeHandler *pHandler = NULL; - IMFMediaType *pType = NULL; - HRESULT hr = pSource->CreatePresentationDescriptor(&pPD); + ComPtr pPD = NULL; + ComPtr pSD = NULL; + ComPtr pHandler = NULL; + ComPtr pType = NULL; + HRESULT hr = pSource->CreatePresentationDescriptor(pPD.GetAddressOf()); if (FAILED(hr)) { goto done; } BOOL fSelected; - hr = pPD->GetStreamDescriptorByIndex(0, &fSelected, &pSD); + hr = pPD->GetStreamDescriptorByIndex(0, &fSelected, pSD.GetAddressOf()); if (FAILED(hr)) { goto done; } - hr = pSD->GetMediaTypeHandler(&pHandler); + hr = pSD->GetMediaTypeHandler(pHandler.GetAddressOf()); if (FAILED(hr)) { goto done; @@ -2048,20 +2040,16 @@ long videoDevice::enumerateCaptureFormats(IMFMediaSource *pSource) } for (DWORD i = 0; i < cTypes; i++) { - hr = pHandler->GetMediaTypeByIndex(i, &pType); + hr = pHandler->GetMediaTypeByIndex(i, pType.GetAddressOf()); if (FAILED(hr)) { goto done; } - MediaType MT = FormatReader::Read(pType); + MediaType MT = FormatReader::Read(pType.Get()); vd_CurrentFormats.push_back(MT); - SafeRelease(&pType); } + done: - SafeRelease(&pPD); - SafeRelease(&pSD); - SafeRelease(&pHandler); - SafeRelease(&pType); return hr; } @@ -3366,11 +3354,11 @@ void CvCaptureFile_MSMF::processPixels(unsigned char * src, unsigned char * dst, long CvCaptureFile_MSMF::enumerateCaptureFormats(IMFMediaSource *pSource) { - IMFPresentationDescriptor *pPD = NULL; - IMFStreamDescriptor *pSD = NULL; - IMFMediaTypeHandler *pHandler = NULL; - IMFMediaType *pType = NULL; - HRESULT hr = pSource->CreatePresentationDescriptor(&pPD); + ComPtr pPD = NULL; + ComPtr pSD = NULL; + ComPtr pHandler = NULL; + ComPtr pType = NULL; + HRESULT hr = pSource->CreatePresentationDescriptor(pPD.GetAddressOf()); if (FAILED(hr)) { goto done; @@ -3383,12 +3371,12 @@ long CvCaptureFile_MSMF::enumerateCaptureFormats(IMFMediaSource *pSource) printf("Stream count: %d\n", cnt); BOOL fSelected; - hr = pPD->GetStreamDescriptorByIndex(0, &fSelected, &pSD); + hr = pPD->GetStreamDescriptorByIndex(0, &fSelected, pSD.GetAddressOf()); if (FAILED(hr)) { goto done; } - hr = pSD->GetMediaTypeHandler(&pHandler); + hr = pSD->GetMediaTypeHandler(pHandler.GetAddressOf()); if (FAILED(hr)) { goto done; @@ -3401,21 +3389,16 @@ long CvCaptureFile_MSMF::enumerateCaptureFormats(IMFMediaSource *pSource) } for (DWORD i = 0; i < cTypes; i++) { - hr = pHandler->GetMediaTypeByIndex(i, &pType); + hr = pHandler->GetMediaTypeByIndex(i, pType.GetAddressOf()); if (FAILED(hr)) { goto done; } - MediaType MT = FormatReader::Read(pType); + MediaType MT = FormatReader::Read(pType.Get()); captureFormats.push_back(MT); - SafeRelease(&pType); } done: - SafeRelease(&pPD); - SafeRelease(&pSD); - SafeRelease(&pHandler); - SafeRelease(&pType); return hr; } @@ -3452,8 +3435,6 @@ CvCapture* cvCreateFileCapture_MSMF (const char* filename) // // -using namespace Microsoft::WRL; - class CvVideoWriter_MSMF : public CvVideoWriter { public: From 996f02a531318f5aa3004d876fb1b3f2af429e3b Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Sat, 18 May 2013 13:04:31 -0700 Subject: [PATCH 21/75] Multiple Media Foundation video i/o fixes. Video i/o tests enabled for media foundation; Negative stride support added to VideoCapture; Error handling improved, dead lock in case of playback error fixed; Some code refacotring done. --- modules/highgui/src/cap_msmf.cpp | 178 ++++++++++--------------- modules/highgui/test/test_precomp.hpp | 6 +- modules/highgui/test/test_video_io.cpp | 38 +++++- 3 files changed, 112 insertions(+), 110 deletions(-) diff --git a/modules/highgui/src/cap_msmf.cpp b/modules/highgui/src/cap_msmf.cpp index 15f2bdcab..814fb75be 100644 --- a/modules/highgui/src/cap_msmf.cpp +++ b/modules/highgui/src/cap_msmf.cpp @@ -54,7 +54,6 @@ #include #include #include -#include #include #include #include @@ -72,6 +71,8 @@ #pragma comment(lib, "Mfreadwrite") #pragma comment(lib, "MinCore_Downlevel") +// for ComPtr usage +#include using namespace Microsoft::WRL; struct IMFMediaType; @@ -112,7 +113,7 @@ struct MediaType unsigned int width; unsigned int MF_MT_YUV_MATRIX; unsigned int MF_MT_VIDEO_LIGHTING; - unsigned int MF_MT_DEFAULT_STRIDE; + int MF_MT_DEFAULT_STRIDE; // stride is negative if image is bottom-up unsigned int MF_MT_VIDEO_CHROMA_SITING; GUID MF_MT_AM_FORMAT_TYPE; wchar_t *pMF_MT_AM_FORMAT_TYPEName; @@ -226,6 +227,7 @@ private: DWORD dwSampleSize); STDMETHODIMP OnShutdown(); }; + /// Class for controlling of thread of the grabbing raw data from video device class ImageGrabberThread { @@ -249,6 +251,7 @@ private: bool igt_stop; unsigned int igt_DeviceID; }; + // Structure for collecting info about one parametr of current video device struct Parametr { @@ -260,6 +263,7 @@ struct Parametr long Flag; Parametr(); }; + // Structure for collecting info about 17 parametrs of current video device struct CamParametrs { @@ -281,11 +285,13 @@ struct CamParametrs Parametr Iris; Parametr Focus; }; + typedef std::wstring String; typedef std::vector vectorNum; typedef std::map SUBTYPEMap; typedef std::map FrameRateMap; typedef void(*emergensyStopEventCallback)(int, void *); + /// Class for controlling of video device class videoDevice { @@ -329,7 +335,7 @@ private: IMFMediaSource *vd_pSource; emergensyStopEventCallback vd_func; void *vd_userData; - long enumerateCaptureFormats(IMFMediaSource *pSource); + HRESULT enumerateCaptureFormats(IMFMediaSource *pSource); long setDeviceFormat(IMFMediaSource *pSource, unsigned long dwFormatIndex); void buildLibraryofTypes(); int findType(unsigned int size, unsigned int frameRate = 0); @@ -337,6 +343,7 @@ private: long initDevice(); long checkDevice(IMFAttributes *pAttributes, IMFActivate **pDevice); }; + /// Class for managing of list of video devices class videoDevices { @@ -352,6 +359,7 @@ private: std::vector vds_Devices; videoDevices(void); }; + // Class for creating of Media Foundation context class Media_Foundation { @@ -362,6 +370,7 @@ public: private: Media_Foundation(void); }; + /// The only visiable class for controlling of video devices in format singelton class videoInput { @@ -411,23 +420,27 @@ public: bool isFrameNew(int deviceID); // Writing of Raw Data pixels from video device with deviceID with correction of RedAndBlue flipping flipRedAndBlue and vertical flipping flipImage bool getPixels(int deviceID, unsigned char * pixels, bool flipRedAndBlue = false, bool flipImage = false); + static void processPixels(unsigned char * src, unsigned char * dst, unsigned int width, unsigned int height, unsigned int bpp, bool bRGB, bool bFlip); private: bool accessToDevices; videoInput(void); - void processPixels(unsigned char * src, unsigned char * dst, unsigned int width, unsigned int height, unsigned int bpp, bool bRGB, bool bFlip); void updateListOfDevices(); }; + DebugPrintOut::DebugPrintOut(void):verbose(true) { } + DebugPrintOut::~DebugPrintOut(void) { } + DebugPrintOut& DebugPrintOut::getInstance() { static DebugPrintOut instance; return instance; } + void DebugPrintOut::printOut(const wchar_t *format, ...) { if(verbose) @@ -448,14 +461,17 @@ void DebugPrintOut::printOut(const wchar_t *format, ...) va_end (args); } } + void DebugPrintOut::setVerbose(bool state) { verbose = state; } + LPCWSTR GetGUIDNameConstNew(const GUID& guid); HRESULT GetGUIDNameNew(const GUID& guid, WCHAR **ppwsz); HRESULT LogAttributeValueByIndexNew(IMFAttributes *pAttr, DWORD index); HRESULT SpecialCaseAttributeValueNew(GUID guid, const PROPVARIANT& var, MediaType &out); + unsigned int *GetParametr(GUID guid, MediaType &out) { if(guid == MF_MT_YUV_MATRIX) @@ -463,7 +479,7 @@ unsigned int *GetParametr(GUID guid, MediaType &out) if(guid == MF_MT_VIDEO_LIGHTING) return &(out.MF_MT_VIDEO_LIGHTING); if(guid == MF_MT_DEFAULT_STRIDE) - return &(out.MF_MT_DEFAULT_STRIDE); + return (unsigned int*)&(out.MF_MT_DEFAULT_STRIDE); if(guid == MF_MT_VIDEO_CHROMA_SITING) return &(out.MF_MT_VIDEO_CHROMA_SITING); if(guid == MF_MT_VIDEO_NOMINAL_RANGE) @@ -480,6 +496,7 @@ unsigned int *GetParametr(GUID guid, MediaType &out) return &(out.MF_MT_INTERLACE_MODE); return NULL; } + HRESULT LogAttributeValueByIndexNew(IMFAttributes *pAttr, DWORD index, MediaType &out) { WCHAR *pGuidName = NULL; @@ -566,6 +583,7 @@ done: PropVariantClear(&var); return hr; } + HRESULT GetGUIDNameNew(const GUID& guid, WCHAR **ppwsz) { HRESULT hr = S_OK; @@ -625,6 +643,10 @@ HRESULT LogVideoAreaNew(const PROPVARIANT& var) } HRESULT SpecialCaseAttributeValueNew(GUID guid, const PROPVARIANT& var, MediaType &out) { + if (guid == MF_MT_DEFAULT_STRIDE) + { + out.MF_MT_DEFAULT_STRIDE = var.intVal; + } else if (guid == MF_MT_FRAME_SIZE) { UINT32 uHigh = 0, uLow = 0; @@ -1039,6 +1061,7 @@ HRESULT ImageGrabber::startGrabbing(void) hr = S_OK; goto done; } + printf("media foundation event: %d\n", met); if (met == MESessionEnded) { DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: MESessionEnded \n", ig_DeviceID); @@ -1055,16 +1078,21 @@ HRESULT ImageGrabber::startGrabbing(void) DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: MEVideoCaptureDeviceRemoved \n", ig_DeviceID); break; } + if ((met == MEError) || (met == MENonFatalError)) + { + pEvent->GetStatus(&hrStatus); + DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: MEError | MENonFatalError: %u\n", ig_DeviceID, hrStatus); + break; + } } + DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: Finish startGrabbing \n", ig_DeviceID); +done: if (ig_Synchronous) { SetEvent(ig_hFinish); } - DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: Finish startGrabbing \n", ig_DeviceID); - -done: SafeRelease(&ig_pSession); SafeRelease(&ig_pTopology); @@ -2010,7 +2038,7 @@ videoDevice::~videoDevice(void) CoTaskMemFree(vd_pFriendlyName); } -long videoDevice::enumerateCaptureFormats(IMFMediaSource *pSource) +HRESULT videoDevice::enumerateCaptureFormats(IMFMediaSource *pSource) { ComPtr pPD = NULL; ComPtr pSD = NULL; @@ -3002,9 +3030,7 @@ protected: IplImage* frame; bool isOpened; - long enumerateCaptureFormats(IMFMediaSource *pSource); - void processPixels(unsigned char * src, unsigned char * dst, unsigned int width, - unsigned int height, unsigned int bpp, bool bRGB, bool bFlip); + HRESULT enumerateCaptureFormats(IMFMediaSource *pSource); }; CvCaptureFile_MSMF::CvCaptureFile_MSMF(): @@ -3034,10 +3060,10 @@ bool CvCaptureFile_MSMF::open(const char* filename) MF_OBJECT_TYPE ObjectType = MF_OBJECT_INVALID; - IMFSourceResolver* pSourceResolver = NULL; + ComPtr pSourceResolver = NULL; IUnknown* pUnkSource = NULL; - hr = MFCreateSourceResolver(&pSourceResolver); + hr = MFCreateSourceResolver(pSourceResolver.GetAddressOf()); if (SUCCEEDED(hr)) { @@ -3056,10 +3082,12 @@ bool CvCaptureFile_MSMF::open(const char* filename) hr = pUnkSource->QueryInterface(IID_PPV_ARGS(&videoFileSource)); } - SafeRelease(&pSourceResolver); SafeRelease(&pUnkSource); - enumerateCaptureFormats(videoFileSource); + if (SUCCEEDED(hr)) + { + hr = enumerateCaptureFormats(videoFileSource); + } if (SUCCEEDED(hr)) { @@ -3071,9 +3099,9 @@ bool CvCaptureFile_MSMF::open(const char* filename) grabberThread->start(); } - isOpened = true; + isOpened = SUCCEEDED(hr); - return true; + return isOpened; } void CvCaptureFile_MSMF::close() @@ -3207,6 +3235,8 @@ double CvCaptureFile_MSMF::getProperty(int property_id) return captureFormats[captureFormatIndex].width; case CV_CAP_PROP_FRAME_HEIGHT: return captureFormats[captureFormatIndex].height; + case CV_CAP_PROP_FRAME_COUNT: + return 30; case CV_CAP_PROP_FOURCC: // FIXME: implement method in VideoInput back end //return VI.getFourcc(index); @@ -3282,77 +3312,18 @@ IplImage* CvCaptureFile_MSMF::retrieveFrame(int) RawImage *RIOut = grabberThread->getImageGrabber()->getRawImage(); unsigned int size = bytes * width * height; + bool verticalFlip = captureFormats[captureFormatIndex].MF_MT_DEFAULT_STRIDE < 0; + if(RIOut && size == RIOut->getSize()) { - processPixels(RIOut->getpPixels(), (unsigned char*)frame->imageData, width, height, bytes, false, false); + videoInput::processPixels(RIOut->getpPixels(), (unsigned char*)frame->imageData, width, + height, bytes, false, verticalFlip); } return frame; } -void CvCaptureFile_MSMF::processPixels(unsigned char * src, unsigned char * dst, unsigned int width, - unsigned int height, unsigned int bpp, bool bRGB, bool bFlip) -{ - unsigned int widthInBytes = width * bpp; - unsigned int numBytes = widthInBytes * height; - int *dstInt, *srcInt; - if(!bRGB) - { - if(bFlip) - { - for(unsigned int y = 0; y < height; y++) - { - dstInt = (int *)(dst + (y * widthInBytes)); - srcInt = (int *)(src + ( (height -y -1) * widthInBytes)); - memcpy(dstInt, srcInt, widthInBytes); - } - } - else - { - memcpy(dst, src, numBytes); - } - } - else - { - if(bFlip) - { - unsigned int x = 0; - unsigned int y = (height - 1) * widthInBytes; - src += y; - for(unsigned int i = 0; i < numBytes; i+=3) - { - if(x >= width) - { - x = 0; - src -= widthInBytes*2; - } - *dst = *(src+2); - dst++; - *dst = *(src+1); - dst++; - *dst = *src; - dst++; - src+=3; - x++; - } - } - else - { - for(unsigned int i = 0; i < numBytes; i+=3) - { - *dst = *(src+2); - dst++; - *dst = *(src+1); - dst++; - *dst = *src; - dst++; - src+=3; - } - } - } -} - -long CvCaptureFile_MSMF::enumerateCaptureFormats(IMFMediaSource *pSource) +HRESULT CvCaptureFile_MSMF::enumerateCaptureFormats(IMFMediaSource *pSource) { ComPtr pPD = NULL; ComPtr pSD = NULL; @@ -3364,12 +3335,6 @@ long CvCaptureFile_MSMF::enumerateCaptureFormats(IMFMediaSource *pSource) goto done; } - DWORD cnt; - - pPD->GetStreamDescriptorCount(&cnt); - - printf("Stream count: %d\n", cnt); - BOOL fSelected; hr = pPD->GetStreamDescriptorByIndex(0, &fSelected, pSD.GetAddressOf()); if (FAILED(hr)) @@ -3423,10 +3388,21 @@ CvCapture* cvCreateCameraCapture_MSMF( int index ) CvCapture* cvCreateFileCapture_MSMF (const char* filename) { CvCaptureFile_MSMF* capture = new CvCaptureFile_MSMF; - if( capture->open(filename) ) - return capture; - delete capture; - return 0; + try + { + if( capture->open(filename) ) + return capture; + else + { + delete capture; + return NULL; + } + } + catch(...) + { + delete capture; + throw; + } } // @@ -3440,10 +3416,10 @@ class CvVideoWriter_MSMF : public CvVideoWriter public: CvVideoWriter_MSMF(); virtual ~CvVideoWriter_MSMF(); - virtual bool open( const char* filename, int fourcc, - double fps, CvSize frameSize, bool isColor ); + virtual bool open(const char* filename, int fourcc, + double fps, CvSize frameSize, bool isColor); virtual void close(); - virtual bool writeFrame( const IplImage* img); + virtual bool writeFrame(const IplImage* img); private: UINT32 videoWidth; @@ -3533,7 +3509,7 @@ bool CvVideoWriter_MSMF::open( const char* filename, int fourcc, videoWidth = frameSize.width; videoHeight = frameSize.height; fps = _fps; - bitRate = videoWidth*videoHeight; // 1-bit per pixel + bitRate = fps*videoWidth*videoHeight; // 1-bit per pixel encodingFormat = FourCC2GUID(fourcc); inputFormat = MFVideoFormat_RGB32; @@ -3596,14 +3572,7 @@ bool CvVideoWriter_MSMF::writeFrame(const IplImage* img) BYTE g = rowStart[colIdx * img->nChannels + 1]; BYTE r = rowStart[colIdx * img->nChannels + 2]; - // On ARM devices data is stored starting from the last line - // (and not the first line) so you have to revert them on the Y axis -#if _M_ARM - int targetRow = videoHeight - rowIdx - 1; - target[(targetRow * videoWidth) + colIdx] = (r << 16) + (g << 8) + b; -#else target[rowIdx*img->width+colIdx] = (r << 16) + (g << 8) + b; -#endif } } @@ -3677,6 +3646,7 @@ HRESULT CvVideoWriter_MSMF::InitializeSinkWriter(const char* filename) { hr = MFSetAttributeRatio(mediaTypeOut.Get(), MF_MT_PIXEL_ASPECT_RATIO, 1, 1); } + if (SUCCEEDED(hr)) { hr = sinkWriter->AddStream(mediaTypeOut.Get(), &streamIndex); @@ -3711,7 +3681,7 @@ HRESULT CvVideoWriter_MSMF::InitializeSinkWriter(const char* filename) { hr = MFSetAttributeRatio(mediaTypeIn.Get(), MF_MT_PIXEL_ASPECT_RATIO, 1, 1); } - if (SUCCEEDED(hr)) + if (SUCCEEDED(hr)) { hr = sinkWriter->SetInputMediaType(streamIndex, mediaTypeIn.Get(), NULL); } diff --git a/modules/highgui/test/test_precomp.hpp b/modules/highgui/test/test_precomp.hpp index 0d0bd8022..be06c0643 100644 --- a/modules/highgui/test/test_precomp.hpp +++ b/modules/highgui/test/test_precomp.hpp @@ -47,7 +47,8 @@ defined(HAVE_QUICKTIME) || \ defined(HAVE_AVFOUNDATION) || \ /*defined(HAVE_OPENNI) || too specialized */ \ - defined(HAVE_FFMPEG) + defined(HAVE_FFMPEG) || \ + defined(HAVE_MSMF) # define BUILD_WITH_VIDEO_INPUT_SUPPORT 1 #else # define BUILD_WITH_VIDEO_INPUT_SUPPORT 0 @@ -57,7 +58,8 @@ defined(HAVE_GSTREAMER) || \ defined(HAVE_QUICKTIME) || \ defined(HAVE_AVFOUNDATION) || \ - defined(HAVE_FFMPEG) + defined(HAVE_FFMPEG) || \ + defined(HAVE_MSMF) # define BUILD_WITH_VIDEO_OUTPUT_SUPPORT 1 #else # define BUILD_WITH_VIDEO_OUTPUT_SUPPORT 0 diff --git a/modules/highgui/test/test_video_io.cpp b/modules/highgui/test/test_video_io.cpp index b0c2e53ba..34ec0bdd8 100644 --- a/modules/highgui/test/test_video_io.cpp +++ b/modules/highgui/test/test_video_io.cpp @@ -54,6 +54,33 @@ string fourccToString(int fourcc) return format("%c%c%c%c", fourcc & 255, (fourcc >> 8) & 255, (fourcc >> 16) & 255, (fourcc >> 24) & 255); } +#ifdef HAVE_MSMF +const VideoFormat g_specific_fmt_list[] = +{ +/* VideoFormat("avi", 'dv25'), + VideoFormat("avi", 'dv50'), + VideoFormat("avi", 'dvc '), + VideoFormat("avi", 'dvh1'), + VideoFormat("avi", 'dvhd'), + VideoFormat("avi", 'dvsd'), + VideoFormat("avi", 'dvsl'), + VideoFormat("avi", 'M4S2'), */ + VideoFormat("wmv", 'WMV3'), + // VideoFormat("avi", 'H264'), + // VideoFormat("avi", 'MJPG'), + // VideoFormat("avi", 'MP43'), + // VideoFormat("avi", 'MP4S'), + // VideoFormat("avi", 'MP4V'), +/* VideoFormat("avi", 'MPG1'), + VideoFormat("avi", 'MSS1'), + VideoFormat("avi", 'MSS2'), + VideoFormat("avi", 'WMV1'), + VideoFormat("avi", 'WMV2'), + VideoFormat("avi", 'WMV3'), + VideoFormat("avi", 'WVC1'), */ + VideoFormat() +}; +#else const VideoFormat g_specific_fmt_list[] = { VideoFormat("avi", CV_FOURCC('X', 'V', 'I', 'D')), @@ -63,17 +90,17 @@ const VideoFormat g_specific_fmt_list[] = VideoFormat("mkv", CV_FOURCC('X', 'V', 'I', 'D')), VideoFormat("mkv", CV_FOURCC('M', 'P', 'E', 'G')), VideoFormat("mkv", CV_FOURCC('M', 'J', 'P', 'G')), - VideoFormat("mov", CV_FOURCC('m', 'p', '4', 'v')), VideoFormat() }; +#endif } class CV_HighGuiTest : public cvtest::BaseTest { protected: - void ImageTest(const string& dir); + void ImageTest (const string& dir); void VideoTest (const string& dir, const cvtest::VideoFormat& fmt); void SpecificImageTest (const string& dir); void SpecificVideoTest (const string& dir, const cvtest::VideoFormat& fmt); @@ -291,8 +318,11 @@ void CV_HighGuiTest::VideoTest(const string& dir, const cvtest::VideoFormat& fmt if (psnr < thresDbell) { printf("Too low psnr = %gdb\n", psnr); - // imwrite("img.png", img); - // imwrite("img1.png", img1); + //imwrite("original.png", img); + //imwrite("after_test.png", img1); + //Mat diff; + //absdiff(img, img1, diff); + //imwrite("diff.png", diff); ts->set_failed_test_info(ts->FAIL_MISMATCH); break; } From bb9a0b725341d939ba08808e495b836d3ba41834 Mon Sep 17 00:00:00 2001 From: Alexander Shishkov Date: Mon, 24 Jun 2013 14:52:17 +0400 Subject: [PATCH 22/75] Update Info.plist.in --- platforms/ios/Info.plist.in | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/platforms/ios/Info.plist.in b/platforms/ios/Info.plist.in index 012de8856..6bcfe862d 100644 --- a/platforms/ios/Info.plist.in +++ b/platforms/ios/Info.plist.in @@ -5,7 +5,7 @@ CFBundleName OpenCV CFBundleIdentifier - opencv.org + org.opencv CFBundleVersion ${VERSION} CFBundleShortVersionString From 6db776f957253d1e484bba8b05afd5a9a8f415a1 Mon Sep 17 00:00:00 2001 From: yao Date: Tue, 25 Jun 2013 14:11:28 +0800 Subject: [PATCH 23/75] add "-c" for cpu ocl mode in perf tests --- modules/ocl/perf/main.cpp | 61 ++++++++++++++++++++------------------- 1 file changed, 32 insertions(+), 29 deletions(-) diff --git a/modules/ocl/perf/main.cpp b/modules/ocl/perf/main.cpp index dfcac20bc..bd2a4ec4b 100644 --- a/modules/ocl/perf/main.cpp +++ b/modules/ocl/perf/main.cpp @@ -44,43 +44,21 @@ int main(int argc, const char *argv[]) { - vector oclinfo; - int num_devices = getDevice(oclinfo); - - if (num_devices < 1) - { - cerr << "no device found\n"; - return -1; - } - // set this to overwrite binary cache every time the test starts - ocl::setBinaryDiskCache(ocl::CACHE_UPDATE); - - int devidx = 0; - - for (size_t i = 0; i < oclinfo.size(); i++) - { - for (size_t j = 0; j < oclinfo[i].DeviceName.size(); j++) - { - printf("device %d: %s\n", devidx++, oclinfo[i].DeviceName[j].c_str()); - } - } - - redirectError(cvErrorCallback); - const char *keys = "{ h | help | false | print help message }" "{ f | filter | | filter for test }" "{ w | workdir | | set working directory }" "{ l | list | false | show all tests }" "{ d | device | 0 | device id }" + "{ c | cpu_ocl | false | use cpu as ocl device}" "{ i | iters | 10 | iteration count }" "{ m | warmup | 1 | gpu warm up iteration count}" - "{ t | xtop | 1.1 | xfactor top boundary}" - "{ b | xbottom | 0.9 | xfactor bottom boundary}" + "{ t | xtop | 1.1 | xfactor top boundary}" + "{ b | xbottom | 0.9 | xfactor bottom boundary}" "{ v | verify | false | only run gpu once to verify if problems occur}"; + redirectError(cvErrorCallback); CommandLineParser cmd(argc, argv, keys); - if (cmd.get("help")) { cout << "Avaible options:" << endl; @@ -88,14 +66,40 @@ int main(int argc, const char *argv[]) return 0; } - int device = cmd.get("device"); + // get ocl devices + bool use_cpu = cmd.get("c"); + vector oclinfo; + int num_devices = 0; + if(use_cpu) + num_devices = getDevice(oclinfo, ocl::CVCL_DEVICE_TYPE_CPU); + else + num_devices = getDevice(oclinfo); + if (num_devices < 1) + { + cerr << "no device found\n"; + return -1; + } + // show device info + int devidx = 0; + for (size_t i = 0; i < oclinfo.size(); i++) + { + for (size_t j = 0; j < oclinfo[i].DeviceName.size(); j++) + { + cout << "device " << devidx++ << ": " << oclinfo[i].DeviceName[j] << endl; + } + } + + int device = cmd.get("device"); if (device < 0 || device >= num_devices) { cerr << "Invalid device ID" << endl; return -1; } + // set this to overwrite binary cache every time the test starts + ocl::setBinaryDiskCache(ocl::CACHE_UPDATE); + if (cmd.get("verify")) { TestSystem::instance().setNumIters(1); @@ -104,7 +108,6 @@ int main(int argc, const char *argv[]) } devidx = 0; - for (size_t i = 0; i < oclinfo.size(); i++) { for (size_t j = 0; j < oclinfo[i].DeviceName.size(); j++, devidx++) @@ -113,7 +116,7 @@ int main(int argc, const char *argv[]) { ocl::setDevice(oclinfo[i], (int)j); TestSystem::instance().setRecordName(oclinfo[i].DeviceName[j]); - printf("\nuse %d: %s\n", devidx, oclinfo[i].DeviceName[j].c_str()); + cout << "use " << devidx << ": " < Date: Tue, 25 Jun 2013 14:12:02 +0800 Subject: [PATCH 24/75] fix stereobm crash on some cpu ocl --- modules/ocl/src/opencl/stereobm.cl | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/modules/ocl/src/opencl/stereobm.cl b/modules/ocl/src/opencl/stereobm.cl index bd86a7f3f..552874d42 100644 --- a/modules/ocl/src/opencl/stereobm.cl +++ b/modules/ocl/src/opencl/stereobm.cl @@ -162,8 +162,8 @@ __kernel void stereoKernel(__global unsigned char *left, __global unsigned char int y_tex; int x_tex = X - radius; - if (x_tex >= cwidth) - return; + //if (x_tex >= cwidth) + // return; for(int d = STEREO_MIND; d < maxdisp; d += STEREO_DISP_STEP) { From 1227e00f3d03daed6a96ff52c32e3051b5114782 Mon Sep 17 00:00:00 2001 From: yao Date: Tue, 25 Jun 2013 16:26:33 +0800 Subject: [PATCH 25/75] fix moments --- modules/ocl/src/moments.cpp | 43 ++- modules/ocl/src/opencl/moments.cl | 536 +++++++++++++++--------------- modules/ocl/test/test_moments.cpp | 8 +- 3 files changed, 290 insertions(+), 297 deletions(-) diff --git a/modules/ocl/src/moments.cpp b/modules/ocl/src/moments.cpp index d6baba207..cb16fb136 100644 --- a/modules/ocl/src/moments.cpp +++ b/modules/ocl/src/moments.cpp @@ -16,7 +16,7 @@ // Third party copyrights are property of their respective owners. // // @Authors -// Sen Liu, sen@multicorewareinc.com +// Sen Liu, swjtuls1987@126.com // // Redistribution and use in source and binary forms, with or without modification, // are permitted provided that the following conditions are met: @@ -277,8 +277,8 @@ static void ocl_cvMoments( const void* array, CvMoments* mom, int binary ) blocky = size.height/TILE_SIZE; else blocky = size.height/TILE_SIZE + 1; - cv::ocl::oclMat dst_m(blocky * 10, blockx, CV_64FC1); - cl_mem sum = openCLCreateBuffer(src.clCxt,CL_MEM_READ_WRITE,10*sizeof(double)); + oclMat dst_m(blocky * 10, blockx, CV_64FC1); + oclMat sum(1, 10, CV_64FC1); int tile_width = std::min(size.width,TILE_SIZE); int tile_height = std::min(size.height,TILE_SIZE); size_t localThreads[3] = { tile_height, 1, 1}; @@ -288,19 +288,16 @@ static void ocl_cvMoments( const void* array, CvMoments* mom, int binary ) args.push_back( make_pair( sizeof(cl_int) , (void *)&src.rows )); args.push_back( make_pair( sizeof(cl_int) , (void *)&src.cols )); args.push_back( make_pair( sizeof(cl_int) , (void *)&src.step )); - args.push_back( make_pair( sizeof(cl_int) , (void *)&tileSize.width )); - args.push_back( make_pair( sizeof(cl_int) , (void *)&tileSize.height )); args.push_back( make_pair( sizeof(cl_mem) , (void *)&dst_m.data )); args.push_back( make_pair( sizeof(cl_int) , (void *)&dst_m.cols )); args.push_back( make_pair( sizeof(cl_int) , (void *)&dst_m.step )); args.push_back( make_pair( sizeof(cl_int) , (void *)&blocky )); - args.push_back( make_pair( sizeof(cl_int) , (void *)&type )); args.push_back( make_pair( sizeof(cl_int) , (void *)&depth )); args.push_back( make_pair( sizeof(cl_int) , (void *)&cn )); args.push_back( make_pair( sizeof(cl_int) , (void *)&coi )); args.push_back( make_pair( sizeof(cl_int) , (void *)&binary )); args.push_back( make_pair( sizeof(cl_int) , (void *)&TILE_SIZE )); - openCLExecuteKernel(dst_m.clCxt, &moments, "CvMoments", globalThreads, localThreads, args, -1, depth); + openCLExecuteKernel(Context::getContext(), &moments, "CvMoments", globalThreads, localThreads, args, -1, depth); size_t localThreadss[3] = { 128, 1, 1}; size_t globalThreadss[3] = { 128, 1, 1}; @@ -309,25 +306,23 @@ static void ocl_cvMoments( const void* array, CvMoments* mom, int binary ) args_sum.push_back( make_pair( sizeof(cl_int) , (void *)&tile_height )); args_sum.push_back( make_pair( sizeof(cl_int) , (void *)&tile_width )); args_sum.push_back( make_pair( sizeof(cl_int) , (void *)&TILE_SIZE )); - args_sum.push_back( make_pair( sizeof(cl_mem) , (void *)&sum )); + args_sum.push_back( make_pair( sizeof(cl_mem) , (void *)&sum.data )); args_sum.push_back( make_pair( sizeof(cl_mem) , (void *)&dst_m.data )); args_sum.push_back( make_pair( sizeof(cl_int) , (void *)&dst_m.step )); - openCLExecuteKernel(dst_m.clCxt, &moments, "dst_sum", globalThreadss, localThreadss, args_sum, -1, -1); - double* dstsum = new double[10]; - memset(dstsum,0,10*sizeof(double)); - openCLReadBuffer(dst_m.clCxt,sum,(void *)dstsum,10*sizeof(double)); - mom->m00 = dstsum[0]; - mom->m10 = dstsum[1]; - mom->m01 = dstsum[2]; - mom->m20 = dstsum[3]; - mom->m11 = dstsum[4]; - mom->m02 = dstsum[5]; - mom->m30 = dstsum[6]; - mom->m21 = dstsum[7]; - mom->m12 = dstsum[8]; - mom->m03 = dstsum[9]; - delete [] dstsum; - openCLSafeCall(clReleaseMemObject(sum)); + openCLExecuteKernel(Context::getContext(), &moments, "dst_sum", globalThreadss, localThreadss, args_sum, -1, -1); + + Mat dstsum(sum); + mom->m00 = dstsum.at(0, 0); + mom->m10 = dstsum.at(0, 1); + mom->m01 = dstsum.at(0, 2); + mom->m20 = dstsum.at(0, 3); + mom->m11 = dstsum.at(0, 4); + mom->m02 = dstsum.at(0, 5); + mom->m30 = dstsum.at(0, 6); + mom->m21 = dstsum.at(0, 7); + mom->m12 = dstsum.at(0, 8); + mom->m03 = dstsum.at(0, 9); + icvCompleteMomentState( mom ); } diff --git a/modules/ocl/src/opencl/moments.cl b/modules/ocl/src/opencl/moments.cl index 2378f4f84..71313017a 100644 --- a/modules/ocl/src/opencl/moments.cl +++ b/modules/ocl/src/opencl/moments.cl @@ -173,10 +173,10 @@ __kernel void dst_sum(int src_rows, int src_cols, int tile_height, int tile_widt sum[i] = dst_sum[i][0]; } -__kernel void CvMoments_D0(__global uchar16* src_data, int src_rows, int src_cols, int src_step, int tileSize_width, int tileSize_height, +__kernel void CvMoments_D0(__global uchar16* src_data, int src_rows, int src_cols, int src_step, __global F* dst_m, int dst_cols, int dst_step, int blocky, - int type, int depth, int cn, int coi, int binary, int TILE_SIZE) + int depth, int cn, int coi, int binary, int TILE_SIZE) { uchar tmp_coi[16]; // get the coi data uchar16 tmp[16]; @@ -192,35 +192,43 @@ __kernel void CvMoments_D0(__global uchar16* src_data, int src_rows, int src_col int x = wgidx*TILE_SIZE; // vector length of uchar int kcn = (cn==2)?2:4; int rstep = min(src_step, TILE_SIZE); - tileSize_height = min(TILE_SIZE, src_rows - y); - tileSize_width = min(TILE_SIZE, src_cols - x); + int tileSize_height = min(TILE_SIZE, src_rows - y); + int tileSize_width = min(TILE_SIZE, src_cols - x); + + if ( y+lidy < src_rows ) + { + if( tileSize_width < TILE_SIZE ) + for(int i = tileSize_width; i < rstep && (x+i) < src_cols; i++ ) + *((__global uchar*)src_data+(y+lidy)*src_step+x+i) = 0; + + if( coi > 0 ) //channel of interest + for(int i = 0; i < tileSize_width; i += VLEN_C) + { + for(int j=0; j 0 ) //channel of interest - for(int i = 0; i < tileSize_width; i += VLEN_C) - { - for(int j=0; j TILE_SIZE && tileSize_width < TILE_SIZE) - for(int i=tileSize_width; i < rstep; i++ ) - *((__global ushort*)src_data+(y+lidy)*src_step/2+x+i) = 0; - if( coi > 0 ) - for(int i=0; i < tileSize_width; i+=VLEN_US) - { - for(int j=0; j TILE_SIZE && tileSize_width < TILE_SIZE) + for(int i=tileSize_width; i < rstep && (x+i) < src_cols; i++ ) + *((__global ushort*)src_data+(y+lidy)*src_step/2+x+i) = 0; + if( coi > 0 ) + for(int i=0; i < tileSize_width; i+=VLEN_US) + { + for(int j=0; j= 1; j = j/2 ) { if(lidy < j) for( int i = 0; i < 10; i++ ) lm[i] = lm[i] + m[i][lidy]; - barrier(CLK_LOCAL_MEM_FENCE); + } + barrier(CLK_LOCAL_MEM_FENCE); + for( int j = TILE_SIZE/2; j >= 1; j = j/2 ) + { if(lidy >= j/2&&lidy < j) for( int i = 0; i < 10; i++ ) m[i][lidy-j/2] = lm[i]; - barrier(CLK_LOCAL_MEM_FENCE); } + barrier(CLK_LOCAL_MEM_FENCE); + if(lidy == 0&&lidx == 0) { for(int mt = 0; mt < 10; mt++ ) @@ -482,10 +501,10 @@ __kernel void CvMoments_D2(__global ushort8* src_data, int src_rows, int src_col } } -__kernel void CvMoments_D3(__global short8* src_data, int src_rows, int src_cols, int src_step, int tileSize_width, int tileSize_height, +__kernel void CvMoments_D3(__global short8* src_data, int src_rows, int src_cols, int src_step, __global F* dst_m, int dst_cols, int dst_step, int blocky, - int type, int depth, int cn, int coi, int binary, const int TILE_SIZE) + int depth, int cn, int coi, int binary, const int TILE_SIZE) { short tmp_coi[8]; // get the coi data short8 tmp[32]; @@ -500,21 +519,26 @@ __kernel void CvMoments_D3(__global short8* src_data, int src_rows, int src_cols int x = wgidx*TILE_SIZE; // real X index of pixel int kcn = (cn==2)?2:4; int rstep = min(src_step/2, TILE_SIZE); - tileSize_height = min(TILE_SIZE, src_rows - y); - tileSize_width = min(TILE_SIZE, src_cols -x); - if(tileSize_width < TILE_SIZE) - for(int i = tileSize_width; i < rstep; i++ ) - *((__global short*)src_data+(y+lidy)*src_step/2+x+i) = 0; - if( coi > 0 ) - for(int i=0; i < tileSize_width; i+=VLEN_S) - { - for(int j=0; j 0 ) + for(int i=0; i < tileSize_width; i+=VLEN_S) + { + for(int j=0; j 0 ) - for(int i=0; i < tileSize_width; i+=VLEN_F) - { -#pragma unroll - for(int j=0; j<4; j++) + + if ( y+lidy < src_rows ) + { + if(tileSize_width < TILE_SIZE) + for(int i = tileSize_width; i < rstep && (x+i) < src_cols; i++ ) + *((__global float*)src_data+(y+lidy)*src_step/4+x+i) = 0; + if( coi > 0 ) + for(int i=0; i < tileSize_width; i+=VLEN_F) { - index = yOff+(x+i+j)*kcn+coi-1; - if (index < maxIdx) - tmp_coi[j] = *(src_data+index); - else - tmp_coi[j] = 0; + for(int j=0; j<4; j++) + tmp_coi[j] = *(src_data+(y+lidy)*src_step/4+(x+i+j)*kcn+coi-1); + tmp[i/VLEN_F] = (float4)(tmp_coi[0],tmp_coi[1],tmp_coi[2],tmp_coi[3]); } - tmp[i/VLEN_F] = (float4)(tmp_coi[0],tmp_coi[1],tmp_coi[2],tmp_coi[3]); - } - else - for(int i=0; i < tileSize_width && (yOff+x+i) < maxIdx; i+=VLEN_F) - tmp[i/VLEN_F] = (*(__global float4 *)(src_data+yOff+x+i)); + else + for(int i=0; i < tileSize_width; i+=VLEN_F) + tmp[i/VLEN_F] = (float4)(*(src_data+(y+lidy)*src_step/4+x+i),*(src_data+(y+lidy)*src_step/4+x+i+1),*(src_data+(y+lidy)*src_step/4+x+i+2),*(src_data+(y+lidy)*src_step/4+x+i+3)); + } + float4 zero = (float4)(0); float4 full = (float4)(255); if( binary ) @@ -688,10 +708,9 @@ __kernel void CvMoments_D5( __global float* src_data, int src_rows, int src_cols tmp[i/VLEN_F] = (tmp[i/VLEN_F]!=zero)?full:zero; F mom[10]; __local F m[10][128]; - if(lidy == 0) + if(lidy < 128) for(int i = 0; i < 10; i ++) - for(int j = 0; j < 128; j ++) - m[i][j] = 0; + m[i][lidy] = 0; barrier(CLK_LOCAL_MEM_FENCE); F lm[10] = {0}; F4 x0 = (F4)(0); @@ -729,185 +748,6 @@ __kernel void CvMoments_D5( __global float* src_data, int src_rows, int src_cols m[0][lidy-bheight] = x0.s0; // m00 } - else if(lidy < bheight) - { - lm[9] = ((F)py) * sy; // m03 - lm[8] = ((F)x1.s0) * sy; // m12 - lm[7] = ((F)x2.s0) * lidy; // m21 - lm[6] = x3.s0; // m30 - lm[5] = x0.s0 * sy; // m02 - lm[4] = x1.s0 * lidy; // m11 - lm[3] = x2.s0; // m20 - lm[2] = py; // m01 - lm[1] = x1.s0; // m10 - lm[0] = x0.s0; // m00 - } - barrier(CLK_LOCAL_MEM_FENCE); - for( int j = TILE_SIZE/2; j >= 1; j = j/2 ) - { - if(lidy < j) - for( int i = 0; i < 10; i++ ) - lm[i] = lm[i] + m[i][lidy]; - barrier(CLK_LOCAL_MEM_FENCE); - if(lidy >= j/2&&lidy < j) - for( int i = 0; i < 10; i++ ) - m[i][lidy-j/2] = lm[i]; - barrier(CLK_LOCAL_MEM_FENCE); - } - if(lidy == 0&&lidx == 0) - { - for( int mt = 0; mt < 10; mt++ ) - mom[mt] = (F)lm[mt]; - if(binary) - { - F s = 1./255; - for( int mt = 0; mt < 10; mt++ ) - mom[mt] *= s; - } - - F xm = x * mom[0], ym = y * mom[0]; - - // accumulate moments computed in each tile - dst_step /= sizeof(F); - - int dst_x_off = mad24(wgidy, dst_cols, wgidx); - int dst_off = 0; - int max_dst_index = 10 * blocky * get_global_size(1); - - // + m00 ( = m00' ) - dst_off = mad24(DST_ROW_00 * blocky, dst_step, dst_x_off); - if (dst_off < max_dst_index) - *(dst_m + dst_off) = mom[0]; - - // + m10 ( = m10' + x*m00' ) - dst_off = mad24(DST_ROW_10 * blocky, dst_step, dst_x_off); - if (dst_off < max_dst_index) - *(dst_m + dst_off) = mom[1] + xm; - - // + m01 ( = m01' + y*m00' ) - dst_off = mad24(DST_ROW_01 * blocky, dst_step, dst_x_off); - if (dst_off < max_dst_index) - *(dst_m + dst_off) = mom[2] + ym; - - // + m20 ( = m20' + 2*x*m10' + x*x*m00' ) - dst_off = mad24(DST_ROW_20 * blocky, dst_step, dst_x_off); - if (dst_off < max_dst_index) - *(dst_m + dst_off) = mom[3] + x * (mom[1] * 2 + xm); - - // + m11 ( = m11' + x*m01' + y*m10' + x*y*m00' ) - dst_off = mad24(DST_ROW_11 * blocky, dst_step, dst_x_off); - if (dst_off < max_dst_index) - *(dst_m + dst_off) = mom[4] + x * (mom[2] + ym) + y * mom[1]; - - // + m02 ( = m02' + 2*y*m01' + y*y*m00' ) - dst_off = mad24(DST_ROW_02 * blocky, dst_step, dst_x_off); - if (dst_off < max_dst_index) - *(dst_m + dst_off) = mom[5] + y * (mom[2] * 2 + ym); - - // + m30 ( = m30' + 3*x*m20' + 3*x*x*m10' + x*x*x*m00' ) - dst_off = mad24(DST_ROW_30 * blocky, dst_step, dst_x_off); - if (dst_off < max_dst_index) - *(dst_m + dst_off) = mom[6] + x * (3. * mom[3] + x * (3. * mom[1] + xm)); - - // + m21 ( = m21' + x*(2*m11' + 2*y*m10' + x*m01' + x*y*m00') + y*m20') - dst_off = mad24(DST_ROW_21 * blocky, dst_step, dst_x_off); - if (dst_off < max_dst_index) - *(dst_m + dst_off) = mom[7] + x * (2 * (mom[4] + y * mom[1]) + x * (mom[2] + ym)) + y * mom[3]; - - // + m12 ( = m12' + y*(2*m11' + 2*x*m01' + y*m10' + x*y*m00') + x*m02') - dst_off = mad24(DST_ROW_12 * blocky, dst_step, dst_x_off); - if (dst_off < max_dst_index) - *(dst_m + dst_off) = mom[8] + y * (2 * (mom[4] + x * mom[2]) + y * (mom[1] + xm)) + x * mom[5]; - - // + m03 ( = m03' + 3*y*m02' + 3*y*y*m01' + y*y*y*m00' ) - dst_off = mad24(DST_ROW_03 * blocky, dst_step, dst_x_off); - if (dst_off < max_dst_index) - *(dst_m + dst_off) = mom[9] + y * (3. * mom[5] + y * (3. * mom[2] + ym)); - } -} - -__kernel void CvMoments_D6(__global F* src_data, int src_rows, int src_cols, int src_step, int tileSize_width, int tileSize_height, - __global F* dst_m, - int dst_cols, int dst_step, int blocky, - int type, int depth, int cn, int coi, int binary, const int TILE_SIZE) -{ - F tmp_coi[4]; // get the coi data - F4 tmp[64]; - int VLEN_D = 4; // length of vetor - int gidy = get_global_id(0); - int gidx = get_global_id(1); - int wgidy = get_group_id(0); - int wgidx = get_group_id(1); - int lidy = get_local_id(0); - int lidx = get_local_id(1); - int y = wgidy*TILE_SIZE; // real Y index of pixel - int x = wgidx*TILE_SIZE; // real X index of pixel - int kcn = (cn==2)?2:4; - int rstep = min(src_step/8, TILE_SIZE); - tileSize_height = min(TILE_SIZE, src_rows - y); - tileSize_width = min(TILE_SIZE, src_cols - x); - - if(tileSize_width < TILE_SIZE) - for(int i = tileSize_width; i < rstep; i++ ) - *((__global F*)src_data+(y+lidy)*src_step/8+x+i) = 0; - if( coi > 0 ) - for(int i=0; i < tileSize_width; i+=VLEN_D) - { - for(int j=0; j<4; j++) - tmp_coi[j] = *(src_data+(y+lidy)*src_step/8+(x+i+j)*kcn+coi-1); - tmp[i/VLEN_D] = (F4)(tmp_coi[0],tmp_coi[1],tmp_coi[2],tmp_coi[3]); - } - else - for(int i=0; i < tileSize_width; i+=VLEN_D) - tmp[i/VLEN_D] = (F4)(*(src_data+(y+lidy)*src_step/8+x+i),*(src_data+(y+lidy)*src_step/8+x+i+1),*(src_data+(y+lidy)*src_step/8+x+i+2),*(src_data+(y+lidy)*src_step/8+x+i+3)); - F4 zero = (F4)(0); - F4 full = (F4)(255); - if( binary ) - for(int i=0; i < tileSize_width; i+=VLEN_D) - tmp[i/VLEN_D] = (tmp[i/VLEN_D]!=zero)?full:zero; - F mom[10]; - __local F m[10][128]; - if(lidy == 0) - for(int i=0; i<10; i++) - for(int j=0; j<128; j++) - m[i][j]=0; - barrier(CLK_LOCAL_MEM_FENCE); - F lm[10] = {0}; - F4 x0 = (F4)(0); - F4 x1 = (F4)(0); - F4 x2 = (F4)(0); - F4 x3 = (F4)(0); - for( int xt = 0 ; xt < tileSize_width; xt+=VLEN_D ) - { - F4 v_xt = (F4)(xt, xt+1, xt+2, xt+3); - F4 p = tmp[xt/VLEN_D]; - F4 xp = v_xt * p, xxp = xp * v_xt; - x0 += p; - x1 += xp; - x2 += xxp; - x3 += xxp *v_xt; - } - x0.s0 += x0.s1 + x0.s2 + x0.s3; - x1.s0 += x1.s1 + x1.s2 + x1.s3; - x2.s0 += x2.s1 + x2.s2 + x2.s3; - x3.s0 += x3.s1 + x3.s2 + x3.s3; - - F py = lidy * x0.s0, sy = lidy*lidy; - int bheight = min(tileSize_height, TILE_SIZE/2); - if(bheight >= TILE_SIZE/2&&lidy > bheight-1&&lidy < tileSize_height) - { - m[9][lidy-bheight] = ((F)py) * sy; // m03 - m[8][lidy-bheight] = ((F)x1.s0) * sy; // m12 - m[7][lidy-bheight] = ((F)x2.s0) * lidy; // m21 - m[6][lidy-bheight] = x3.s0; // m30 - m[5][lidy-bheight] = x0.s0 * sy; // m02 - m[4][lidy-bheight] = x1.s0 * lidy; // m11 - m[3][lidy-bheight] = x2.s0; // m20 - m[2][lidy-bheight] = py; // m01 - m[1][lidy-bheight] = x1.s0; // m10 - m[0][lidy-bheight] = x0.s0; // m00 - } - else if(lidy < bheight) { lm[9] = ((F)py) * sy; // m03 @@ -922,6 +762,164 @@ __kernel void CvMoments_D6(__global F* src_data, int src_rows, int src_cols, in lm[0] = x0.s0; // m00 } barrier(CLK_LOCAL_MEM_FENCE); + for( int j = TILE_SIZE/2; j >= 1; j = j/2 ) + { + if(lidy < j) + for( int i = 0; i < 10; i++ ) + lm[i] = lm[i] + m[i][lidy]; + barrier(CLK_LOCAL_MEM_FENCE); + if(lidy >= j/2&&lidy < j) + for( int i = 0; i < 10; i++ ) + m[i][lidy-j/2] = lm[i]; + barrier(CLK_LOCAL_MEM_FENCE); + } + if(lidy == 0&&lidx == 0) + { + for( int mt = 0; mt < 10; mt++ ) + mom[mt] = (F)lm[mt]; + if(binary) + { + F s = 1./255; + for( int mt = 0; mt < 10; mt++ ) + mom[mt] *= s; + } + + F xm = x * mom[0], ym = y * mom[0]; + + // accumulate moments computed in each tile + dst_step /= sizeof(F); + + // + m00 ( = m00' ) + *(dst_m + mad24(DST_ROW_00 * blocky, dst_step, mad24(wgidy, dst_cols, wgidx))) = mom[0]; + + // + m10 ( = m10' + x*m00' ) + *(dst_m + mad24(DST_ROW_10 * blocky, dst_step, mad24(wgidy, dst_cols, wgidx))) = mom[1] + xm; + + // + m01 ( = m01' + y*m00' ) + *(dst_m + mad24(DST_ROW_01 * blocky, dst_step, mad24(wgidy, dst_cols, wgidx))) = mom[2] + ym; + + // + m20 ( = m20' + 2*x*m10' + x*x*m00' ) + *(dst_m + mad24(DST_ROW_20 * blocky, dst_step, mad24(wgidy, dst_cols, wgidx))) = mom[3] + x * (mom[1] * 2 + xm); + + // + m11 ( = m11' + x*m01' + y*m10' + x*y*m00' ) + *(dst_m + mad24(DST_ROW_11 * blocky, dst_step, mad24(wgidy, dst_cols, wgidx))) = mom[4] + x * (mom[2] + ym) + y * mom[1]; + + // + m02 ( = m02' + 2*y*m01' + y*y*m00' ) + *(dst_m + mad24(DST_ROW_02 * blocky, dst_step, mad24(wgidy, dst_cols, wgidx))) = mom[5] + y * (mom[2] * 2 + ym); + + // + m30 ( = m30' + 3*x*m20' + 3*x*x*m10' + x*x*x*m00' ) + *(dst_m + mad24(DST_ROW_30 * blocky, dst_step, mad24(wgidy, dst_cols, wgidx))) = mom[6] + x * (3. * mom[3] + x * (3. * mom[1] + xm)); + + // + m21 ( = m21' + x*(2*m11' + 2*y*m10' + x*m01' + x*y*m00') + y*m20') + *(dst_m + mad24(DST_ROW_21 * blocky, dst_step, mad24(wgidy, dst_cols, wgidx))) = mom[7] + x * (2 * (mom[4] + y * mom[1]) + x * (mom[2] + ym)) + y * mom[3]; + + // + m12 ( = m12' + y*(2*m11' + 2*x*m01' + y*m10' + x*y*m00') + x*m02') + *(dst_m + mad24(DST_ROW_12 * blocky, dst_step, mad24(wgidy, dst_cols, wgidx))) = mom[8] + y * (2 * (mom[4] + x * mom[2]) + y * (mom[1] + xm)) + x * mom[5]; + + // + m03 ( = m03' + 3*y*m02' + 3*y*y*m01' + y*y*y*m00' ) + *(dst_m + mad24(DST_ROW_03 * blocky, dst_step, mad24(wgidy, dst_cols, wgidx))) = mom[9] + y * (3. * mom[5] + y * (3. * mom[2] + ym)); + } +} + +__kernel void CvMoments_D6(__global F* src_data, int src_rows, int src_cols, int src_step, + __global F* dst_m, + int dst_cols, int dst_step, int blocky, + int depth, int cn, int coi, int binary, const int TILE_SIZE) +{ + F tmp_coi[4]; // get the coi data + F4 tmp[64]; + int VLEN_D = 4; // length of vetor + int gidy = get_global_id(0); + int gidx = get_global_id(1); + int wgidy = get_group_id(0); + int wgidx = get_group_id(1); + int lidy = get_local_id(0); + int lidx = get_local_id(1); + int y = wgidy*TILE_SIZE; // real Y index of pixel + int x = wgidx*TILE_SIZE; // real X index of pixel + int kcn = (cn==2)?2:4; + int rstep = min(src_step/8, TILE_SIZE); + int tileSize_height = min(TILE_SIZE, src_rows - y); + int tileSize_width = min(TILE_SIZE, src_cols - x); + + if ( y+lidy < src_rows ) + { + if(tileSize_width < TILE_SIZE) + for(int i = tileSize_width; i < rstep && (x+i) < src_cols; i++ ) + *((__global F*)src_data+(y+lidy)*src_step/8+x+i) = 0; + if( coi > 0 ) + for(int i=0; i < tileSize_width; i+=VLEN_D) + { + for(int j=0; j<4 && ((x+i+j)*kcn+coi-1)= TILE_SIZE/2&&lidy > bheight-1&&lidy < tileSize_height) + { + m[9][lidy-bheight] = ((F)py) * sy; // m03 + m[8][lidy-bheight] = ((F)x1.s0) * sy; // m12 + m[7][lidy-bheight] = ((F)x2.s0) * lidy; // m21 + m[6][lidy-bheight] = x3.s0; // m30 + m[5][lidy-bheight] = x0.s0 * sy; // m02 + m[4][lidy-bheight] = x1.s0 * lidy; // m11 + m[3][lidy-bheight] = x2.s0; // m20 + m[2][lidy-bheight] = py; // m01 + m[1][lidy-bheight] = x1.s0; // m10 + m[0][lidy-bheight] = x0.s0; // m00 + } + else if(lidy < bheight) + { + lm[9] = ((F)py) * sy; // m03 + lm[8] = ((F)x1.s0) * sy; // m12 + lm[7] = ((F)x2.s0) * lidy; // m21 + lm[6] = x3.s0; // m30 + lm[5] = x0.s0 * sy; // m02 + lm[4] = x1.s0 * lidy; // m11 + lm[3] = x2.s0; // m20 + lm[2] = py; // m01 + lm[1] = x1.s0; // m10 + lm[0] = x0.s0; // m00 + } + barrier(CLK_LOCAL_MEM_FENCE); + for( int j = TILE_SIZE/2; j >= 1; j = j/2 ) { if(lidy < j) diff --git a/modules/ocl/test/test_moments.cpp b/modules/ocl/test/test_moments.cpp index 98c66def3..86f4779d6 100644 --- a/modules/ocl/test/test_moments.cpp +++ b/modules/ocl/test/test_moments.cpp @@ -45,12 +45,12 @@ TEST_P(MomentsTest, Mat) { if(test_contours) { - Mat src = imread( workdir + "../cpp/pic3.png", 1 ); - Mat src_gray, canny_output; - cvtColor( src, src_gray, CV_BGR2GRAY ); + Mat src = imread( workdir + "../cpp/pic3.png", IMREAD_GRAYSCALE ); + ASSERT_FALSE(src.empty()); + Mat canny_output; vector > contours; vector hierarchy; - Canny( src_gray, canny_output, 100, 200, 3 ); + Canny( src, canny_output, 100, 200, 3 ); findContours( canny_output, contours, hierarchy, CV_RETR_TREE, CV_CHAIN_APPROX_SIMPLE, Point(0, 0) ); for( size_t i = 0; i < contours.size(); i++ ) { From 43122939cbe17e534442bd9ba9bb299e752a13a5 Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Wed, 22 May 2013 04:21:23 -0700 Subject: [PATCH 26/75] Media foundation video i/o fixes. Bug in Video for Windows capture init fixed; Media Foundation based capture finalization fixed; Highgui tests for video i/o updated. --- .../include/opencv2/highgui/highgui_c.h | 4 +- modules/highgui/src/cap.cpp | 39 ++--- modules/highgui/src/cap_msmf.cpp | 138 ++++++++---------- modules/highgui/src/cap_vfw.cpp | 4 +- modules/highgui/test/test_video_io.cpp | 58 ++++---- 5 files changed, 110 insertions(+), 133 deletions(-) diff --git a/modules/highgui/include/opencv2/highgui/highgui_c.h b/modules/highgui/include/opencv2/highgui/highgui_c.h index 12be9867a..fbcdba24f 100644 --- a/modules/highgui/include/opencv2/highgui/highgui_c.h +++ b/modules/highgui/include/opencv2/highgui/highgui_c.h @@ -558,9 +558,11 @@ CVAPI(int) cvGetCaptureDomain( CvCapture* capture); /* "black box" video file writer structure */ typedef struct CvVideoWriter CvVideoWriter; +#define CV_FOURCC_MACRO(c1, c2, c3, c4) ((c1 & 255) + ((c2 & 255) << 8) + ((c3 & 255) << 16) + ((c4 & 255) << 24)) + CV_INLINE int CV_FOURCC(char c1, char c2, char c3, char c4) { - return (c1 & 255) + ((c2 & 255) << 8) + ((c3 & 255) << 16) + ((c4 & 255) << 24); + return CV_FOURCC_MACRO(c1, c2, c3, c4); } #define CV_FOURCC_PROMPT -1 /* Open Codec Selection Dialog (Windows only) */ diff --git a/modules/highgui/src/cap.cpp b/modules/highgui/src/cap.cpp index 8db873102..cc92da3d0 100644 --- a/modules/highgui/src/cap.cpp +++ b/modules/highgui/src/cap.cpp @@ -199,15 +199,6 @@ CV_IMPL CvCapture * cvCreateCameraCapture (int index) switch (domains[i]) { -#ifdef HAVE_MSMF - case CV_CAP_MSMF: - printf("Creating Media foundation capture\n"); - capture = cvCreateCameraCapture_MSMF (index); - printf("Capture address %p\n", capture); - if (capture) - return capture; - break; -#endif #ifdef HAVE_DSHOW case CV_CAP_DSHOW: capture = cvCreateCameraCapture_DShow (index); @@ -215,7 +206,13 @@ CV_IMPL CvCapture * cvCreateCameraCapture (int index) return capture; break; #endif - +#ifdef HAVE_MSMF + case CV_CAP_MSMF: + capture = cvCreateCameraCapture_MSMF (index); + if (capture) + return capture; + break; +#endif #ifdef HAVE_TYZX case CV_CAP_STEREO: capture = cvCreateCameraCapture_TYZX (index); @@ -223,14 +220,12 @@ CV_IMPL CvCapture * cvCreateCameraCapture (int index) return capture; break; #endif - - case CV_CAP_VFW: #ifdef HAVE_VFW + case CV_CAP_VFW: capture = cvCreateCameraCapture_VFW (index); if (capture) return capture; #endif - #if defined HAVE_LIBV4L || defined HAVE_CAMV4L || defined HAVE_CAMV4L2 || defined HAVE_VIDEOIO capture = cvCreateCameraCapture_V4L (index); if (capture) @@ -363,20 +358,16 @@ CV_IMPL CvCapture * cvCreateFileCapture (const char * filename) if (! result) result = cvCreateFileCapture_FFMPEG_proxy (filename); -#ifdef HAVE_MSMF - if (! result) - { - printf("Creating Media foundation based reader\n"); - result = cvCreateFileCapture_MSMF (filename); - printf("Construction result %p\n", result); - } -#endif - #ifdef HAVE_VFW if (! result) result = cvCreateFileCapture_VFW (filename); #endif +#ifdef HAVE_MSMF + if (! result) + result = cvCreateFileCapture_MSMF (filename); +#endif + #ifdef HAVE_XINE if (! result) result = cvCreateFileCapture_XINE (filename); @@ -422,14 +413,12 @@ CV_IMPL CvVideoWriter* cvCreateVideoWriter( const char* filename, int fourcc, if(!fourcc || !fps) result = cvCreateVideoWriter_Images(filename); -#ifdef HAVE_FFMPEG if(!result) result = cvCreateVideoWriter_FFMPEG_proxy (filename, fourcc, fps, frameSize, is_color); -#endif #ifdef HAVE_VFW if(!result) - return cvCreateVideoWriter_VFW(filename, fourcc, fps, frameSize, is_color); + result = cvCreateVideoWriter_VFW(filename, fourcc, fps, frameSize, is_color); #endif #ifdef HAVE_MSMF diff --git a/modules/highgui/src/cap_msmf.cpp b/modules/highgui/src/cap_msmf.cpp index 814fb75be..7af85382b 100644 --- a/modules/highgui/src/cap_msmf.cpp +++ b/modules/highgui/src/cap_msmf.cpp @@ -840,9 +840,11 @@ LPCWSTR GetGUIDNameConstNew(const GUID& guid) IF_EQUAL_RETURN(guid, MFAudioFormat_ADTS); // WAVE_FORMAT_MPEG_ADTS_AAC return NULL; } + FormatReader::FormatReader(void) { } + MediaType FormatReader::Read(IMFMediaType *pType) { UINT32 count = 0; @@ -888,9 +890,9 @@ ImageGrabber::ImageGrabber(unsigned int deviceID, bool synchronous): ig_RIE(true), ig_Close(false), ig_Synchronous(synchronous), - ig_hFrameReady(synchronous ? CreateEvent(NULL, FALSE, FALSE, "ig_hFrameReady"): 0), - ig_hFrameGrabbed(synchronous ? CreateEvent(NULL, FALSE, TRUE, "ig_hFrameGrabbed"): 0), - ig_hFinish(synchronous ? CreateEvent(NULL, FALSE, FALSE, "ig_hFinish") : 0) + ig_hFrameReady(synchronous ? CreateEvent(NULL, FALSE, FALSE, NULL): 0), + ig_hFrameGrabbed(synchronous ? CreateEvent(NULL, FALSE, TRUE, NULL): 0), + ig_hFinish(CreateEvent(NULL, TRUE, FALSE, NULL)) {} ImageGrabber::~ImageGrabber(void) @@ -898,15 +900,16 @@ ImageGrabber::~ImageGrabber(void) printf("ImageGrabber::~ImageGrabber()\n"); if (ig_pSession) { - printf("ig_pSession->Shutdown()"); + printf("ig_pSession->Shutdown()\n"); ig_pSession->Shutdown(); } + CloseHandle(ig_hFinish); + if (ig_Synchronous) { CloseHandle(ig_hFrameReady); CloseHandle(ig_hFrameGrabbed); - CloseHandle(ig_hFinish); } SafeRelease(&ig_pSession); @@ -1061,7 +1064,6 @@ HRESULT ImageGrabber::startGrabbing(void) hr = S_OK; goto done; } - printf("media foundation event: %d\n", met); if (met == MESessionEnded) { DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: MESessionEnded \n", ig_DeviceID); @@ -1088,13 +1090,7 @@ HRESULT ImageGrabber::startGrabbing(void) DPO->printOut(L"IMAGEGRABBER VIDEODEVICE %i: Finish startGrabbing \n", ig_DeviceID); done: - if (ig_Synchronous) - { - SetEvent(ig_hFinish); - } - - SafeRelease(&ig_pSession); - SafeRelease(&ig_pTopology); + SetEvent(ig_hFinish); return hr; } @@ -1109,7 +1105,7 @@ void ImageGrabber::resumeGrabbing() HRESULT ImageGrabber::CreateTopology(IMFMediaSource *pSource, IMFActivate *pSinkActivate, IMFTopology **ppTopo) { - ComPtr pTopology = NULL; + IMFTopology* pTopology = NULL; ComPtr pPD = NULL; ComPtr pSD = NULL; ComPtr pHandler = NULL; @@ -1117,7 +1113,7 @@ HRESULT ImageGrabber::CreateTopology(IMFMediaSource *pSource, IMFActivate *pSink ComPtr pNode2 = NULL; HRESULT hr = S_OK; DWORD cStreams = 0; - CHECK_HR(hr = MFCreateTopology(pTopology.GetAddressOf())); + CHECK_HR(hr = MFCreateTopology(&pTopology)); CHECK_HR(hr = pSource->CreatePresentationDescriptor(pPD.GetAddressOf())); CHECK_HR(hr = pPD->GetStreamDescriptorCount(&cStreams)); for (DWORD i = 0; i < cStreams; i++) @@ -1130,8 +1126,8 @@ HRESULT ImageGrabber::CreateTopology(IMFMediaSource *pSource, IMFActivate *pSink CHECK_HR(hr = pHandler->GetMajorType(&majorType)); if (majorType == MFMediaType_Video && fSelected) { - CHECK_HR(hr = AddSourceNode(pTopology.Get(), pSource, pPD.Get(), pSD.Get(), pNode1.GetAddressOf())); - CHECK_HR(hr = AddOutputNode(pTopology.Get(), pSinkActivate, 0, pNode2.GetAddressOf())); + CHECK_HR(hr = AddSourceNode(pTopology, pSource, pPD.Get(), pSD.Get(), pNode1.GetAddressOf())); + CHECK_HR(hr = AddOutputNode(pTopology, pSinkActivate, 0, pNode2.GetAddressOf())); CHECK_HR(hr = pNode1->ConnectOutput(0, pNode2.Get(), 0)); break; } @@ -1140,7 +1136,7 @@ HRESULT ImageGrabber::CreateTopology(IMFMediaSource *pSource, IMFActivate *pSink CHECK_HR(hr = pPD->DeselectStream(i)); } } - *ppTopo = pTopology.Get(); + *ppTopo = pTopology; (*ppTopo)->AddRef(); done: @@ -1286,9 +1282,15 @@ STDMETHODIMP ImageGrabber::OnProcessSample(REFGUID guidMajorMediaType, DWORD dwS (void)llSampleDuration; (void)dwSampleSize; - WaitForSingleObject(ig_hFrameGrabbed, INFINITE); + //printf("ImageGrabber::OnProcessSample() -- begin\n"); + HANDLE tmp[] = {ig_hFinish, ig_hFrameGrabbed, NULL}; - printf("ImageGrabber::OnProcessSample() -- begin\n"); + DWORD status = WaitForMultipleObjects(2, tmp, FALSE, INFINITE); + if (status == WAIT_OBJECT_0) + { + printf("OnProcessFrame called after ig_hFinish event\n"); + return S_OK; + } if(ig_RIE) { @@ -1300,25 +1302,24 @@ STDMETHODIMP ImageGrabber::OnProcessSample(REFGUID guidMajorMediaType, DWORD dwS ig_RISecond->fastCopy(pSampleBuffer); ig_RIOut = ig_RISecond; } - ig_RIE = !ig_RIE; - printf("ImageGrabber::OnProcessSample() -- end\n"); + //printf("ImageGrabber::OnProcessSample() -- end\n"); if (ig_Synchronous) { SetEvent(ig_hFrameReady); } + else + { + ig_RIE = !ig_RIE; + } return S_OK; } STDMETHODIMP ImageGrabber::OnShutdown() { - if (ig_Synchronous) - { - SetEvent(ig_hFrameGrabbed); - } - + SetEvent(ig_hFinish); return S_OK; } @@ -1387,6 +1388,8 @@ ImageGrabberThread::~ImageGrabberThread(void) { DebugPrintOut *DPO = &DebugPrintOut::getInstance(); DPO->printOut(L"IMAGEGRABBERTHREAD VIDEODEVICE %i: Destroing ImageGrabberThread\n", igt_DeviceID); + if (igt_Handle) + WaitForSingleObject(igt_Handle, INFINITE); delete igt_pImageGrabber; } @@ -1431,7 +1434,6 @@ void ImageGrabberThread::run() DPO->printOut(L"IMAGEGRABBERTHREAD VIDEODEVICE %i: Emergency Stop thread\n", igt_DeviceID); if(igt_func) { - printf("Calling Emergency stop even handler\n"); igt_func(igt_DeviceID, igt_userData); } } @@ -3045,6 +3047,7 @@ CvCaptureFile_MSMF::CvCaptureFile_MSMF(): CvCaptureFile_MSMF::~CvCaptureFile_MSMF() { + close(); MFShutdown(); } @@ -3109,7 +3112,7 @@ void CvCaptureFile_MSMF::close() if (grabberThread) { isOpened = false; - SetEvent(grabberThread->getImageGrabber()->ig_hFrameReady); + SetEvent(grabberThread->getImageGrabber()->ig_hFinish); grabberThread->stop(); delete grabberThread; } @@ -3289,12 +3292,12 @@ bool CvCaptureFile_MSMF::grabFrame() DWORD waitResult; if (isOpened) { + SetEvent(grabberThread->getImageGrabber()->ig_hFrameGrabbed); HANDLE tmp[] = {grabberThread->getImageGrabber()->ig_hFrameReady, grabberThread->getImageGrabber()->ig_hFinish, 0}; waitResult = WaitForMultipleObjects(2, tmp, FALSE, INFINITE); - SetEvent(grabberThread->getImageGrabber()->ig_hFrameGrabbed); } - return isOpened && (waitResult == WAIT_OBJECT_0); + return isOpened && grabberThread->getImageGrabber()->getRawImage()->isNew() && (waitResult == WAIT_OBJECT_0); } IplImage* CvCaptureFile_MSMF::retrieveFrame(int) @@ -3443,7 +3446,8 @@ private: HRESULT WriteFrame(DWORD *videoFrameBuffer, const LONGLONG& rtStart, const LONGLONG& rtDuration); }; -CvVideoWriter_MSMF::CvVideoWriter_MSMF() +CvVideoWriter_MSMF::CvVideoWriter_MSMF(): + initiated(false) { } @@ -3456,47 +3460,47 @@ const GUID CvVideoWriter_MSMF::FourCC2GUID(int fourcc) { switch(fourcc) { - case 'dv25': + case CV_FOURCC_MACRO('d', 'v', '2', '5'): return MFVideoFormat_DV25; break; - case 'dv50': + case CV_FOURCC_MACRO('d', 'v', '5', '0'): return MFVideoFormat_DV50; break; - case 'dvc ': + case CV_FOURCC_MACRO('d', 'v', 'c', ' '): return MFVideoFormat_DVC; break; - case 'dvh1': + case CV_FOURCC_MACRO('d', 'v', 'h', '1'): return MFVideoFormat_DVH1; break; - case 'dvhd': + case CV_FOURCC_MACRO('d', 'v', 'h', 'd'): return MFVideoFormat_DVHD; break; - case 'dvsd': + case CV_FOURCC_MACRO('d', 'v', 's', 'd'): return MFVideoFormat_DVSD; break; - case 'dvsl': + case CV_FOURCC_MACRO('d', 'v', 's', 'l'): return MFVideoFormat_DVSL; break; - case 'H263': + case CV_FOURCC_MACRO('H', '2', '6', '3'): return MFVideoFormat_H263; break; - case 'H264': + case CV_FOURCC_MACRO('H', '2', '6', '4'): return MFVideoFormat_H264; break; - case 'M4S2': + case CV_FOURCC_MACRO('M', '4', 'S', '2'): return MFVideoFormat_M4S2; break; - case 'MJPG': + case CV_FOURCC_MACRO('M', 'J', 'P', 'G'): return MFVideoFormat_MJPG; break; - case 'MP43': + case CV_FOURCC_MACRO('M', 'P', '4', '3'): return MFVideoFormat_MP43; break; - case 'MP4S': + case CV_FOURCC_MACRO('M', 'P', '4', 'S'): return MFVideoFormat_MP4S; break; - case 'MP4V': + case CV_FOURCC_MACRO('M', 'P', '4', 'V'): return MFVideoFormat_MP4V; break; - case 'MPG1': + case CV_FOURCC_MACRO('M', 'P', 'G', '1'): return MFVideoFormat_MPG1; break; - case 'MSS1': + case CV_FOURCC_MACRO('M', 'S', 'S', '1'): return MFVideoFormat_MSS1; break; - case 'MSS2': + case CV_FOURCC_MACRO('M', 'S', 'S', '2'): return MFVideoFormat_MSS2; break; - case 'WMV1': + case CV_FOURCC_MACRO('W', 'M', 'V', '1'): return MFVideoFormat_WMV1; break; - case 'WMV2': + case CV_FOURCC_MACRO('W', 'M', 'V', '2'): return MFVideoFormat_WMV2; break; - case 'WMV3': + case CV_FOURCC_MACRO('W', 'M', 'V', '3'): return MFVideoFormat_WMV3; break; - case 'WVC1': + case CV_FOURCC_MACRO('W', 'V', 'C', '1'): return MFVideoFormat_WVC1; break; default: return MFVideoFormat_H264; @@ -3516,19 +3520,15 @@ bool CvVideoWriter_MSMF::open( const char* filename, int fourcc, HRESULT hr = CoInitializeEx(NULL, COINIT_APARTMENTTHREADED); if (SUCCEEDED(hr)) { - printf("CoInitializeEx is successfull\n"); hr = MFStartup(MF_VERSION); if (SUCCEEDED(hr)) { - printf("MFStartup is successfull\n"); hr = InitializeSinkWriter(filename); if (SUCCEEDED(hr)) { - printf("InitializeSinkWriter is successfull\n"); initiated = true; rtStart = 0; MFFrameRateToAverageTimePerFrame((UINT32)fps, 1, &rtDuration); - printf("duration: %d\n", rtDuration); } } } @@ -3556,13 +3556,9 @@ bool CvVideoWriter_MSMF::writeFrame(const IplImage* img) if (!img) return false; - printf("Writing not empty IplImage\n"); - int length = img->width * img->height * 4; - printf("Image: %dx%d, %d\n", img->width, img->height, length); DWORD* target = new DWORD[length]; - printf("Before for loop\n"); for (int rowIdx = 0; rowIdx < img->height; rowIdx++) { char* rowStart = img->imageData + rowIdx*img->widthStep; @@ -3577,9 +3573,7 @@ bool CvVideoWriter_MSMF::writeFrame(const IplImage* img) } // Send frame to the sink writer. - printf("Before private WriteFrame call\n"); HRESULT hr = WriteFrame(target, rtStart, rtDuration); - printf("After private WriteFrame call\n"); if (FAILED(hr)) { printf("Private WriteFrame failed\n"); @@ -3588,8 +3582,6 @@ bool CvVideoWriter_MSMF::writeFrame(const IplImage* img) } rtStart += rtDuration; - printf("End of writing IplImage\n"); - delete[] target; return true; @@ -3681,7 +3673,8 @@ HRESULT CvVideoWriter_MSMF::InitializeSinkWriter(const char* filename) { hr = MFSetAttributeRatio(mediaTypeIn.Get(), MF_MT_PIXEL_ASPECT_RATIO, 1, 1); } - if (SUCCEEDED(hr)) + + if (SUCCEEDED(hr)) { hr = sinkWriter->SetInputMediaType(streamIndex, mediaTypeIn.Get(), NULL); } @@ -3697,7 +3690,6 @@ HRESULT CvVideoWriter_MSMF::InitializeSinkWriter(const char* filename) HRESULT CvVideoWriter_MSMF::WriteFrame(DWORD *videoFrameBuffer, const LONGLONG& Start, const LONGLONG& Duration) { - printf("Private WriteFrame(%p, %llu, %llu)\n", videoFrameBuffer, Start, Duration); ComPtr sample; ComPtr buffer; @@ -3712,13 +3704,11 @@ HRESULT CvVideoWriter_MSMF::WriteFrame(DWORD *videoFrameBuffer, const LONGLONG& // Lock the buffer and copy the video frame to the buffer. if (SUCCEEDED(hr)) { - printf("MFCreateMemoryBuffer successfull\n"); hr = buffer->Lock(&pData, NULL, NULL); } if (SUCCEEDED(hr)) { - printf("Before MFCopyImage(%p, %d, %p, %d, %d %d)\n", pData, cbWidth, videoFrameBuffer, cbWidth, cbWidth, videoHeight); hr = MFCopyImage( pData, // Destination buffer. cbWidth, // Destination stride. @@ -3727,21 +3717,16 @@ HRESULT CvVideoWriter_MSMF::WriteFrame(DWORD *videoFrameBuffer, const LONGLONG& cbWidth, // Image width in bytes. videoHeight // Image height in pixels. ); - printf("After MFCopyImage()\n"); } - printf("Before buffer.Get()\n"); if (buffer) { - printf("Before buffer->Unlock\n"); buffer->Unlock(); - printf("After buffer->Unlock\n"); } // Set the data length of the buffer. if (SUCCEEDED(hr)) { - printf("MFCopyImage successfull\n"); hr = buffer->SetCurrentLength(cbBuffer); } @@ -3758,24 +3743,19 @@ HRESULT CvVideoWriter_MSMF::WriteFrame(DWORD *videoFrameBuffer, const LONGLONG& // Set the time stamp and the duration. if (SUCCEEDED(hr)) { - printf("Sample time: %d\n", Start); hr = sample->SetSampleTime(Start); } if (SUCCEEDED(hr)) { - printf("Duration: %d\n", Duration); hr = sample->SetSampleDuration(Duration); } // Send the sample to the Sink Writer. if (SUCCEEDED(hr)) { - printf("Setting writer params successfull\n"); hr = sinkWriter->WriteSample(streamIndex, sample.Get()); } - printf("Private WriteFrame(%d, %p) end with status %u\n", streamIndex, sample.Get(), hr); - return hr; } diff --git a/modules/highgui/src/cap_vfw.cpp b/modules/highgui/src/cap_vfw.cpp index d419a4891..d845953f8 100644 --- a/modules/highgui/src/cap_vfw.cpp +++ b/modules/highgui/src/cap_vfw.cpp @@ -613,8 +613,10 @@ bool CvVideoWriter_VFW::open( const char* filename, int _fourcc, double _fps, Cv close(); return false; } + return true; } - return true; + else + return false; } diff --git a/modules/highgui/test/test_video_io.cpp b/modules/highgui/test/test_video_io.cpp index 34ec0bdd8..f46235b3e 100644 --- a/modules/highgui/test/test_video_io.cpp +++ b/modules/highgui/test/test_video_io.cpp @@ -57,27 +57,27 @@ string fourccToString(int fourcc) #ifdef HAVE_MSMF const VideoFormat g_specific_fmt_list[] = { -/* VideoFormat("avi", 'dv25'), - VideoFormat("avi", 'dv50'), - VideoFormat("avi", 'dvc '), - VideoFormat("avi", 'dvh1'), - VideoFormat("avi", 'dvhd'), - VideoFormat("avi", 'dvsd'), - VideoFormat("avi", 'dvsl'), - VideoFormat("avi", 'M4S2'), */ - VideoFormat("wmv", 'WMV3'), - // VideoFormat("avi", 'H264'), - // VideoFormat("avi", 'MJPG'), - // VideoFormat("avi", 'MP43'), - // VideoFormat("avi", 'MP4S'), - // VideoFormat("avi", 'MP4V'), -/* VideoFormat("avi", 'MPG1'), - VideoFormat("avi", 'MSS1'), - VideoFormat("avi", 'MSS2'), - VideoFormat("avi", 'WMV1'), - VideoFormat("avi", 'WMV2'), - VideoFormat("avi", 'WMV3'), - VideoFormat("avi", 'WVC1'), */ + /*VideoFormat("wmv", CV_FOURCC_MACRO('d', 'v', '2', '5')), + VideoFormat("wmv", CV_FOURCC_MACRO('d', 'v', '5', '0')), + VideoFormat("wmv", CV_FOURCC_MACRO('d', 'v', 'c', ' ')), + VideoFormat("wmv", CV_FOURCC_MACRO('d', 'v', 'h', '1')), + VideoFormat("wmv", CV_FOURCC_MACRO('d', 'v', 'h', 'd')), + VideoFormat("wmv", CV_FOURCC_MACRO('d', 'v', 's', 'd')), + VideoFormat("wmv", CV_FOURCC_MACRO('d', 'v', 's', 'l')), + VideoFormat("wmv", CV_FOURCC_MACRO('H', '2', '6', '3')), + VideoFormat("wmv", CV_FOURCC_MACRO('M', '4', 'S', '2')), + VideoFormat("avi", CV_FOURCC_MACRO('M', 'J', 'P', 'G')), + VideoFormat("mp4", CV_FOURCC_MACRO('M', 'P', '4', 'S')), + VideoFormat("mp4", CV_FOURCC_MACRO('M', 'P', '4', 'V')), + VideoFormat("wmv", CV_FOURCC_MACRO('M', 'P', '4', '3')), + VideoFormat("wmv", CV_FOURCC_MACRO('M', 'P', 'G', '1')), + VideoFormat("wmv", CV_FOURCC_MACRO('M', 'S', 'S', '1')), + VideoFormat("wmv", CV_FOURCC_MACRO('M', 'S', 'S', '2')),*/ + //VideoFormat("avi", CV_FOURCC_MACRO('H', '2', '6', '4')), + VideoFormat("wmv", CV_FOURCC_MACRO('W', 'M', 'V', '1')), + VideoFormat("wmv", CV_FOURCC_MACRO('W', 'M', 'V', '2')), + VideoFormat("wmv", CV_FOURCC_MACRO('W', 'M', 'V', '3')), + //VideoFormat("wmv", CV_FOURCC_MACRO('W', 'V', 'C', '1')), VideoFormat() }; #else @@ -269,19 +269,19 @@ void CV_HighGuiTest::VideoTest(const string& dir, const cvtest::VideoFormat& fmt for(;;) { - IplImage * img = cvQueryFrame( cap ); + IplImage* img = cvQueryFrame( cap ); if (!img) break; frames.push_back(Mat(img).clone()); - if (writer == 0) + if (writer == NULL) { writer = cvCreateVideoWriter(tmp_name.c_str(), fmt.fourcc, 24, cvGetSize(img)); - if (writer == 0) + if (writer == NULL) { - ts->printf(ts->LOG, "can't create writer (with fourcc : %d)\n", + ts->printf(ts->LOG, "can't create writer (with fourcc : %s)\n", cvtest::fourccToString(fmt.fourcc).c_str()); cvReleaseCapture( &cap ); ts->set_failed_test_info(ts->FAIL_MISMATCH); @@ -317,18 +317,22 @@ void CV_HighGuiTest::VideoTest(const string& dir, const cvtest::VideoFormat& fmt double psnr = PSNR(img1, img); if (psnr < thresDbell) { - printf("Too low psnr = %gdb\n", psnr); + ts->printf(ts->LOG, "Too low frame %d psnr = %gdb\n", i, psnr); + ts->set_failed_test_info(ts->FAIL_MISMATCH); + //imwrite("original.png", img); //imwrite("after_test.png", img1); //Mat diff; //absdiff(img, img1, diff); //imwrite("diff.png", diff); - ts->set_failed_test_info(ts->FAIL_MISMATCH); + break; } } + printf("Before saved release for %s\n", tmp_name.c_str()); cvReleaseCapture( &saved ); + printf("After release\n"); ts->printf(ts->LOG, "end test function : ImagesVideo \n"); } From 08a0e1c91b121cea67d08000c5339a6d7960e43d Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Wed, 22 May 2013 07:26:43 -0700 Subject: [PATCH 27/75] TBB support for WinRT fixed. Development release of TBB with WinRT support added; TBB.dll is placed in bin folder now. --- 3rdparty/tbb/CMakeLists.txt | 31 ++++++++++++++++++++----------- 1 file changed, 20 insertions(+), 11 deletions(-) diff --git a/3rdparty/tbb/CMakeLists.txt b/3rdparty/tbb/CMakeLists.txt index 9dcb63b7f..942404133 100644 --- a/3rdparty/tbb/CMakeLists.txt +++ b/3rdparty/tbb/CMakeLists.txt @@ -1,12 +1,19 @@ #Cross compile TBB from source project(tbb) -# 4.1 update 3 dev - works fine -set(tbb_ver "tbb41_20130401oss") -set(tbb_url "http://threadingbuildingblocks.org/sites/default/files/software_releases/source/tbb41_20130401oss_src.tgz") -set(tbb_md5 "f2f591a0d2ca8f801e221ce7d9ea84bb") +# Development release +set(tbb_ver "tbb41_20130516oss") +set(tbb_url "http://threadingbuildingblocks.org/sites/default/files/software_releases/source/tbb41_20130516oss_src.tgz") +set(tbb_md5 "08c14d1c196bc9595d8c52bedc239937") set(tbb_version_file "version_string.ver") -ocv_warnings_disable(CMAKE_CXX_FLAGS -Wshadow) +ocv_warnings_disable(CMAKE_CXX_FLAGS -Wshadow -Wunused-parameter) + +# 4.1 update 3 dev - works fine +#set(tbb_ver "tbb41_20130401oss") +#set(tbb_url "http://threadingbuildingblocks.org/sites/default/files/software_releases/source/tbb41_20130401oss_src.tgz") +#set(tbb_md5 "f2f591a0d2ca8f801e221ce7d9ea84bb") +#set(tbb_version_file "version_string.ver") +#ocv_warnings_disable(CMAKE_CXX_FLAGS -Wshadow) # 4.1 update 2 - works fine #set(tbb_ver "tbb41_20130116oss") @@ -115,7 +122,6 @@ if(NOT EXISTS "${tbb_src_dir}") if(NOT tbb_untar_RESULT EQUAL 0 OR NOT EXISTS "${tbb_src_dir}") message(FATAL_ERROR "Failed to unpack TBB sources (${tbb_untar_RESULT} ${tbb_src_dir})") - endif() endif() @@ -133,12 +139,16 @@ list(APPEND lib_srcs "${tbb_src_dir}/src/rml/client/rml_tbb.cpp") if (WIN32) add_definitions(-D__TBB_DYNAMIC_LOAD_ENABLED=0 /D__TBB_BUILD=1 + /DTBB_NO_LEGACY=1 /D_UNICODE /DUNICODE /DWINAPI_FAMILY=WINAPI_FAMILY_APP /DDO_ITT_NOTIFY=0 /DUSE_WINTHREAD + /D_WIN32_WINNT=0x0602 + /D__TBB_WIN32_USE_CL_BUILTINS ) # defines were copied from windows.cl.inc + set(CMAKE_LINKER_FLAGS "${CMAKE_LINKER_FLAGS} /APPCONTAINER") else() add_definitions(-D__TBB_DYNAMIC_LOAD_ENABLED=0 #required @@ -183,10 +193,10 @@ set(TBB_SOURCE_FILES ${TBB_SOURCE_FILES} "${CMAKE_CURRENT_SOURCE_DIR}/${tbb_vers add_library(tbb ${TBB_SOURCE_FILES}) -if (WIN32) +if (ARM AND WIN32) add_custom_command(TARGET tbb PRE_BUILD - COMMAND ${CMAKE_C_COMPILER} /nologo /TC /EP ${tbb_src_dir}\\src\\tbb\\win32-tbb-export.def /DTBB_NO_LEGACY /DUSE_WINTHREAD /D_CRT_SECURE_NO_DEPRECATE /D_WIN32_WINNT=0x0400 /D__TBB_BUILD=1 /I${tbb_src_dir}\\src /I${tbb_src_dir}\\include > "${tbb_src_dir}\\src\\tbb\\tbb.def" + COMMAND ${CMAKE_C_COMPILER} /nologo /TC /EP ${tbb_src_dir}\\src\\tbb\\win32-tbb-export.def /DTBB_NO_LEGACY=1 /D_CRT_SECURE_NO_DEPRECATE /D__TBB_BUILD=1 /D_M_ARM=1 /I${tbb_src_dir}\\src /I${tbb_src_dir}\\include > "${tbb_src_dir}\\src\\tbb\\tbb.def" WORKING_DIRECTORY ${tbb_src_dir}\\src\\tbb COMMENT "Generating tbb.def file" VERBATIM ) @@ -194,9 +204,7 @@ endif() if (WIN32) set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} /DEF:${tbb_src_dir}/src/tbb/tbb.def /DLL /MAP /fixed:no /INCREMENTAL:NO") -endif() - -if (NOT WIN32) +else() target_link_libraries(tbb c m dl) endif() @@ -207,6 +215,7 @@ set_target_properties(tbb PROPERTIES OUTPUT_NAME tbb DEBUG_POSTFIX "${OPENCV_DEBUG_POSTFIX}" ARCHIVE_OUTPUT_DIRECTORY ${3P_LIBRARY_OUTPUT_PATH} + RUNTIME_OUTPUT_DIRECTORY ${EXECUTABLE_OUTPUT_PATH} ) if(ENABLE_SOLUTION_FOLDERS) From 2bc1d3709c40dfa326b1f2b6a0725c1ffa42f506 Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Wed, 22 May 2013 09:50:54 -0700 Subject: [PATCH 28/75] GetProperty method for MSMF VideoCapture implemented. --- modules/highgui/src/cap_msmf.cpp | 103 ++++++++++++++----------------- 1 file changed, 48 insertions(+), 55 deletions(-) diff --git a/modules/highgui/src/cap_msmf.cpp b/modules/highgui/src/cap_msmf.cpp index 7af85382b..ae82b2c67 100644 --- a/modules/highgui/src/cap_msmf.cpp +++ b/modules/highgui/src/cap_msmf.cpp @@ -119,8 +119,8 @@ struct MediaType wchar_t *pMF_MT_AM_FORMAT_TYPEName; unsigned int MF_MT_FIXED_SIZE_SAMPLES; unsigned int MF_MT_VIDEO_NOMINAL_RANGE; - unsigned int MF_MT_FRAME_RATE; - unsigned int MF_MT_FRAME_RATE_low; + unsigned int MF_MT_FRAME_RATE_NUMERATOR; + unsigned int MF_MT_FRAME_RATE_DENOMINATOR; unsigned int MF_MT_PIXEL_ASPECT_RATIO; unsigned int MF_MT_PIXEL_ASPECT_RATIO_low; unsigned int MF_MT_ALL_SAMPLES_INDEPENDENT; @@ -625,14 +625,17 @@ done: } return hr; } + void LogUINT32AsUINT64New(const PROPVARIANT& var, UINT32 &uHigh, UINT32 &uLow) { Unpack2UINT32AsUINT64(var.uhVal.QuadPart, &uHigh, &uLow); } + float OffsetToFloatNew(const MFOffset& offset) { return offset.value + (static_cast(offset.fract) / 65536.0f); } + HRESULT LogVideoAreaNew(const PROPVARIANT& var) { if (var.caub.cElems < sizeof(MFVideoArea)) @@ -641,6 +644,7 @@ HRESULT LogVideoAreaNew(const PROPVARIANT& var) } return S_OK; } + HRESULT SpecialCaseAttributeValueNew(GUID guid, const PROPVARIANT& var, MediaType &out) { if (guid == MF_MT_DEFAULT_STRIDE) @@ -660,8 +664,8 @@ HRESULT SpecialCaseAttributeValueNew(GUID guid, const PROPVARIANT& var, MediaTyp { UINT32 uHigh = 0, uLow = 0; LogUINT32AsUINT64New(var, uHigh, uLow); - out.MF_MT_FRAME_RATE = uHigh; - out.MF_MT_FRAME_RATE_low = uLow; + out.MF_MT_FRAME_RATE_NUMERATOR = uHigh; + out.MF_MT_FRAME_RATE_DENOMINATOR = uLow; } else if (guid == MF_MT_FRAME_RATE_RANGE_MAX) @@ -693,9 +697,11 @@ HRESULT SpecialCaseAttributeValueNew(GUID guid, const PROPVARIANT& var, MediaTyp } return S_OK; } + #ifndef IF_EQUAL_RETURN #define IF_EQUAL_RETURN(param, val) if(val == param) return L#val #endif + LPCWSTR GetGUIDNameConstNew(const GUID& guid) { IF_EQUAL_RETURN(guid, MF_MT_MAJOR_TYPE); @@ -875,6 +881,7 @@ MediaType FormatReader::Read(IMFMediaType *pType) } return out; } + FormatReader::~FormatReader(void) { } @@ -1880,6 +1887,7 @@ int videoDevice::findType(unsigned int size, unsigned int frameRate) return 0; return VN[0]; } + void videoDevice::buildLibraryofTypes() { unsigned int size; @@ -1889,7 +1897,7 @@ void videoDevice::buildLibraryofTypes() for(; i != vd_CurrentFormats.end(); i++) { size = (*i).MF_MT_FRAME_SIZE; - framerate = (*i).MF_MT_FRAME_RATE; + framerate = (*i).MF_MT_FRAME_RATE_NUMERATOR; FrameRateMap FRM = vd_CaptureFormats[size]; SUBTYPEMap STM = FRM[framerate]; String subType((*i).pMF_MT_SUBTYPEName); @@ -2187,8 +2195,8 @@ void MediaType::Clear() MF_MT_VIDEO_CHROMA_SITING = 0; MF_MT_FIXED_SIZE_SAMPLES = 0; MF_MT_VIDEO_NOMINAL_RANGE = 0; - MF_MT_FRAME_RATE = 0; - MF_MT_FRAME_RATE_low = 0; + MF_MT_FRAME_RATE_NUMERATOR = 0; + MF_MT_FRAME_RATE_DENOMINATOR = 0; MF_MT_PIXEL_ASPECT_RATIO = 0; MF_MT_PIXEL_ASPECT_RATIO_low = 0; MF_MT_ALL_SAMPLES_INDEPENDENT = 0; @@ -3033,6 +3041,7 @@ protected: bool isOpened; HRESULT enumerateCaptureFormats(IMFMediaSource *pSource); + HRESULT getSourceDuration(IMFMediaSource *pSource, MFTIME *pDuration); }; CvCaptureFile_MSMF::CvCaptureFile_MSMF(): @@ -3094,7 +3103,7 @@ bool CvCaptureFile_MSMF::open(const char* filename) if (SUCCEEDED(hr)) { - hr = ImageGrabberThread::CreateInstance(&grabberThread, videoFileSource, -2, true); + hr = ImageGrabberThread::CreateInstance(&grabberThread, videoFileSource, (unsigned int)-2, true); } if (SUCCEEDED(hr)) @@ -3131,7 +3140,8 @@ bool CvCaptureFile_MSMF::setProperty(int property_id, double value) { // image capture properties bool handled = false; - int width, height, fourcc; + unsigned int width, height; + int fourcc; switch( property_id ) { case CV_CAP_PROP_FRAME_WIDTH: @@ -3239,57 +3249,26 @@ double CvCaptureFile_MSMF::getProperty(int property_id) case CV_CAP_PROP_FRAME_HEIGHT: return captureFormats[captureFormatIndex].height; case CV_CAP_PROP_FRAME_COUNT: - return 30; + { + MFTIME duration; + getSourceDuration(this->videoFileSource, &duration); + double fps = ((double)captureFormats[captureFormatIndex].MF_MT_FRAME_RATE_NUMERATOR) / + ((double)captureFormats[captureFormatIndex].MF_MT_FRAME_RATE_DENOMINATOR); + return (double)floor(((double)duration/1e7)*fps+0.5); + } case CV_CAP_PROP_FOURCC: - // FIXME: implement method in VideoInput back end - //return VI.getFourcc(index); - ; + return captureFormats[captureFormatIndex].MF_MT_SUBTYPE.Data1; case CV_CAP_PROP_FPS: - // FIXME: implement method in VideoInput back end - //return VI.getFPS(index); - ; + return ((double)captureFormats[captureFormatIndex].MF_MT_FRAME_RATE_NUMERATOR) / + ((double)captureFormats[captureFormatIndex].MF_MT_FRAME_RATE_DENOMINATOR); } - // video filter properties - switch( property_id ) - { - case CV_CAP_PROP_BRIGHTNESS: - case CV_CAP_PROP_CONTRAST: - case CV_CAP_PROP_HUE: - case CV_CAP_PROP_SATURATION: - case CV_CAP_PROP_SHARPNESS: - case CV_CAP_PROP_GAMMA: - case CV_CAP_PROP_MONOCROME: - case CV_CAP_PROP_WHITE_BALANCE_BLUE_U: - case CV_CAP_PROP_BACKLIGHT: - case CV_CAP_PROP_GAIN: - // FIXME: implement method in VideoInput back end - // if ( VI.getVideoSettingFilter(index, VI.getVideoPropertyFromCV(property_id), min_value, - // max_value, stepping_delta, current_value, flags,defaultValue) ) - // return (double)current_value; - return 0.; - } - // camera properties - switch( property_id ) - { - case CV_CAP_PROP_PAN: - case CV_CAP_PROP_TILT: - case CV_CAP_PROP_ROLL: - case CV_CAP_PROP_ZOOM: - case CV_CAP_PROP_EXPOSURE: - case CV_CAP_PROP_IRIS: - case CV_CAP_PROP_FOCUS: - // FIXME: implement method in VideoInput back end - // if (VI.getVideoSettingCamera(index,VI.getCameraPropertyFromCV(property_id),min_value, - // max_value,stepping_delta,current_value,flags,defaultValue) ) return (double)current_value; - return 0.; - } - // unknown parameter or value not available + return -1; } bool CvCaptureFile_MSMF::grabFrame() { - DWORD waitResult; + DWORD waitResult = -1; if (isOpened) { SetEvent(grabberThread->getImageGrabber()->ig_hFrameGrabbed); @@ -3305,7 +3284,7 @@ IplImage* CvCaptureFile_MSMF::retrieveFrame(int) unsigned int width = captureFormats[captureFormatIndex].width; unsigned int height = captureFormats[captureFormatIndex].height; unsigned int bytes = 3; - if( !frame || width != frame->width || height != frame->height ) + if( !frame || (int)width != frame->width || (int)height != frame->height ) { if (frame) cvReleaseImage( &frame ); @@ -3370,6 +3349,20 @@ done: return hr; } +HRESULT CvCaptureFile_MSMF::getSourceDuration(IMFMediaSource *pSource, MFTIME *pDuration) +{ + *pDuration = 0; + + IMFPresentationDescriptor *pPD = NULL; + + HRESULT hr = pSource->CreatePresentationDescriptor(&pPD); + if (SUCCEEDED(hr)) + { + hr = pPD->GetUINT64(MF_PD_DURATION, (UINT64*)pDuration); + pPD->Release(); + } + return hr; +} CvCapture* cvCreateCameraCapture_MSMF( int index ) { @@ -3508,12 +3501,12 @@ const GUID CvVideoWriter_MSMF::FourCC2GUID(int fourcc) } bool CvVideoWriter_MSMF::open( const char* filename, int fourcc, - double _fps, CvSize frameSize, bool isColor ) + double _fps, CvSize frameSize, bool /*isColor*/ ) { videoWidth = frameSize.width; videoHeight = frameSize.height; fps = _fps; - bitRate = fps*videoWidth*videoHeight; // 1-bit per pixel + bitRate = (UINT32)fps*videoWidth*videoHeight; // 1-bit per pixel encodingFormat = FourCC2GUID(fourcc); inputFormat = MFVideoFormat_RGB32; From 34c659875293058f3db7c082d6c2917dc8ea190e Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Fri, 24 May 2013 06:34:42 -0700 Subject: [PATCH 29/75] Perf test failure fixes for Media Foundation. --- modules/highgui/perf/perf_input.cpp | 10 ++++++++++ modules/highgui/perf/perf_output.cpp | 12 +++++++++--- modules/highgui/perf/perf_precomp.hpp | 2 ++ 3 files changed, 21 insertions(+), 3 deletions(-) diff --git a/modules/highgui/perf/perf_input.cpp b/modules/highgui/perf/perf_input.cpp index 0c1e8e0a7..414c85365 100644 --- a/modules/highgui/perf/perf_input.cpp +++ b/modules/highgui/perf/perf_input.cpp @@ -11,11 +11,21 @@ using std::tr1::get; typedef perf::TestBaseWithParam VideoCapture_Reading; +#if defined(HAVE_MSMF) +// MPEG2 is not supported by Media Foundation yet +// http://social.msdn.microsoft.com/Forums/en-US/mediafoundationdevelopment/thread/39a36231-8c01-40af-9af5-3c105d684429 +PERF_TEST_P(VideoCapture_Reading, ReadFile, testing::Values( "highgui/video/big_buck_bunny.avi", + "highgui/video/big_buck_bunny.mov", + "highgui/video/big_buck_bunny.mp4", + "highgui/video/big_buck_bunny.wmv" ) ) + +#else PERF_TEST_P(VideoCapture_Reading, ReadFile, testing::Values( "highgui/video/big_buck_bunny.avi", "highgui/video/big_buck_bunny.mov", "highgui/video/big_buck_bunny.mp4", "highgui/video/big_buck_bunny.mpg", "highgui/video/big_buck_bunny.wmv" ) ) +#endif { string filename = getDataPath(GetParam()); diff --git a/modules/highgui/perf/perf_output.cpp b/modules/highgui/perf/perf_output.cpp index 6428bb4f0..2adfe8965 100644 --- a/modules/highgui/perf/perf_output.cpp +++ b/modules/highgui/perf/perf_output.cpp @@ -22,10 +22,16 @@ PERF_TEST_P(VideoWriter_Writing, WriteFrame, { string filename = getDataPath(get<0>(GetParam())); bool isColor = get<1>(GetParam()); + Mat image = imread(filename, 1); +#if defined(HAVE_MSMF) && !defined(HAVE_VFW) && !defined(HAVE_FFMPEG) // VFW has greater priority + VideoWriter writer(cv::tempfile(".wmv"), CV_FOURCC('W', 'M', 'V', '3'), + 25, cv::Size(image.cols, image.rows), isColor); +#else + VideoWriter writer(cv::tempfile(".avi"), CV_FOURCC('X', 'V', 'I', 'D'), + 25, cv::Size(image.cols, image.rows), isColor); +#endif - VideoWriter writer(cv::tempfile(".avi"), CV_FOURCC('X', 'V', 'I', 'D'), 25, cv::Size(640, 480), isColor); - - TEST_CYCLE() { Mat image = imread(filename, 1); writer << image; } + TEST_CYCLE() { image = imread(filename, 1); writer << image; } bool dummy = writer.isOpened(); SANITY_CHECK(dummy); diff --git a/modules/highgui/perf/perf_precomp.hpp b/modules/highgui/perf/perf_precomp.hpp index 529187d3b..d6b28b6d2 100644 --- a/modules/highgui/perf/perf_precomp.hpp +++ b/modules/highgui/perf/perf_precomp.hpp @@ -21,6 +21,7 @@ defined(HAVE_QUICKTIME) || \ defined(HAVE_AVFOUNDATION) || \ defined(HAVE_FFMPEG) || \ + defined(HAVE_MSMF) || \ defined(HAVE_VFW) /*defined(HAVE_OPENNI) too specialized */ \ @@ -34,6 +35,7 @@ defined(HAVE_QUICKTIME) || \ defined(HAVE_AVFOUNDATION) || \ defined(HAVE_FFMPEG) || \ + defined(HAVE_MSMF) || \ defined(HAVE_VFW) # define BUILD_WITH_VIDEO_OUTPUT_SUPPORT 1 #else From ee591efb9fbd6b1bc274fd56a39cca734900e7cb Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Wed, 29 May 2013 01:14:01 -0700 Subject: [PATCH 30/75] Build fix for Windows RT. --- modules/contrib/src/inputoutput.cpp | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/modules/contrib/src/inputoutput.cpp b/modules/contrib/src/inputoutput.cpp index d10d884c8..a711f242a 100644 --- a/modules/contrib/src/inputoutput.cpp +++ b/modules/contrib/src/inputoutput.cpp @@ -1,7 +1,7 @@ #include "opencv2/contrib/contrib.hpp" -#ifdef WIN32 +#if defined(WIN32) || defined(_WIN32) #include #include #else From 49903367608af0a89adb5ff83f61e2d7b9b7c514 Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Thu, 6 Jun 2013 01:34:57 -0700 Subject: [PATCH 31/75] Base camera access sample for Windows RT added. Microsoft Media Foundation Camera Sample for Windows RT added. --- .../C++/AdvancedCapture.xaml | 81 + .../C++/AdvancedCapture.xaml.cpp | 1034 +++++ .../C++/AdvancedCapture.xaml.h | 103 + samples/winrt/ImageManipulations/C++/App.xaml | 30 + .../winrt/ImageManipulations/C++/App.xaml.cpp | 114 + .../winrt/ImageManipulations/C++/App.xaml.h | 35 + .../ImageManipulations/C++/AudioCapture.xaml | 62 + .../C++/AudioCapture.xaml.cpp | 366 ++ .../C++/AudioCapture.xaml.h | 70 + .../ImageManipulations/C++/BasicCapture.xaml | 87 + .../C++/BasicCapture.xaml.cpp | 535 +++ .../C++/BasicCapture.xaml.h | 88 + .../ImageManipulations/C++/Constants.cpp | 24 + .../winrt/ImageManipulations/C++/Constants.h | 45 + .../ImageManipulations/C++/MainPage.xaml | 166 + .../ImageManipulations/C++/MainPage.xaml.cpp | 315 ++ .../ImageManipulations/C++/MainPage.xaml.h | 105 + .../ImageManipulations/C++/MediaCapture.sln | 52 + .../C++/MediaCapture.vcxproj | 200 + .../C++/MediaCapture.vcxproj.filters | 88 + .../C++/MediaExtensions/Common/AsyncCB.h | 81 + .../C++/MediaExtensions/Common/BufferLock.h | 102 + .../C++/MediaExtensions/Common/CritSec.h | 62 + .../C++/MediaExtensions/Common/LinkList.h | 516 +++ .../C++/MediaExtensions/Common/OpQueue.h | 222 + .../MediaExtensions/Grayscale/Grayscale.cpp | 1783 ++++++++ .../MediaExtensions/Grayscale/Grayscale.def | 4 + .../C++/MediaExtensions/Grayscale/Grayscale.h | 266 ++ .../Grayscale/Grayscale.vcxproj | 313 ++ .../Grayscale/Grayscale.vcxproj.filters | 22 + .../Grayscale/GrayscaleTransform.idl | 11 + .../C++/MediaExtensions/Grayscale/dllmain.cpp | 58 + .../C++/Package.appxmanifest | 39 + .../C++/assets/microsoft-sdk.png | Bin 0 -> 1583 bytes .../C++/assets/placeholder-sdk.png | Bin 0 -> 8991 bytes .../C++/assets/smallTile-sdk.png | Bin 0 -> 1248 bytes .../C++/assets/splash-sdk.png | Bin 0 -> 5068 bytes .../C++/assets/squareTile-sdk.png | Bin 0 -> 2482 bytes .../C++/assets/storeLogo-sdk.png | Bin 0 -> 1550 bytes .../C++/assets/tile-sdk.png | Bin 0 -> 2665 bytes .../C++/assets/windows-sdk.png | Bin 0 -> 2997 bytes .../C++/common/LayoutAwarePage.cpp | 452 ++ .../C++/common/LayoutAwarePage.h | 88 + .../C++/common/StandardStyles.xaml | 978 +++++ .../C++/common/suspensionmanager.cpp | 481 +++ .../C++/common/suspensionmanager.h | 50 + samples/winrt/ImageManipulations/C++/pch.cpp | 16 + samples/winrt/ImageManipulations/C++/pch.h | 23 + .../sample-utils/SampleTemplateStyles.xaml | 51 + .../winrt/ImageManipulations/description.html | 238 ++ ...a0-3e7e-46df-b80b-1692acc1c812Combined.css | 0 .../ImageManipulations/description/Brand.css | 3629 +++++++++++++++++ .../description/Combined.css | 0 .../description/Galleries.css | 418 ++ .../ImageManipulations/description/Layout.css | 147 + ...69f54-1c43-4037-b90b-5f775f1d945fBrand.css | 303 ++ .../description/iframedescription.css | 179 + .../ImageManipulations/description/offline.js | 52 + samples/winrt/ImageManipulations/license.rtf | 25 + 59 files changed, 14209 insertions(+) create mode 100644 samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml create mode 100644 samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.cpp create mode 100644 samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.h create mode 100644 samples/winrt/ImageManipulations/C++/App.xaml create mode 100644 samples/winrt/ImageManipulations/C++/App.xaml.cpp create mode 100644 samples/winrt/ImageManipulations/C++/App.xaml.h create mode 100644 samples/winrt/ImageManipulations/C++/AudioCapture.xaml create mode 100644 samples/winrt/ImageManipulations/C++/AudioCapture.xaml.cpp create mode 100644 samples/winrt/ImageManipulations/C++/AudioCapture.xaml.h create mode 100644 samples/winrt/ImageManipulations/C++/BasicCapture.xaml create mode 100644 samples/winrt/ImageManipulations/C++/BasicCapture.xaml.cpp create mode 100644 samples/winrt/ImageManipulations/C++/BasicCapture.xaml.h create mode 100644 samples/winrt/ImageManipulations/C++/Constants.cpp create mode 100644 samples/winrt/ImageManipulations/C++/Constants.h create mode 100644 samples/winrt/ImageManipulations/C++/MainPage.xaml create mode 100644 samples/winrt/ImageManipulations/C++/MainPage.xaml.cpp create mode 100644 samples/winrt/ImageManipulations/C++/MainPage.xaml.h create mode 100644 samples/winrt/ImageManipulations/C++/MediaCapture.sln create mode 100644 samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj create mode 100644 samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj.filters create mode 100644 samples/winrt/ImageManipulations/C++/MediaExtensions/Common/AsyncCB.h create mode 100644 samples/winrt/ImageManipulations/C++/MediaExtensions/Common/BufferLock.h create mode 100644 samples/winrt/ImageManipulations/C++/MediaExtensions/Common/CritSec.h create mode 100644 samples/winrt/ImageManipulations/C++/MediaExtensions/Common/LinkList.h create mode 100644 samples/winrt/ImageManipulations/C++/MediaExtensions/Common/OpQueue.h create mode 100644 samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.cpp create mode 100644 samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.def create mode 100644 samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.h create mode 100644 samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.vcxproj create mode 100644 samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.vcxproj.filters create mode 100644 samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/GrayscaleTransform.idl create mode 100644 samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/dllmain.cpp create mode 100644 samples/winrt/ImageManipulations/C++/Package.appxmanifest create mode 100644 samples/winrt/ImageManipulations/C++/assets/microsoft-sdk.png create mode 100644 samples/winrt/ImageManipulations/C++/assets/placeholder-sdk.png create mode 100644 samples/winrt/ImageManipulations/C++/assets/smallTile-sdk.png create mode 100644 samples/winrt/ImageManipulations/C++/assets/splash-sdk.png create mode 100644 samples/winrt/ImageManipulations/C++/assets/squareTile-sdk.png create mode 100644 samples/winrt/ImageManipulations/C++/assets/storeLogo-sdk.png create mode 100644 samples/winrt/ImageManipulations/C++/assets/tile-sdk.png create mode 100644 samples/winrt/ImageManipulations/C++/assets/windows-sdk.png create mode 100644 samples/winrt/ImageManipulations/C++/common/LayoutAwarePage.cpp create mode 100644 samples/winrt/ImageManipulations/C++/common/LayoutAwarePage.h create mode 100644 samples/winrt/ImageManipulations/C++/common/StandardStyles.xaml create mode 100644 samples/winrt/ImageManipulations/C++/common/suspensionmanager.cpp create mode 100644 samples/winrt/ImageManipulations/C++/common/suspensionmanager.h create mode 100644 samples/winrt/ImageManipulations/C++/pch.cpp create mode 100644 samples/winrt/ImageManipulations/C++/pch.h create mode 100644 samples/winrt/ImageManipulations/C++/sample-utils/SampleTemplateStyles.xaml create mode 100644 samples/winrt/ImageManipulations/description.html create mode 100644 samples/winrt/ImageManipulations/description/4ee0dda0-3e7e-46df-b80b-1692acc1c812Combined.css create mode 100644 samples/winrt/ImageManipulations/description/Brand.css create mode 100644 samples/winrt/ImageManipulations/description/Combined.css create mode 100644 samples/winrt/ImageManipulations/description/Galleries.css create mode 100644 samples/winrt/ImageManipulations/description/Layout.css create mode 100644 samples/winrt/ImageManipulations/description/c2e69f54-1c43-4037-b90b-5f775f1d945fBrand.css create mode 100644 samples/winrt/ImageManipulations/description/iframedescription.css create mode 100644 samples/winrt/ImageManipulations/description/offline.js create mode 100644 samples/winrt/ImageManipulations/license.rtf diff --git a/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml b/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml new file mode 100644 index 000000000..4e6ebfd30 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml @@ -0,0 +1,81 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.cpp b/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.cpp new file mode 100644 index 000000000..dc59acc2e --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.cpp @@ -0,0 +1,1034 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +// +// AdvancedCapture.xaml.cpp +// Implementation of the AdvancedCapture class +// + +#include "pch.h" +#include "AdvancedCapture.xaml.h" + +using namespace SDKSample::MediaCapture; + +using namespace Windows::UI::Xaml; +using namespace Windows::UI::Xaml::Navigation; +using namespace Windows::UI::Xaml::Data; +using namespace Windows::System; +using namespace Windows::Foundation; +using namespace Platform; +using namespace Windows::UI; +using namespace Windows::UI::Core; +using namespace Windows::UI::Xaml; +using namespace Windows::UI::Xaml::Controls; +using namespace Windows::UI::Xaml::Data; +using namespace Windows::UI::Xaml::Media; +using namespace Windows::Storage; +using namespace Windows::Media::MediaProperties; +using namespace Windows::Storage::Streams; +using namespace Windows::System; +using namespace Windows::UI::Xaml::Media::Imaging; +using namespace Windows::Devices::Enumeration; + +ref class ReencodeState sealed +{ +public: + ReencodeState() + { + } + + virtual ~ReencodeState() + { + if (InputStream != nullptr) + { + delete InputStream; + } + if (OutputStream != nullptr) + { + delete OutputStream; + } + } + +internal: + Windows::Storage::Streams::IRandomAccessStream ^InputStream; + Windows::Storage::Streams::IRandomAccessStream ^OutputStream; + Windows::Storage::StorageFile ^PhotoStorage; + Windows::Graphics::Imaging::BitmapDecoder ^Decoder; + Windows::Graphics::Imaging::BitmapEncoder ^Encoder; +}; + +AdvancedCapture::AdvancedCapture() +{ + InitializeComponent(); + ScenarioInit(); +} + +/// +/// Invoked when this page is about to be displayed in a Frame. +/// +/// Event data that describes how this page was reached. The Parameter +/// property is typically used to configure the page. +void AdvancedCapture::OnNavigatedTo(NavigationEventArgs^ e) +{ + // A pointer back to the main page. This is needed if you want to call methods in MainPage such + // as NotifyUser() + rootPage = MainPage::Current; + m_eventRegistrationToken = Windows::Media::MediaControl::SoundLevelChanged += ref new EventHandler(this, &AdvancedCapture::SoundLevelChanged); + + m_orientationChangedEventToken = Windows::Graphics::Display::DisplayProperties::OrientationChanged += ref new Windows::Graphics::Display::DisplayPropertiesEventHandler(this, &AdvancedCapture::DisplayProperties_OrientationChanged); +} + +void AdvancedCapture::OnNavigatedFrom(NavigationEventArgs^ e) +{ + Windows::Media::MediaControl::SoundLevelChanged -= m_eventRegistrationToken; + Windows::Graphics::Display::DisplayProperties::OrientationChanged -= m_orientationChangedEventToken; +} + +void AdvancedCapture::ScenarioInit() +{ + rootPage = MainPage::Current; + btnStartDevice2->IsEnabled = true; + btnStartPreview2->IsEnabled = false; + btnStartStopRecord2->IsEnabled = false; + m_bRecording = false; + m_bPreviewing = false; + m_bEffectAdded = false; + btnStartStopRecord2->Content = "StartRecord"; + btnTakePhoto2->IsEnabled = false; + previewElement2->Source = nullptr; + playbackElement2->Source = nullptr; + imageElement2->Source= nullptr; + ShowStatusMessage(""); + chkAddRemoveEffect->IsChecked = false; + chkAddRemoveEffect->IsEnabled = false; + previewCanvas2->Visibility = Windows::UI::Xaml::Visibility::Collapsed; + EnumerateWebcamsAsync(); + m_bSuspended = false; +} + +void AdvancedCapture::ScenarioReset() +{ + previewCanvas2->Visibility = Windows::UI::Xaml::Visibility::Collapsed; + ScenarioInit(); +} + +void AdvancedCapture::SoundLevelChanged(Object^ sender, Object^ e) +{ + create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, ref new Windows::UI::Core::DispatchedHandler([this]() + { + if(Windows::Media::MediaControl::SoundLevel != Windows::Media::SoundLevel::Muted) + { + ScenarioReset(); + } + else + { + if (m_bRecording) + { + ShowStatusMessage("Stopping Record on invisibility"); + + create_task(m_mediaCaptureMgr->StopRecordAsync()).then([this](task recordTask) + { + try + { + recordTask.get(); + m_bRecording = false; + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + } + }); + } + if (m_bPreviewing) + { + ShowStatusMessage("Stopping Preview on invisibility"); + + create_task(m_mediaCaptureMgr->StopPreviewAsync()).then([this](task previewTask) + { + try + { + previewTask.get(); + m_bPreviewing = false; + + }catch (Exception ^e) + { + ShowExceptionMessage(e); + } + }); + } + } + }))); +} + +void AdvancedCapture::RecordLimitationExceeded(Windows::Media::Capture::MediaCapture ^currentCaptureObject) +{ + try + { + if (m_bRecording) + { + create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, ref new Windows::UI::Core::DispatchedHandler([this]() + { + try + { + ShowStatusMessage("Stopping Record on exceeding max record duration"); + EnableButton(false, "StartStopRecord"); + create_task(m_mediaCaptureMgr->StopRecordAsync()).then([this](task recordTask) + { + try + { + recordTask.get(); + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + ShowStatusMessage("Stopped record on exceeding max record duration:" + m_recordStorageFile->Path); + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + } + }); + } + catch (Exception ^e) + { + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + ShowExceptionMessage(e); + } + }))); + } + } + catch (Exception ^e) + { + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + ShowExceptionMessage(e); + } +} + +void AdvancedCapture::Failed(Windows::Media::Capture::MediaCapture ^currentCaptureObject, Windows::Media::Capture::MediaCaptureFailedEventArgs^ currentFailure) +{ + String ^message = "Fatal error" + currentFailure->Message; + create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, + ref new Windows::UI::Core::DispatchedHandler([this, message]() + { + ShowStatusMessage(message); + }))); +} + +void AdvancedCapture::btnStartDevice_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) +{ + try + { + EnableButton(false, "StartDevice"); + ShowStatusMessage("Starting device"); + auto mediaCapture = ref new Windows::Media::Capture::MediaCapture(); + m_mediaCaptureMgr = mediaCapture; + auto settings = ref new Windows::Media::Capture::MediaCaptureInitializationSettings(); + auto chosenDevInfo = m_devInfoCollection->GetAt(EnumedDeviceList2->SelectedIndex); + settings->VideoDeviceId = chosenDevInfo->Id; + if (chosenDevInfo->EnclosureLocation != nullptr && chosenDevInfo->EnclosureLocation->Panel == Windows::Devices::Enumeration::Panel::Back) + { + m_bRotateVideoOnOrientationChange = true; + m_bReversePreviewRotation = false; + } + else if (chosenDevInfo->EnclosureLocation != nullptr && chosenDevInfo->EnclosureLocation->Panel == Windows::Devices::Enumeration::Panel::Front) + { + m_bRotateVideoOnOrientationChange = true; + m_bReversePreviewRotation = true; + } + else + { + m_bRotateVideoOnOrientationChange = false; + } + + create_task(mediaCapture->InitializeAsync(settings)).then([this](task initTask) + { + try + { + initTask.get(); + + auto mediaCapture = m_mediaCaptureMgr.Get(); + + DisplayProperties_OrientationChanged(nullptr); + + EnableButton(true, "StartPreview"); + EnableButton(true, "StartStopRecord"); + EnableButton(true, "TakePhoto"); + ShowStatusMessage("Device initialized successful"); + chkAddRemoveEffect->IsEnabled = true; + mediaCapture->RecordLimitationExceeded += ref new Windows::Media::Capture::RecordLimitationExceededEventHandler(this, &AdvancedCapture::RecordLimitationExceeded); + mediaCapture->Failed += ref new Windows::Media::Capture::MediaCaptureFailedEventHandler(this, &AdvancedCapture::Failed); + } + catch (Exception ^ e) + { + ShowExceptionMessage(e); + } + }); + } + catch (Platform::Exception^ e) + { + ShowExceptionMessage(e); + } +} + +void AdvancedCapture::btnStartPreview_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) +{ + m_bPreviewing = false; + try + { + ShowStatusMessage("Starting preview"); + EnableButton(false, "StartPreview"); + + auto mediaCapture = m_mediaCaptureMgr.Get(); + previewCanvas2->Visibility = Windows::UI::Xaml::Visibility::Visible; + previewElement2->Source = mediaCapture; + create_task(mediaCapture->StartPreviewAsync()).then([this](task previewTask) + { + try + { + previewTask.get(); + m_bPreviewing = true; + ShowStatusMessage("Start preview successful"); + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + } + }); + } + catch (Platform::Exception^ e) + { + m_bPreviewing = false; + previewElement2->Source = nullptr; + EnableButton(true, "StartPreview"); + ShowExceptionMessage(e); + } +} + +void AdvancedCapture::btnTakePhoto_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) +{ + try + { + ShowStatusMessage("Taking photo"); + EnableButton(false, "TakePhoto"); + auto currentRotation = GetCurrentPhotoRotation(); + + task(KnownFolders::PicturesLibrary->CreateFileAsync(TEMP_PHOTO_FILE_NAME, Windows::Storage::CreationCollisionOption::GenerateUniqueName)).then([this, currentRotation](task getFileTask) + { + try + { + auto tempPhotoStorageFile = getFileTask.get(); + ShowStatusMessage("Create photo file successful"); + ImageEncodingProperties^ imageProperties = ImageEncodingProperties::CreateJpeg(); + + create_task(m_mediaCaptureMgr->CapturePhotoToStorageFileAsync(imageProperties, tempPhotoStorageFile)).then([this,tempPhotoStorageFile,currentRotation](task photoTask) + { + try + { + photoTask.get(); + + ReencodePhotoAsync(tempPhotoStorageFile, currentRotation).then([this] (task reencodeImageTask) + { + try + { + auto photoStorageFile = reencodeImageTask.get(); + + EnableButton(true, "TakePhoto"); + ShowStatusMessage("Photo taken"); + + task(photoStorageFile->OpenAsync(FileAccessMode::Read)).then([this](task getStreamTask) + { + try + { + auto photoStream = getStreamTask.get(); + ShowStatusMessage("File open successful"); + auto bmpimg = ref new BitmapImage(); + + bmpimg->SetSource(photoStream); + imageElement2->Source = bmpimg; + } + catch (Exception^ e) + { + ShowExceptionMessage(e); + EnableButton(true, "TakePhoto"); + } + }); + } + catch (Platform::Exception ^ e) + { + ShowExceptionMessage(e); + EnableButton(true, "TakePhoto"); + } + }); + } + catch (Platform::Exception ^ e) + { + ShowExceptionMessage(e); + EnableButton(true, "TakePhoto"); + } + }); + } + catch (Exception^ e) + { + ShowExceptionMessage(e); + EnableButton(true, "TakePhoto"); + + } + }); + } + catch (Platform::Exception^ e) + { + ShowExceptionMessage(e); + EnableButton(true, "TakePhoto"); + } +} + +void AdvancedCapture::btnStartStopRecord_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) +{ + try + { + String ^fileName; + EnableButton(false, "StartStopRecord"); + + if (!m_bRecording) + { + ShowStatusMessage("Starting Record"); + + fileName = VIDEO_FILE_NAME; + + PrepareForVideoRecording(); + + task(KnownFolders::VideosLibrary->CreateFileAsync(fileName, Windows::Storage::CreationCollisionOption::GenerateUniqueName)).then([this](task fileTask) + { + try + { + this->m_recordStorageFile = fileTask.get(); + ShowStatusMessage("Create record file successful"); + + MediaEncodingProfile^ recordProfile= nullptr; + recordProfile = MediaEncodingProfile::CreateMp4(Windows::Media::MediaProperties::VideoEncodingQuality::Auto); + + create_task(m_mediaCaptureMgr->StartRecordToStorageFileAsync(recordProfile, this->m_recordStorageFile)).then([this](task recordTask) + { + try + { + recordTask.get(); + m_bRecording = true; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + + ShowStatusMessage("Start Record successful"); + } + catch (Exception ^e) + { + m_bRecording = true; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + ShowExceptionMessage(e); + } + }); + } + catch (Exception ^e) + { + m_bRecording = false; + EnableButton(true, "StartStopRecord"); + ShowExceptionMessage(e); + } + }); + } + else + { + ShowStatusMessage("Stopping Record"); + + create_task(m_mediaCaptureMgr->StopRecordAsync()).then([this](task recordTask) + { + try + { + recordTask.get(); + m_bRecording = false; + EnableButton(true, "StartStopRecord"); + SwitchRecordButtonContent(); + + ShowStatusMessage("Stop record successful"); + if (!m_bSuspended) + { + task(this->m_recordStorageFile->OpenAsync(FileAccessMode::Read)).then([this](task streamTask) + { + try + { + auto stream = streamTask.get(); + ShowStatusMessage("Record file opened"); + ShowStatusMessage(this->m_recordStorageFile->Path); + playbackElement2->AutoPlay = true; + playbackElement2->SetSource(stream, this->m_recordStorageFile->FileType); + playbackElement2->Play(); + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + m_bRecording = false; + EnableButton(true, "StartStopRecord"); + SwitchRecordButtonContent(); + } + }); + } + } + catch (Exception ^e) + { + m_bRecording = false; + EnableButton(true, "StartStopRecord"); + SwitchRecordButtonContent(); + ShowExceptionMessage(e); + } + }); + } + } + catch (Platform::Exception^ e) + { + EnableButton(true, "StartStopRecord"); + ShowExceptionMessage(e); + m_bRecording = false; + SwitchRecordButtonContent(); + } +} +void AdvancedCapture::lstEnumedDevices_SelectionChanged(Platform::Object^ sender, Windows::UI::Xaml::Controls::SelectionChangedEventArgs^ e) +{ + if ( m_bPreviewing ) + { + create_task(m_mediaCaptureMgr->StopPreviewAsync()).then([this](task previewTask) + { + try + { + previewTask.get(); + m_bPreviewing = false; + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + } + }); + } + + btnStartDevice2->IsEnabled = true; + btnStartPreview2->IsEnabled = false; + btnStartStopRecord2->IsEnabled = false; + m_bRecording = false; + btnStartStopRecord2->Content = "StartRecord"; + btnTakePhoto2->IsEnabled = false; + previewElement2->Source = nullptr; + playbackElement2->Source = nullptr; + imageElement2->Source= nullptr; + chkAddRemoveEffect->IsEnabled = false; + chkAddRemoveEffect->IsChecked = false; + m_bEffectAdded = false; + m_bEffectAddedToRecord = false; + m_bEffectAddedToPhoto = false; + ShowStatusMessage(""); +} + +void AdvancedCapture::EnumerateWebcamsAsync() +{ + try + { + ShowStatusMessage("Enumerating Webcams..."); + m_devInfoCollection = nullptr; + + EnumedDeviceList2->Items->Clear(); + + task(DeviceInformation::FindAllAsync(DeviceClass::VideoCapture)).then([this](task findTask) + { + try + { + m_devInfoCollection = findTask.get(); + if (m_devInfoCollection == nullptr || m_devInfoCollection->Size == 0) + { + ShowStatusMessage("No WebCams found."); + } + else + { + for(unsigned int i = 0; i < m_devInfoCollection->Size; i++) + { + auto devInfo = m_devInfoCollection->GetAt(i); + EnumedDeviceList2->Items->Append(devInfo->Name); + } + EnumedDeviceList2->SelectedIndex = 0; + ShowStatusMessage("Enumerating Webcams completed successfully."); + btnStartDevice2->IsEnabled = true; + } + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + } + }); + } + catch (Platform::Exception^ e) + { + ShowExceptionMessage(e); + } +} + +void AdvancedCapture::AddEffectToImageStream() +{ + auto mediaCapture = m_mediaCaptureMgr.Get(); + Windows::Media::Capture::VideoDeviceCharacteristic charecteristic = mediaCapture->MediaCaptureSettings->VideoDeviceCharacteristic; + + if((charecteristic != Windows::Media::Capture::VideoDeviceCharacteristic::AllStreamsIdentical) && + (charecteristic != Windows::Media::Capture::VideoDeviceCharacteristic::PreviewPhotoStreamsIdentical) && + (charecteristic != Windows::Media::Capture::VideoDeviceCharacteristic::RecordPhotoStreamsIdentical)) + { + Windows::Media::MediaProperties::IMediaEncodingProperties ^props = mediaCapture->VideoDeviceController->GetMediaStreamProperties(Windows::Media::Capture::MediaStreamType::Photo); + if(props->Type->Equals("Image")) + { + //Switch to a video media type instead since we cant add an effect to a image media type + Windows::Foundation::Collections::IVectorView^ supportedPropsList = mediaCapture->VideoDeviceController->GetAvailableMediaStreamProperties(Windows::Media::Capture::MediaStreamType::Photo); + { + unsigned int i = 0; + while (i< supportedPropsList->Size) + { + Windows::Media::MediaProperties::IMediaEncodingProperties^ props = supportedPropsList->GetAt(i); + + String^ s = props->Type; + if(props->Type->Equals("Video")) + { + task(mediaCapture->VideoDeviceController->SetMediaStreamPropertiesAsync(Windows::Media::Capture::MediaStreamType::Photo,props)).then([this](task changeTypeTask) + { + try + { + changeTypeTask.get(); + ShowStatusMessage("Change type on photo stream successful"); + //Now add the effect on the image pin + task(m_mediaCaptureMgr->AddEffectAsync(Windows::Media::Capture::MediaStreamType::Photo,"GrayscaleTransform.GrayscaleEffect", nullptr)).then([this](task effectTask3) + { + try + { + effectTask3.get(); + m_bEffectAddedToPhoto = true; + ShowStatusMessage("Adding effect to photo stream successful"); + chkAddRemoveEffect->IsEnabled = true; + + } + catch(Exception ^e) + { + ShowExceptionMessage(e); + chkAddRemoveEffect->IsEnabled = true; + chkAddRemoveEffect->IsChecked = false; + } + }); + + } + catch(Exception ^e) + { + ShowExceptionMessage(e); + chkAddRemoveEffect->IsEnabled = true; + chkAddRemoveEffect->IsChecked = false; + + } + + }); + break; + + } + i++; + } + } + } + else + { + //Add the effect to the image pin if the type is already "Video" + task(mediaCapture->AddEffectAsync(Windows::Media::Capture::MediaStreamType::Photo,"GrayscaleTransform.GrayscaleEffect", nullptr)).then([this](task effectTask3) + { + try + { + effectTask3.get(); + m_bEffectAddedToPhoto = true; + ShowStatusMessage("Adding effect to photo stream successful"); + chkAddRemoveEffect->IsEnabled = true; + + } + catch(Exception ^e) + { + ShowExceptionMessage(e); + chkAddRemoveEffect->IsEnabled = true; + chkAddRemoveEffect->IsChecked = false; + } + }); + } + } +} + + + +void AdvancedCapture::chkAddRemoveEffect_Checked(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) +{ + try + { + chkAddRemoveEffect->IsEnabled = false; + m_bEffectAdded = true; + create_task(m_mediaCaptureMgr->AddEffectAsync(Windows::Media::Capture::MediaStreamType::VideoPreview,"GrayscaleTransform.GrayscaleEffect", nullptr)).then([this](task effectTask) + { + try + { + effectTask.get(); + + auto mediaCapture = m_mediaCaptureMgr.Get(); + Windows::Media::Capture::VideoDeviceCharacteristic charecteristic = mediaCapture->MediaCaptureSettings->VideoDeviceCharacteristic; + + ShowStatusMessage("Add effect successful to preview stream successful"); + if((charecteristic != Windows::Media::Capture::VideoDeviceCharacteristic::AllStreamsIdentical) && + (charecteristic != Windows::Media::Capture::VideoDeviceCharacteristic::PreviewRecordStreamsIdentical)) + { + Windows::Media::MediaProperties::IMediaEncodingProperties ^props = mediaCapture->VideoDeviceController->GetMediaStreamProperties(Windows::Media::Capture::MediaStreamType::VideoRecord); + Windows::Media::MediaProperties::VideoEncodingProperties ^videoEncodingProperties = static_cast(props); + if(!videoEncodingProperties->Subtype->Equals("H264")) //Cant add an effect to an H264 stream + { + task(mediaCapture->AddEffectAsync(Windows::Media::Capture::MediaStreamType::VideoRecord,"GrayscaleTransform.GrayscaleEffect", nullptr)).then([this](task effectTask2) + { + try + { + effectTask2.get(); + ShowStatusMessage("Add effect successful to record stream successful"); + m_bEffectAddedToRecord = true; + AddEffectToImageStream(); + chkAddRemoveEffect->IsEnabled = true; + } + catch(Exception ^e) + { + ShowExceptionMessage(e); + chkAddRemoveEffect->IsEnabled = true; + chkAddRemoveEffect->IsChecked = false; + } + }); + } + else + { + AddEffectToImageStream(); + chkAddRemoveEffect->IsEnabled = true; + } + + } + else + { + AddEffectToImageStream(); + chkAddRemoveEffect->IsEnabled = true; + } + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + chkAddRemoveEffect->IsEnabled = true; + chkAddRemoveEffect->IsChecked = false; + } + }); + } + catch (Platform::Exception ^e) + { + ShowExceptionMessage(e); + chkAddRemoveEffect->IsEnabled = true; + chkAddRemoveEffect->IsChecked = false; + } +} + +void AdvancedCapture::chkAddRemoveEffect_Unchecked(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) +{ + try + { + chkAddRemoveEffect->IsEnabled = false; + m_bEffectAdded = false; + create_task(m_mediaCaptureMgr->ClearEffectsAsync(Windows::Media::Capture::MediaStreamType::VideoPreview)).then([this](task effectTask) + { + try + { + effectTask.get(); + ShowStatusMessage("Remove effect from video preview stream successful"); + if(m_bEffectAddedToRecord) + { + task(m_mediaCaptureMgr->ClearEffectsAsync(Windows::Media::Capture::MediaStreamType::VideoRecord)).then([this](task effectTask) + { + try + { + effectTask.get(); + ShowStatusMessage("Remove effect from video record stream successful"); + m_bEffectAddedToRecord = false; + if(m_bEffectAddedToPhoto) + { + task(m_mediaCaptureMgr->ClearEffectsAsync(Windows::Media::Capture::MediaStreamType::Photo)).then([this](task effectTask) + { + try + { + effectTask.get(); + ShowStatusMessage("Remove effect from Photo stream successful"); + m_bEffectAddedToPhoto = false; + + } + catch(Exception ^e) + { + ShowExceptionMessage(e); + chkAddRemoveEffect->IsEnabled = true; + chkAddRemoveEffect->IsChecked = true; + } + + }); + } + else + { + } + chkAddRemoveEffect->IsEnabled = true; + } + catch(Exception ^e) + { + ShowExceptionMessage(e); + chkAddRemoveEffect->IsEnabled = true; + chkAddRemoveEffect->IsChecked = true; + + } + + }); + + } + else if(m_bEffectAddedToPhoto) + { + task(m_mediaCaptureMgr->ClearEffectsAsync(Windows::Media::Capture::MediaStreamType::Photo)).then([this](task effectTask) + { + try + { + effectTask.get(); + ShowStatusMessage("Remove effect from Photo stream successful"); + m_bEffectAddedToPhoto = false; + + } + catch(Exception ^e) + { + ShowExceptionMessage(e); + chkAddRemoveEffect->IsEnabled = true; + chkAddRemoveEffect->IsChecked = true; + } + + }); + } + else + { + chkAddRemoveEffect->IsEnabled = true; + chkAddRemoveEffect->IsChecked = true; + } + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + chkAddRemoveEffect->IsEnabled = true; + chkAddRemoveEffect->IsChecked = true; + } + }); + } + catch (Platform::Exception ^e) + { + ShowExceptionMessage(e); + chkAddRemoveEffect->IsEnabled = true; + chkAddRemoveEffect->IsChecked = true; + } +} + +void AdvancedCapture::ShowStatusMessage(Platform::String^ text) +{ + rootPage->NotifyUser(text, NotifyType::StatusMessage); +} + +void AdvancedCapture::ShowExceptionMessage(Platform::Exception^ ex) +{ + rootPage->NotifyUser(ex->Message, NotifyType::ErrorMessage); +} + +void AdvancedCapture::SwitchRecordButtonContent() +{ + if (m_bRecording) + { + btnStartStopRecord2->Content="StopRecord"; + } + else + { + btnStartStopRecord2->Content="StartRecord"; + } +} + +void AdvancedCapture::EnableButton(bool enabled, String^ name) +{ + if (name->Equals("StartDevice")) + { + btnStartDevice2->IsEnabled = enabled; + } + else if (name->Equals("StartPreview")) + { + btnStartPreview2->IsEnabled = enabled; + } + else if (name->Equals("StartStopRecord")) + { + btnStartStopRecord2->IsEnabled = enabled; + } + else if (name->Equals("TakePhoto")) + { + btnTakePhoto2->IsEnabled = enabled; + } +} + +task AdvancedCapture::ReencodePhotoAsync( + Windows::Storage::StorageFile ^tempStorageFile, + Windows::Storage::FileProperties::PhotoOrientation photoRotation) +{ + ReencodeState ^state = ref new ReencodeState(); + + return create_task(tempStorageFile->OpenAsync(Windows::Storage::FileAccessMode::Read)).then([state](Windows::Storage::Streams::IRandomAccessStream ^stream) + { + state->InputStream = stream; + return Windows::Graphics::Imaging::BitmapDecoder::CreateAsync(state->InputStream); + }).then([state](Windows::Graphics::Imaging::BitmapDecoder ^decoder) + { + state->Decoder = decoder; + return Windows::Storage::KnownFolders::PicturesLibrary->CreateFileAsync(PHOTO_FILE_NAME, Windows::Storage::CreationCollisionOption::GenerateUniqueName); + }).then([state](Windows::Storage::StorageFile ^storageFile) + { + state->PhotoStorage = storageFile; + return state->PhotoStorage->OpenAsync(Windows::Storage::FileAccessMode::ReadWrite); + }).then([state](Windows::Storage::Streams::IRandomAccessStream ^stream) + { + state->OutputStream = stream; + state->OutputStream->Size = 0; + return Windows::Graphics::Imaging::BitmapEncoder::CreateForTranscodingAsync(state->OutputStream, state->Decoder); + }).then([state, photoRotation](Windows::Graphics::Imaging::BitmapEncoder ^encoder) + { + state->Encoder = encoder; + auto properties = ref new Windows::Graphics::Imaging::BitmapPropertySet(); + properties->Insert("System.Photo.Orientation", + ref new Windows::Graphics::Imaging::BitmapTypedValue((unsigned short)photoRotation, Windows::Foundation::PropertyType::UInt16)); + return create_task(state->Encoder->BitmapProperties->SetPropertiesAsync(properties)); + }).then([state]() + { + return state->Encoder->FlushAsync(); + }).then([tempStorageFile, state](task previousTask) + { + auto result = state->PhotoStorage; + delete state; + + tempStorageFile->DeleteAsync(Windows::Storage::StorageDeleteOption::PermanentDelete); + + previousTask.get(); + + return result; + }); +} + +Windows::Storage::FileProperties::PhotoOrientation AdvancedCapture::GetCurrentPhotoRotation() +{ + bool counterclockwiseRotation = m_bReversePreviewRotation; + + if (m_bRotateVideoOnOrientationChange) + { + return PhotoRotationLookup(Windows::Graphics::Display::DisplayProperties::CurrentOrientation, counterclockwiseRotation); + } + else + { + return Windows::Storage::FileProperties::PhotoOrientation::Normal; + } +} + +void AdvancedCapture::PrepareForVideoRecording() +{ + Windows::Media::Capture::MediaCapture ^mediaCapture = m_mediaCaptureMgr.Get(); + if (mediaCapture == nullptr) + { + return; + } + + bool counterclockwiseRotation = m_bReversePreviewRotation; + + if (m_bRotateVideoOnOrientationChange) + { + mediaCapture->SetRecordRotation(VideoRotationLookup(Windows::Graphics::Display::DisplayProperties::CurrentOrientation, counterclockwiseRotation)); + } + else + { + mediaCapture->SetRecordRotation(Windows::Media::Capture::VideoRotation::None); + } +} + +void AdvancedCapture::DisplayProperties_OrientationChanged(Platform::Object^ sender) +{ + Windows::Media::Capture::MediaCapture ^mediaCapture = m_mediaCaptureMgr.Get(); + if (mediaCapture == nullptr) + { + return; + } + + bool previewMirroring = mediaCapture->GetPreviewMirroring(); + bool counterclockwiseRotation = (previewMirroring && !m_bReversePreviewRotation) || + (!previewMirroring && m_bReversePreviewRotation); + + if (m_bRotateVideoOnOrientationChange) + { + mediaCapture->SetPreviewRotation(VideoRotationLookup(Windows::Graphics::Display::DisplayProperties::CurrentOrientation, counterclockwiseRotation)); + } + else + { + mediaCapture->SetPreviewRotation(Windows::Media::Capture::VideoRotation::None); + } +} + +Windows::Storage::FileProperties::PhotoOrientation AdvancedCapture::PhotoRotationLookup( + Windows::Graphics::Display::DisplayOrientations displayOrientation, bool counterclockwise) +{ + switch (displayOrientation) + { + case Windows::Graphics::Display::DisplayOrientations::Landscape: + return Windows::Storage::FileProperties::PhotoOrientation::Normal; + + case Windows::Graphics::Display::DisplayOrientations::Portrait: + return (counterclockwise) ? Windows::Storage::FileProperties::PhotoOrientation::Rotate270: + Windows::Storage::FileProperties::PhotoOrientation::Rotate90; + + case Windows::Graphics::Display::DisplayOrientations::LandscapeFlipped: + return Windows::Storage::FileProperties::PhotoOrientation::Rotate180; + + case Windows::Graphics::Display::DisplayOrientations::PortraitFlipped: + return (counterclockwise) ? Windows::Storage::FileProperties::PhotoOrientation::Rotate90 : + Windows::Storage::FileProperties::PhotoOrientation::Rotate270; + + default: + return Windows::Storage::FileProperties::PhotoOrientation::Unspecified; + } +} + +Windows::Media::Capture::VideoRotation AdvancedCapture::VideoRotationLookup( + Windows::Graphics::Display::DisplayOrientations displayOrientation, bool counterclockwise) +{ + switch (displayOrientation) + { + case Windows::Graphics::Display::DisplayOrientations::Landscape: + return Windows::Media::Capture::VideoRotation::None; + + case Windows::Graphics::Display::DisplayOrientations::Portrait: + return (counterclockwise) ? Windows::Media::Capture::VideoRotation::Clockwise270Degrees : + Windows::Media::Capture::VideoRotation::Clockwise90Degrees; + + case Windows::Graphics::Display::DisplayOrientations::LandscapeFlipped: + return Windows::Media::Capture::VideoRotation::Clockwise180Degrees; + + case Windows::Graphics::Display::DisplayOrientations::PortraitFlipped: + return (counterclockwise) ? Windows::Media::Capture::VideoRotation::Clockwise90Degrees: + Windows::Media::Capture::VideoRotation::Clockwise270Degrees ; + + default: + return Windows::Media::Capture::VideoRotation::None; + } +} + diff --git a/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.h b/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.h new file mode 100644 index 000000000..83556b95e --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.h @@ -0,0 +1,103 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +// +// AdvancedCapture.xaml.h +// Declaration of the AdvancedCapture class +// + +#pragma once + +#include "pch.h" +#include "AdvancedCapture.g.h" +#include "MainPage.xaml.h" +#include + +#define VIDEO_FILE_NAME "video.mp4" +#define PHOTO_FILE_NAME "photo.jpg" +#define TEMP_PHOTO_FILE_NAME "photoTmp.jpg" + +using namespace concurrency; +using namespace Windows::Devices::Enumeration; + +namespace SDKSample +{ + namespace MediaCapture + { + /// + /// An empty page that can be used on its own or navigated to within a Frame. + /// + [Windows::Foundation::Metadata::WebHostHidden] + public ref class AdvancedCapture sealed + { + public: + AdvancedCapture(); + + protected: + virtual void OnNavigatedTo(Windows::UI::Xaml::Navigation::NavigationEventArgs^ e) override; + virtual void OnNavigatedFrom(Windows::UI::Xaml::Navigation::NavigationEventArgs^ e) override; + + private: + MainPage^ rootPage; + void ScenarioInit(); + void ScenarioReset(); + + void SoundLevelChanged(Object^ sender, Object^ e); + void RecordLimitationExceeded(Windows::Media::Capture::MediaCapture ^ mediaCapture); + void Failed(Windows::Media::Capture::MediaCapture ^ mediaCapture, Windows::Media::Capture::MediaCaptureFailedEventArgs ^ args); + + void btnStartDevice_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + + void btnStartPreview_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + + void btnStartStopRecord_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + + void btnTakePhoto_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + + void lstEnumedDevices_SelectionChanged(Platform::Object^ sender, Windows::UI::Xaml::Controls::SelectionChangedEventArgs^ e); + void EnumerateWebcamsAsync(); + + void chkAddRemoveEffect_Checked(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + void chkAddRemoveEffect_Unchecked(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + void AddEffectToImageStream(); + + void ShowStatusMessage(Platform::String^ text); + void ShowExceptionMessage(Platform::Exception^ ex); + + void EnableButton(bool enabled, Platform::String ^name); + void SwitchRecordButtonContent(); + + task ReencodePhotoAsync( + Windows::Storage::StorageFile ^tempStorageFile, + Windows::Storage::FileProperties::PhotoOrientation photoRotation); + Windows::Storage::FileProperties::PhotoOrientation GetCurrentPhotoRotation(); + void PrepareForVideoRecording(); + void DisplayProperties_OrientationChanged(Platform::Object^ sender); + Windows::Storage::FileProperties::PhotoOrientation PhotoRotationLookup( + Windows::Graphics::Display::DisplayOrientations displayOrientation, bool counterclockwise); + Windows::Media::Capture::VideoRotation VideoRotationLookup( + Windows::Graphics::Display::DisplayOrientations displayOrientation, bool counterclockwise); + + Platform::Agile m_mediaCaptureMgr; + Windows::Storage::StorageFile^ m_recordStorageFile; + bool m_bRecording; + bool m_bEffectAdded; + bool m_bEffectAddedToRecord; + bool m_bEffectAddedToPhoto; + bool m_bSuspended; + bool m_bPreviewing; + DeviceInformationCollection^ m_devInfoCollection; + Windows::Foundation::EventRegistrationToken m_eventRegistrationToken; + bool m_bRotateVideoOnOrientationChange; + bool m_bReversePreviewRotation; + Windows::Foundation::EventRegistrationToken m_orientationChangedEventToken; + }; + } +} diff --git a/samples/winrt/ImageManipulations/C++/App.xaml b/samples/winrt/ImageManipulations/C++/App.xaml new file mode 100644 index 000000000..2edfd7790 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/App.xaml @@ -0,0 +1,30 @@ + + + + + + + + + + + + + + diff --git a/samples/winrt/ImageManipulations/C++/App.xaml.cpp b/samples/winrt/ImageManipulations/C++/App.xaml.cpp new file mode 100644 index 000000000..ef733a1cc --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/App.xaml.cpp @@ -0,0 +1,114 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +// +// App.xaml.cpp +// Implementation of the App.xaml class. +// + +#include "pch.h" +#include "MainPage.xaml.h" +#include "Common\SuspensionManager.h" + +using namespace SDKSample; +using namespace SDKSample::Common; + +using namespace Concurrency; +using namespace Platform; +using namespace Windows::ApplicationModel; +using namespace Windows::ApplicationModel::Activation; +using namespace Windows::Foundation; +using namespace Windows::Foundation::Collections; +using namespace Windows::UI::Core; +using namespace Windows::UI::Xaml; +using namespace Windows::UI::Xaml::Controls; +using namespace Windows::UI::Xaml::Controls::Primitives; +using namespace Windows::UI::Xaml::Data; +using namespace Windows::UI::Xaml::Input; +using namespace Windows::UI::Xaml::Interop; +using namespace Windows::UI::Xaml::Media; +using namespace Windows::UI::Xaml::Navigation; + +/// +/// Initializes the singleton application object. This is the first line of authored code +/// executed, and as such is the logical equivalent of main() or WinMain(). +/// +App::App() +{ + InitializeComponent(); + this->Suspending += ref new SuspendingEventHandler(this, &SDKSample::App::OnSuspending); +} + +/// +/// Invoked when the application is launched normally by the end user. Other entry points will +/// be used when the application is launched to open a specific file, to display search results, +/// and so forth. +/// +/// Details about the launch request and process. +void App::OnLaunched(LaunchActivatedEventArgs^ pArgs) +{ + this->LaunchArgs = pArgs; + + // Do not repeat app initialization when already running, just ensure that + // the window is active + if (pArgs->PreviousExecutionState == ApplicationExecutionState::Running) + { + Window::Current->Activate(); + return; + } + + // Create a Frame to act as the navigation context and associate it with + // a SuspensionManager key + auto rootFrame = ref new Frame(); + SuspensionManager::RegisterFrame(rootFrame, "AppFrame"); + + auto prerequisite = task([](){}); + if (pArgs->PreviousExecutionState == ApplicationExecutionState::Terminated) + { + // Restore the saved session state only when appropriate, scheduling the + // final launch steps after the restore is complete + prerequisite = SuspensionManager::RestoreAsync(); + } + prerequisite.then([=]() + { + // When the navigation stack isn't restored navigate to the first page, + // configuring the new page by passing required information as a navigation + // parameter + if (rootFrame->Content == nullptr) + { + if (!rootFrame->Navigate(TypeName(MainPage::typeid))) + { + throw ref new FailureException("Failed to create initial page"); + } + } + + // Place the frame in the current Window and ensure that it is active + Window::Current->Content = rootFrame; + Window::Current->Activate(); + }, task_continuation_context::use_current()); +} + +/// +/// Invoked when application execution is being suspended. Application state is saved +/// without knowing whether the application will be terminated or resumed with the contents +/// of memory still intact. +/// +/// The source of the suspend request. +/// Details about the suspend request. +void App::OnSuspending(Object^ sender, SuspendingEventArgs^ e) +{ + (void) sender; // Unused parameter + + auto deferral = e->SuspendingOperation->GetDeferral(); + SuspensionManager::SaveAsync().then([=]() + { + deferral->Complete(); + }); +} diff --git a/samples/winrt/ImageManipulations/C++/App.xaml.h b/samples/winrt/ImageManipulations/C++/App.xaml.h new file mode 100644 index 000000000..a8b606424 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/App.xaml.h @@ -0,0 +1,35 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +// +// App.xaml.h +// Declaration of the App.xaml class. +// + +#pragma once + +#include "pch.h" +#include "App.g.h" +#include "MainPage.g.h" + +namespace SDKSample +{ + ref class App + { + internal: + App(); + virtual void OnSuspending(Platform::Object^ sender, Windows::ApplicationModel::SuspendingEventArgs^ pArgs); + Windows::ApplicationModel::Activation::LaunchActivatedEventArgs^ LaunchArgs; + protected: + virtual void OnLaunched(Windows::ApplicationModel::Activation::LaunchActivatedEventArgs^ pArgs) override; + private: + Windows::UI::Xaml::Controls::Frame^ rootFrame; + }; +} diff --git a/samples/winrt/ImageManipulations/C++/AudioCapture.xaml b/samples/winrt/ImageManipulations/C++/AudioCapture.xaml new file mode 100644 index 000000000..be65bcd8c --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/AudioCapture.xaml @@ -0,0 +1,62 @@ + + + + + + + + + + + + + + + + This scenario shows how to do an audio only capture using the default microphone. Click on StartRecord to start recording. + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/samples/winrt/ImageManipulations/C++/AudioCapture.xaml.cpp b/samples/winrt/ImageManipulations/C++/AudioCapture.xaml.cpp new file mode 100644 index 000000000..37fc379d3 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/AudioCapture.xaml.cpp @@ -0,0 +1,366 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +// +// AudioCapture.xaml.cpp +// Implementation of the AudioCapture class +// + +#include "pch.h" +#include "AudioCapture.xaml.h" +#include +using namespace concurrency; + +using namespace SDKSample::MediaCapture; + +using namespace Windows::UI::Xaml; +using namespace Windows::UI::Xaml::Navigation; +using namespace Windows::UI::Xaml::Data; +using namespace Windows::System; +using namespace Windows::Foundation; +using namespace Platform; +using namespace Windows::UI; +using namespace Windows::UI::Core; +using namespace Windows::UI::Xaml; +using namespace Windows::UI::Xaml::Controls; +using namespace Windows::UI::Xaml::Data; +using namespace Windows::UI::Xaml::Media; +using namespace Windows::Storage; +using namespace Windows::Media::MediaProperties; +using namespace Windows::Storage::Streams; +using namespace Windows::System; +using namespace Windows::UI::Xaml::Media::Imaging; + + +AudioCapture::AudioCapture() +{ + InitializeComponent(); + ScenarioInit(); +} + +/// +/// Invoked when this page is about to be displayed in a Frame. +/// +/// Event data that describes how this page was reached. The Parameter +/// property is typically used to configure the page. +void AudioCapture::OnNavigatedTo(NavigationEventArgs^ e) +{ + // A pointer back to the main page. This is needed if you want to call methods in MainPage such + // as NotifyUser() + rootPage = MainPage::Current; + m_eventRegistrationToken = Windows::Media::MediaControl::SoundLevelChanged += ref new EventHandler(this, &AudioCapture::SoundLevelChanged); +} + +void AudioCapture::OnNavigatedFrom(NavigationEventArgs^ e) +{ + // A pointer back to the main page. This is needed if you want to call methods in MainPage such + // as NotifyUser() + Windows::Media::MediaControl::SoundLevelChanged -= m_eventRegistrationToken; +} + +void AudioCapture::ScenarioInit() +{ + try + { + rootPage = MainPage::Current; + btnStartDevice3->IsEnabled = true; + btnStartStopRecord3->IsEnabled = false; + m_bRecording = false; + playbackElement3->Source = nullptr; + m_bSuspended = false; + ShowStatusMessage(""); + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + } + +} + +void AudioCapture::ScenarioReset() +{ + ScenarioInit(); +} + + +void AudioCapture::SoundLevelChanged(Object^ sender, Object^ e) +{ + create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, ref new Windows::UI::Core::DispatchedHandler([this]() + { + if(Windows::Media::MediaControl::SoundLevel != Windows::Media::SoundLevel::Muted) + { + ScenarioReset(); + } + else + { + if (m_bRecording) + { + ShowStatusMessage("Stopping Record on invisibility"); + + create_task(m_mediaCaptureMgr->StopRecordAsync()).then([this](task recordTask) + { + try + { + recordTask.get(); + m_bRecording = false; + }catch (Exception ^e) + { + ShowExceptionMessage(e); + } + }); + } + } + }))); +} + +void AudioCapture::RecordLimitationExceeded(Windows::Media::Capture::MediaCapture ^currentCaptureObject) +{ + try + { + if (m_bRecording) + { + create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, ref new Windows::UI::Core::DispatchedHandler([this](){ + try + { + ShowStatusMessage("Stopping Record on exceeding max record duration"); + EnableButton(false, "StartStopRecord"); + create_task(m_mediaCaptureMgr->StopRecordAsync()).then([this](task recordTask) + { + try + { + recordTask.get(); + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + ShowStatusMessage("Stopped record on exceeding max record duration:" + m_recordStorageFile->Path); + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + } + }); + + } + catch (Exception ^e) + { + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + ShowExceptionMessage(e); + } + + }))); + } + } + catch (Exception ^e) + { + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + ShowExceptionMessage(e); + } +} + +void AudioCapture::Failed(Windows::Media::Capture::MediaCapture ^currentCaptureObject, Windows::Media::Capture::MediaCaptureFailedEventArgs^ currentFailure) +{ + String ^message = "Fatal error: " + currentFailure->Message; + create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, + ref new Windows::UI::Core::DispatchedHandler([this, message]() + { + ShowStatusMessage(message); + }))); +} + +void AudioCapture::btnStartDevice_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) +{ + try + { + EnableButton(false, "StartDevice"); + ShowStatusMessage("Starting device"); + auto mediaCapture = ref new Windows::Media::Capture::MediaCapture(); + m_mediaCaptureMgr = mediaCapture; + auto settings = ref new Windows::Media::Capture::MediaCaptureInitializationSettings(); + settings->StreamingCaptureMode = Windows::Media::Capture::StreamingCaptureMode::Audio; + create_task(mediaCapture->InitializeAsync()).then([this](task initTask) + { + try + { + initTask.get(); + + auto mediaCapture = m_mediaCaptureMgr.Get(); + EnableButton(true, "StartPreview"); + EnableButton(true, "StartStopRecord"); + EnableButton(true, "TakePhoto"); + ShowStatusMessage("Device initialized successful"); + mediaCapture->RecordLimitationExceeded += ref new Windows::Media::Capture::RecordLimitationExceededEventHandler(this, &AudioCapture::RecordLimitationExceeded); + mediaCapture->Failed += ref new Windows::Media::Capture::MediaCaptureFailedEventHandler(this, &AudioCapture::Failed); + } + catch (Exception ^ e) + { + ShowExceptionMessage(e); + } + }); + } + catch (Platform::Exception^ e) + { + ShowExceptionMessage(e); + } +} + +void AudioCapture::btnStartStopRecord_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) +{ + try + { + String ^fileName; + EnableButton(false, "StartStopRecord"); + + if (!m_bRecording) + { + ShowStatusMessage("Starting Record"); + + fileName = AUDIO_FILE_NAME; + + task(KnownFolders::VideosLibrary->CreateFileAsync(fileName, Windows::Storage::CreationCollisionOption::GenerateUniqueName)).then([this](task fileTask) + { + try + { + this->m_recordStorageFile = fileTask.get(); + ShowStatusMessage("Create record file successful"); + + MediaEncodingProfile^ recordProfile= nullptr; + recordProfile = MediaEncodingProfile::CreateM4a(Windows::Media::MediaProperties::AudioEncodingQuality::Auto); + + create_task(m_mediaCaptureMgr->StartRecordToStorageFileAsync(recordProfile, this->m_recordStorageFile)).then([this](task recordTask) + { + try + { + recordTask.get(); + m_bRecording = true; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + + ShowStatusMessage("Start Record successful"); + + + }catch (Exception ^e) + { + ShowExceptionMessage(e); + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + } + }); + } + catch (Exception ^e) + { + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + ShowExceptionMessage(e); + } + } + ); + } + else + { + ShowStatusMessage("Stopping Record"); + + create_task(m_mediaCaptureMgr->StopRecordAsync()).then([this](task) + { + try + { + m_bRecording = false; + EnableButton(true, "StartStopRecord"); + SwitchRecordButtonContent(); + + ShowStatusMessage("Stop record successful"); + if (!m_bSuspended) + { + task(this->m_recordStorageFile->OpenAsync(FileAccessMode::Read)).then([this](task streamTask) + { + try + { + ShowStatusMessage("Record file opened"); + auto stream = streamTask.get(); + ShowStatusMessage(this->m_recordStorageFile->Path); + playbackElement3->AutoPlay = true; + playbackElement3->SetSource(stream, this->m_recordStorageFile->FileType); + playbackElement3->Play(); + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + } + }); + } + } + catch (Exception ^e) + { + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + ShowExceptionMessage(e); + } + }); + } + } + catch (Platform::Exception^ e) + { + EnableButton(true, "StartStopRecord"); + ShowExceptionMessage(e); + m_bRecording = false; + SwitchRecordButtonContent(); + } +} + + +void AudioCapture::ShowStatusMessage(Platform::String^ text) +{ + rootPage->NotifyUser(text, NotifyType::StatusMessage); +} + +void AudioCapture::ShowExceptionMessage(Platform::Exception^ ex) +{ + rootPage->NotifyUser(ex->Message, NotifyType::ErrorMessage); +} + +void AudioCapture::SwitchRecordButtonContent() +{ + { + if (m_bRecording) + { + btnStartStopRecord3->Content="StopRecord"; + } + else + { + btnStartStopRecord3->Content="StartRecord"; + } + } +} +void AudioCapture::EnableButton(bool enabled, String^ name) +{ + if (name->Equals("StartDevice")) + { + btnStartDevice3->IsEnabled = enabled; + } + + else if (name->Equals("StartStopRecord")) + { + btnStartStopRecord3->IsEnabled = enabled; + } + +} + diff --git a/samples/winrt/ImageManipulations/C++/AudioCapture.xaml.h b/samples/winrt/ImageManipulations/C++/AudioCapture.xaml.h new file mode 100644 index 000000000..b75efdc72 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/AudioCapture.xaml.h @@ -0,0 +1,70 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +// +// AudioCapture.xaml.h +// Declaration of the AudioCapture class +// + +#pragma once + +#include "pch.h" +#include "AudioCapture.g.h" +#include "MainPage.xaml.h" + +#define AUDIO_FILE_NAME "audio.mp4" + +namespace SDKSample +{ + namespace MediaCapture + { + /// + /// An empty page that can be used on its own or navigated to within a Frame. + /// + [Windows::Foundation::Metadata::WebHostHidden] + public ref class AudioCapture sealed + { + public: + AudioCapture(); + + protected: + virtual void OnNavigatedTo(Windows::UI::Xaml::Navigation::NavigationEventArgs^ e) override; + virtual void OnNavigatedFrom(Windows::UI::Xaml::Navigation::NavigationEventArgs^ e) override; + private: + MainPage^ rootPage; + + void ScenarioInit(); + void ScenarioReset(); + + void SoundLevelChanged(Object^ sender, Object^ e); + void RecordLimitationExceeded(Windows::Media::Capture::MediaCapture ^ mediaCapture); + void Failed(Windows::Media::Capture::MediaCapture ^ mediaCapture, Windows::Media::Capture::MediaCaptureFailedEventArgs ^ args); + + void btnStartDevice_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + + void btnStartPreview_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + + void btnStartStopRecord_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + + void ShowStatusMessage(Platform::String^ text); + void ShowExceptionMessage(Platform::Exception^ ex); + + void EnableButton(bool enabled, Platform::String ^name); + void SwitchRecordButtonContent(); + + Platform::Agile m_mediaCaptureMgr; + Windows::Storage::StorageFile^ m_photoStorageFile; + Windows::Storage::StorageFile^ m_recordStorageFile; + bool m_bRecording; + bool m_bSuspended; + Windows::Foundation::EventRegistrationToken m_eventRegistrationToken; + }; + } +} diff --git a/samples/winrt/ImageManipulations/C++/BasicCapture.xaml b/samples/winrt/ImageManipulations/C++/BasicCapture.xaml new file mode 100644 index 000000000..2cc0b0a5f --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/BasicCapture.xaml @@ -0,0 +1,87 @@ + + + + + + + + + + + + + + + + + This scenario demonstrates how to use the MediaCapture API to preview the camera stream, record a video, and take a picture using default initialization settings. + You can also adjust the brightness and contrast. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/samples/winrt/ImageManipulations/C++/BasicCapture.xaml.cpp b/samples/winrt/ImageManipulations/C++/BasicCapture.xaml.cpp new file mode 100644 index 000000000..f385fa9a7 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/BasicCapture.xaml.cpp @@ -0,0 +1,535 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +// +// BasicCapture.xaml.cpp +// Implementation of the BasicCapture class +// + +#include "pch.h" +#include "BasicCapture.xaml.h" +#include "ppl.h" + +using namespace Windows::System; +using namespace Windows::Foundation; +using namespace Platform; +using namespace Windows::UI; +using namespace Windows::UI::Core; +using namespace Windows::UI::Xaml; +using namespace Windows::UI::Xaml::Controls; +using namespace Windows::UI::Xaml::Navigation; +using namespace Windows::UI::Xaml::Data; +using namespace Windows::UI::Xaml::Media; +using namespace Windows::Storage; +using namespace Windows::Media::MediaProperties; +using namespace Windows::Storage::Streams; +using namespace Windows::System; +using namespace Windows::UI::Xaml::Media::Imaging; + +using namespace SDKSample::MediaCapture; +using namespace concurrency; + + +BasicCapture::BasicCapture() +{ + InitializeComponent(); + ScenarioInit(); +} + +/// +/// Invoked when this page is about to be displayed in a Frame. +/// +/// Event data that describes how this page was reached. The Parameter +/// property is typically used to configure the page. +void BasicCapture::OnNavigatedTo(NavigationEventArgs^ e) +{ + // A pointer back to the main page. This is needed if you want to call methods in MainPage such + // as NotifyUser() + rootPage = MainPage::Current; + m_eventRegistrationToken = Windows::Media::MediaControl::SoundLevelChanged += ref new EventHandler(this, &BasicCapture::SoundLevelChanged); +} + +void BasicCapture::OnNavigatedFrom(NavigationEventArgs^ e) +{ + // A pointer back to the main page. This is needed if you want to call methods in MainPage such + // as NotifyUser() + + Windows::Media::MediaControl::SoundLevelChanged -= m_eventRegistrationToken; + m_currentScenarioLoaded = false; +} + + +void BasicCapture::ScenarioInit() +{ + try + { + btnStartDevice1->IsEnabled = true; + btnStartPreview1->IsEnabled = false; + btnStartStopRecord1->IsEnabled = false; + m_bRecording = false; + m_bPreviewing = false; + btnStartStopRecord1->Content = "StartRecord"; + btnTakePhoto1->IsEnabled = false; + previewElement1->Source = nullptr; + playbackElement1->Source = nullptr; + imageElement1->Source= nullptr; + sldBrightness->IsEnabled = false; + sldContrast->IsEnabled = false; + m_bSuspended = false; + previewCanvas1->Visibility = Windows::UI::Xaml::Visibility::Collapsed; + + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + } + +} + +void BasicCapture::ScenarioReset() +{ + previewCanvas1->Visibility = Windows::UI::Xaml::Visibility::Collapsed; + ScenarioInit(); +} + +void BasicCapture::SoundLevelChanged(Object^ sender, Object^ e) +{ + create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, ref new Windows::UI::Core::DispatchedHandler([this]() + { + if(Windows::Media::MediaControl::SoundLevel != Windows::Media::SoundLevel::Muted) + { + ScenarioReset(); + } + else + { + if (m_bRecording) + { + ShowStatusMessage("Stopping Record on invisibility"); + + create_task(m_mediaCaptureMgr->StopRecordAsync()).then([this](task recordTask) + { + m_bRecording = false; + }); + } + if (m_bPreviewing) + { + ShowStatusMessage("Stopping Preview on invisibility"); + + create_task(m_mediaCaptureMgr->StopPreviewAsync()).then([this](task previewTask) + { + try + { + previewTask.get(); + m_bPreviewing = false; + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + } + }); + } + } + }))); +} + +void BasicCapture::RecordLimitationExceeded(Windows::Media::Capture::MediaCapture ^currentCaptureObject) +{ + try + { + if (m_bRecording) + { + create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, ref new Windows::UI::Core::DispatchedHandler([this](){ + try + { + ShowStatusMessage("Stopping Record on exceeding max record duration"); + EnableButton(false, "StartStopRecord"); + create_task(m_mediaCaptureMgr->StopRecordAsync()).then([this](task recordTask) + { + try + { + recordTask.get(); + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + ShowStatusMessage("Stopped record on exceeding max record duration:" + m_recordStorageFile->Path); + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + } + }); + + } + catch (Exception ^e) + { + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + ShowExceptionMessage(e); + } + + }))); + } + } + catch (Exception ^e) + { + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + ShowExceptionMessage(e); + } +} + +void BasicCapture::Failed(Windows::Media::Capture::MediaCapture ^currentCaptureObject, Windows::Media::Capture::MediaCaptureFailedEventArgs^ currentFailure) +{ + String ^message = "Fatal error: " + currentFailure->Message; + create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, + ref new Windows::UI::Core::DispatchedHandler([this, message]() + { + ShowStatusMessage(message); + }))); +} + +void BasicCapture::btnStartDevice_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) +{ + try + { + EnableButton(false, "StartDevice"); + ShowStatusMessage("Starting device"); + auto mediaCapture = ref new Windows::Media::Capture::MediaCapture(); + m_mediaCaptureMgr = mediaCapture; + create_task(mediaCapture->InitializeAsync()).then([this](task initTask) + { + try + { + initTask.get(); + + auto mediaCapture = m_mediaCaptureMgr.Get(); + EnableButton(true, "StartPreview"); + EnableButton(true, "StartStopRecord"); + EnableButton(true, "TakePhoto"); + ShowStatusMessage("Device initialized successful"); + mediaCapture->RecordLimitationExceeded += ref new Windows::Media::Capture::RecordLimitationExceededEventHandler(this, &BasicCapture::RecordLimitationExceeded); + mediaCapture->Failed += ref new Windows::Media::Capture::MediaCaptureFailedEventHandler(this, &BasicCapture::Failed); + } + catch (Exception ^ e) + { + ShowExceptionMessage(e); + } + } + ); + } + catch (Platform::Exception^ e) + { + ShowExceptionMessage(e); + } +} + +void BasicCapture::btnStartPreview_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) +{ + m_bPreviewing = false; + try + { + ShowStatusMessage("Starting preview"); + EnableButton(false, "StartPreview"); + auto mediaCapture = m_mediaCaptureMgr.Get(); + + previewCanvas1->Visibility = Windows::UI::Xaml::Visibility::Visible; + previewElement1->Source = mediaCapture; + create_task(mediaCapture->StartPreviewAsync()).then([this](task previewTask) + { + try + { + previewTask.get(); + auto mediaCapture = m_mediaCaptureMgr.Get(); + m_bPreviewing = true; + ShowStatusMessage("Start preview successful"); + if(mediaCapture->VideoDeviceController->Brightness) + { + SetupVideoDeviceControl(mediaCapture->VideoDeviceController->Brightness, sldBrightness); + } + if(mediaCapture->VideoDeviceController->Contrast) + { + SetupVideoDeviceControl(mediaCapture->VideoDeviceController->Contrast, sldContrast); + } + + }catch (Exception ^e) + { + ShowExceptionMessage(e); + } + }); + } + catch (Platform::Exception^ e) + { + m_bPreviewing = false; + previewElement1->Source = nullptr; + EnableButton(true, "StartPreview"); + ShowExceptionMessage(e); + } +} + +void BasicCapture::btnTakePhoto_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) +{ + try + { + ShowStatusMessage("Taking photo"); + EnableButton(false, "TakePhoto"); + + task(KnownFolders::PicturesLibrary->CreateFileAsync(PHOTO_FILE_NAME, Windows::Storage::CreationCollisionOption::GenerateUniqueName)).then([this](task getFileTask) + { + try + { + this->m_photoStorageFile = getFileTask.get(); + ShowStatusMessage("Create photo file successful"); + ImageEncodingProperties^ imageProperties = ImageEncodingProperties::CreateJpeg(); + + create_task(m_mediaCaptureMgr->CapturePhotoToStorageFileAsync(imageProperties, this->m_photoStorageFile)).then([this](task photoTask) + { + try + { + photoTask.get(); + EnableButton(true, "TakePhoto"); + ShowStatusMessage("Photo taken"); + + task(this->m_photoStorageFile->OpenAsync(FileAccessMode::Read)).then([this](task getStreamTask) + { + try + { + auto photoStream = getStreamTask.get(); + ShowStatusMessage("File open successful"); + auto bmpimg = ref new BitmapImage(); + + bmpimg->SetSource(photoStream); + imageElement1->Source = bmpimg; + } + catch (Exception^ e) + { + ShowExceptionMessage(e); + EnableButton(true, "TakePhoto"); + } + }); + } + catch (Platform::Exception ^ e) + { + ShowExceptionMessage(e); + EnableButton(true, "TakePhoto"); + } + }); + } + catch (Exception^ e) + { + ShowExceptionMessage(e); + EnableButton(true, "TakePhoto"); + } + }); + } + catch (Platform::Exception^ e) + { + ShowExceptionMessage(e); + EnableButton(true, "TakePhoto"); + } +} + +void BasicCapture::btnStartStopRecord_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) +{ + try + { + String ^fileName; + EnableButton(false, "StartStopRecord"); + + if (!m_bRecording) + { + ShowStatusMessage("Starting Record"); + + fileName = VIDEO_FILE_NAME; + + task(KnownFolders::VideosLibrary->CreateFileAsync(fileName,Windows::Storage::CreationCollisionOption::GenerateUniqueName )).then([this](task fileTask) + { + try + { + this->m_recordStorageFile = fileTask.get(); + ShowStatusMessage("Create record file successful"); + + MediaEncodingProfile^ recordProfile= nullptr; + recordProfile = MediaEncodingProfile::CreateMp4(Windows::Media::MediaProperties::VideoEncodingQuality::Auto); + + create_task(m_mediaCaptureMgr->StartRecordToStorageFileAsync(recordProfile, this->m_recordStorageFile)).then([this](task recordTask) + { + try + { + recordTask.get(); + m_bRecording = true; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + + ShowStatusMessage("Start Record successful"); + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + } + }); + } + catch (Exception ^e) + { + m_bRecording = false; + SwitchRecordButtonContent(); + EnableButton(true, "StartStopRecord"); + ShowExceptionMessage(e); + } + } + ); + } + else + { + ShowStatusMessage("Stopping Record"); + + create_task(m_mediaCaptureMgr->StopRecordAsync()).then([this](task recordTask) + { + try + { + recordTask.get(); + m_bRecording = false; + EnableButton(true, "StartStopRecord"); + SwitchRecordButtonContent(); + + ShowStatusMessage("Stop record successful"); + if (!m_bSuspended) + { + task(this->m_recordStorageFile->OpenAsync(FileAccessMode::Read)).then([this](task streamTask) + { + try + { + auto stream = streamTask.get(); + ShowStatusMessage("Record file opened"); + ShowStatusMessage(this->m_recordStorageFile->Path); + playbackElement1->AutoPlay = true; + playbackElement1->SetSource(stream, this->m_recordStorageFile->FileType); + playbackElement1->Play(); + } + catch (Exception ^e) + { + ShowExceptionMessage(e); + m_bRecording = false; + EnableButton(true, "StartStopRecord"); + SwitchRecordButtonContent(); + } + }); + } + } + catch (Exception ^e) + { + m_bRecording = false; + EnableButton(true, "StartStopRecord"); + SwitchRecordButtonContent(); + ShowExceptionMessage(e); + } + }); + } + } + catch (Platform::Exception^ e) + { + EnableButton(true, "StartStopRecord"); + ShowExceptionMessage(e); + SwitchRecordButtonContent(); + m_bRecording = false; + } +} + +void BasicCapture::SetupVideoDeviceControl(Windows::Media::Devices::MediaDeviceControl^ videoDeviceControl, Slider^ slider) +{ + try + { + if ((videoDeviceControl->Capabilities)->Supported) + { + slider->IsEnabled = true; + slider->Maximum = videoDeviceControl->Capabilities->Max; + slider->Minimum = videoDeviceControl->Capabilities->Min; + slider->StepFrequency = videoDeviceControl->Capabilities->Step; + double controlValue = 0; + if (videoDeviceControl->TryGetValue(&controlValue)) + { + slider->Value = controlValue; + } + } + else + { + slider->IsEnabled = false; + } + } + catch (Platform::Exception^ e) + { + ShowExceptionMessage(e); + } +} + +// VideoDeviceControllers +void BasicCapture::sldBrightness_ValueChanged(Platform::Object^ sender, Windows::UI::Xaml::Controls::Primitives::RangeBaseValueChangedEventArgs^ e) +{ + bool succeeded = m_mediaCaptureMgr->VideoDeviceController->Brightness->TrySetValue(sldBrightness->Value); + if (!succeeded) + { + ShowStatusMessage("Set Brightness failed"); + } +} + +void BasicCapture::sldContrast_ValueChanged(Platform::Object^ sender, Windows::UI::Xaml::Controls::Primitives::RangeBaseValueChangedEventArgs ^e) +{ + bool succeeded = m_mediaCaptureMgr->VideoDeviceController->Contrast->TrySetValue(sldContrast->Value); + if (!succeeded) + { + ShowStatusMessage("Set Contrast failed"); + } +} + +void BasicCapture::ShowStatusMessage(Platform::String^ text) +{ + rootPage->NotifyUser(text, NotifyType::StatusMessage); +} + +void BasicCapture::ShowExceptionMessage(Platform::Exception^ ex) +{ + rootPage->NotifyUser(ex->Message, NotifyType::ErrorMessage); +} + +void BasicCapture::SwitchRecordButtonContent() +{ + if (m_bRecording) + { + btnStartStopRecord1->Content="StopRecord"; + } + else + { + btnStartStopRecord1->Content="StartRecord"; + } +} +void BasicCapture::EnableButton(bool enabled, String^ name) +{ + if (name->Equals("StartDevice")) + { + btnStartDevice1->IsEnabled = enabled; + } + else if (name->Equals("StartPreview")) + { + btnStartPreview1->IsEnabled = enabled; + } + else if (name->Equals("StartStopRecord")) + { + btnStartStopRecord1->IsEnabled = enabled; + } + else if (name->Equals("TakePhoto")) + { + btnTakePhoto1->IsEnabled = enabled; + } +} + diff --git a/samples/winrt/ImageManipulations/C++/BasicCapture.xaml.h b/samples/winrt/ImageManipulations/C++/BasicCapture.xaml.h new file mode 100644 index 000000000..28129efb7 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/BasicCapture.xaml.h @@ -0,0 +1,88 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +// +// BasicCapture.xaml.h +// Declaration of the BasicCapture class +// + +#pragma once + +#include "pch.h" +#include "BasicCapture.g.h" +#include "MainPage.xaml.h" + +using namespace Windows::UI::Xaml; +using namespace Windows::UI::Xaml::Controls; +using namespace Windows::Graphics::Display; +using namespace Windows::UI::ViewManagement; +using namespace Windows::Devices::Enumeration; +#define VIDEO_FILE_NAME "video.mp4" +#define PHOTO_FILE_NAME "photo.jpg" +namespace SDKSample +{ + namespace MediaCapture + { + /// + /// An empty page that can be used on its own or navigated to within a Frame. + /// + [Windows::Foundation::Metadata::WebHostHidden] + public ref class BasicCapture sealed + { + public: + BasicCapture(); + + protected: + virtual void OnNavigatedTo(Windows::UI::Xaml::Navigation::NavigationEventArgs^ e) override; + virtual void OnNavigatedFrom(Windows::UI::Xaml::Navigation::NavigationEventArgs^ e) override; + + private: + MainPage^ rootPage; + void ScenarioInit(); + void ScenarioReset(); + + void Suspending(Object^ sender, Windows::ApplicationModel::SuspendingEventArgs^ e); + void Resuming(Object^ sender, Object^ e); + + void btnStartDevice_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + void SoundLevelChanged(Object^ sender, Object^ e); + void RecordLimitationExceeded(Windows::Media::Capture::MediaCapture ^ mediaCapture); + void Failed(Windows::Media::Capture::MediaCapture ^ mediaCapture, Windows::Media::Capture::MediaCaptureFailedEventArgs ^ args); + + + void btnStartPreview_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + + void btnStartStopRecord_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + + void btnTakePhoto_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + + void SetupVideoDeviceControl(Windows::Media::Devices::MediaDeviceControl^ videoDeviceControl, Slider^ slider); + void sldBrightness_ValueChanged(Platform::Object^ sender, Windows::UI::Xaml::Controls::Primitives::RangeBaseValueChangedEventArgs^ e); + void sldContrast_ValueChanged(Platform::Object^ sender, Windows::UI::Xaml::Controls::Primitives::RangeBaseValueChangedEventArgs^ e); + + void ShowStatusMessage(Platform::String^ text); + void ShowExceptionMessage(Platform::Exception^ ex); + + void EnableButton(bool enabled, Platform::String ^name); + void SwitchRecordButtonContent(); + + Platform::Agile m_mediaCaptureMgr; + Windows::Storage::StorageFile^ m_photoStorageFile; + Windows::Storage::StorageFile^ m_recordStorageFile; + bool m_bRecording; + bool m_bEffectAdded; + bool m_bSuspended; + bool m_bPreviewing; + Windows::UI::Xaml::WindowVisibilityChangedEventHandler ^m_visbilityHandler; + Windows::Foundation::EventRegistrationToken m_eventRegistrationToken; + bool m_currentScenarioLoaded; + }; + } +} diff --git a/samples/winrt/ImageManipulations/C++/Constants.cpp b/samples/winrt/ImageManipulations/C++/Constants.cpp new file mode 100644 index 000000000..873b98381 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/Constants.cpp @@ -0,0 +1,24 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +#include "pch.h" +#include "MainPage.xaml.h" +#include "Constants.h" + +using namespace SDKSample; + +Platform::Array^ MainPage::scenariosInner = ref new Platform::Array +{ + // The format here is the following: + // { "Description for the sample", "Fully quaified name for the class that implements the scenario" } + { "Video preview, record and take pictures", "SDKSample.MediaCapture.BasicCapture" }, + { "Enumerate cameras and add a video effect", "SDKSample.MediaCapture.AdvancedCapture" }, + { "Audio Capture", "SDKSample.MediaCapture.AudioCapture" } +}; diff --git a/samples/winrt/ImageManipulations/C++/Constants.h b/samples/winrt/ImageManipulations/C++/Constants.h new file mode 100644 index 000000000..917f66487 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/Constants.h @@ -0,0 +1,45 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +#pragma once + +#include +namespace SDKSample +{ + public value struct Scenario + { + Platform::String^ Title; + Platform::String^ ClassName; + }; + + partial ref class MainPage + { + public: + static property Platform::String^ FEATURE_NAME + { + Platform::String^ get() + { + return ref new Platform::String(L"MediaCapture CPP sample"); + } + } + + static property Platform::Array^ scenarios + { + Platform::Array^ get() + { + return scenariosInner; + } + } + private: + static Platform::Array^ scenariosInner; + }; + + +} diff --git a/samples/winrt/ImageManipulations/C++/MainPage.xaml b/samples/winrt/ImageManipulations/C++/MainPage.xaml new file mode 100644 index 000000000..d830e3cf0 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MainPage.xaml @@ -0,0 +1,166 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + 20,20,20,20 + + + + + + + + + diff --git a/samples/winrt/ImageManipulations/C++/MainPage.xaml.cpp b/samples/winrt/ImageManipulations/C++/MainPage.xaml.cpp new file mode 100644 index 000000000..070278191 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MainPage.xaml.cpp @@ -0,0 +1,315 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +// +// MainPage.xaml.cpp +// Implementation of the MainPage.xaml class. +// + +#include "pch.h" +#include "MainPage.xaml.h" +#include "App.xaml.h" + +#include + +using namespace Windows::UI::Xaml; +using namespace Windows::UI::Xaml::Controls; +using namespace Windows::Foundation; +using namespace Windows::Foundation::Collections; +using namespace Platform; +using namespace SDKSample; +using namespace Windows::UI::Xaml::Navigation; +using namespace Windows::UI::Xaml::Interop; +using namespace Windows::Graphics::Display; +using namespace Windows::UI::ViewManagement; + +MainPage^ MainPage::Current = nullptr; + +MainPage::MainPage() +{ + InitializeComponent(); + + // This frame is hidden, meaning it is never shown. It is simply used to load + // each scenario page and then pluck out the input and output sections and + // place them into the UserControls on the main page. + HiddenFrame = ref new Windows::UI::Xaml::Controls::Frame(); + HiddenFrame->Visibility = Windows::UI::Xaml::Visibility::Collapsed; + ContentRoot->Children->Append(HiddenFrame); + + FeatureName->Text = FEATURE_NAME; + + this->SizeChanged += ref new SizeChangedEventHandler(this, &MainPage::MainPage_SizeChanged); + Scenarios->SelectionChanged += ref new SelectionChangedEventHandler(this, &MainPage::Scenarios_SelectionChanged); + + MainPage::Current = this; + autoSizeInputSectionWhenSnapped = true; +} + +/// +/// We need to handle SizeChanged so that we can make the sample layout property +/// in the various layouts. +/// +/// +/// +void MainPage::MainPage_SizeChanged(Object^ sender, SizeChangedEventArgs^ e) +{ + InvalidateSize(); + MainPageSizeChangedEventArgs^ args = ref new MainPageSizeChangedEventArgs(); + args->ViewState = ApplicationView::Value; + MainPageResized(this, args); + +} + +void MainPage::InvalidateSize() +{ + // Get the window width + double windowWidth = this->ActualWidth; + + if (windowWidth != 0.0) + { + // Get the width of the ListBox. + double listBoxWidth = Scenarios->ActualWidth; + + // Is the ListBox using any margins that we need to consider? + double listBoxMarginLeft = Scenarios->Margin.Left; + double listBoxMarginRight = Scenarios->Margin.Right; + + // Figure out how much room is left after considering the list box width + double availableWidth = windowWidth - listBoxWidth; + + // Is the top most child using margins? + double layoutRootMarginLeft = ContentRoot->Margin.Left; + double layoutRootMarginRight = ContentRoot->Margin.Right; + + // We have different widths to use depending on the view state + if (ApplicationView::Value != ApplicationViewState::Snapped) + { + // Make us as big as the the left over space, factoring in the ListBox width, the ListBox margins. + // and the LayoutRoot's margins + InputSection->Width = ((availableWidth) - + (layoutRootMarginLeft + layoutRootMarginRight + listBoxMarginLeft + listBoxMarginRight)); + } + else + { + // Make us as big as the left over space, factoring in just the LayoutRoot's margins. + if (autoSizeInputSectionWhenSnapped) + { + InputSection->Width = (windowWidth - (layoutRootMarginLeft + layoutRootMarginRight)); + } + } + } + InvalidateViewState(); +} + +void MainPage::InvalidateViewState() +{ + // Are we going to snapped mode? + if (ApplicationView::Value == ApplicationViewState::Snapped) + { + Grid::SetRow(DescriptionText, 3); + Grid::SetColumn(DescriptionText, 0); + + Grid::SetRow(InputSection, 4); + Grid::SetColumn(InputSection, 0); + + Grid::SetRow(FooterPanel, 2); + Grid::SetColumn(FooterPanel, 0); + } + else + { + Grid::SetRow(DescriptionText, 1); + Grid::SetColumn(DescriptionText, 1); + + Grid::SetRow(InputSection, 2); + Grid::SetColumn(InputSection, 1); + + Grid::SetRow(FooterPanel, 1); + Grid::SetColumn(FooterPanel, 1); + } + + // Since we don't load the scenario page in the traditional manner (we just pluck out the + // input and output sections from the page) we need to ensure that any VSM code used + // by the scenario's input and output sections is fired. + VisualStateManager::GoToState(InputSection, "Input" + LayoutAwarePage::DetermineVisualState(ApplicationView::Value), false); + VisualStateManager::GoToState(OutputSection, "Output" + LayoutAwarePage::DetermineVisualState(ApplicationView::Value), false); +} + +void MainPage::PopulateScenarios() +{ + ScenarioList = ref new Platform::Collections::Vector(); + + // Populate the ListBox with the list of scenarios as defined in Constants.cpp. + for (unsigned int i = 0; i < scenarios->Length; ++i) + { + Scenario s = scenarios[i]; + ListBoxItem^ item = ref new ListBoxItem(); + item->Name = s.ClassName; + item->Content = (i + 1).ToString() + ") " + s.Title; + ScenarioList->Append(item); + } + + // Bind the ListBox to the scenario list. + Scenarios->ItemsSource = ScenarioList; + Scenarios->ScrollIntoView(Scenarios->SelectedItem); +} + +/// +/// This method is responsible for loading the individual input and output sections for each scenario. This +/// is based on navigating a hidden Frame to the ScenarioX.xaml page and then extracting out the input +/// and output sections into the respective UserControl on the main page. +/// +/// +void MainPage::LoadScenario(String^ scenarioName) +{ + autoSizeInputSectionWhenSnapped = true; + + // Load the ScenarioX.xaml file into the Frame. + TypeName scenarioType = {scenarioName, TypeKind::Custom}; + HiddenFrame->Navigate(scenarioType, this); + + // Get the top element, the Page, so we can look up the elements + // that represent the input and output sections of the ScenarioX file. + Page^ hiddenPage = safe_cast(HiddenFrame->Content); + + // Get each element. + UIElement^ input = safe_cast(hiddenPage->FindName("Input")); + UIElement^ output = safe_cast(hiddenPage->FindName("Output")); + + if (input == nullptr) + { + // Malformed input section. + NotifyUser("Cannot load scenario input section for " + scenarioName + + " Make sure root of input section markup has x:Name of 'Input'", NotifyType::ErrorMessage); + return; + } + + if (output == nullptr) + { + // Malformed output section. + NotifyUser("Cannot load scenario output section for " + scenarioName + + " Make sure root of output section markup has x:Name of 'Output'", NotifyType::ErrorMessage); + return; + } + + // Find the LayoutRoot which parents the input and output sections in the main page. + Panel^ panel = safe_cast(hiddenPage->FindName("LayoutRoot")); + + if (panel != nullptr) + { + unsigned int index = 0; + UIElementCollection^ collection = panel->Children; + + // Get rid of the content that is currently in the intput and output sections. + collection->IndexOf(input, &index); + collection->RemoveAt(index); + + collection->IndexOf(output, &index); + collection->RemoveAt(index); + + // Populate the input and output sections with the newly loaded content. + InputSection->Content = input; + OutputSection->Content = output; + + ScenarioLoaded(this, nullptr); + } + else + { + // Malformed Scenario file. + NotifyUser("Cannot load scenario: " + scenarioName + ". Make sure root tag in the '" + + scenarioName + "' file has an x:Name of 'LayoutRoot'", NotifyType::ErrorMessage); + } +} + +void MainPage::Scenarios_SelectionChanged(Object^ sender, SelectionChangedEventArgs^ e) +{ + if (Scenarios->SelectedItem != nullptr) + { + NotifyUser("", NotifyType::StatusMessage); + + LoadScenario((safe_cast(Scenarios->SelectedItem))->Name); + InvalidateSize(); + } +} + +void MainPage::NotifyUser(String^ strMessage, NotifyType type) +{ + switch (type) + { + case NotifyType::StatusMessage: + // Use the status message style. + StatusBlock->Style = safe_cast(this->Resources->Lookup("StatusStyle")); + break; + case NotifyType::ErrorMessage: + // Use the error message style. + StatusBlock->Style = safe_cast(this->Resources->Lookup("ErrorStyle")); + break; + default: + break; + } + StatusBlock->Text = strMessage; + + // Collapsed the StatusBlock if it has no text to conserve real estate. + if (StatusBlock->Text != "") + { + StatusBlock->Visibility = Windows::UI::Xaml::Visibility::Visible; + } + else + { + StatusBlock->Visibility = Windows::UI::Xaml::Visibility::Collapsed; + } +} + +void MainPage::Footer_Click(Object^ sender, RoutedEventArgs^ e) +{ + auto uri = ref new Uri((String^)((HyperlinkButton^)sender)->Tag); + Windows::System::Launcher::LaunchUriAsync(uri); +} + + +/// +/// Populates the page with content passed during navigation. Any saved state is also +/// provided when recreating a page from a prior session. +/// +/// The parameter value passed to +/// when this page was initially requested. +/// +/// A map of state preserved by this page during an earlier +/// session. This will be null the first time a page is visited. +void MainPage::LoadState(Object^ navigationParameter, IMap^ pageState) +{ + (void) navigationParameter; // Unused parameter + + PopulateScenarios(); + + // Starting scenario is the first or based upon a previous state. + ListBoxItem^ startingScenario = nullptr; + int startingScenarioIndex = -1; + + if (pageState != nullptr && pageState->HasKey("SelectedScenarioIndex")) + { + startingScenarioIndex = safe_cast(pageState->Lookup("SelectedScenarioIndex")); + } + + Scenarios->SelectedIndex = startingScenarioIndex != -1 ? startingScenarioIndex : 0; + + InvalidateViewState(); +} + +/// +/// Preserves state associated with this page in case the application is suspended or the +/// page is discarded from the navigation cache. Values must conform to the serialization +/// requirements of . +/// +/// An empty map to be populated with serializable state. +void MainPage::SaveState(IMap^ pageState) +{ + int selectedListBoxItemIndex = Scenarios->SelectedIndex; + pageState->Insert("SelectedScenarioIndex", selectedListBoxItemIndex); +} diff --git a/samples/winrt/ImageManipulations/C++/MainPage.xaml.h b/samples/winrt/ImageManipulations/C++/MainPage.xaml.h new file mode 100644 index 000000000..36fb7796a --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MainPage.xaml.h @@ -0,0 +1,105 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +// +// MainPage.xaml.h +// Declaration of the MainPage.xaml class. +// + +#pragma once + +#include "pch.h" +#include "MainPage.g.h" +#include "Common\LayoutAwarePage.h" // Required by generated header +#include "Constants.h" + +namespace SDKSample +{ + public enum class NotifyType + { + StatusMessage, + ErrorMessage + }; + + public ref class MainPageSizeChangedEventArgs sealed + { + public: + property Windows::UI::ViewManagement::ApplicationViewState ViewState + { + Windows::UI::ViewManagement::ApplicationViewState get() + { + return viewState; + } + + void set(Windows::UI::ViewManagement::ApplicationViewState value) + { + viewState = value; + } + } + + private: + Windows::UI::ViewManagement::ApplicationViewState viewState; + }; + + public ref class MainPage sealed + { + public: + MainPage(); + + protected: + virtual void LoadState(Platform::Object^ navigationParameter, + Windows::Foundation::Collections::IMap^ pageState) override; + virtual void SaveState(Windows::Foundation::Collections::IMap^ pageState) override; + + internal: + property bool AutoSizeInputSectionWhenSnapped + { + bool get() + { + return autoSizeInputSectionWhenSnapped; + } + + void set(bool value) + { + autoSizeInputSectionWhenSnapped = value; + } + } + + property Windows::ApplicationModel::Activation::LaunchActivatedEventArgs^ LaunchArgs + { + Windows::ApplicationModel::Activation::LaunchActivatedEventArgs^ get() + { + return safe_cast(App::Current)->LaunchArgs; + } + } + + void NotifyUser(Platform::String^ strMessage, NotifyType type); + void LoadScenario(Platform::String^ scenarioName); + event Windows::Foundation::EventHandler^ ScenarioLoaded; + event Windows::Foundation::EventHandler^ MainPageResized; + + private: + void PopulateScenarios(); + void InvalidateSize(); + void InvalidateViewState(); + + Platform::Collections::Vector^ ScenarioList; + Windows::UI::Xaml::Controls::Frame^ HiddenFrame; + void Footer_Click(Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + bool autoSizeInputSectionWhenSnapped; + + void MainPage_SizeChanged(Object^ sender, Windows::UI::Xaml::SizeChangedEventArgs^ e); + void Scenarios_SelectionChanged(Object^ sender, Windows::UI::Xaml::Controls::SelectionChangedEventArgs^ e); + + internal: + static MainPage^ Current; + + }; +} diff --git a/samples/winrt/ImageManipulations/C++/MediaCapture.sln b/samples/winrt/ImageManipulations/C++/MediaCapture.sln new file mode 100644 index 000000000..7b99bce31 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MediaCapture.sln @@ -0,0 +1,52 @@ + +Microsoft Visual Studio Solution File, Format Version 12.00 +# Visual Studio 11 Express for Windows 8 +Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "MediaCapture", "MediaCapture.vcxproj", "{C5B886A7-8300-46FF-B533-9613DE2AF637}" +EndProject +Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "GrayscaleTransform", "MediaExtensions\Grayscale\Grayscale.vcxproj", "{BA69218F-DA5C-4D14-A78D-21A9E4DEC669}" +EndProject +Global + GlobalSection(SolutionConfigurationPlatforms) = preSolution + Debug|ARM = Debug|ARM + Debug|Win32 = Debug|Win32 + Debug|x64 = Debug|x64 + Release|ARM = Release|ARM + Release|Win32 = Release|Win32 + Release|x64 = Release|x64 + EndGlobalSection + GlobalSection(ProjectConfigurationPlatforms) = postSolution + {BA69218F-DA5C-4D14-A78D-21A9E4DEC669}.Debug|ARM.ActiveCfg = Debug|ARM + {BA69218F-DA5C-4D14-A78D-21A9E4DEC669}.Debug|ARM.Build.0 = Debug|ARM + {BA69218F-DA5C-4D14-A78D-21A9E4DEC669}.Debug|Win32.ActiveCfg = Debug|Win32 + {BA69218F-DA5C-4D14-A78D-21A9E4DEC669}.Debug|Win32.Build.0 = Debug|Win32 + {BA69218F-DA5C-4D14-A78D-21A9E4DEC669}.Debug|x64.ActiveCfg = Debug|x64 + {BA69218F-DA5C-4D14-A78D-21A9E4DEC669}.Debug|x64.Build.0 = Debug|x64 + {BA69218F-DA5C-4D14-A78D-21A9E4DEC669}.Release|ARM.ActiveCfg = Release|ARM + {BA69218F-DA5C-4D14-A78D-21A9E4DEC669}.Release|ARM.Build.0 = Release|ARM + {BA69218F-DA5C-4D14-A78D-21A9E4DEC669}.Release|Win32.ActiveCfg = Release|Win32 + {BA69218F-DA5C-4D14-A78D-21A9E4DEC669}.Release|Win32.Build.0 = Release|Win32 + {BA69218F-DA5C-4D14-A78D-21A9E4DEC669}.Release|x64.ActiveCfg = Release|x64 + {BA69218F-DA5C-4D14-A78D-21A9E4DEC669}.Release|x64.Build.0 = Release|x64 + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Debug|ARM.ActiveCfg = Debug|ARM + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Debug|ARM.Build.0 = Debug|ARM + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Debug|ARM.Deploy.0 = Debug|ARM + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Debug|Win32.ActiveCfg = Debug|Win32 + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Debug|Win32.Build.0 = Debug|Win32 + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Debug|Win32.Deploy.0 = Debug|Win32 + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Debug|x64.ActiveCfg = Debug|x64 + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Debug|x64.Build.0 = Debug|x64 + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Debug|x64.Deploy.0 = Debug|x64 + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Release|ARM.ActiveCfg = Release|ARM + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Release|ARM.Build.0 = Release|ARM + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Release|ARM.Deploy.0 = Release|ARM + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Release|Win32.ActiveCfg = Release|Win32 + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Release|Win32.Build.0 = Release|Win32 + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Release|Win32.Deploy.0 = Release|Win32 + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Release|x64.ActiveCfg = Release|x64 + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Release|x64.Build.0 = Release|x64 + {C5B886A7-8300-46FF-B533-9613DE2AF637}.Release|x64.Deploy.0 = Release|x64 + EndGlobalSection + GlobalSection(SolutionProperties) = preSolution + HideSolutionNode = FALSE + EndGlobalSection +EndGlobal diff --git a/samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj b/samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj new file mode 100644 index 000000000..d2f255d1b --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj @@ -0,0 +1,200 @@ + + + + + Debug + ARM + + + Debug + Win32 + + + Debug + x64 + + + Release + ARM + + + Release + Win32 + + + Release + x64 + + + + {C5B886A7-8300-46FF-B533-9613DE2AF637} + SDKSample + en-US + $(VCTargetsPath11) + 11.0 + true + MediaCapture + + + + Application + true + v110 + + + Application + true + v110 + + + Application + true + v110 + + + Application + false + true + v110 + + + Application + false + true + v110 + + + Application + false + true + v110 + + + + + + + + + + + + + + + + + + + + + + + + + + pch.h + + + + + AdvancedCapture.xaml + Code + + + AudioCapture.xaml + Code + + + BasicCapture.xaml + Code + + + + MainPage.xaml + + + + + + App.xaml + + + + + Designer + + + Designer + + + Designer + + + Designer + + + Designer + + + Designer + + + Designer + + + + + Designer + + + + + AdvancedCapture.xaml + Code + + + App.xaml + + + AudioCapture.xaml + Code + + + BasicCapture.xaml + Code + + + + + + MainPage.xaml + + + Create + Create + Create + Create + Create + Create + + + + + + + + + + + + + + + {ba69218f-da5c-4d14-a78d-21a9e4dec669} + + + + + + \ No newline at end of file diff --git a/samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj.filters b/samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj.filters new file mode 100644 index 000000000..5f6124c2b --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj.filters @@ -0,0 +1,88 @@ + + + + + Assets + + + Assets + + + Assets + + + Assets + + + Assets + + + Assets + + + Assets + + + Assets + + + + + + + + + + + + Common + + + Sample-Utils + + + + + + + + + + + Common + + + Common + + + + + + + + + + + + Common + + + Common + + + + + + + + + {132eec18-b164-4b15-a746-643880e9c5d9} + + + {476b4177-f316-4458-8e13-cab3dc2381c5} + + + {54f287f8-e4cb-4f47-97d0-4c469de6992e} + + + \ No newline at end of file diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Common/AsyncCB.h b/samples/winrt/ImageManipulations/C++/MediaExtensions/Common/AsyncCB.h new file mode 100644 index 000000000..04ff69ed8 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Common/AsyncCB.h @@ -0,0 +1,81 @@ +#pragma once + +////////////////////////////////////////////////////////////////////////// +// AsyncCallback [template] +// +// Description: +// Helper class that routes IMFAsyncCallback::Invoke calls to a class +// method on the parent class. +// +// Usage: +// Add this class as a member variable. In the parent class constructor, +// initialize the AsyncCallback class like this: +// m_cb(this, &CYourClass::OnInvoke) +// where +// m_cb = AsyncCallback object +// CYourClass = parent class +// OnInvoke = Method in the parent class to receive Invoke calls. +// +// The parent's OnInvoke method (you can name it anything you like) must +// have a signature that matches the InvokeFn typedef below. +////////////////////////////////////////////////////////////////////////// + +// T: Type of the parent object +template +class AsyncCallback : public IMFAsyncCallback +{ +public: + typedef HRESULT (T::*InvokeFn)(IMFAsyncResult *pAsyncResult); + + AsyncCallback(T *pParent, InvokeFn fn) : m_pParent(pParent), m_pInvokeFn(fn) + { + } + + // IUnknown + STDMETHODIMP_(ULONG) AddRef() { + // Delegate to parent class. + return m_pParent->AddRef(); + } + STDMETHODIMP_(ULONG) Release() { + // Delegate to parent class. + return m_pParent->Release(); + } + STDMETHODIMP QueryInterface(REFIID iid, void** ppv) + { + if (!ppv) + { + return E_POINTER; + } + if (iid == __uuidof(IUnknown)) + { + *ppv = static_cast(static_cast(this)); + } + else if (iid == __uuidof(IMFAsyncCallback)) + { + *ppv = static_cast(this); + } + else + { + *ppv = NULL; + return E_NOINTERFACE; + } + AddRef(); + return S_OK; + } + + + // IMFAsyncCallback methods + STDMETHODIMP GetParameters(DWORD*, DWORD*) + { + // Implementation of this method is optional. + return E_NOTIMPL; + } + + STDMETHODIMP Invoke(IMFAsyncResult* pAsyncResult) + { + return (m_pParent->*m_pInvokeFn)(pAsyncResult); + } + + T *m_pParent; + InvokeFn m_pInvokeFn; +}; diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Common/BufferLock.h b/samples/winrt/ImageManipulations/C++/MediaExtensions/Common/BufferLock.h new file mode 100644 index 000000000..92de15eac --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Common/BufferLock.h @@ -0,0 +1,102 @@ +// THIS CODE AND INFORMATION IS PROVIDED "AS IS" WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO +// THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A +// PARTICULAR PURPOSE. +// +// Copyright (c) Microsoft Corporation. All rights reserved + + +#pragma once + + +////////////////////////////////////////////////////////////////////////// +// VideoBufferLock +// +// Description: +// Locks a video buffer that might or might not support IMF2DBuffer. +// +////////////////////////////////////////////////////////////////////////// + +class VideoBufferLock +{ +public: + VideoBufferLock(IMFMediaBuffer *pBuffer) : m_p2DBuffer(NULL) + { + m_pBuffer = pBuffer; + m_pBuffer->AddRef(); + + // Query for the 2-D buffer interface. OK if this fails. + m_pBuffer->QueryInterface(IID_PPV_ARGS(&m_p2DBuffer)); + } + + ~VideoBufferLock() + { + UnlockBuffer(); + SafeRelease(&m_pBuffer); + SafeRelease(&m_p2DBuffer); + } + + // LockBuffer: + // Locks the buffer. Returns a pointer to scan line 0 and returns the stride. + + // The caller must provide the default stride as an input parameter, in case + // the buffer does not expose IMF2DBuffer. You can calculate the default stride + // from the media type. + + HRESULT LockBuffer( + LONG lDefaultStride, // Minimum stride (with no padding). + DWORD dwHeightInPixels, // Height of the image, in pixels. + BYTE **ppbScanLine0, // Receives a pointer to the start of scan line 0. + LONG *plStride // Receives the actual stride. + ) + { + HRESULT hr = S_OK; + + // Use the 2-D version if available. + if (m_p2DBuffer) + { + hr = m_p2DBuffer->Lock2D(ppbScanLine0, plStride); + } + else + { + // Use non-2D version. + BYTE *pData = NULL; + + hr = m_pBuffer->Lock(&pData, NULL, NULL); + if (SUCCEEDED(hr)) + { + *plStride = lDefaultStride; + if (lDefaultStride < 0) + { + // Bottom-up orientation. Return a pointer to the start of the + // last row *in memory* which is the top row of the image. + *ppbScanLine0 = pData + abs(lDefaultStride) * (dwHeightInPixels - 1); + } + else + { + // Top-down orientation. Return a pointer to the start of the + // buffer. + *ppbScanLine0 = pData; + } + } + } + return hr; + } + + HRESULT UnlockBuffer() + { + if (m_p2DBuffer) + { + return m_p2DBuffer->Unlock2D(); + } + else + { + return m_pBuffer->Unlock(); + } + } + +private: + IMFMediaBuffer *m_pBuffer; + IMF2DBuffer *m_p2DBuffer; +}; + diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Common/CritSec.h b/samples/winrt/ImageManipulations/C++/MediaExtensions/Common/CritSec.h new file mode 100644 index 000000000..d5ea05bfd --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Common/CritSec.h @@ -0,0 +1,62 @@ +#pragma once + +////////////////////////////////////////////////////////////////////////// +// CritSec +// Description: Wraps a critical section. +////////////////////////////////////////////////////////////////////////// + +class CritSec +{ +public: + CRITICAL_SECTION m_criticalSection; +public: + CritSec() + { + InitializeCriticalSectionEx(&m_criticalSection, 100, 0); + } + + ~CritSec() + { + DeleteCriticalSection(&m_criticalSection); + } + + _Acquires_lock_(m_criticalSection) + void Lock() + { + EnterCriticalSection(&m_criticalSection); + } + + _Releases_lock_(m_criticalSection) + void Unlock() + { + LeaveCriticalSection(&m_criticalSection); + } +}; + + +////////////////////////////////////////////////////////////////////////// +// AutoLock +// Description: Provides automatic locking and unlocking of a +// of a critical section. +// +// Note: The AutoLock object must go out of scope before the CritSec. +////////////////////////////////////////////////////////////////////////// + +class AutoLock +{ +private: + CritSec *m_pCriticalSection; +public: + _Acquires_lock_(m_pCriticalSection) + AutoLock(CritSec& crit) + { + m_pCriticalSection = &crit; + m_pCriticalSection->Lock(); + } + + _Releases_lock_(m_pCriticalSection) + ~AutoLock() + { + m_pCriticalSection->Unlock(); + } +}; diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Common/LinkList.h b/samples/winrt/ImageManipulations/C++/MediaExtensions/Common/LinkList.h new file mode 100644 index 000000000..c67c0f2ca --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Common/LinkList.h @@ -0,0 +1,516 @@ +//----------------------------------------------------------------------------- +// File: Linklist.h +// Desc: Linked list class. +// +// THIS CODE AND INFORMATION IS PROVIDED "AS IS" WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO +// THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A +// PARTICULAR PURPOSE. +// +// Copyright (C) Microsoft Corporation. All rights reserved. +//----------------------------------------------------------------------------- + +#pragma once + +// Notes: +// +// The List class template implements a simple double-linked list. +// It uses STL's copy semantics. + +// There are two versions of the Clear() method: +// Clear(void) clears the list w/out cleaning up the object. +// Clear(FN fn) takes a functor object that releases the objects, if they need cleanup. + +// The List class supports enumeration. Example of usage: +// +// List::POSIITON pos = list.GetFrontPosition(); +// while (pos != list.GetEndPosition()) +// { +// T item; +// hr = list.GetItemPos(&item); +// pos = list.Next(pos); +// } + +// The ComPtrList class template derives from List<> and implements a list of COM pointers. + +template +struct NoOp +{ + void operator()(T& t) + { + } +}; + +template +class List +{ +protected: + + // Nodes in the linked list + struct Node + { + Node *prev; + Node *next; + T item; + + Node() : prev(nullptr), next(nullptr) + { + } + + Node(T item) : prev(nullptr), next(nullptr) + { + this->item = item; + } + + T Item() const { return item; } + }; + +public: + + // Object for enumerating the list. + class POSITION + { + friend class List; + + public: + POSITION() : pNode(nullptr) + { + } + + bool operator==(const POSITION &p) const + { + return pNode == p.pNode; + } + + bool operator!=(const POSITION &p) const + { + return pNode != p.pNode; + } + + private: + const Node *pNode; + + POSITION(Node *p) : pNode(p) + { + } + }; + +protected: + Node m_anchor; // Anchor node for the linked list. + DWORD m_count; // Number of items in the list. + + Node* Front() const + { + return m_anchor.next; + } + + Node* Back() const + { + return m_anchor.prev; + } + + virtual HRESULT InsertAfter(T item, Node *pBefore) + { + if (pBefore == nullptr) + { + return E_POINTER; + } + + Node *pNode = new Node(item); + if (pNode == nullptr) + { + return E_OUTOFMEMORY; + } + + Node *pAfter = pBefore->next; + + pBefore->next = pNode; + pAfter->prev = pNode; + + pNode->prev = pBefore; + pNode->next = pAfter; + + m_count++; + + return S_OK; + } + + virtual HRESULT GetItem(const Node *pNode, T* ppItem) + { + if (pNode == nullptr || ppItem == nullptr) + { + return E_POINTER; + } + + *ppItem = pNode->item; + return S_OK; + } + + // RemoveItem: + // Removes a node and optionally returns the item. + // ppItem can be nullptr. + virtual HRESULT RemoveItem(Node *pNode, T *ppItem) + { + if (pNode == nullptr) + { + return E_POINTER; + } + + assert(pNode != &m_anchor); // We should never try to remove the anchor node. + if (pNode == &m_anchor) + { + return E_INVALIDARG; + } + + + T item; + + // The next node's previous is this node's previous. + pNode->next->prev = pNode->prev; + + // The previous node's next is this node's next. + pNode->prev->next = pNode->next; + + item = pNode->item; + delete pNode; + + m_count--; + + if (ppItem) + { + *ppItem = item; + } + + return S_OK; + } + +public: + + List() + { + m_anchor.next = &m_anchor; + m_anchor.prev = &m_anchor; + + m_count = 0; + } + + virtual ~List() + { + Clear(); + } + + // Insertion functions + HRESULT InsertBack(T item) + { + return InsertAfter(item, m_anchor.prev); + } + + + HRESULT InsertFront(T item) + { + return InsertAfter(item, &m_anchor); + } + + HRESULT InsertPos(POSITION pos, T item) + { + if (pos.pNode == nullptr) + { + return InsertBack(item); + } + + return InsertAfter(item, pos.pNode->prev); + } + + // RemoveBack: Removes the tail of the list and returns the value. + // ppItem can be nullptr if you don't want the item back. (But the method does not release the item.) + HRESULT RemoveBack(T *ppItem) + { + if (IsEmpty()) + { + return E_FAIL; + } + else + { + return RemoveItem(Back(), ppItem); + } + } + + // RemoveFront: Removes the head of the list and returns the value. + // ppItem can be nullptr if you don't want the item back. (But the method does not release the item.) + HRESULT RemoveFront(T *ppItem) + { + if (IsEmpty()) + { + return E_FAIL; + } + else + { + return RemoveItem(Front(), ppItem); + } + } + + // GetBack: Gets the tail item. + HRESULT GetBack(T *ppItem) + { + if (IsEmpty()) + { + return E_FAIL; + } + else + { + return GetItem(Back(), ppItem); + } + } + + // GetFront: Gets the front item. + HRESULT GetFront(T *ppItem) + { + if (IsEmpty()) + { + return E_FAIL; + } + else + { + return GetItem(Front(), ppItem); + } + } + + + // GetCount: Returns the number of items in the list. + DWORD GetCount() const { return m_count; } + + bool IsEmpty() const + { + return (GetCount() == 0); + } + + // Clear: Takes a functor object whose operator() + // frees the object on the list. + template + void Clear(FN& clear_fn) + { + Node *n = m_anchor.next; + + // Delete the nodes + while (n != &m_anchor) + { + clear_fn(n->item); + + Node *tmp = n->next; + delete n; + n = tmp; + } + + // Reset the anchor to point at itself + m_anchor.next = &m_anchor; + m_anchor.prev = &m_anchor; + + m_count = 0; + } + + // Clear: Clears the list. (Does not delete or release the list items.) + virtual void Clear() + { + NoOp clearOp; + Clear<>(clearOp); + } + + + // Enumerator functions + + POSITION FrontPosition() + { + if (IsEmpty()) + { + return POSITION(nullptr); + } + else + { + return POSITION(Front()); + } + } + + POSITION EndPosition() const + { + return POSITION(); + } + + HRESULT GetItemPos(POSITION pos, T *ppItem) + { + if (pos.pNode) + { + return GetItem(pos.pNode, ppItem); + } + else + { + return E_FAIL; + } + } + + POSITION Next(const POSITION pos) + { + if (pos.pNode && (pos.pNode->next != &m_anchor)) + { + return POSITION(pos.pNode->next); + } + else + { + return POSITION(nullptr); + } + } + + // Remove an item at a position. + // The item is returns in ppItem, unless ppItem is nullptr. + // NOTE: This method invalidates the POSITION object. + HRESULT Remove(POSITION& pos, T *ppItem) + { + if (pos.pNode) + { + // Remove const-ness temporarily... + Node *pNode = const_cast(pos.pNode); + + pos = POSITION(); + + return RemoveItem(pNode, ppItem); + } + else + { + return E_INVALIDARG; + } + } + +}; + + + +// Typical functors for Clear method. + +// ComAutoRelease: Releases COM pointers. +// MemDelete: Deletes pointers to new'd memory. + +class ComAutoRelease +{ +public: + void operator()(IUnknown *p) + { + if (p) + { + p->Release(); + } + } +}; + +class MemDelete +{ +public: + void operator()(void *p) + { + if (p) + { + delete p; + } + } +}; + + +// ComPtrList class +// Derived class that makes it safer to store COM pointers in the List<> class. +// It automatically AddRef's the pointers that are inserted onto the list +// (unless the insertion method fails). +// +// T must be a COM interface type. +// example: ComPtrList +// +// NULLABLE: If true, client can insert nullptr pointers. This means GetItem can +// succeed but return a nullptr pointer. By default, the list does not allow nullptr +// pointers. + +template +class ComPtrList : public List +{ +public: + + typedef T* Ptr; + + void Clear() + { + ComAutoRelease car; + List::Clear(car); + } + + ~ComPtrList() + { + Clear(); + } + +protected: + HRESULT InsertAfter(Ptr item, Node *pBefore) + { + // Do not allow nullptr item pointers unless NULLABLE is true. + if (item == nullptr && !NULLABLE) + { + return E_POINTER; + } + + if (item) + { + item->AddRef(); + } + + HRESULT hr = List::InsertAfter(item, pBefore); + if (FAILED(hr) && item != nullptr) + { + item->Release(); + } + return hr; + } + + HRESULT GetItem(const Node *pNode, Ptr* ppItem) + { + Ptr pItem = nullptr; + + // The base class gives us the pointer without AddRef'ing it. + // If we return the pointer to the caller, we must AddRef(). + HRESULT hr = List::GetItem(pNode, &pItem); + if (SUCCEEDED(hr)) + { + assert(pItem || NULLABLE); + if (pItem) + { + *ppItem = pItem; + (*ppItem)->AddRef(); + } + } + return hr; + } + + HRESULT RemoveItem(Node *pNode, Ptr *ppItem) + { + // ppItem can be nullptr, but we need to get the + // item so that we can release it. + + // If ppItem is not nullptr, we will AddRef it on the way out. + + Ptr pItem = nullptr; + + HRESULT hr = List::RemoveItem(pNode, &pItem); + + if (SUCCEEDED(hr)) + { + assert(pItem || NULLABLE); + if (ppItem && pItem) + { + *ppItem = pItem; + (*ppItem)->AddRef(); + } + + if (pItem) + { + pItem->Release(); + pItem = nullptr; + } + } + + return hr; + } +}; diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Common/OpQueue.h b/samples/winrt/ImageManipulations/C++/MediaExtensions/Common/OpQueue.h new file mode 100644 index 000000000..dd0813be3 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Common/OpQueue.h @@ -0,0 +1,222 @@ +////////////////////////////////////////////////////////////////////////// +// +// OpQueue.h +// Async operation queue. +// +// THIS CODE AND INFORMATION IS PROVIDED "AS IS" WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO +// THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A +// PARTICULAR PURPOSE. +// +// Copyright (c) Microsoft Corporation. All rights reserved. +// +////////////////////////////////////////////////////////////////////////// + +#pragma once + +#pragma warning( push ) +#pragma warning( disable : 4355 ) // 'this' used in base member initializer list + +/* + This header file defines an object to help queue and serialize + asynchronous operations. + + Background: + + To perform an operation asynchronously in Media Foundation, an object + does one of the following: + + 1. Calls MFPutWorkItem(Ex), using either a standard work queue + identifier or a caller-allocated work queue. The work-queue + thread invokes the object's callback. + + 2. Creates an async result object (IMFAsyncResult) and calls + MFInvokeCallback to invoke the object's callback. + + Ultimately, either of these cause the object's callback to be invoked + from a work-queue thread. The object can then complete the operation + inside the callback. + + However, the Media Foundation platform may dispatch async callbacks in + parallel on several threads. Putting an item on a work queue does NOT + guarantee that one operation will complete before the next one starts, + or even that work items will be dispatched in the same order they were + called. + + To serialize async operations that should not overlap, an object should + use a queue. While one operation is pending, subsequent operations are + put on the queue, and only dispatched after the previous operation is + complete. + + The granularity of a single "operation" depends on the requirements of + that particular object. A single operation might involve several + asynchronous calls before the object dispatches the next operation on + the queue. + + +*/ + + + +//------------------------------------------------------------------- +// OpQueue class template +// +// Base class for an async operation queue. +// +// TOperation: The class used to describe operations. This class must +// implement IUnknown. +// +// The OpQueue class is an abstract class. The derived class must +// implement the following pure-virtual methods: +// +// - IUnknown methods (AddRef, Release, QI) +// +// - DispatchOperation: +// +// Performs the asynchronous operation specified by pOp. +// +// At the end of each operation, the derived class must call +// ProcessQueue to process the next operation in the queue. +// +// NOTE: An operation is not required to complete inside the +// DispatchOperation method. A single operation might consist +// of several asynchronous method calls. +// +// - ValidateOperation: +// +// Checks whether the object can perform the operation specified +// by pOp at this time. +// +// If the object cannot perform the operation now (e.g., because +// another operation is still in progress) the method should +// return MF_E_NOTACCEPTING. +// +//------------------------------------------------------------------- +#include "linklist.h" +#include "AsyncCB.h" + +template +class OpQueue //: public IUnknown +{ +public: + + typedef ComPtrList OpList; + + HRESULT QueueOperation(TOperation *pOp); + +protected: + + HRESULT ProcessQueue(); + HRESULT ProcessQueueAsync(IMFAsyncResult *pResult); + + virtual HRESULT DispatchOperation(TOperation *pOp) = 0; + virtual HRESULT ValidateOperation(TOperation *pOp) = 0; + + OpQueue(CRITICAL_SECTION& critsec) + : m_OnProcessQueue(static_cast(this), &OpQueue::ProcessQueueAsync), + m_critsec(critsec) + { + } + + virtual ~OpQueue() + { + } + +protected: + OpList m_OpQueue; // Queue of operations. + CRITICAL_SECTION& m_critsec; // Protects the queue state. + AsyncCallback m_OnProcessQueue; // ProcessQueueAsync callback. +}; + + + +//------------------------------------------------------------------- +// Place an operation on the queue. +// Public method. +//------------------------------------------------------------------- + +template +HRESULT OpQueue::QueueOperation(TOperation *pOp) +{ + HRESULT hr = S_OK; + + EnterCriticalSection(&m_critsec); + + hr = m_OpQueue.InsertBack(pOp); + if (SUCCEEDED(hr)) + { + hr = ProcessQueue(); + } + + LeaveCriticalSection(&m_critsec); + return hr; +} + + +//------------------------------------------------------------------- +// Process the next operation on the queue. +// Protected method. +// +// Note: This method dispatches the operation to a work queue. +//------------------------------------------------------------------- + +template +HRESULT OpQueue::ProcessQueue() +{ + HRESULT hr = S_OK; + if (m_OpQueue.GetCount() > 0) + { + hr = MFPutWorkItem2( + MFASYNC_CALLBACK_QUEUE_STANDARD, // Use the standard work queue. + 0, // Default priority + &m_OnProcessQueue, // Callback method. + nullptr // State object. + ); + } + return hr; +} + + +//------------------------------------------------------------------- +// Process the next operation on the queue. +// Protected method. +// +// Note: This method is called from a work-queue thread. +//------------------------------------------------------------------- + +template +HRESULT OpQueue::ProcessQueueAsync(IMFAsyncResult *pResult) +{ + HRESULT hr = S_OK; + TOperation *pOp = nullptr; + + EnterCriticalSection(&m_critsec); + + if (m_OpQueue.GetCount() > 0) + { + hr = m_OpQueue.GetFront(&pOp); + + if (SUCCEEDED(hr)) + { + hr = ValidateOperation(pOp); + } + if (SUCCEEDED(hr)) + { + hr = m_OpQueue.RemoveFront(nullptr); + } + if (SUCCEEDED(hr)) + { + (void)DispatchOperation(pOp); + } + } + + if (pOp != nullptr) + { + pOp->Release(); + } + + LeaveCriticalSection(&m_critsec); + return hr; +} + +#pragma warning( pop ) \ No newline at end of file diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.cpp b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.cpp new file mode 100644 index 000000000..687386ece --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.cpp @@ -0,0 +1,1783 @@ +// THIS CODE AND INFORMATION IS PROVIDED "AS IS" WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO +// THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A +// PARTICULAR PURPOSE. +// +// Copyright (c) Microsoft Corporation. All rights reserved. + +#include "Grayscale.h" +#include "bufferlock.h" + +#pragma comment(lib, "d2d1") + +using namespace Microsoft::WRL; + +/* + +This sample implements a video effect as a Media Foundation transform (MFT). + +The video effect manipulates chroma values in a YUV image. In the default setting, +the entire image is converted to grayscale. Optionally, the application may set any +of the following attributes: + +MFT_GRAYSCALE_DESTINATION_RECT (type = blob, UINT32[4] array) + + Sets the destination rectangle for the effect. Pixels outside the destination + rectangle are not altered. + +MFT_GRAYSCALE_SATURATION (type = double) + + Sets the saturation level. The nominal range is [0...1]. Values beyond 1.0f + result in supersaturated colors. Values below 0.0f create inverted colors. + +MFT_GRAYSCALE_CHROMA_ROTATION (type = double) + + Rotates the chroma values of each pixel. The attribue value is the angle of + rotation in degrees. The result is a shift in hue. + +The effect is implemented by treating the chroma value of each pixel as a vector [u,v], +and applying a transformation matrix to the vector. The saturation parameter is applied +as a scaling transform. + + +NOTES ON THE MFT IMPLEMENTATION + +1. The MFT has fixed streams: One input stream and one output stream. + +2. The MFT supports the following formats: UYVY, YUY2, NV12. + +3. If the MFT is holding an input sample, SetInputType and SetOutputType both fail. + +4. The input and output types must be identical. + +5. If both types are set, no type can be set until the current type is cleared. + +6. Preferred input types: + + (a) If the output type is set, that's the preferred type. + (b) Otherwise, the preferred types are partial types, constructed from the + list of supported subtypes. + +7. Preferred output types: As above. + +8. Streaming: + + The private BeingStreaming() method is called in response to the + MFT_MESSAGE_NOTIFY_BEGIN_STREAMING message. + + If the client does not send MFT_MESSAGE_NOTIFY_BEGIN_STREAMING, the MFT calls + BeginStreaming inside the first call to ProcessInput or ProcessOutput. + + This is a good approach for allocating resources that your MFT requires for + streaming. + +9. The configuration attributes are applied in the BeginStreaming method. If the + client changes the attributes during streaming, the change is ignored until + streaming is stopped (either by changing the media types or by sending the + MFT_MESSAGE_NOTIFY_END_STREAMING message) and then restarted. + +*/ + + +// Video FOURCC codes. +const DWORD FOURCC_YUY2 = '2YUY'; +const DWORD FOURCC_UYVY = 'YVYU'; +const DWORD FOURCC_NV12 = '21VN'; + +// Static array of media types (preferred and accepted). +const GUID g_MediaSubtypes[] = +{ + MFVideoFormat_NV12, + MFVideoFormat_YUY2, + MFVideoFormat_UYVY +}; + +HRESULT GetImageSize(DWORD fcc, UINT32 width, UINT32 height, DWORD* pcbImage); +HRESULT GetDefaultStride(IMFMediaType *pType, LONG *plStride); +bool ValidateRect(const RECT& rc); + +template +inline T clamp(const T& val, const T& minVal, const T& maxVal) +{ + return (val < minVal ? minVal : (val > maxVal ? maxVal : val)); +} + + +// TransformChroma: +// Apply the transforms to calculate the output chroma values. + +void TransformChroma(const D2D1::Matrix3x2F& mat, BYTE *pu, BYTE *pv) +{ + // Normalize the chroma values to [-112, 112] range + + D2D1_POINT_2F pt = { static_cast(*pu) - 128, static_cast(*pv) - 128 }; + + pt = mat.TransformPoint(pt); + + // Clamp to valid range. + clamp(pt.x, -112.0f, 112.0f); + clamp(pt.y, -112.0f, 112.0f); + + // Map back to [16...240] range. + *pu = static_cast(pt.x + 128.0f); + *pv = static_cast(pt.y + 128.0f); +} + +//------------------------------------------------------------------- +// Functions to convert a YUV images to grayscale. +// +// In all cases, the same transformation is applied to the 8-bit +// chroma values, but the pixel layout in memory differs. +// +// The image conversion functions take the following parameters: +// +// mat Transfomation matrix for chroma values. +// rcDest Destination rectangle. +// pDest Pointer to the destination buffer. +// lDestStride Stride of the destination buffer, in bytes. +// pSrc Pointer to the source buffer. +// lSrcStride Stride of the source buffer, in bytes. +// dwWidthInPixels Frame width in pixels. +// dwHeightInPixels Frame height, in pixels. +//------------------------------------------------------------------- + +// Convert UYVY image. + +void TransformImage_UYVY( + const D2D1::Matrix3x2F& mat, + const D2D_RECT_U& rcDest, + _Inout_updates_(_Inexpressible_(lDestStride * dwHeightInPixels)) BYTE *pDest, + _In_ LONG lDestStride, + _In_reads_(_Inexpressible_(lSrcStride * dwHeightInPixels)) const BYTE* pSrc, + _In_ LONG lSrcStride, + _In_ DWORD dwWidthInPixels, + _In_ DWORD dwHeightInPixels) +{ + DWORD y = 0; + const DWORD y0 = min(rcDest.bottom, dwHeightInPixels); + + // Lines above the destination rectangle. + for ( ; y < rcDest.top; y++) + { + memcpy(pDest, pSrc, dwWidthInPixels * 2); + pSrc += lSrcStride; + pDest += lDestStride; + } + + // Lines within the destination rectangle. + for ( ; y < y0; y++) + { + WORD *pSrc_Pixel = (WORD*)pSrc; + WORD *pDest_Pixel = (WORD*)pDest; + + for (DWORD x = 0; (x + 1) < dwWidthInPixels; x += 2) + { + // Byte order is U0 Y0 V0 Y1 + // Each WORD is a byte pair (U/V, Y) + // Windows is little-endian so the order appears reversed. + + if (x >= rcDest.left && x < rcDest.right) + { + BYTE u = pSrc_Pixel[x] & 0x00FF; + BYTE v = pSrc_Pixel[x+1] & 0x00FF; + + TransformChroma(mat, &u, &v); + + pDest_Pixel[x] = (pSrc_Pixel[x] & 0xFF00) | u; + pDest_Pixel[x+1] = (pSrc_Pixel[x+1] & 0xFF00) | v; + } + else + { +#pragma warning(push) +#pragma warning(disable: 6385) +#pragma warning(disable: 6386) + pDest_Pixel[x] = pSrc_Pixel[x]; + pDest_Pixel[x+1] = pSrc_Pixel[x+1]; +#pragma warning(pop) + } + } + + pDest += lDestStride; + pSrc += lSrcStride; + } + + // Lines below the destination rectangle. + for ( ; y < dwHeightInPixels; y++) + { + memcpy(pDest, pSrc, dwWidthInPixels * 2); + pSrc += lSrcStride; + pDest += lDestStride; + } +} + + +// Convert YUY2 image. + +void TransformImage_YUY2( + const D2D1::Matrix3x2F& mat, + const D2D_RECT_U& rcDest, + _Inout_updates_(_Inexpressible_(lDestStride * dwHeightInPixels)) BYTE *pDest, + _In_ LONG lDestStride, + _In_reads_(_Inexpressible_(lSrcStride * dwHeightInPixels)) const BYTE* pSrc, + _In_ LONG lSrcStride, + _In_ DWORD dwWidthInPixels, + _In_ DWORD dwHeightInPixels) +{ + DWORD y = 0; + const DWORD y0 = min(rcDest.bottom, dwHeightInPixels); + + // Lines above the destination rectangle. + for ( ; y < rcDest.top; y++) + { + memcpy(pDest, pSrc, dwWidthInPixels * 2); + pSrc += lSrcStride; + pDest += lDestStride; + } + + // Lines within the destination rectangle. + for ( ; y < y0; y++) + { + WORD *pSrc_Pixel = (WORD*)pSrc; + WORD *pDest_Pixel = (WORD*)pDest; + + for (DWORD x = 0; (x + 1) < dwWidthInPixels; x += 2) + { + // Byte order is Y0 U0 Y1 V0 + // Each WORD is a byte pair (Y, U/V) + // Windows is little-endian so the order appears reversed. + + if (x >= rcDest.left && x < rcDest.right) + { + BYTE u = pSrc_Pixel[x] >> 8; + BYTE v = pSrc_Pixel[x+1] >> 8; + + TransformChroma(mat, &u, &v); + + pDest_Pixel[x] = (pSrc_Pixel[x] & 0x00FF) | (u<<8); + pDest_Pixel[x+1] = (pSrc_Pixel[x+1] & 0x00FF) | (v<<8); + } + else + { +#pragma warning(push) +#pragma warning(disable: 6385) +#pragma warning(disable: 6386) + pDest_Pixel[x] = pSrc_Pixel[x]; + pDest_Pixel[x+1] = pSrc_Pixel[x+1]; +#pragma warning(pop) + } + } + pDest += lDestStride; + pSrc += lSrcStride; + } + + // Lines below the destination rectangle. + for ( ; y < dwHeightInPixels; y++) + { + memcpy(pDest, pSrc, dwWidthInPixels * 2); + pSrc += lSrcStride; + pDest += lDestStride; + } +} + +// Convert NV12 image + +void TransformImage_NV12( + const D2D1::Matrix3x2F& mat, + const D2D_RECT_U& rcDest, + _Inout_updates_(_Inexpressible_(2 * lDestStride * dwHeightInPixels)) BYTE *pDest, + _In_ LONG lDestStride, + _In_reads_(_Inexpressible_(2 * lSrcStride * dwHeightInPixels)) const BYTE* pSrc, + _In_ LONG lSrcStride, + _In_ DWORD dwWidthInPixels, + _In_ DWORD dwHeightInPixels) +{ + // NV12 is planar: Y plane, followed by packed U-V plane. + + // Y plane + for (DWORD y = 0; y < dwHeightInPixels; y++) + { + CopyMemory(pDest, pSrc, dwWidthInPixels); + pDest += lDestStride; + pSrc += lSrcStride; + } + + // U-V plane + + // NOTE: The U-V plane has 1/2 the number of lines as the Y plane. + + // Lines above the destination rectangle. + DWORD y = 0; + const DWORD y0 = min(rcDest.bottom, dwHeightInPixels); + + for ( ; y < rcDest.top/2; y++) + { + memcpy(pDest, pSrc, dwWidthInPixels); + pSrc += lSrcStride; + pDest += lDestStride; + } + + // Lines within the destination rectangle. + for ( ; y < y0/2; y++) + { + for (DWORD x = 0; (x + 1) < dwWidthInPixels; x += 2) + { + if (x >= rcDest.left && x < rcDest.right) + { + BYTE u = pSrc[x]; + BYTE v = pSrc[x+1]; + + TransformChroma(mat, &u, &v); + + pDest[x] = u; + pDest[x+1] = v; + } + else + { + pDest[x] = pSrc[x]; + pDest[x+1] = pSrc[x+1]; + } + } + pDest += lDestStride; + pSrc += lSrcStride; + } + + // Lines below the destination rectangle. + for ( ; y < dwHeightInPixels/2; y++) + { + memcpy(pDest, pSrc, dwWidthInPixels); + pSrc += lSrcStride; + pDest += lDestStride; + } +} + +CGrayscale::CGrayscale() : + m_pSample(NULL), m_pInputType(NULL), m_pOutputType(NULL), m_pTransformFn(NULL), + m_imageWidthInPixels(0), m_imageHeightInPixels(0), m_cbImageSize(0), + m_transform(D2D1::Matrix3x2F::Identity()), m_rcDest(D2D1::RectU()), m_bStreamingInitialized(false), + m_pAttributes(NULL) +{ + InitializeCriticalSectionEx(&m_critSec, 3000, 0); +} + +CGrayscale::~CGrayscale() +{ + SafeRelease(&m_pInputType); + SafeRelease(&m_pOutputType); + SafeRelease(&m_pSample); + SafeRelease(&m_pAttributes); + DeleteCriticalSection(&m_critSec); +} + +// Initialize the instance. +STDMETHODIMP CGrayscale::RuntimeClassInitialize() +{ + // Create the attribute store. + return MFCreateAttributes(&m_pAttributes, 3); +} + +// IMediaExtension methods + +//------------------------------------------------------------------- +// SetProperties +// Sets the configuration of the effect +//------------------------------------------------------------------- +HRESULT CGrayscale::SetProperties(ABI::Windows::Foundation::Collections::IPropertySet *pConfiguration) +{ + return S_OK; +} + +// IMFTransform methods. Refer to the Media Foundation SDK documentation for details. + +//------------------------------------------------------------------- +// GetStreamLimits +// Returns the minimum and maximum number of streams. +//------------------------------------------------------------------- + +HRESULT CGrayscale::GetStreamLimits( + DWORD *pdwInputMinimum, + DWORD *pdwInputMaximum, + DWORD *pdwOutputMinimum, + DWORD *pdwOutputMaximum +) +{ + if ((pdwInputMinimum == NULL) || + (pdwInputMaximum == NULL) || + (pdwOutputMinimum == NULL) || + (pdwOutputMaximum == NULL)) + { + return E_POINTER; + } + + // This MFT has a fixed number of streams. + *pdwInputMinimum = 1; + *pdwInputMaximum = 1; + *pdwOutputMinimum = 1; + *pdwOutputMaximum = 1; + return S_OK; +} + + +//------------------------------------------------------------------- +// GetStreamCount +// Returns the actual number of streams. +//------------------------------------------------------------------- + +HRESULT CGrayscale::GetStreamCount( + DWORD *pcInputStreams, + DWORD *pcOutputStreams +) +{ + if ((pcInputStreams == NULL) || (pcOutputStreams == NULL)) + + { + return E_POINTER; + } + + // This MFT has a fixed number of streams. + *pcInputStreams = 1; + *pcOutputStreams = 1; + return S_OK; +} + + + +//------------------------------------------------------------------- +// GetStreamIDs +// Returns stream IDs for the input and output streams. +//------------------------------------------------------------------- + +HRESULT CGrayscale::GetStreamIDs( + DWORD dwInputIDArraySize, + DWORD *pdwInputIDs, + DWORD dwOutputIDArraySize, + DWORD *pdwOutputIDs +) +{ + // It is not required to implement this method if the MFT has a fixed number of + // streams AND the stream IDs are numbered sequentially from zero (that is, the + // stream IDs match the stream indexes). + + // In that case, it is OK to return E_NOTIMPL. + return E_NOTIMPL; +} + + +//------------------------------------------------------------------- +// GetInputStreamInfo +// Returns information about an input stream. +//------------------------------------------------------------------- + +HRESULT CGrayscale::GetInputStreamInfo( + DWORD dwInputStreamID, + MFT_INPUT_STREAM_INFO * pStreamInfo +) +{ + if (pStreamInfo == NULL) + { + return E_POINTER; + } + + EnterCriticalSection(&m_critSec); + + if (!IsValidInputStream(dwInputStreamID)) + { + LeaveCriticalSection(&m_critSec); + return MF_E_INVALIDSTREAMNUMBER; + } + + // NOTE: This method should succeed even when there is no media type on the + // stream. If there is no media type, we only need to fill in the dwFlags + // member of MFT_INPUT_STREAM_INFO. The other members depend on having a + // a valid media type. + + pStreamInfo->hnsMaxLatency = 0; + pStreamInfo->dwFlags = MFT_INPUT_STREAM_WHOLE_SAMPLES | MFT_INPUT_STREAM_SINGLE_SAMPLE_PER_BUFFER; + + if (m_pInputType == NULL) + { + pStreamInfo->cbSize = 0; + } + else + { + pStreamInfo->cbSize = m_cbImageSize; + } + + pStreamInfo->cbMaxLookahead = 0; + pStreamInfo->cbAlignment = 0; + + LeaveCriticalSection(&m_critSec); + return S_OK; +} + +//------------------------------------------------------------------- +// GetOutputStreamInfo +// Returns information about an output stream. +//------------------------------------------------------------------- + +HRESULT CGrayscale::GetOutputStreamInfo( + DWORD dwOutputStreamID, + MFT_OUTPUT_STREAM_INFO * pStreamInfo +) +{ + if (pStreamInfo == NULL) + { + return E_POINTER; + } + + EnterCriticalSection(&m_critSec); + + if (!IsValidOutputStream(dwOutputStreamID)) + { + LeaveCriticalSection(&m_critSec); + return MF_E_INVALIDSTREAMNUMBER; + } + + // NOTE: This method should succeed even when there is no media type on the + // stream. If there is no media type, we only need to fill in the dwFlags + // member of MFT_OUTPUT_STREAM_INFO. The other members depend on having a + // a valid media type. + + pStreamInfo->dwFlags = + MFT_OUTPUT_STREAM_WHOLE_SAMPLES | + MFT_OUTPUT_STREAM_SINGLE_SAMPLE_PER_BUFFER | + MFT_OUTPUT_STREAM_FIXED_SAMPLE_SIZE ; + + if (m_pOutputType == NULL) + { + pStreamInfo->cbSize = 0; + } + else + { + pStreamInfo->cbSize = m_cbImageSize; + } + + pStreamInfo->cbAlignment = 0; + + LeaveCriticalSection(&m_critSec); + return S_OK; +} + + +//------------------------------------------------------------------- +// GetAttributes +// Returns the attributes for the MFT. +//------------------------------------------------------------------- + +HRESULT CGrayscale::GetAttributes(IMFAttributes** ppAttributes) +{ + if (ppAttributes == NULL) + { + return E_POINTER; + } + + EnterCriticalSection(&m_critSec); + + *ppAttributes = m_pAttributes; + (*ppAttributes)->AddRef(); + + LeaveCriticalSection(&m_critSec); + return S_OK; +} + + +//------------------------------------------------------------------- +// GetInputStreamAttributes +// Returns stream-level attributes for an input stream. +//------------------------------------------------------------------- + +HRESULT CGrayscale::GetInputStreamAttributes( + DWORD dwInputStreamID, + IMFAttributes **ppAttributes +) +{ + // This MFT does not support any stream-level attributes, so the method is not implemented. + return E_NOTIMPL; +} + + +//------------------------------------------------------------------- +// GetOutputStreamAttributes +// Returns stream-level attributes for an output stream. +//------------------------------------------------------------------- + +HRESULT CGrayscale::GetOutputStreamAttributes( + DWORD dwOutputStreamID, + IMFAttributes **ppAttributes +) +{ + // This MFT does not support any stream-level attributes, so the method is not implemented. + return E_NOTIMPL; +} + + +//------------------------------------------------------------------- +// DeleteInputStream +//------------------------------------------------------------------- + +HRESULT CGrayscale::DeleteInputStream(DWORD dwStreamID) +{ + // This MFT has a fixed number of input streams, so the method is not supported. + return E_NOTIMPL; +} + + +//------------------------------------------------------------------- +// AddInputStreams +//------------------------------------------------------------------- + +HRESULT CGrayscale::AddInputStreams( + DWORD cStreams, + DWORD *adwStreamIDs +) +{ + // This MFT has a fixed number of output streams, so the method is not supported. + return E_NOTIMPL; +} + + +//------------------------------------------------------------------- +// GetInputAvailableType +// Returns a preferred input type. +//------------------------------------------------------------------- + +HRESULT CGrayscale::GetInputAvailableType( + DWORD dwInputStreamID, + DWORD dwTypeIndex, // 0-based + IMFMediaType **ppType +) +{ + if (ppType == NULL) + { + return E_INVALIDARG; + } + + EnterCriticalSection(&m_critSec); + + if (!IsValidInputStream(dwInputStreamID)) + { + LeaveCriticalSection(&m_critSec); + return MF_E_INVALIDSTREAMNUMBER; + } + + HRESULT hr = S_OK; + + // If the output type is set, return that type as our preferred input type. + if (m_pOutputType == NULL) + { + // The output type is not set. Create a partial media type. + hr = OnGetPartialType(dwTypeIndex, ppType); + } + else if (dwTypeIndex > 0) + { + hr = MF_E_NO_MORE_TYPES; + } + else + { + *ppType = m_pOutputType; + (*ppType)->AddRef(); + } + + LeaveCriticalSection(&m_critSec); + return hr; +} + + + +//------------------------------------------------------------------- +// GetOutputAvailableType +// Returns a preferred output type. +//------------------------------------------------------------------- + +HRESULT CGrayscale::GetOutputAvailableType( + DWORD dwOutputStreamID, + DWORD dwTypeIndex, // 0-based + IMFMediaType **ppType +) +{ + if (ppType == NULL) + { + return E_INVALIDARG; + } + + EnterCriticalSection(&m_critSec); + + if (!IsValidOutputStream(dwOutputStreamID)) + { + LeaveCriticalSection(&m_critSec); + return MF_E_INVALIDSTREAMNUMBER; + } + + HRESULT hr = S_OK; + + if (m_pInputType == NULL) + { + // The input type is not set. Create a partial media type. + hr = OnGetPartialType(dwTypeIndex, ppType); + } + else if (dwTypeIndex > 0) + { + hr = MF_E_NO_MORE_TYPES; + } + else + { + *ppType = m_pInputType; + (*ppType)->AddRef(); + } + + LeaveCriticalSection(&m_critSec); + return hr; +} + + +//------------------------------------------------------------------- +// SetInputType +//------------------------------------------------------------------- + +HRESULT CGrayscale::SetInputType( + DWORD dwInputStreamID, + IMFMediaType *pType, // Can be NULL to clear the input type. + DWORD dwFlags +) +{ + // Validate flags. + if (dwFlags & ~MFT_SET_TYPE_TEST_ONLY) + { + return E_INVALIDARG; + } + + EnterCriticalSection(&m_critSec); + + if (!IsValidInputStream(dwInputStreamID)) + { + LeaveCriticalSection(&m_critSec); + return MF_E_INVALIDSTREAMNUMBER; + } + + HRESULT hr = S_OK; + + // Does the caller want us to set the type, or just test it? + BOOL bReallySet = ((dwFlags & MFT_SET_TYPE_TEST_ONLY) == 0); + + // If we have an input sample, the client cannot change the type now. + if (HasPendingOutput()) + { + hr = MF_E_TRANSFORM_CANNOT_CHANGE_MEDIATYPE_WHILE_PROCESSING; + goto done; + } + + // Validate the type, if non-NULL. + if (pType) + { + hr = OnCheckInputType(pType); + if (FAILED(hr)) + { + goto done; + } + } + + // The type is OK. Set the type, unless the caller was just testing. + if (bReallySet) + { + OnSetInputType(pType); + + // When the type changes, end streaming. + hr = EndStreaming(); + } + +done: + LeaveCriticalSection(&m_critSec); + return hr; +} + + + +//------------------------------------------------------------------- +// SetOutputType +//------------------------------------------------------------------- + +HRESULT CGrayscale::SetOutputType( + DWORD dwOutputStreamID, + IMFMediaType *pType, // Can be NULL to clear the output type. + DWORD dwFlags +) +{ + // Validate flags. + if (dwFlags & ~MFT_SET_TYPE_TEST_ONLY) + { + return E_INVALIDARG; + } + + EnterCriticalSection(&m_critSec); + + if (!IsValidOutputStream(dwOutputStreamID)) + { + LeaveCriticalSection(&m_critSec); + return MF_E_INVALIDSTREAMNUMBER; + } + + HRESULT hr = S_OK; + + // Does the caller want us to set the type, or just test it? + BOOL bReallySet = ((dwFlags & MFT_SET_TYPE_TEST_ONLY) == 0); + + // If we have an input sample, the client cannot change the type now. + if (HasPendingOutput()) + { + hr = MF_E_TRANSFORM_CANNOT_CHANGE_MEDIATYPE_WHILE_PROCESSING; + goto done; + } + + // Validate the type, if non-NULL. + if (pType) + { + hr = OnCheckOutputType(pType); + if (FAILED(hr)) + { + goto done; + } + } + + // The type is OK. Set the type, unless the caller was just testing. + if (bReallySet) + { + OnSetOutputType(pType); + + // When the type changes, end streaming. + hr = EndStreaming(); + } + +done: + LeaveCriticalSection(&m_critSec); + return hr; +} + + +//------------------------------------------------------------------- +// GetInputCurrentType +// Returns the current input type. +//------------------------------------------------------------------- + +HRESULT CGrayscale::GetInputCurrentType( + DWORD dwInputStreamID, + IMFMediaType **ppType +) +{ + if (ppType == NULL) + { + return E_POINTER; + } + + HRESULT hr = S_OK; + + EnterCriticalSection(&m_critSec); + + if (!IsValidInputStream(dwInputStreamID)) + { + hr = MF_E_INVALIDSTREAMNUMBER; + } + else if (!m_pInputType) + { + hr = MF_E_TRANSFORM_TYPE_NOT_SET; + } + else + { + *ppType = m_pInputType; + (*ppType)->AddRef(); + } + LeaveCriticalSection(&m_critSec); + return hr; +} + + +//------------------------------------------------------------------- +// GetOutputCurrentType +// Returns the current output type. +//------------------------------------------------------------------- + +HRESULT CGrayscale::GetOutputCurrentType( + DWORD dwOutputStreamID, + IMFMediaType **ppType +) +{ + if (ppType == NULL) + { + return E_POINTER; + } + + HRESULT hr = S_OK; + + EnterCriticalSection(&m_critSec); + + if (!IsValidOutputStream(dwOutputStreamID)) + { + hr = MF_E_INVALIDSTREAMNUMBER; + } + else if (!m_pOutputType) + { + hr = MF_E_TRANSFORM_TYPE_NOT_SET; + } + else + { + *ppType = m_pOutputType; + (*ppType)->AddRef(); + } + + LeaveCriticalSection(&m_critSec); + return hr; +} + + +//------------------------------------------------------------------- +// GetInputStatus +// Query if the MFT is accepting more input. +//------------------------------------------------------------------- + +HRESULT CGrayscale::GetInputStatus( + DWORD dwInputStreamID, + DWORD *pdwFlags +) +{ + if (pdwFlags == NULL) + { + return E_POINTER; + } + + EnterCriticalSection(&m_critSec); + + if (!IsValidInputStream(dwInputStreamID)) + { + LeaveCriticalSection(&m_critSec); + return MF_E_INVALIDSTREAMNUMBER; + } + + // If an input sample is already queued, do not accept another sample until the + // client calls ProcessOutput or Flush. + + // NOTE: It is possible for an MFT to accept more than one input sample. For + // example, this might be required in a video decoder if the frames do not + // arrive in temporal order. In the case, the decoder must hold a queue of + // samples. For the video effect, each sample is transformed independently, so + // there is no reason to queue multiple input samples. + + if (m_pSample == NULL) + { + *pdwFlags = MFT_INPUT_STATUS_ACCEPT_DATA; + } + else + { + *pdwFlags = 0; + } + + LeaveCriticalSection(&m_critSec); + return S_OK; +} + + + +//------------------------------------------------------------------- +// GetOutputStatus +// Query if the MFT can produce output. +//------------------------------------------------------------------- + +HRESULT CGrayscale::GetOutputStatus(DWORD *pdwFlags) +{ + if (pdwFlags == NULL) + { + return E_POINTER; + } + + EnterCriticalSection(&m_critSec); + + // The MFT can produce an output sample if (and only if) there an input sample. + if (m_pSample != NULL) + { + *pdwFlags = MFT_OUTPUT_STATUS_SAMPLE_READY; + } + else + { + *pdwFlags = 0; + } + + LeaveCriticalSection(&m_critSec); + return S_OK; +} + + +//------------------------------------------------------------------- +// SetOutputBounds +// Sets the range of time stamps that the MFT will output. +//------------------------------------------------------------------- + +HRESULT CGrayscale::SetOutputBounds( + LONGLONG hnsLowerBound, + LONGLONG hnsUpperBound +) +{ + // Implementation of this method is optional. + return E_NOTIMPL; +} + + +//------------------------------------------------------------------- +// ProcessEvent +// Sends an event to an input stream. +//------------------------------------------------------------------- + +HRESULT CGrayscale::ProcessEvent( + DWORD dwInputStreamID, + IMFMediaEvent *pEvent +) +{ + // This MFT does not handle any stream events, so the method can + // return E_NOTIMPL. This tells the pipeline that it can stop + // sending any more events to this MFT. + return E_NOTIMPL; +} + + +//------------------------------------------------------------------- +// ProcessMessage +//------------------------------------------------------------------- + +HRESULT CGrayscale::ProcessMessage( + MFT_MESSAGE_TYPE eMessage, + ULONG_PTR ulParam +) +{ + EnterCriticalSection(&m_critSec); + + HRESULT hr = S_OK; + + switch (eMessage) + { + case MFT_MESSAGE_COMMAND_FLUSH: + // Flush the MFT. + hr = OnFlush(); + break; + + case MFT_MESSAGE_COMMAND_DRAIN: + // Drain: Tells the MFT to reject further input until all pending samples are + // processed. That is our default behavior already, so there is nothing to do. + // + // For a decoder that accepts a queue of samples, the MFT might need to drain + // the queue in response to this command. + break; + + case MFT_MESSAGE_SET_D3D_MANAGER: + // Sets a pointer to the IDirect3DDeviceManager9 interface. + + // The pipeline should never send this message unless the MFT sets the MF_SA_D3D_AWARE + // attribute set to TRUE. Because this MFT does not set MF_SA_D3D_AWARE, it is an error + // to send the MFT_MESSAGE_SET_D3D_MANAGER message to the MFT. Return an error code in + // this case. + + // NOTE: If this MFT were D3D-enabled, it would cache the IDirect3DDeviceManager9 + // pointer for use during streaming. + + hr = E_NOTIMPL; + break; + + case MFT_MESSAGE_NOTIFY_BEGIN_STREAMING: + hr = BeginStreaming(); + break; + + case MFT_MESSAGE_NOTIFY_END_STREAMING: + hr = EndStreaming(); + break; + + // The next two messages do not require any action from this MFT. + + case MFT_MESSAGE_NOTIFY_END_OF_STREAM: + break; + + case MFT_MESSAGE_NOTIFY_START_OF_STREAM: + break; + } + + LeaveCriticalSection(&m_critSec); + return hr; +} + + +//------------------------------------------------------------------- +// ProcessInput +// Process an input sample. +//------------------------------------------------------------------- + +HRESULT CGrayscale::ProcessInput( + DWORD dwInputStreamID, + IMFSample *pSample, + DWORD dwFlags +) +{ + // Check input parameters. + if (pSample == NULL) + { + return E_POINTER; + } + + if (dwFlags != 0) + { + return E_INVALIDARG; // dwFlags is reserved and must be zero. + } + + HRESULT hr = S_OK; + + EnterCriticalSection(&m_critSec); + + // Validate the input stream number. + if (!IsValidInputStream(dwInputStreamID)) + { + hr = MF_E_INVALIDSTREAMNUMBER; + goto done; + } + + // Check for valid media types. + // The client must set input and output types before calling ProcessInput. + if (!m_pInputType || !m_pOutputType) + { + hr = MF_E_NOTACCEPTING; + goto done; + } + + // Check if an input sample is already queued. + if (m_pSample != NULL) + { + hr = MF_E_NOTACCEPTING; // We already have an input sample. + goto done; + } + + // Initialize streaming. + hr = BeginStreaming(); + if (FAILED(hr)) + { + goto done; + } + + // Cache the sample. We do the actual work in ProcessOutput. + m_pSample = pSample; + pSample->AddRef(); // Hold a reference count on the sample. + +done: + LeaveCriticalSection(&m_critSec); + return hr; +} + + +//------------------------------------------------------------------- +// ProcessOutput +// Process an output sample. +//------------------------------------------------------------------- + +HRESULT CGrayscale::ProcessOutput( + DWORD dwFlags, + DWORD cOutputBufferCount, + MFT_OUTPUT_DATA_BUFFER *pOutputSamples, // one per stream + DWORD *pdwStatus +) +{ + // Check input parameters... + + // This MFT does not accept any flags for the dwFlags parameter. + + // The only defined flag is MFT_PROCESS_OUTPUT_DISCARD_WHEN_NO_BUFFER. This flag + // applies only when the MFT marks an output stream as lazy or optional. But this + // MFT has no lazy or optional streams, so the flag is not valid. + + if (dwFlags != 0) + { + return E_INVALIDARG; + } + + if (pOutputSamples == NULL || pdwStatus == NULL) + { + return E_POINTER; + } + + // There must be exactly one output buffer. + if (cOutputBufferCount != 1) + { + return E_INVALIDARG; + } + + // It must contain a sample. + if (pOutputSamples[0].pSample == NULL) + { + return E_INVALIDARG; + } + + HRESULT hr = S_OK; + + IMFMediaBuffer *pInput = NULL; + IMFMediaBuffer *pOutput = NULL; + + EnterCriticalSection(&m_critSec); + + // There must be an input sample available for processing. + if (m_pSample == NULL) + { + hr = MF_E_TRANSFORM_NEED_MORE_INPUT; + goto done; + } + + // Initialize streaming. + + hr = BeginStreaming(); + if (FAILED(hr)) + { + goto done; + } + + // Get the input buffer. + hr = m_pSample->ConvertToContiguousBuffer(&pInput); + if (FAILED(hr)) + { + goto done; + } + + // Get the output buffer. + hr = pOutputSamples[0].pSample->ConvertToContiguousBuffer(&pOutput); + if (FAILED(hr)) + { + goto done; + } + + hr = OnProcessOutput(pInput, pOutput); + if (FAILED(hr)) + { + goto done; + } + + // Set status flags. + pOutputSamples[0].dwStatus = 0; + *pdwStatus = 0; + + + // Copy the duration and time stamp from the input sample, if present. + + LONGLONG hnsDuration = 0; + LONGLONG hnsTime = 0; + + if (SUCCEEDED(m_pSample->GetSampleDuration(&hnsDuration))) + { + hr = pOutputSamples[0].pSample->SetSampleDuration(hnsDuration); + if (FAILED(hr)) + { + goto done; + } + } + + if (SUCCEEDED(m_pSample->GetSampleTime(&hnsTime))) + { + hr = pOutputSamples[0].pSample->SetSampleTime(hnsTime); + } + +done: + SafeRelease(&m_pSample); // Release our input sample. + SafeRelease(&pInput); + SafeRelease(&pOutput); + LeaveCriticalSection(&m_critSec); + return hr; +} + +// PRIVATE METHODS + +// All methods that follow are private to this MFT and are not part of the IMFTransform interface. + +// Create a partial media type from our list. +// +// dwTypeIndex: Index into the list of peferred media types. +// ppmt: Receives a pointer to the media type. + +HRESULT CGrayscale::OnGetPartialType(DWORD dwTypeIndex, IMFMediaType **ppmt) +{ + if (dwTypeIndex >= ARRAYSIZE(g_MediaSubtypes)) + { + return MF_E_NO_MORE_TYPES; + } + + IMFMediaType *pmt = NULL; + + HRESULT hr = MFCreateMediaType(&pmt); + if (FAILED(hr)) + { + goto done; + } + + hr = pmt->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video); + if (FAILED(hr)) + { + goto done; + } + + hr = pmt->SetGUID(MF_MT_SUBTYPE, g_MediaSubtypes[dwTypeIndex]); + if (FAILED(hr)) + { + goto done; + } + + *ppmt = pmt; + (*ppmt)->AddRef(); + +done: + SafeRelease(&pmt); + return hr; +} + + +// Validate an input media type. + +HRESULT CGrayscale::OnCheckInputType(IMFMediaType *pmt) +{ + assert(pmt != NULL); + + HRESULT hr = S_OK; + + // If the output type is set, see if they match. + if (m_pOutputType != NULL) + { + DWORD flags = 0; + hr = pmt->IsEqual(m_pOutputType, &flags); + + // IsEqual can return S_FALSE. Treat this as failure. + if (hr != S_OK) + { + hr = MF_E_INVALIDMEDIATYPE; + } + } + else + { + // Output type is not set. Just check this type. + hr = OnCheckMediaType(pmt); + } + return hr; +} + + +// Validate an output media type. + +HRESULT CGrayscale::OnCheckOutputType(IMFMediaType *pmt) +{ + assert(pmt != NULL); + + HRESULT hr = S_OK; + + // If the input type is set, see if they match. + if (m_pInputType != NULL) + { + DWORD flags = 0; + hr = pmt->IsEqual(m_pInputType, &flags); + + // IsEqual can return S_FALSE. Treat this as failure. + if (hr != S_OK) + { + hr = MF_E_INVALIDMEDIATYPE; + } + + } + else + { + // Input type is not set. Just check this type. + hr = OnCheckMediaType(pmt); + } + return hr; +} + + +// Validate a media type (input or output) + +HRESULT CGrayscale::OnCheckMediaType(IMFMediaType *pmt) +{ + BOOL bFoundMatchingSubtype = FALSE; + + // Major type must be video. + GUID major_type; + HRESULT hr = pmt->GetGUID(MF_MT_MAJOR_TYPE, &major_type); + if (FAILED(hr)) + { + goto done; + } + + if (major_type != MFMediaType_Video) + { + hr = MF_E_INVALIDMEDIATYPE; + goto done; + } + + // Subtype must be one of the subtypes in our global list. + + // Get the subtype GUID. + GUID subtype; + hr = pmt->GetGUID(MF_MT_SUBTYPE, &subtype); + if (FAILED(hr)) + { + goto done; + } + + // Look for the subtype in our list of accepted types. + for (DWORD i = 0; i < ARRAYSIZE(g_MediaSubtypes); i++) + { + if (subtype == g_MediaSubtypes[i]) + { + bFoundMatchingSubtype = TRUE; + break; + } + } + + if (!bFoundMatchingSubtype) + { + hr = MF_E_INVALIDMEDIATYPE; // The MFT does not support this subtype. + goto done; + } + + // Reject single-field media types. + UINT32 interlace = MFGetAttributeUINT32(pmt, MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive); + if (interlace == MFVideoInterlace_FieldSingleUpper || interlace == MFVideoInterlace_FieldSingleLower) + { + hr = MF_E_INVALIDMEDIATYPE; + } + +done: + return hr; +} + + +// Set or clear the input media type. +// +// Prerequisite: The input type was already validated. + +void CGrayscale::OnSetInputType(IMFMediaType *pmt) +{ + // if pmt is NULL, clear the type. + // if pmt is non-NULL, set the type. + + SafeRelease(&m_pInputType); + m_pInputType = pmt; + if (m_pInputType) + { + m_pInputType->AddRef(); + } + + // Update the format information. + UpdateFormatInfo(); +} + + +// Set or clears the output media type. +// +// Prerequisite: The output type was already validated. + +void CGrayscale::OnSetOutputType(IMFMediaType *pmt) +{ + // If pmt is NULL, clear the type. Otherwise, set the type. + + SafeRelease(&m_pOutputType); + m_pOutputType = pmt; + if (m_pOutputType) + { + m_pOutputType->AddRef(); + } +} + + +// Initialize streaming parameters. +// +// This method is called if the client sends the MFT_MESSAGE_NOTIFY_BEGIN_STREAMING +// message, or when the client processes a sample, whichever happens first. + +HRESULT CGrayscale::BeginStreaming() +{ + HRESULT hr = S_OK; + + if (!m_bStreamingInitialized) + { + // Get the configuration attributes. + + // Get the destination rectangle. + + RECT rcDest; + hr = m_pAttributes->GetBlob(MFT_GRAYSCALE_DESTINATION_RECT, (UINT8*)&rcDest, sizeof(rcDest), NULL); + if (hr == MF_E_ATTRIBUTENOTFOUND || !ValidateRect(rcDest)) + { + // The client did not set this attribute, or the client provided an invalid rectangle. + // Default to the entire image. + + m_rcDest = D2D1::RectU(0, 0, m_imageWidthInPixels, m_imageHeightInPixels); + hr = S_OK; + } + else if (SUCCEEDED(hr)) + { + m_rcDest = D2D1::RectU(rcDest.left, rcDest.top, rcDest.right, rcDest.bottom); + } + else + { + goto done; + } + + // Get the chroma transformations. + + float scale = (float)MFGetAttributeDouble(m_pAttributes, MFT_GRAYSCALE_SATURATION, 0.0f); + float angle = (float)MFGetAttributeDouble(m_pAttributes, MFT_GRAYSCALE_CHROMA_ROTATION, 0.0f); + + m_transform = D2D1::Matrix3x2F::Scale(scale, scale) * D2D1::Matrix3x2F::Rotation(angle); + + m_bStreamingInitialized = true; + } + +done: + return hr; +} + + +// End streaming. + +// This method is called if the client sends an MFT_MESSAGE_NOTIFY_END_STREAMING +// message, or when the media type changes. In general, it should be called whenever +// the streaming parameters need to be reset. + +HRESULT CGrayscale::EndStreaming() +{ + m_bStreamingInitialized = false; + return S_OK; +} + + + +// Generate output data. + +HRESULT CGrayscale::OnProcessOutput(IMFMediaBuffer *pIn, IMFMediaBuffer *pOut) +{ + BYTE *pDest = NULL; // Destination buffer. + LONG lDestStride = 0; // Destination stride. + + BYTE *pSrc = NULL; // Source buffer. + LONG lSrcStride = 0; // Source stride. + + // Helper objects to lock the buffers. + VideoBufferLock inputLock(pIn); + VideoBufferLock outputLock(pOut); + + // Stride if the buffer does not support IMF2DBuffer + LONG lDefaultStride = 0; + + HRESULT hr = GetDefaultStride(m_pInputType, &lDefaultStride); + if (FAILED(hr)) + { + goto done; + } + + // Lock the input buffer. + hr = inputLock.LockBuffer(lDefaultStride, m_imageHeightInPixels, &pSrc, &lSrcStride); + if (FAILED(hr)) + { + goto done; + } + + // Lock the output buffer. + hr = outputLock.LockBuffer(lDefaultStride, m_imageHeightInPixels, &pDest, &lDestStride); + if (FAILED(hr)) + { + goto done; + } + + // Invoke the image transform function. + assert (m_pTransformFn != NULL); + if (m_pTransformFn) + { + (*m_pTransformFn)(m_transform, m_rcDest, pDest, lDestStride, pSrc, lSrcStride, + m_imageWidthInPixels, m_imageHeightInPixels); + } + else + { + hr = E_UNEXPECTED; + goto done; + } + + + // Set the data size on the output buffer. + hr = pOut->SetCurrentLength(m_cbImageSize); + + // The VideoBufferLock class automatically unlocks the buffers. +done: + return hr; +} + + +// Flush the MFT. + +HRESULT CGrayscale::OnFlush() +{ + // For this MFT, flushing just means releasing the input sample. + SafeRelease(&m_pSample); + return S_OK; +} + + +// Update the format information. This method is called whenever the +// input type is set. + +HRESULT CGrayscale::UpdateFormatInfo() +{ + HRESULT hr = S_OK; + + GUID subtype = GUID_NULL; + + m_imageWidthInPixels = 0; + m_imageHeightInPixels = 0; + m_cbImageSize = 0; + + m_pTransformFn = NULL; + + if (m_pInputType != NULL) + { + hr = m_pInputType->GetGUID(MF_MT_SUBTYPE, &subtype); + if (FAILED(hr)) + { + goto done; + } + if (subtype == MFVideoFormat_YUY2) + { + m_pTransformFn = TransformImage_YUY2; + } + else if (subtype == MFVideoFormat_UYVY) + { + m_pTransformFn = TransformImage_UYVY; + } + else if (subtype == MFVideoFormat_NV12) + { + m_pTransformFn = TransformImage_NV12; + } + else + { + hr = E_UNEXPECTED; + goto done; + } + + hr = MFGetAttributeSize(m_pInputType, MF_MT_FRAME_SIZE, &m_imageWidthInPixels, &m_imageHeightInPixels); + if (FAILED(hr)) + { + goto done; + } + + // Calculate the image size (not including padding) + hr = GetImageSize(subtype.Data1, m_imageWidthInPixels, m_imageHeightInPixels, &m_cbImageSize); + } + +done: + return hr; +} + + +// Calculate the size of the buffer needed to store the image. + +// fcc: The FOURCC code of the video format. + +HRESULT GetImageSize(DWORD fcc, UINT32 width, UINT32 height, DWORD* pcbImage) +{ + HRESULT hr = S_OK; + + switch (fcc) + { + case FOURCC_YUY2: + case FOURCC_UYVY: + // check overflow + if ((width > MAXDWORD / 2) || (width * 2 > MAXDWORD / height)) + { + hr = E_INVALIDARG; + } + else + { + // 16 bpp + *pcbImage = width * height * 2; + } + break; + + case FOURCC_NV12: + // check overflow + if ((height/2 > MAXDWORD - height) || ((height + height/2) > MAXDWORD / width)) + { + hr = E_INVALIDARG; + } + else + { + // 12 bpp + *pcbImage = width * (height + (height/2)); + } + break; + + default: + hr = E_FAIL; // Unsupported type. + } + return hr; +} + +// Get the default stride for a video format. +HRESULT GetDefaultStride(IMFMediaType *pType, LONG *plStride) +{ + LONG lStride = 0; + + // Try to get the default stride from the media type. + HRESULT hr = pType->GetUINT32(MF_MT_DEFAULT_STRIDE, (UINT32*)&lStride); + if (FAILED(hr)) + { + // Attribute not set. Try to calculate the default stride. + GUID subtype = GUID_NULL; + + UINT32 width = 0; + UINT32 height = 0; + + // Get the subtype and the image size. + hr = pType->GetGUID(MF_MT_SUBTYPE, &subtype); + if (SUCCEEDED(hr)) + { + hr = MFGetAttributeSize(pType, MF_MT_FRAME_SIZE, &width, &height); + } + if (SUCCEEDED(hr)) + { + if (subtype == MFVideoFormat_NV12) + { + lStride = width; + } + else if (subtype == MFVideoFormat_YUY2 || subtype == MFVideoFormat_UYVY) + { + lStride = ((width * 2) + 3) & ~3; + } + else + { + hr = E_INVALIDARG; + } + } + + // Set the attribute for later reference. + if (SUCCEEDED(hr)) + { + (void)pType->SetUINT32(MF_MT_DEFAULT_STRIDE, UINT32(lStride)); + } + } + if (SUCCEEDED(hr)) + { + *plStride = lStride; + } + return hr; +} + + +// Validate that a rectangle meets the following criteria: +// +// - All coordinates are non-negative. +// - The rectangle is not flipped (top > bottom, left > right) +// +// These are the requirements for the destination rectangle. + +bool ValidateRect(const RECT& rc) +{ + if (rc.left < 0 || rc.top < 0) + { + return false; + } + if (rc.left > rc.right || rc.top > rc.bottom) + { + return false; + } + return true; +} diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.def b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.def new file mode 100644 index 000000000..0b801908c --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.def @@ -0,0 +1,4 @@ +EXPORTS + DllCanUnloadNow PRIVATE + DllGetActivationFactory PRIVATE + DllGetClassObject PRIVATE \ No newline at end of file diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.h b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.h new file mode 100644 index 000000000..b83223bce --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.h @@ -0,0 +1,266 @@ +// Defines the transform class. +// +// THIS CODE AND INFORMATION IS PROVIDED "AS IS" WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO +// THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A +// PARTICULAR PURPOSE. +// +// Copyright (c) Microsoft Corporation. All rights reserved. + +#ifndef GRAYSCALE_H +#define GRAYSCALE_H + +#include +#include +#include +#include +#include +#include +#include + +// Note: The Direct2D helper library is included for its 2D matrix operations. +#include + +#include +#include +#include + +#include "GrayscaleTransform.h" + +// CLSID of the MFT. +DEFINE_GUID(CLSID_GrayscaleMFT, +0x2f3dbc05, 0xc011, 0x4a8f, 0xb2, 0x64, 0xe4, 0x2e, 0x35, 0xc6, 0x7b, 0xf4); + +// +// * IMPORTANT: If you implement your own MFT, create a new GUID for the CLSID. * +// + + +// Configuration attributes + +// {7BBBB051-133B-41F5-B6AA-5AFF9B33A2CB} +DEFINE_GUID(MFT_GRAYSCALE_DESTINATION_RECT, +0x7bbbb051, 0x133b, 0x41f5, 0xb6, 0xaa, 0x5a, 0xff, 0x9b, 0x33, 0xa2, 0xcb); + + +// {14782342-93E8-4565-872C-D9A2973D5CBF} +DEFINE_GUID(MFT_GRAYSCALE_SATURATION, +0x14782342, 0x93e8, 0x4565, 0x87, 0x2c, 0xd9, 0xa2, 0x97, 0x3d, 0x5c, 0xbf); + +// {E0BADE5D-E4B9-4689-9DBA-E2F00D9CED0E} +DEFINE_GUID(MFT_GRAYSCALE_CHROMA_ROTATION, +0xe0bade5d, 0xe4b9, 0x4689, 0x9d, 0xba, 0xe2, 0xf0, 0xd, 0x9c, 0xed, 0xe); + + +template void SafeRelease(T **ppT) +{ + if (*ppT) + { + (*ppT)->Release(); + *ppT = NULL; + } +} + +// Function pointer for the function that transforms the image. +typedef void (*IMAGE_TRANSFORM_FN)( + const D2D1::Matrix3x2F& mat, // Chroma transform matrix. + const D2D_RECT_U& rcDest, // Destination rectangle for the transformation. + BYTE* pDest, // Destination buffer. + LONG lDestStride, // Destination stride. + const BYTE* pSrc, // Source buffer. + LONG lSrcStride, // Source stride. + DWORD dwWidthInPixels, // Image width in pixels. + DWORD dwHeightInPixels // Image height in pixels. + ); + +// CGrayscale class: +// Implements a grayscale video effect. + +class CGrayscale + : public Microsoft::WRL::RuntimeClass< + Microsoft::WRL::RuntimeClassFlags< Microsoft::WRL::RuntimeClassType::WinRtClassicComMix >, + ABI::Windows::Media::IMediaExtension, + IMFTransform > +{ + InspectableClass(RuntimeClass_GrayscaleTransform_GrayscaleEffect, BaseTrust) + +public: + CGrayscale(); + + ~CGrayscale(); + + STDMETHOD(RuntimeClassInitialize)(); + + // IMediaExtension + STDMETHODIMP SetProperties(ABI::Windows::Foundation::Collections::IPropertySet *pConfiguration); + + // IMFTransform + STDMETHODIMP GetStreamLimits( + DWORD *pdwInputMinimum, + DWORD *pdwInputMaximum, + DWORD *pdwOutputMinimum, + DWORD *pdwOutputMaximum + ); + + STDMETHODIMP GetStreamCount( + DWORD *pcInputStreams, + DWORD *pcOutputStreams + ); + + STDMETHODIMP GetStreamIDs( + DWORD dwInputIDArraySize, + DWORD *pdwInputIDs, + DWORD dwOutputIDArraySize, + DWORD *pdwOutputIDs + ); + + STDMETHODIMP GetInputStreamInfo( + DWORD dwInputStreamID, + MFT_INPUT_STREAM_INFO * pStreamInfo + ); + + STDMETHODIMP GetOutputStreamInfo( + DWORD dwOutputStreamID, + MFT_OUTPUT_STREAM_INFO * pStreamInfo + ); + + STDMETHODIMP GetAttributes(IMFAttributes** pAttributes); + + STDMETHODIMP GetInputStreamAttributes( + DWORD dwInputStreamID, + IMFAttributes **ppAttributes + ); + + STDMETHODIMP GetOutputStreamAttributes( + DWORD dwOutputStreamID, + IMFAttributes **ppAttributes + ); + + STDMETHODIMP DeleteInputStream(DWORD dwStreamID); + + STDMETHODIMP AddInputStreams( + DWORD cStreams, + DWORD *adwStreamIDs + ); + + STDMETHODIMP GetInputAvailableType( + DWORD dwInputStreamID, + DWORD dwTypeIndex, // 0-based + IMFMediaType **ppType + ); + + STDMETHODIMP GetOutputAvailableType( + DWORD dwOutputStreamID, + DWORD dwTypeIndex, // 0-based + IMFMediaType **ppType + ); + + STDMETHODIMP SetInputType( + DWORD dwInputStreamID, + IMFMediaType *pType, + DWORD dwFlags + ); + + STDMETHODIMP SetOutputType( + DWORD dwOutputStreamID, + IMFMediaType *pType, + DWORD dwFlags + ); + + STDMETHODIMP GetInputCurrentType( + DWORD dwInputStreamID, + IMFMediaType **ppType + ); + + STDMETHODIMP GetOutputCurrentType( + DWORD dwOutputStreamID, + IMFMediaType **ppType + ); + + STDMETHODIMP GetInputStatus( + DWORD dwInputStreamID, + DWORD *pdwFlags + ); + + STDMETHODIMP GetOutputStatus(DWORD *pdwFlags); + + STDMETHODIMP SetOutputBounds( + LONGLONG hnsLowerBound, + LONGLONG hnsUpperBound + ); + + STDMETHODIMP ProcessEvent( + DWORD dwInputStreamID, + IMFMediaEvent *pEvent + ); + + STDMETHODIMP ProcessMessage( + MFT_MESSAGE_TYPE eMessage, + ULONG_PTR ulParam + ); + + STDMETHODIMP ProcessInput( + DWORD dwInputStreamID, + IMFSample *pSample, + DWORD dwFlags + ); + + STDMETHODIMP ProcessOutput( + DWORD dwFlags, + DWORD cOutputBufferCount, + MFT_OUTPUT_DATA_BUFFER *pOutputSamples, // one per stream + DWORD *pdwStatus + ); + + +private: + // HasPendingOutput: Returns TRUE if the MFT is holding an input sample. + BOOL HasPendingOutput() const { return m_pSample != NULL; } + + // IsValidInputStream: Returns TRUE if dwInputStreamID is a valid input stream identifier. + BOOL IsValidInputStream(DWORD dwInputStreamID) const + { + return dwInputStreamID == 0; + } + + // IsValidOutputStream: Returns TRUE if dwOutputStreamID is a valid output stream identifier. + BOOL IsValidOutputStream(DWORD dwOutputStreamID) const + { + return dwOutputStreamID == 0; + } + + HRESULT OnGetPartialType(DWORD dwTypeIndex, IMFMediaType **ppmt); + HRESULT OnCheckInputType(IMFMediaType *pmt); + HRESULT OnCheckOutputType(IMFMediaType *pmt); + HRESULT OnCheckMediaType(IMFMediaType *pmt); + void OnSetInputType(IMFMediaType *pmt); + void OnSetOutputType(IMFMediaType *pmt); + HRESULT BeginStreaming(); + HRESULT EndStreaming(); + HRESULT OnProcessOutput(IMFMediaBuffer *pIn, IMFMediaBuffer *pOut); + HRESULT OnFlush(); + HRESULT UpdateFormatInfo(); + + CRITICAL_SECTION m_critSec; + + // Transformation parameters + D2D1::Matrix3x2F m_transform; // Chroma transform matrix. + D2D_RECT_U m_rcDest; // Destination rectangle for the effect. + + // Streaming + bool m_bStreamingInitialized; + IMFSample *m_pSample; // Input sample. + IMFMediaType *m_pInputType; // Input media type. + IMFMediaType *m_pOutputType; // Output media type. + + // Fomat information + UINT32 m_imageWidthInPixels; + UINT32 m_imageHeightInPixels; + DWORD m_cbImageSize; // Image size, in bytes. + + IMFAttributes *m_pAttributes; + + // Image transform function. (Changes based on the media type.) + IMAGE_TRANSFORM_FN m_pTransformFn; +}; +#endif \ No newline at end of file diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.vcxproj b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.vcxproj new file mode 100644 index 000000000..8af8f2c7d --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.vcxproj @@ -0,0 +1,313 @@ + + + + + Debug + ARM + + + Debug + Win32 + + + Debug + x64 + + + Release + ARM + + + Release + Win32 + + + Release + x64 + + + + $(VCTargetsPath11) + {BA69218F-DA5C-4D14-A78D-21A9E4DEC669} + Win32Proj + GrayscaleTransform + GrayscaleTransform + 11.0 + true + + + + DynamicLibrary + true + v110 + + + DynamicLibrary + true + v110 + + + DynamicLibrary + true + v110 + + + DynamicLibrary + false + true + v110 + + + DynamicLibrary + false + true + v110 + + + DynamicLibrary + false + true + v110 + + + + en-US + + + + + + + + + + + + + + + + + + + + + + + false + $(Configuration)\$(MSBuildProjectName)\ + + + false + + + false + + + false + $(Configuration)\$(MSBuildProjectName)\ + + + false + + + false + + + + NotUsing + _WINRT_DLL;_DEBUG;_WINDOWS;%(PreprocessorDefinitions) + + + + + $(WindowsSDK_WindowsMetadata);$(AdditionalUsingDirectories) + false + $(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common; + + + Console + runtimeobject.lib;%(AdditionalDependencies);mf.lib;mfuuid.lib;mfplat.lib + false + Grayscale.def + + + mdmerge -metadata_dir "$(WindowsSDK_MetadataPath)" -o "$(ProjectDir)$(Configuration)\$(MSBuildProjectName)" -i "$(MSBuildProjectDirectory)" -v -partial + $(ProjectDir)$(Configuration)\$(MSBuildProjectName)\$(ProjectName).winmd + + + + + NotUsing + _WINRT_DLL;_DEBUG;_WINDOWS;%(PreprocessorDefinitions) + + + + + $(WindowsSDK_WindowsMetadata);$(AdditionalUsingDirectories) + false + $(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common; + + + Console + runtimeobject.lib;%(AdditionalDependencies);mf.lib;mfuuid.lib;mfplat.lib + false + Grayscale.def + + + mdmerge -metadata_dir "$(WindowsSDK_MetadataPath)" -o "$(SolutionDir)$(Platform)\$(Configuration)\$(MSBuildProjectName)" -i "$(MSBuildProjectDirectory)" -v -partial + $(SolutionDir)$(Platform)\$(Configuration)\$(MSBuildProjectName)\$(ProjectName).winmd + + + + + NotUsing + _WINRT_DLL;_DEBUG;_WINDOWS;%(PreprocessorDefinitions) + + + + + $(WindowsSDK_WindowsMetadata);$(AdditionalUsingDirectories) + false + $(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common; + + + Console + runtimeobject.lib;%(AdditionalDependencies);mf.lib;mfuuid.lib;mfplat.lib + false + Grayscale.def + + + mdmerge -metadata_dir "$(WindowsSDK_MetadataPath)" -o "$(SolutionDir)$(Platform)\$(Configuration)\$(MSBuildProjectName)" -i "$(MSBuildProjectDirectory)" -v -partial + $(SolutionDir)$(Platform)\$(Configuration)\$(MSBuildProjectName)\$(ProjectName).winmd + + + + + NotUsing + _WINRT_DLL;NDEBUG;_WINDOWS;%(PreprocessorDefinitions) + + + + + $(WindowsSDK_WindowsMetadata);$(AdditionalUsingDirectories) + false + $(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common; + + + Console + runtimeobject.lib;%(AdditionalDependencies);mf.lib;mfuuid.lib;mfplat.lib + false + Grayscale.def + + + mdmerge -metadata_dir "$(WindowsSDK_MetadataPath)" -o "$(ProjectDir)$(Configuration)\$(MSBuildProjectName)" -i "$(MSBuildProjectDirectory)" -v -partial + $(ProjectDir)$(Configuration)\$(MSBuildProjectName)\$(ProjectName).winmd + + + + + NotUsing + _WINRT_DLL;NDEBUG;_WINDOWS;%(PreprocessorDefinitions) + + + + + $(WindowsSDK_WindowsMetadata);$(AdditionalUsingDirectories) + false + $(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common; + + + Console + runtimeobject.lib;%(AdditionalDependencies);mf.lib;mfuuid.lib;mfplat.lib + false + Grayscale.def + + + mdmerge -metadata_dir "$(WindowsSDK_MetadataPath)" -o "$(SolutionDir)$(Platform)\$(Configuration)\$(MSBuildProjectName)" -i "$(MSBuildProjectDirectory)" -v -partial + $(SolutionDir)$(Platform)\$(Configuration)\$(MSBuildProjectName)\$(ProjectName).winmd + + + + + NotUsing + _WINRT_DLL;NDEBUG;_WINDOWS;%(PreprocessorDefinitions) + + + + + $(WindowsSDK_WindowsMetadata);$(AdditionalUsingDirectories) + false + $(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common; + + + Console + runtimeobject.lib;%(AdditionalDependencies);mf.lib;mfuuid.lib;mfplat.lib + false + Grayscale.def + + + mdmerge -metadata_dir "$(WindowsSDK_MetadataPath)" -o "$(SolutionDir)$(Platform)\$(Configuration)\$(MSBuildProjectName)" -i "$(MSBuildProjectDirectory)" -v -partial + $(SolutionDir)$(Platform)\$(Configuration)\$(MSBuildProjectName)\$(ProjectName).winmd + + + + + + + + + + + + + + + + + + + + + + + + + + + $(WindowsSDK_MetadataPath) + $(WindowsSDK_MetadataPath) + $(WindowsSDK_MetadataPath) + $(WindowsSDK_MetadataPath) + $(WindowsSDK_MetadataPath) + $(WindowsSDK_MetadataPath) + true + true + true + true + true + true + %(Filename).h + %(Filename).h + %(Filename).h + %(Filename).h + %(Filename).h + %(Filename).h + + + + + <_MdMergeOutput Condition="'$(Platform)' == 'Win32'" Include="$(ProjectDir)$(Configuration)\$(MSBuildProjectName)\$(ProjectName).winmd" /> + <_MdMergeOutput Condition="'$(Platform)' != 'Win32'" Include="$(SolutionDir)$(Platform)\$(Configuration)\$(MSBuildProjectName)\$(ProjectName).winmd" /> + + + + + $(ProjectName).winmd + $(TargetName)$(TargetExt) + + + + + + \ No newline at end of file diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.vcxproj.filters b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.vcxproj.filters new file mode 100644 index 000000000..92c2c9cfe --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.vcxproj.filters @@ -0,0 +1,22 @@ + + + + + bdc52ff6-58cb-464b-bf4f-0c1804b135ff + rc;ico;cur;bmp;dlg;rc2;rct;bin;rgs;gif;jpg;jpeg;jpe;resx;tiff;tif;png;wav;mfcribbon-ms + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/GrayscaleTransform.idl b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/GrayscaleTransform.idl new file mode 100644 index 000000000..de81380ec --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/GrayscaleTransform.idl @@ -0,0 +1,11 @@ +import "Windows.Media.idl"; + +#include + +namespace GrayscaleTransform +{ + [version(NTDDI_WIN8)] + runtimeclass GrayscaleEffect + { + } +} \ No newline at end of file diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/dllmain.cpp b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/dllmain.cpp new file mode 100644 index 000000000..ad6767011 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/dllmain.cpp @@ -0,0 +1,58 @@ +////////////////////////////////////////////////////////////////////////// +// +// dllmain.cpp +// +// THIS CODE AND INFORMATION IS PROVIDED "AS IS" WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO +// THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A +// PARTICULAR PURPOSE. +// +// Copyright (c) Microsoft Corporation. All rights reserved. +// +////////////////////////////////////////////////////////////////////////// + +#include +#include "Grayscale.h" + +using namespace Microsoft::WRL; + +namespace Microsoft { namespace Samples { + ActivatableClass(CGrayscale); +}} + +BOOL WINAPI DllMain( _In_ HINSTANCE hInstance, _In_ DWORD dwReason, _In_opt_ LPVOID lpReserved ) +{ + if( DLL_PROCESS_ATTACH == dwReason ) + { + // + // Don't need per-thread callbacks + // + DisableThreadLibraryCalls( hInstance ); + + Module::GetModule().Create(); + } + else if( DLL_PROCESS_DETACH == dwReason ) + { + Module::GetModule().Terminate(); + } + + return TRUE; +} + +HRESULT WINAPI DllGetActivationFactory( _In_ HSTRING activatibleClassId, _Outptr_ IActivationFactory** factory ) +{ + auto &module = Microsoft::WRL::Module< Microsoft::WRL::InProc >::GetModule(); + return module.GetActivationFactory( activatibleClassId, factory ); +} + +HRESULT WINAPI DllCanUnloadNow() +{ + auto &module = Microsoft::WRL::Module::GetModule(); + return (module.Terminate()) ? S_OK : S_FALSE; +} + +STDAPI DllGetClassObject( _In_ REFCLSID rclsid, _In_ REFIID riid, _Outptr_ LPVOID FAR* ppv ) +{ + auto &module = Microsoft::WRL::Module::GetModule(); + return module.GetClassObject( rclsid, riid, ppv ); +} diff --git a/samples/winrt/ImageManipulations/C++/Package.appxmanifest b/samples/winrt/ImageManipulations/C++/Package.appxmanifest new file mode 100644 index 000000000..c72258b0c --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/Package.appxmanifest @@ -0,0 +1,39 @@ + + + + + MediaCapture CPP sample + Microsoft Corporation + Assets\storeLogo-sdk.png + + + 6.2.1 + 6.2.1 + + + + + + + + + + + + + + + + + + + + + + + GrayscaleTransform.dll + + + + + \ No newline at end of file diff --git a/samples/winrt/ImageManipulations/C++/assets/microsoft-sdk.png b/samples/winrt/ImageManipulations/C++/assets/microsoft-sdk.png new file mode 100644 index 0000000000000000000000000000000000000000..1a1aec25b67aa1dc55663bdc58da2083107eac0d GIT binary patch literal 1583 zcmV+~2GIG5P)Px#1ZP1_K>z@;j|==^1poj532;bRa{vGmbN~PnbOGLGA9w%&02y>eSaefwW^{L9 za%BKeVQFr3E>1;MAa*k@H7+-&gq_R)00o~(L_t(oN9~visFhU^$9--EX8GC_#TFqg zGsMa^)1tx*B!f1x#mt4MtcW-#S%BUan9FL&TqXX@G&Z||Ep!@D z+1O23l5Bln^t!^c!t;TdO&cXe2h?ylf!crt^xI$fp@!Vy;*S+hGz@8aWB)@1vTyg~ zg9k;A3j6(FPi%VoQhH%YUEk&vBGx6E9r@~hvn7zcm zKq$ItKZo&^0P6b#)BmWx9|;=%U;$py`oD{Rsn9L-4AS=tCHDNPvE$9*eMR4%_V4T5 zsIi_xM~Tj&eZhNN*g=>sOl>GIn!stK=+{!dmBwb5jNhxV{nPl>qSq0gsl%8llA)>a z^*zWhX^i0e_;;cStn+3?V4gBQdhQgs0lyd@ds~G80!@$W;ShbFc;}1$qz+rM56Lq4 zZ<4T2qk*BD$_m{vyJ`IMl1-6Law?oE-mPg4H(sJx zY@GFWO6#KYHG%w3Iw2+Y5V{K!l5Y0c%TgSjrtfK_XimQK=7voK&G}>N4nQqIDZC{q zT_l>mytaQf)?T3v{;*Hu4qnguZuS$@n05AtLU<{ zk6wHU?4~6LqqoL?G9AnHO;<(1^`-%X*Rm5$V?kPCZ4Ssi0ylwqrwOFCY(tf%GkhbB z4>jFgHHN)@IH^Z38jm3Ni@-R4MN>fug+&ibYRb|AqqwfLsctOSowtkg+LBQi;Ez;t z?_@)+FV*ybyO}l!OkPqMe;=Pyl-Pd%FPh;HZwFPfD z-V)t{ZHeFhZv-ZHQ=Cv7vN&yw)5U3Ak@y7bj>H=$+H*fat8^s#Jwou*NTGPMzd~Vl$Y0Tv~?T3WHpPF3dVVdelQCh9)!ilHE znj`f6Mai1X;V_&P^3;)=OXQKAC-e;Yg-Lc~@cA!CKCsLFaZo}jZ7Ox5iV?v-#(3j{ z$0>7JYRmA%n{G~}2p=cVO5=QrF-~q1`iD;546buZHb~=iP`rKk5BAprL8a-Swyg5+ z717j~a|LPww}cyt(^d+c{Lv}!1m{qJw8mSFhb0G*zfnncPTFfFU}48-jq#YI2J>Dz hxT(Obp8Vf?;4d8suiatp%?AJg002ovPDHLkV1jTD8!rF= literal 0 HcmV?d00001 diff --git a/samples/winrt/ImageManipulations/C++/assets/placeholder-sdk.png b/samples/winrt/ImageManipulations/C++/assets/placeholder-sdk.png new file mode 100644 index 0000000000000000000000000000000000000000..01b3138cf8c040c54f3a200b2052f94a10240ded GIT binary patch literal 8991 zcmd6Nc|6qJ-#(KTZA4N@+E597$&zgpB@wcXeYqo*J;vCFN|9_KA%yH($TovpDVowAA+PKDwKQg=L@m4OLwh z7FK`ob8^=X@QEkv^Dy{_)lFCJDobJeu^I3mHoGgguCTBaN9>_nZwLQpce!Eg#=^3{ z9{OR871__n!gA_^y6P2uPm2Y-qk}syl2;iuI5B$Y9`!J zKd5~?lB?{2NoclbkFqD;r^+#H?|?Pz0D9~(xP8_p`JGHXA2^k_Z?gVAxavZ|xfXn5Qz zbz!igproYh&;=u_Qo9ZhT%S*oMa_f9j~_c(4(K}(hpL>=JUl!^l#KJRd($y~e$n$; z)V35j&ZTv(uV`@MbDTNF9e+^8(a|w==*yQ#c_&>Izgus-ndm&QJnXr}YeCphNIk#%+5^#xS^e&4xIhE>y!!sdT3R8^ zN#xlXiY{-Qh+)3V<-##LIISN$SmBInE-o(aG#{Ni^Zs7vDRTNFs$|W%`04t$A{v6P zgZA)Z;IW)v8S*+325FtN*dHmLW9&Z5&B6saIXTlU1|fPOQkM%`a>V2=zbne+dBXM9 z5{}w|^)V<=Eu@|c$$K-Kc}FDt%x#fFJ-37--Vv#K69!sMH>Az|^7hL;9n-<_YOkqBmR{06=BM{qSD#L!r3OwRQQ3nk3{h(p7V!Bl z?U9c`dkR0x9l5}rGj1sZRpb$i52{aCV7pF?5cfR@2;K4K-sc7cJ!tezGpF(!PG#`5 zk39c0q$?5kezRk4a&`}^6YQ{$;-=s3VjvK+-j&qQXWP8@OUT3usZ$dU`|ep$ymU@-@~^UEQY`O$2G+AGyA<8 zb|rLW&y(B-z#%%rFq466>C(}HE~glpI>QR85y0Rl!Xw(<$aL=*`jMU^7I&B{PXyXD zQ*5G|%!X!C;4UKpruW_O806WL0bo6w&|5*T_9Xc!%xW2z2zMpS-s??*LAUi z^>e#U1Sng<&TQ5ljd20S2I|6QXq8c~Y?QoLvXd^m?GOdtW(j23zVg~k@w#Eeo1Q~6 zyxz5QwSF|%-ndBS^;SGh3))@gzZJ`eSdPoDYM4_>nMXenj?^4EuMm6<_vLY=aVWgH zQRo!dO&QigGRe3CCqB32-)#Oma`Ng{kdzhWPoF;Ru|Zu0``A6o%{|{c%U)CvvI~UJ z1vcB$IgX4IGoRf|3hjVVqJ_0h932a!q@@04hLQ7ifZ;@USLuyWW+q5Z`POjoJq;c&784`J}E<0KC)NaTu zYxYXpc<15j`zH0zex+7-&p1-t9P|B=TZ9FDV{_xziPBD!#2-K2^UJ!_eYhma z2lfk4_Jm=^7U$;NgjrAxMMumPy#G`Elo_-W2M)`)A;j+mb8Ibq4cn|43XaEGOJBMT z1)Izryi$#qvhV$-r>z*o#oR0aMw)CZUG!V@>Fw$oOZ(jLNtWbI#T6`bQ%5Uel(-jsCr}P;SLJy07 z=H|$hmrU+6+xSoR_fA29?}E)J2;?N%jW>psd#^r-VWu|h;17}lC>jTwtI(R`XX@C* zlmxbHj9mYm|2I1?Fg>`?64x&4ue1~TMIFpP8Pzf5=-tQ1P2K7pf0CqFutJUC4 z&<#KhUm39SByE4Z6nie~uB`_nxQjV;umGJG8f%#YZyV~4|3dy>bGk-v`|-fwUn z`prLFKSv}nP#0if?MshtCslbSP138}rW^pIDfe9ZYpY?xxOF15{_i`CaN(X(-*3fHB(j}E z>*Y-hpU%&H~qEHpeGq?RWnB(SImoq;T<-VT6>?TkW4D}H%X0m{snR2G!`=l7lL zM_AcfOzZGN{Kt;DYe@d#r0idf=TC3m#8p)EwZFN{{cB`-yLWjhC0FoNm95)#B=6(+ zS#AEwcF1-XzIbL=wtRaJsfDE4_Y%K-JFE*bh6TpD8x%}#VPW|`K4~L2H@ARME@q*E z>$)w)pC~H5QWDIul||O<*@?nQQ}KVcnpXU;R-=l^$@>o-bmQG35^2yh0Sy|QJ}D-! z7fLvnKvzi=QfPW^PO_%%w@<-t7JOTy?aW1|$-<|UdX78)5$*2yz2ry__s=CilmpGG z51}`|q^k6O`C?_?n;*CboTGf=Vqj|ONsTElA(!pFbx$6HAlR^f>52YdM%jhHdjMu* zu-Jv}96e)OFDL_)t#&|Oz<4YqI1*~{0CnHBlGO(hf$9V#M^qzf`NXRW(#&eOTL@fp zy5LjM)IxiLl>OxOYi$Q0Xg~zT>{Ylb%4W&G84mS8~1)-QKVpfct(Cl8HEi$}Gw1iEsV_PxLZAORZvzcUfu;pRp@`N1_cW%ZwAAH1@u z+k`m&lS9ZI&bn^BdtPv!6rvWGdro3zJbR@_>H-0`lgUW+)t&bd@j`P zTBCk7c2q|zl@Q=d9WX&$d0l_09m%xZp_wttfexlI`ZHBV>WORgJS=Sw_94T+NRjb9 zMT;=FfxuG`3`#m3z3Ll-Uale1Lq3XRk>HG}H9F2eHaDatlaRRB!I<-5L@~UyG3DL_ z^g(_Zbj6$Sq(W^abe2O7nU~T2^)Mo09m)F)C*HGYu~$&hH>De=Q5HDZl%n)J!rcvkKUva~Hp1$As_-f-)twdY& z-y9x^8)Hd^wENz;63P{as9bVic#*2)E03@}g%o*t+O|@GEawq;nn!}>AYn0@+Eg^V zfUD^kEkf%!n|kn~8-=Jd+68uRa1{S5Q!}mbB=*GsjtHB;Hvj z^Cc~5TY<^Jju;EVJ#n8V^Rn`DTJIW~z0GDwS{V#sy&z{gskrw&K++QPQBXK6eXUYrCO~w;Wyc^ zI-WnI)R$j@+P#{5)7O8W!1vs7=>ZOGsP+>MT$0e&HL`P{Zi+SuekXc;=mmk`UjA7h zu0BEbhj%pjZK9lrhK7c(QX21YE=k_paTy=RcQ_wdR7{41B<=PZ$_lE&nr`BWYlT zp&+DNob#&6bn#pxLO)cq&zmNoann<=)uSbUyxy!&tmR&jIwpLOaE1UcHqm>K1{+_d z7noKoFX^QCR?FGNlONr(TsQU`2s+)cQ9n)%Uri$*j>VZCBHxL?szna*o$Vnf@6O6r zI$~SOGv+_=y)uW2^*2X#T-l&C_z{eckJEaDyU596(U`1@4eLU72EO7d8=D+8Ovl_G z@^K^w_{c@Oh*sg$tKl@s8zIfV2Ekx?7`qv_B2sDPq2)L6F&YcbXNvXCC~t57!1q~r zo`3W?(}$Kz28H-4jJ9Qeo=>hC`Iwl$HQM-6E$w`siJp%lQro&^DdRz!pW=HFVI^YF zI@*Mo{ORLSJ9w$< z@pY?}{<7{t=5M;Cl&I(?8G}XqkdzOF;;Gt>rn<3=<$V!d(s{kwKt6CYE^HjPo>Qh} z70-@BGOVi!X?&HKr9o%(NvWHtr{yP^+b=Iewv0x*=J3h zkn8ogEPQ;YeRl{TFZm3e=Xk&4Q!(Lo>xf}+EbwKUi1nx)7u z6*Z}`t_q)(R-SDiR&SqKFWoes%?=`;C9XHFrDB7KmQzK%mNVt&i;J+1re=LP6<^mo znpMXu(SvdH`Hb_0uba?HM2lSq{r0bB&Qpg_XyWVs#gyf^Nd6bxl`f-?GDyB!+h}Oo zAYq`Qc2$(t8R5@n8~{t`>|b(35BUcKLI5o0pe%~`CIbxXx}L7CzeEs)PcGaDlpn9g zu+Hh} zY<@-WZh(`wvHt=}fQKyrozRcAw*9c!Nv8e4oab@2va zOo|EkwYF|^Xu5CyCk2N{=wjh|UT&^Yo^W+JlNY~rX-EMqCMqh5-W)A_OA-qxfYp|c zps7rD`RbJ%j}4T%I12gH7^XSW!Zn73AmBw-F0|FYn@o-3mGZm)(0$hCi<6U+^GldK z?Zm}x5QEvbckeeM9$SFL`s{eB63YaiCBO=8g!Ze$MW-S3$zJJ7FTLAV^8ESpx~r!l zzQ@7}K;3Xb4|l`_sub1FjLk=GXmS}F-`99`p_oFN`o(H0%f10F0g`-5+@gB=&;!%= zR;>RL$roW^#{UlqyRG<(NRGU+?S0u=;e_9P;Sy6(VggKjIxl>EWZNJP4Zr%EbR0c? zynNfv{gs09f0%R78Il(~w%m9Egc;?AD8+o+vX|!u1dpE%wyQ$$SQa7_tH&CD>I>VM z6j(U`WMT&-DwyP{?J2vzCblIi7{Qu(&@caQ?y@B<@Etq0_fPT?0kZ!Akjs1f``LeQ zAF;ynszZs^DrQ#6X;t}TOECn(fm;OzlW;)MR~lFwsQgP^cuM=pN`=97{=ZNcPHlka zgZ*&tQ>uWswCFH%8Zg4wiS-S?S=VXB0ED(}PImS#aTBH%!g&$iG4=H6(`B`Y<7F6( zoV}CNQ%Ht`*w^T&^%iyc#l9{Eif9D)f#d{@>IOYf>g>;KkD~goEfSoYA_O|le};rc z7MLVtL1$fEz`{_~*Zs#8WVd}vZDIb#cuP$1;?y|A$INU=h^sptL)+SB@O}P+BA!;X zN4dLquN?IFZ<3sv`ZrMNdH%50|(u;ElIxbkhsO(t;kxzW;@lH^y`1UgV|W0 zj=Y3>nQw&ilPvg6D^Yb6>G=<0hXHP0rjH3Wi)Lsk> z-C6u(a2i@UAqo`Go6FT2bnoNo{y$S2lp^3|@)8n%EqQMKqFf}B?o>6?sFmVh73~bDWfYx!}{9w zOU&$mnsMvU#N$BF4A{gV@t~c}&(AND_qR`>-R$o$)Au=BBM{aurmS0E$PA+XZaE6+ z-XDK1Ies=Wv~R3KK+gRi@cQP^mLTQA91mt8lBLe)^$$^%^U4!OQsjW-4@d?-8nGBmmB{6N$@c;EtuAg=-~1wl9Ylv-Zi zJK3JZ=Q+J4FEgieL1%h6x^4cip8_-RQ52QT$H(W`nRe}$%u7^q0$hF!nLt7mSs=x` zdwQdsBWaQb!GIPs z5EbwWLgHW>1%*jrzZjTDA9xcCuuSWAAoOgqp6~@SBs}m&$DrcYlS?K&P53Vm`nP)_ zypdcr*x6}KTU!u${Db`fFy<^A)ERbX@!oy=;@fxqA>*`6Jd_3=KZ8{52n_Lx^C$9%T;FD|^Ybz3RM;q?)GlImWyL0T0zx<}fkz;1bO>m>%QzZ-F~6#+ zKN-F0CG$Z~+g`nQDn-UQxu>r$VvkhqIR{8(;&4%Kv45hYqVm`Pf#7Z7*t^#VNJF#= zi$2vyEiE?zFu7uVCMrC`?&|*i`!Rm^^)4$O(WK>NXS?9OM(XkF$(-++&~F0RhBw@6 zgNh`dRgcx~yX879d8OL5GmJJYAEUzLw|LEZBePv^M;{rQ)qy$mmk(8bp(R^$aE_ z=>vQteC?eg#b1j-Cf)WeBF2fJ88iY`fw8fOCOqfDas3l#=|962`ZDunc<(HFG>CV1 zh!91?*|`YFUf6f(XL3|CZPsV=icB;#G*anJ0%!-U$HGH~;md>dV-2ATZ8C%p;5{Ep znERe;dm0FqL(R$p%CV;4F{EaxkuS(5N#WvpeF%49cy3CvV3q6mb)Sh?#O~%*QU|=E zyj(htZXgre**u;RXHhbh>{F;C!|x2n84<~k5I&fsY=PcDnwC8r^4W-f%5V<~7I^6t}-gWzNT4R3@@PaM;G2|k)3Cskjxe@Iep7Rj3(HcU2GcCI8 zcSk=45w!*DR~Vcznn>jZ6?V%eFYwZh|I)b2|g*>AW z3#6W=I{pK^A;&XX#4$-HwjPO0=@xZ0q}XWqZeW@bjxi44&ms(cLy87&H{IIoOT(Po zA?Qay*dle`Dc;wvO+k6{WRhrh271{RT{kE3<0M`#?6n~7&Q2!78dnu LTB?Ottseaktfbe2 literal 0 HcmV?d00001 diff --git a/samples/winrt/ImageManipulations/C++/assets/smallTile-sdk.png b/samples/winrt/ImageManipulations/C++/assets/smallTile-sdk.png new file mode 100644 index 0000000000000000000000000000000000000000..5546e8b24a69738114bc17b2b2d493a6872d67ad GIT binary patch literal 1248 zcmeAS@N?(olHy`uVBq!ia0vp^av;pX1|+Qw)-3{3k|nMYCBgY=CFO}lsSJ)O`AMk? zp1FzXsX?iUDV2pMQ*9U+m{T%CB1$5BeXNr6bM+EIYV;~{3xK*A7;Nk-3KEmEQ%e+* zQqwc@Y?a>c-mj#PnPRIHZt82`Ti~3Uk?B!Ylp0*+7m{3+ootz+WN)WnQ(*-(AUCxn zQK2F?C$HG5!d3}vt`(3C64qBz04piUwpD^SD#ABF!8yMuRl!uxKsVXI%s|1+P|wiV z#N6CmN5ROz&_Lh7NZ-&%*U;R`*vQJjKmiJrfVLH-q*(>IxIyg#@@$ndN=gc>^!3Zj z%k|2Q_413-^$jg8EkR}&8R-I5=oVMzl_XZ^<`pZ$OmImpPA){ffi_eM3D1{oGuTzrd=COM+4n&cLd=IHa;5RX-@T zIKQ+g85kdF$}r8qu)}W=NFmTQR{lkqz(`5Vami0E%}vcK@pQ3O0?O#6WTse|n3$Sa zTDmy9nj4$D8X6i}IGH%PI2*Va8n{>*8k#!7%)qAC)y34v)Xdz{*vZAn(9qS;($vJl z)xyQx%*e^f(b2>Zrq?sCxFj(zITdDaCeU7}UJJZ>t(=Qe6HD@oLh|!-U@0IVBfliS zI3vG6!8zDeAv`lLCBM8F6gd#Tx}+9mmZhe+73JqDfJ4_R6N~M}u1=N)CT<3nuErMF zeGSo@LQaV310ACeN*YK>1ttVce;_72;R8AFtdp7t%r8a2jPa$hZ3Y7aqmZYIV@SoV zH(|GVn+U)puN1)>GjWDYb(7@t?Y2iuo7SC^e(wK3>@~xg zMI8?7PRhyOSNeDN&u*(nlZ&MfRzH8rq}_NqyH&<=^8B|I4LQ+k9H%eLW)>+)vZ|V=`jsDXVcc>oTy0n3v?M%&Q3sa2+aW|74&4eCtU0$WI7BOyoPGRM@CzI)G{Td{9My9)pICP(!TIQd&%E34(}N-e_yy&h_tq=Dpr8?}sTnpr6>ge0Hke>!%qVM5&;0f2Jn_m;*}^(UWfQgfpiQ(a>V(NLXiYC;HWpw3k|fv zB7M=P(Mazw{BLLz06@|k}+8u999Rp=A%Q#28p8q026a^5EA8&CIP+BzL-E$ z5U;WZ1jKlof}9QPAa+5PXg|zx3IXj%v3EjI{87fAFNQ9A1 zK_78p;`_B_T@di23(4OU^obP0?j+C>M?eD&bRgO&h#>@M2-Sfe(l;_P0t59RP>3!> zPZz4E4K;!t(t|-Dz|S9$I2*y+2X-26^*NW=G6ngONI@`N-O$ibozO!%ID)S()Y#Z~ z4MR^)TkN4t3=1S7$=ZQLjdcV#nusD`f=C!#AaD&4>4gg>nS#VjKb3$Dva|aWIFR_c zQR0^Al955WP#uUa7Q0s0M{gqOH2S}3eAS!i6c&WmJ&h*ff(a<`JbX0PgT=l3`$TJw zVm2@b0!BO(WB?q83dW)XNj7j(kobv?H^v*LZ)BivYz((NW^~Bt7!(RUY@u%fw}il< z5V$cEYH)NN<7=@7`bK)jhI)pF;D&lo=rQO~OSpjr#K6GP$k4#T0197^wFxAWkbx-l zdR~k;@8?*9zm0`i63|Ezj^KpD1+4FYlYTf7j_8LA0$Ms60G%;`-ndZW+HgMA3P%$# zA!u(a0uBrO*j*UrE7||K>QAxW|C^5Kipl7%ZH+Is%KDOc|E=Brx$(uupGOBBC_XO) z@xhS4pqLB*>{zjZA9W&g#@{61hY*Sv+m5)T0+b}=Bvd41E{U%VaThHi>Y1|RvbI|+ zw@k=ZJvsko2$u+`1`HY(#4C0`53{LTETb$>3x&4;fIUU4nHvFs7-;~Yb`t>D0|3a1 ze+9M}el#z7O8#kXR06D-KRn&=(MxQu1+ANFLH`+J&1((ve~j^EVll1t#Qz!NOY)z2 z{5{5(2tgdVsvny!d+2N)KOgOxdVkw-BraF?;GNO za9OEniGGuBu)16 z$>uEcspvWKb+0@@sF06pX&NeVHU5rX6F5-wvIBVp@ zXDRl~j_4e_Tl^-YaYilOeAiD*|K{A7WFK{->m?vKBL~rCZb32)S%{f#^HRvUdFrq( zF0A_!TEVVQ%bto6a@0<*o(eD9Z+(q`Yx}O|?v9-S398A59#s`S^kVXz!=CBA<#efc zsh*F5mcKvl*k8a87xt2+cFx_95oNe*csmMV5oS9BAV0eeD^a=;)cksx&SxE1;xJex z3wOC8&0-3NQYL47c*lZlZ|>N#8YC}AlMrn;e9b7ds0 zO=3?fGo`f!A>THu9iij=VC8rhBir(2m&*Q)G18kB4f{i%{Q-Im3^7&J+DvUTXz<&` z$QthwWf3$J|BwodRn6LgnC8PlO<8#n^%8z&Sb8y zQU82iy-(NBbaZCNYb9~*MYZXxKbI;D#^-uf&J@SvUb9JulO3szm|X#>5Xw)KrOB$j zVE(B{%LyRk{LX3n_Y3se%3-NtmJ9(j84SmsymYJRBCNpecmXlOe>fn z-V&UdsI)Q4u0Qn{UlT7kT2cO{cVd~p>h99dE2$bdxKxv4u*!QqQC`(5x;5Yup=t|T z37oE2SurUjFOR9RY2?BTh)Pn})*}D*p~z0tfyL(V#qIpi{waKAbK?*l%qvy(Jom7F zWKfVKtgQDB0WPQ8z+!qJ)-vc1zPo#b)TZssqlgaQ8mv> zNr(42u_MpFy7LOPAHSLV!OylId_>z?Ha@2>J1_EFl8J4SiM6syQ!Z{G{5t(i4zEWu zeq=C=A7LDtROB`+=u4`2wHc9~l=z{ey0vP_o`pFwyW&KJJjvji$(Y=aY7iyS=^|vLO=7Bu6BRl!*};Rl)2P-*Ypn~=+y3I%H7uAD&Encu{-I9<0~hp z6!VcCa|I~|rA0x+-!}+gb#$Du9$_4KX`v;uZ3u6sbGVKU#!*Qt+sE;H{f*v7FSZ!RZKar(0Rb1AwH~z2Bvw?@iEVL%fBVvE5Pr(^8w6 zqDY5&m9Q{vb~?+YN;%_%y?W2v22NH#Y9|BR{lvR8O&*_`HmB(&$0lW?ORD@txg7$}W<9QLg5|L$UsO*#y?2d=r0jJkz76iRoo$<`Ze(f9^@2+< zkRs}<#-+pJ!@_fxsUx!5hni{nXODp!C9UEG)1en?!or(t4Y%=m2g)V3#z_f4 zUzU6NS2(yTyH&pb!zN=-&zZBX1)w}!kYVC)!xt}6of)aJm)u5CO9Je$>7B}fa=RhX zcOKstJ|PzkwDWemc%%)|7eLW+W)~Q)mpm^k`4etER6a0TDD3@yWkizMc=G7Ck-zXY z5SHKgvQuc>ifW|H*prLq>r9QV^A{{eSPX!Xj& zjbjL_{JknQo-Z^K62j)Qet25D|DLwWIR(SsD`!gUvIU59+vu0Mi4xPVM_N#uWwJOg z%xoqMuZ{9s&Nw`nzR^%xgMda48S@$-dh$E6xq3s&J79=-!;MO>-CcjPcYt>>hEbhN zj|`j$6<=be^{Y;+QTY(RByl0KrOHH}g8zli6&;A;-MxNyPoVI*B0EJ}!-4zM7$5cS zscT1b20hB@9yU$PswIGp+~|u3^>7*n9@B294gJyNvUq&L0UYko>g=}n^y2dE9J};u z%BWO1L0?FOY4G5Dc9PmOMJsuqnL~HSJge)0AB!EK&5;cVatYQzv1MvQ0&>dok5E@# zUSIHdoKGb=xXrzFC95c*m4>8tLthj2m?c9v{edPqzL*{jT20(A(zK2%9g`THSn=|F z2OL37>n!%H^i@;$iQ7`3!$)5|8K7rgnl=oyT6!R9o%D_9c|DEvp(OGV{6*$?{cHuZ zcJretn4zS$V8>wO6ZYX9fDN?`8M~9GvYgLOVr%X?TkqcMvA-u)(HNr`fxqbw`;|!F zO0S4iqPph&JUOeHQKo#Z%BlMSyM3t-T>qS`Or}KZt|}E7NySJPnt)EKC~y_e;Rcfz z@6l)s!Pu?_xz>4wxeJQB3+8pl7qjei6XkcvAe_>#P27?f^`)RO;g^K?N6SmM84UkP ziIcXMO*m!czt2Y9hg0RoKs|4phJLK!Gy5y8&MMz2Z(#NpX-@w7CfGdtSVGy=>1X5R z1lL&kunb|Q^}ncD*1DTqOdifoc+8xOA62dzAu?iHCz7{W`Kgx6VYe8Tow~kV{#>Vg z$fAYTnBA5ldagS&9(#wASn*!&rM;sdvLu^5IOFA@GK!)m(7UxyWMrND8P7r>gX zoo!?PdTi=Ccu2O@c;DLmj^X8cXrodE6iRY3siuH`pCZNM85^i+vvPN`I|2vpB~OXm z>^&=_`21otc*Kr|t>3SE!cE7z%jbK%LVnxttnPy)1I@5)eBbavbkuE`-Ps;L{zN(6 z-Md|>)!697yCYMQ8lr(mRs4^O3qx3yUWzl0xQ{$ec11uVRs|-cAvhA z714Q^UzNf3b6XKHA{c7i3Q;o?-P9?&r`p_3g7K5OhCg+3^~0vd_iW{=*t9vlWxfe9 ztfmNqs5Q+Cq$U$?K5lBvDyTP$?Mx%R{&xA4f;5+A3x47Psqt0!#Uky*9y$Z61r$hz=1>QzG{9d0Z^ zW;>GYm&C3>T=2&Vb?eonS=roU>(8h^m6LA`gpjizS!|0;ZqDalbgDVS5OZxH) z7xqdS3rAA(= z2C_=Yb1{1xy*PSgpEKpjnjB+*HmiIRWM2r7wFjS7%U2ZG-z%SWG4%z;KPaDnS73ip zK7Y|2ANAr_^v@UMzowr49b?VwbK-SH`u7mp!LQZc9$K@p>LCn)az@bsMBUpvp4neYAPeV_OG zoin*nt3zyNI?cr4a5kJ!kc+(!ns3YL*wg>BjSsQc0u?(}6$2-!G-3q8F{N-Kgy+b` z>me>AmZopK4*BD77E&26Ruvl&&XB-zlGv<6N|P%wYm6o^O(~Y7LMnVBv|gqNAPhD% z67Vu<0D<)tejG8aN&H6p?Kk>oOIjMeVX9hnueXc&=*3~P#b0|-mha!8@# zfB^*T3rQ-IGAK;84@jZ2*iPVL;(Q`NCyBP<|N0bY-zzv7By%&(~AXq z0RS6d0>M;_jYbLb@ecM{?mcPCQJ^ZZLIO?tm0^BA*wX*hmJx(NVik<=V0gpi1Vkmn zDi}?MmH40-A3R^Ckir_&+?@Bhf)FCx1WAJtSdO2lE<^SS?<|TJ)0@p^(y=fHktOG8PQkJR1KoDwA8-{F#>@2OkqY9vw)5ofiZ<81CJbt~lJZJ`Tv_rFA|n&f8e# zZ@W@WORzY%u6mJLVvFA)i~7r>AmGH>_)+3VyyBxWv!Q_^*Yg+TjCkuY*?EXJ%C&ZLc_TFUhB@tlPKZ ztuQIYLNMqid6%PJV$8LS%K7SL_)v88<3F-zz=d?{D?YBDk z(@kr0US82Vnfi$6EjP7mr9NXVqdWecyFu9bQ+-vz8~*Jvx9S@}PI#nNetuSS!i8wk z+2%t+i}JEQOZ7g#c_LBGC{o4ST9^^)NT7?J5)0N#T*h;_Wx;TqK{z%zf-s~q+YwB9u^OgKBG`gUUdml{92j!Tihu~5eK)(TydZIShZuokRiT}2N0 zcDc2}bHhff=9HgJPtLf-jEF`G_WFA6JJQ)tb$hmVhTVW;aR_-}cG$zM?sa9##irpc z<&R9Eq<CAi4DOoQ%=KG!OUUvWW#nHkpQ#kO{GHv}x)0)lUe%p1v zL%`CXz;=(464l$UN(Pd~u6Y!+_QyDFdT=0p(Y!NmdYhq;8m)2hzSOXmU)tVLhkyGc zF~;%vBmOe}1}F9bBmR~m!XbgLKWR8P+QBib-tiabEv)6`w%^YW{6}_dhW_G+^FC7K zkvB!3J8O+ct8^=>yBb0}b`Ge|l+VpAZvSFftI+ibddotU*X!`{FA_jPVWDxa^xdUq zq+T0wnf!A6Rlfo5wU>UwtMlr*`CVPU#q2q2kjo0e5r6pT543r02EiydC0Asaw!_BW zB(j9N38zY zVSnnVYg@*N=YGSlDqVZC^-JyRY6^JHUNANN_oKml)nKDbi%`78&Lwkq>3n8lOaK(M z%XIY)`(*v_R>j{Wk6lNV(-IeGdy{*z&OsHogMSnZ|3Tg#&#R$#Tys~HmZY1|0$u&~ z8{Z!cZXd0F3%?F04BZXB^SFofI5YqCEdel3^w)^1&PVosU-?Y zsp*+{wo31J?^jaDOtDo8H}y5}EpSfF$n>ZxN)4{^3rViZPPR-@vbR&PsjvbXkegbP zs8ErclUHn2VXFi-*9yo63F|8Xy|HaX=q~T zXkuhyoy6KF3~uM1wiR?bDKi6!|(A^G_^uoMuGkzbNu zoRMFk;2dnK5T2Qrl3!j7iX4bvT~doO%TiO^it=+6z@clEiN$tTLlY-cQ*#SPb7Kqa zzJ};cAt%K2fsWA!B@Lvc0uutJKM)h1@PQn7)=A9+=9eO1#>h2t<6vN5a`1F<45_&F zCgNglvw^_UQ1K63?>IoPgvE$SQc9`RFR@5oEMEQcuh2HP zp6FYXzh@K|KXK{P|2wZ_Iz!UOKY}UZOO97`J6F2O-)1bnXSiIfU8r#i)0E?L&d-t2 z)c7~uKsr%c^Nr5aQ$j_iEzTP2CUULicAD~e(?KcrHI`iRGF9`Xqr9aHn7pUHd@)Un zYj5;f1(o7sYHzMT6tnTW>+t@;Wxnk!*_3;D;aQ7@>=7RYwcvWU3HaL8rD zamEmP#-v^oIY;GJrcGr_YyWP$|7PL>Nd~R3E+*rLwl`M(?ya};313mc{QT!#{=kD~ z*_z*~*>3GM0X=d#Wzp$P!Xu`&Pv literal 0 HcmV?d00001 diff --git a/samples/winrt/ImageManipulations/C++/assets/tile-sdk.png b/samples/winrt/ImageManipulations/C++/assets/tile-sdk.png new file mode 100644 index 0000000000000000000000000000000000000000..cdec0dbdccad7ae5dd0e92e3c3015449fffb7450 GIT binary patch literal 2665 zcma)8dpwj`AD^(gN@_z1&pSR_@0Ph|CX;bDv1_Dpuf%xfF$Ocwm>HL$5^q~u7S+3& zTD-Ok5}6Q6nB-Pe79nQGbi)dHH^jK^u&Le8=e6%Xf1Kx>^ZS0kzu!6M_xYW3%GG6$ z)~d~`5D0{plcPNyfmlXV_0O-+RJHCMS4UL8O+p86p&J}1jAZa3gbfo8fB+{Bg9Xtc z22&LF5VAlZ)B(1;x6qsBOa@^NnlX<-M{#&6HUeQ`6~$wKA&?LVfLLrU1vz|IiUin9 z3ewArhNbcBpdhwmG#_${c5w%zLqKyT(#jIBh$5>5IFOJ5L~%m70&)}uxu8o{_2;KC zNMHdX451)@56YY73fRGX2rxrqQ6QFx1&BB_j|#1#k`To7`yryx~)G@H#N6A9)HW;k;^k$|?KuS)BlTqYX-%Ae=h|P1LEg9mWlnv{$CwipSRw5W!7=Gl{Lv|$c&&wI(fkRs zU|z&5m=`hsM?z)(;O1{jWK<1uq7yUx#%3OaYE=>Oc0k_VAnz%Ys)BE!w}l;RWk)8z z(VENsO!gwpvfivAyHe0KY8rXrP=K=c_{hkX#>B^dA*a(?jrF?TX05mq{A2T$rv-;I z$g)GLQthe-50+`?nAorF8~K&f1xcDT?^m3L8awoS^t+|B+_)o2M|S0bvx#3RHK)Q97{jhg}3G8uH@@X&CJqU`E#W{P30#``^{%(OwJD-=-tyauopL0;VGqI3c5q4 zr*$ObhOaD=K;y>j@OB(*Y&F$>fHn1Sym(M#EK#3=slAk%n1|!amZGMfQyrkraP~!> z)-W|mGF&kTS}6x41?jgc3I-VXbg1n0kWp%YDJoLz5y0_884jFCy2t_@V*^{WqK&Vj z#j#K7hFadS*9^0nR!#e8^3SKrl7nC8{Bpi&c{jLWmq)wKWPA5!c}o(GC~7L-m_Cc3yhuqVprfBOTg1$Tmq++%`0NgQG)Qb0okCcS zk9B*@D2`=mr`HE}>8<&1fBcQj`m7M-x`@t+M|s|8?Qg!+3NgcXlQ8jD%v&B1)xof_ zvtup#a{1X2R~F;h=%WFHf!+0#aD@pljC}qB!EZ!qJ$7e9@Zl7H8MCd!Rj(*Pv;3cG z%mXw7edEMLDMZOdby>zc2ah&@JVyxK?m{O#8snd zB#U9q_hK$7E?2vRqqBV^4jt&zytD1ruf;jKdWR&rv&!4_1MZjmDJ7le!AZhu7>vQR ziRvEGNi$i|(y(#Xj$-A#llvm;X70rrh34!dIf$7uw@y^y^HeXtkYnZ^naWn(qCtz- zRfkyezQXmBx(s)(K7(yCY7o(?6Gf}9*p%~{3*O1hsxMwH+d6PUivpeTLWS#Ui1`h) zOGVoG;&%DQ#y)xL)3(9rUc+2ZFQ0p;CVlP^Kc4g1q`SM{{bQW6-Q`7Hn^&lZr_L>t z#!LgqOV2XP*X;0WYFjJkQF7Q(Tf|d!5BC1jHIkC(IW;IO;F=`B;PJKz%Ko~rjCVcv zvZ7<@FI+af-v2L@hhJkH+H-cbpZ4+lUtE7MhDOPlqpA&Q6Tso6^`10k( zB-VRlVRfHXfnwSz-fc%$z3I=vkI#gxDJkFIxg}>XP@`eQuV@r|{P(a>u6% zbshAL;`_fWk*`fVne`@i^!iZ$+wJrA7$N>k51Z_K`RLmbsqYJ5Ae zpDHirz501}!eNS?kO8L1^C!)ktV)`Jj^`)VJh@*TSEK%sN(#lz}UAsoG zRciR3+%kN&g+^=xSy?_x3snMB(~e+cy^QWP;%#Cq>n)f`f3<94`odnN$+D3+?MvMI zW0MyJYI6zx>iRu%7OIN=he92%^@l>eWF=PB=s(r^zbEmR*TvEjf4XsIX4m^8zOJ*{ V^V80WL-UV6CkGe%d|Utc-vFTWTn+#L literal 0 HcmV?d00001 diff --git a/samples/winrt/ImageManipulations/C++/assets/windows-sdk.png b/samples/winrt/ImageManipulations/C++/assets/windows-sdk.png new file mode 100644 index 0000000000000000000000000000000000000000..67268021df6fbd0dfdd5220f669db5e07d83f620 GIT binary patch literal 2997 zcmV;m3rh5fP);00009a7bBm000id z000id0mpBsWB>pPPiaF#P*7-ZbZ>KLZ*U+IBfRsybQWXdwQbLP>6pAqfylh#{fb6;Z(vMMVS~$e@S=j*ftg6;Uhf59&ghTmgWD0l;*T zI709Y^p6lP1rIRMx#05C~cW=H_Aw*bJ-5DT&Z2n+x)QHX^p z00esgV8|mQcmRZ%02D^@S3L16t`O%c004NIvOKvYIYoh62rY33S640`D9%Y2D-rV&neh&#Q1i z007~1e$oCcFS8neI|hJl{-P!B1ZZ9hpmq0)X0i`JwE&>$+E?>%_LC6RbVIkUx0b+_+BaR3cnT7Zv!AJxW zizFb)h!jyGOOZ85F;a?DAXP{m@;!0_IfqH8(HlgRxt7s3}k3K`kFu>>-2Q$QMFfPW!La{h336o>X zu_CMttHv6zR;&ZNiS=X8v3CR#fknUxHUxJ0uoBa_M6WNWeqIg~6QE69c9o#eyhGvpiOA@W-aonk<7r1(?fC{oI5N*U!4 zfg=2N-7=cNnjjOr{yriy6mMFgG#l znCF=fnQv8CDz++o6_Lscl}eQ+l^ZHARH>?_s@|##Rr6KLRFA1%Q+=*RRWnoLsR`7U zt5vFIcfW3@?wFpwUVxrVZ>QdQz32KIeJ}k~{cZZE^+ya? z2D1z#2HOnI7(B%_ac?{wFUQ;QQA1tBKtrWrm0_3Rgps+?Jfqb{jYbcQX~taRB;#$y zZN{S}1|}gUOHJxc?wV3fxuz+mJ4`!F$IZ;mqRrNsHJd##*D~ju=bP7?-?v~|cv>vB zsJ6IeNwVZxrdjT`yl#bBIa#GxRa#xMMy;K#CDyyGyQdMSxlWT#tDe?p!?5wT$+oGt z8L;Kp2HUQ-ZMJ=3XJQv;x5ci*?vuTfeY$;({XGW_huIFR9a(?@3)XSs8O^N5RyOM=TTmp(3=8^+zpz2r)C z^>JO{deZfso3oq3?Wo(Y?l$ge?uXo;%ru`Vo>?<<(8I_>;8Eq#KMS9gFl*neeosSB zfoHYnBQIkwkyowPu(zdms`p{<7e4kra-ZWq<2*OsGTvEV%s0Td$hXT+!*8Bnh2KMe zBmZRodjHV?r+_5^X9J0WL4jKW`}lf%A-|44I@@LTvf1rHjG(ze6+w@Jt%Bvjts!X0 z?2xS?_ve_-kiKB_KiJlZ$9G`c^=E@oNG)mWWaNo-3TIW8)$Hg0Ub-~8?KhvJ>$ z3*&nim@mj(aCxE5!t{lw7O5^0EIO7zOo&c6l<+|iDySBWCGrz@C5{St!X3hAA}`T4 z(TLbXTq+(;@<=L8dXnssyft|w#WSTW<++3>sgS%(4NTpeI-VAqb|7ssJvzNHgOZVu zaYCvgO_R1~>SyL=cFU|~g|hy|Zi}}s9+d~lYqOB71z9Z$wnC=pR9Yz4DhIM>Wmjgu z&56o6maCpC&F##y%G;1PobR9i?GnNg;gYtchD%p19a!eQtZF&3JaKv33gZ<8D~47E ztUS1iwkmDaPpj=$m#%)jCVEY4fnLGNg2A-`YwHVD3gv};>)hAvT~AmqS>Lr``i7kw zJ{5_It`yrBmlc25DBO7E8;5VoznR>Ww5hAaxn$2~(q`%A-YuS64wkBy=9dm`4cXeX z4c}I@?e+FW+b@^RDBHV(wnMq2zdX3SWv9u`%{xC-q*U}&`cyXV(%rRT*Z6MH?i+i& z_B8C(+grT%{XWUQ+f@NoP1R=AW&26{v-dx)iK^-Nmiuj8txj!m?Z*Ss1N{dh4z}01 z)YTo*JycSU)+_5r4#yw9{+;i4Ee$peRgIj+;v;ZGdF1K$3E%e~4LaI(jC-u%2h$&R z9cLXcYC@Xwnns&bn)_Q~Te?roKGD|d-g^8;+aC{{G(1^(O7m37Y1-+6)01cN&y1aw zoqc{T`P^XJqPBbIW6s}d4{z_f5Om?vMgNQEJG?v2T=KYd^0M3I6IZxbny)%vZR&LD zJpPl@Psh8QyPB@KTx+@RdcC!KX7}kEo;S|j^u2lU7XQ}Oo;f|;z4Ll+_r>@1-xl3| zawq-H%e&ckC+@AhPrP6BKT#_XdT7&;F71j}Joy zkC~6lh7E@6o;W@^IpRNZ{ptLtL(gQ-CY~4mqW;US7Zxvm_|@yz&e53Bp_lTPlfP|z zrTyx_>lv@x#=^!PzR7qqF<$gm`|ZJZ+;<)Cqu&ot2z=0000WV@Og>004R=004l4008;_004mL004C`008P>0026e000+nl3&F} z0002sNkl}=C$qwI-NYOoT~TF4nB*YE@uK^dDB zWR30ww$s-POn&BXW+rPygeRt``M?BN0aGCJxdu1^r8s+Rdsm!&%k_ve0V-e%jKx_a z&Kh6@6u`k0^$aBgg{A`b{%e|-{MpA6H~>atXo8lmfjMc|h_j1WOH-MBto3b%xo=ah zD6ZCiGu#1JA7iMs;nIYI&w#sfAynGXP=aR{pby*>2XE7wE-hUGLmy+Pwc&3e3AYq~ rL3F-plJa|oO~TqubZP!GOu=ISd<8xF6%`=M00000NkvXXu0mjfiA9h! literal 0 HcmV?d00001 diff --git a/samples/winrt/ImageManipulations/C++/common/LayoutAwarePage.cpp b/samples/winrt/ImageManipulations/C++/common/LayoutAwarePage.cpp new file mode 100644 index 000000000..9449fbead --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/common/LayoutAwarePage.cpp @@ -0,0 +1,452 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +#include "pch.h" +#include "LayoutAwarePage.h" +#include "SuspensionManager.h" + +using namespace SDKSample::Common; + +using namespace Platform; +using namespace Platform::Collections; +using namespace Windows::Foundation; +using namespace Windows::Foundation::Collections; +using namespace Windows::System; +using namespace Windows::UI::Core; +using namespace Windows::UI::ViewManagement; +using namespace Windows::UI::Xaml; +using namespace Windows::UI::Xaml::Controls; +using namespace Windows::UI::Xaml::Interop; +using namespace Windows::UI::Xaml::Navigation; + +/// +/// Initializes a new instance of the class. +/// +LayoutAwarePage::LayoutAwarePage() +{ + if (Windows::ApplicationModel::DesignMode::DesignModeEnabled) + { + return; + } + + // Create an empty default view model + DefaultViewModel = ref new Map(std::less()); + + // When this page is part of the visual tree make two changes: + // 1) Map application view state to visual state for the page + // 2) Handle keyboard and mouse navigation requests + Loaded += ref new RoutedEventHandler(this, &LayoutAwarePage::OnLoaded); + + // Undo the same changes when the page is no longer visible + Unloaded += ref new RoutedEventHandler(this, &LayoutAwarePage::OnUnloaded); +} + +static DependencyProperty^ _defaultViewModelProperty = + DependencyProperty::Register("DefaultViewModel", + TypeName(IObservableMap::typeid), TypeName(LayoutAwarePage::typeid), nullptr); + +/// +/// Identifies the dependency property. +/// +DependencyProperty^ LayoutAwarePage::DefaultViewModelProperty::get() +{ + return _defaultViewModelProperty; +} + +/// +/// Gets an implementation of designed to be +/// used as a trivial view model. +/// +IObservableMap^ LayoutAwarePage::DefaultViewModel::get() +{ + return safe_cast^>(GetValue(DefaultViewModelProperty)); +} + +/// +/// Sets an implementation of designed to be +/// used as a trivial view model. +/// +void LayoutAwarePage::DefaultViewModel::set(IObservableMap^ value) +{ + SetValue(DefaultViewModelProperty, value); +} + +/// +/// Invoked when the page is part of the visual tree +/// +/// Instance that triggered the event. +/// Event data describing the conditions that led to the event. +void LayoutAwarePage::OnLoaded(Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) +{ + this->StartLayoutUpdates(sender, e); + + // Keyboard and mouse navigation only apply when occupying the entire window + if (this->ActualHeight == Window::Current->Bounds.Height && + this->ActualWidth == Window::Current->Bounds.Width) + { + // Listen to the window directly so focus isn't required + _acceleratorKeyEventToken = Window::Current->CoreWindow->Dispatcher->AcceleratorKeyActivated += + ref new TypedEventHandler(this, + &LayoutAwarePage::CoreDispatcher_AcceleratorKeyActivated); + _pointerPressedEventToken = Window::Current->CoreWindow->PointerPressed += + ref new TypedEventHandler(this, + &LayoutAwarePage::CoreWindow_PointerPressed); + _navigationShortcutsRegistered = true; + } +} + +/// +/// Invoked when the page is removed from visual tree +/// +/// Instance that triggered the event. +/// Event data describing the conditions that led to the event. +void LayoutAwarePage::OnUnloaded(Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) +{ + if (_navigationShortcutsRegistered) + { + Window::Current->CoreWindow->Dispatcher->AcceleratorKeyActivated -= _acceleratorKeyEventToken; + Window::Current->CoreWindow->PointerPressed -= _pointerPressedEventToken; + _navigationShortcutsRegistered = false; + } + StopLayoutUpdates(sender, e); +} + +#pragma region Navigation support + +/// +/// Invoked as an event handler to navigate backward in the page's associated +/// until it reaches the top of the navigation stack. +/// +/// Instance that triggered the event. +/// Event data describing the conditions that led to the event. +void LayoutAwarePage::GoHome(Object^ sender, RoutedEventArgs^ e) +{ + (void) sender; // Unused parameter + (void) e; // Unused parameter + + // Use the navigation frame to return to the topmost page + if (Frame != nullptr) + { + while (Frame->CanGoBack) + { + Frame->GoBack(); + } + } +} + +/// +/// Invoked as an event handler to navigate backward in the navigation stack +/// associated with this page's . +/// +/// Instance that triggered the event. +/// Event data describing the conditions that led to the event. +void LayoutAwarePage::GoBack(Object^ sender, RoutedEventArgs^ e) +{ + (void) sender; // Unused parameter + (void) e; // Unused parameter + + // Use the navigation frame to return to the previous page + if (Frame != nullptr && Frame->CanGoBack) + { + Frame->GoBack(); + } +} + +/// +/// Invoked as an event handler to navigate forward in the navigation stack +/// associated with this page's . +/// +/// Instance that triggered the event. +/// Event data describing the conditions that led to the event. +void LayoutAwarePage::GoForward(Object^ sender, RoutedEventArgs^ e) +{ + (void) sender; // Unused parameter + (void) e; // Unused parameter + + // Use the navigation frame to advance to the next page + if (Frame != nullptr && Frame->CanGoForward) + { + Frame->GoForward(); + } +} + +/// +/// Invoked on every keystroke, including system keys such as Alt key combinations, when +/// this page is active and occupies the entire window. Used to detect keyboard navigation +/// between pages even when the page itself doesn't have focus. +/// +/// Instance that triggered the event. +/// Event data describing the conditions that led to the event. +void LayoutAwarePage::CoreDispatcher_AcceleratorKeyActivated(CoreDispatcher^ sender, AcceleratorKeyEventArgs^ args) +{ + auto virtualKey = args->VirtualKey; + + // Only investigate further when Left, Right, or the dedicated Previous or Next keys + // are pressed + if ((args->EventType == CoreAcceleratorKeyEventType::SystemKeyDown || + args->EventType == CoreAcceleratorKeyEventType::KeyDown) && + (virtualKey == VirtualKey::Left || virtualKey == VirtualKey::Right || + (int)virtualKey == 166 || (int)virtualKey == 167)) + { + auto coreWindow = Window::Current->CoreWindow; + auto downState = Windows::UI::Core::CoreVirtualKeyStates::Down; + bool menuKey = (coreWindow->GetKeyState(VirtualKey::Menu) & downState) == downState; + bool controlKey = (coreWindow->GetKeyState(VirtualKey::Control) & downState) == downState; + bool shiftKey = (coreWindow->GetKeyState(VirtualKey::Shift) & downState) == downState; + bool noModifiers = !menuKey && !controlKey && !shiftKey; + bool onlyAlt = menuKey && !controlKey && !shiftKey; + + if (((int)virtualKey == 166 && noModifiers) || + (virtualKey == VirtualKey::Left && onlyAlt)) + { + // When the previous key or Alt+Left are pressed navigate back + args->Handled = true; + GoBack(this, ref new RoutedEventArgs()); + } + else if (((int)virtualKey == 167 && noModifiers) || + (virtualKey == VirtualKey::Right && onlyAlt)) + { + // When the next key or Alt+Right are pressed navigate forward + args->Handled = true; + GoForward(this, ref new RoutedEventArgs()); + } + } +} + +/// +/// Invoked on every mouse click, touch screen tap, or equivalent interaction when this +/// page is active and occupies the entire window. Used to detect browser-style next and +/// previous mouse button clicks to navigate between pages. +/// +/// Instance that triggered the event. +/// Event data describing the conditions that led to the event. +void LayoutAwarePage::CoreWindow_PointerPressed(CoreWindow^ sender, PointerEventArgs^ args) +{ + auto properties = args->CurrentPoint->Properties; + + // Ignore button chords with the left, right, and middle buttons + if (properties->IsLeftButtonPressed || properties->IsRightButtonPressed || + properties->IsMiddleButtonPressed) return; + + // If back or foward are pressed (but not both) navigate appropriately + bool backPressed = properties->IsXButton1Pressed; + bool forwardPressed = properties->IsXButton2Pressed; + if (backPressed ^ forwardPressed) + { + args->Handled = true; + if (backPressed) GoBack(this, ref new RoutedEventArgs()); + if (forwardPressed) GoForward(this, ref new RoutedEventArgs()); + } +} + +#pragma endregion + +#pragma region Visual state switching + +/// +/// Invoked as an event handler, typically on the event of a +/// within the page, to indicate that the sender should start receiving +/// visual state management changes that correspond to application view state changes. +/// +/// Instance of that supports visual state management +/// corresponding to view states. +/// Event data that describes how the request was made. +/// The current view state will immediately be used to set the corresponding visual state +/// when layout updates are requested. A corresponding event handler +/// connected to is strongly encouraged. Instances of +/// automatically invoke these handlers in their Loaded and Unloaded +/// events. +/// +/// +void LayoutAwarePage::StartLayoutUpdates(Object^ sender, RoutedEventArgs^ e) +{ + (void) e; // Unused parameter + + auto control = safe_cast(sender); + if (_layoutAwareControls == nullptr) + { + // Start listening to view state changes when there are controls interested in updates + _layoutAwareControls = ref new Vector(); + _windowSizeEventToken = Window::Current->SizeChanged += ref new WindowSizeChangedEventHandler(this, &LayoutAwarePage::WindowSizeChanged); + + // Page receives notifications for children. Protect the page until we stopped layout updates for all controls. + _this = this; + } + _layoutAwareControls->Append(control); + + // Set the initial visual state of the control + VisualStateManager::GoToState(control, DetermineVisualState(ApplicationView::Value), false); +} + +void LayoutAwarePage::WindowSizeChanged(Object^ sender, Windows::UI::Core::WindowSizeChangedEventArgs^ e) +{ + (void) sender; // Unused parameter + (void) e; // Unused parameter + + InvalidateVisualState(); +} + +/// +/// Invoked as an event handler, typically on the event of a +/// , to indicate that the sender should start receiving visual state +/// management changes that correspond to application view state changes. +/// +/// Instance of that supports visual state management +/// corresponding to view states. +/// Event data that describes how the request was made. +/// The current view state will immediately be used to set the corresponding visual state +/// when layout updates are requested. +/// +void LayoutAwarePage::StopLayoutUpdates(Object^ sender, RoutedEventArgs^ e) +{ + (void) e; // Unused parameter + + auto control = safe_cast(sender); + unsigned int index; + if (_layoutAwareControls != nullptr && _layoutAwareControls->IndexOf(control, &index)) + { + _layoutAwareControls->RemoveAt(index); + if (_layoutAwareControls->Size == 0) + { + // Stop listening to view state changes when no controls are interested in updates + Window::Current->SizeChanged -= _windowSizeEventToken; + _layoutAwareControls = nullptr; + // Last control has received the Unload notification. + _this = nullptr; + } + } +} + +/// +/// Translates values into strings for visual state management +/// within the page. The default implementation uses the names of enum values. Subclasses may +/// override this method to control the mapping scheme used. +/// +/// View state for which a visual state is desired. +/// Visual state name used to drive the +/// +String^ LayoutAwarePage::DetermineVisualState(ApplicationViewState viewState) +{ + switch (viewState) + { + case ApplicationViewState::Filled: + return "Filled"; + case ApplicationViewState::Snapped: + return "Snapped"; + case ApplicationViewState::FullScreenPortrait: + return "FullScreenPortrait"; + case ApplicationViewState::FullScreenLandscape: + default: + return "FullScreenLandscape"; + } +} + +/// +/// Updates all controls that are listening for visual state changes with the correct visual +/// state. +/// +/// +/// Typically used in conjunction with overriding to +/// signal that a different value may be returned even though the view state has not changed. +/// +void LayoutAwarePage::InvalidateVisualState() +{ + if (_layoutAwareControls != nullptr) + { + String^ visualState = DetermineVisualState(ApplicationView::Value); + auto controlIterator = _layoutAwareControls->First(); + while (controlIterator->HasCurrent) + { + auto control = controlIterator->Current; + VisualStateManager::GoToState(control, visualState, false); + controlIterator->MoveNext(); + } + } +} + +#pragma endregion + +#pragma region Process lifetime management + +/// +/// Invoked when this page is about to be displayed in a Frame. +/// +/// Event data that describes how this page was reached. The Parameter +/// property provides the group to be displayed. +void LayoutAwarePage::OnNavigatedTo(NavigationEventArgs^ e) +{ + // Returning to a cached page through navigation shouldn't trigger state loading + if (_pageKey != nullptr) return; + + auto frameState = SuspensionManager::SessionStateForFrame(Frame); + _pageKey = "Page-" + Frame->BackStackDepth; + + if (e->NavigationMode == NavigationMode::New) + { + // Clear existing state for forward navigation when adding a new page to the + // navigation stack + auto nextPageKey = _pageKey; + int nextPageIndex = Frame->BackStackDepth; + while (frameState->HasKey(nextPageKey)) + { + frameState->Remove(nextPageKey); + nextPageIndex++; + nextPageKey = "Page-" + nextPageIndex; + } + + // Pass the navigation parameter to the new page + LoadState(e->Parameter, nullptr); + } + else + { + // Pass the navigation parameter and preserved page state to the page, using + // the same strategy for loading suspended state and recreating pages discarded + // from cache + LoadState(e->Parameter, safe_cast^>(frameState->Lookup(_pageKey))); + } +} + +/// +/// Invoked when this page will no longer be displayed in a Frame. +/// +/// Event data that describes how this page was reached. The Parameter +/// property provides the group to be displayed. +void LayoutAwarePage::OnNavigatedFrom(NavigationEventArgs^ e) +{ + auto frameState = SuspensionManager::SessionStateForFrame(Frame); + auto pageState = ref new Map(); + SaveState(pageState); + frameState->Insert(_pageKey, pageState); +} + +/// +/// Populates the page with content passed during navigation. Any saved state is also +/// provided when recreating a page from a prior session. +/// +/// The parameter value passed to +/// when this page was initially requested. +/// +/// A map of state preserved by this page during an earlier +/// session. This will be null the first time a page is visited. +void LayoutAwarePage::LoadState(Object^ navigationParameter, IMap^ pageState) +{ +} + +/// +/// Preserves state associated with this page in case the application is suspended or the +/// page is discarded from the navigation cache. Values must conform to the serialization +/// requirements of . +/// +/// An empty map to be populated with serializable state. +void LayoutAwarePage::SaveState(IMap^ pageState) +{ +} + +#pragma endregion diff --git a/samples/winrt/ImageManipulations/C++/common/LayoutAwarePage.h b/samples/winrt/ImageManipulations/C++/common/LayoutAwarePage.h new file mode 100644 index 000000000..bd71062fe --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/common/LayoutAwarePage.h @@ -0,0 +1,88 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +#pragma once + +#include + +namespace SDKSample +{ + namespace Common + { + /// + /// Typical implementation of Page that provides several important conveniences: + /// + /// + /// Application view state to visual state mapping + /// + /// + /// GoBack, GoForward, and GoHome event handlers + /// + /// + /// Mouse and keyboard shortcuts for navigation + /// + /// + /// State management for navigation and process lifetime management + /// + /// + /// A default view model + /// + /// + /// + [Windows::Foundation::Metadata::WebHostHidden] + public ref class LayoutAwarePage : Windows::UI::Xaml::Controls::Page + { + internal: + LayoutAwarePage(); + + public: + void StartLayoutUpdates(Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + void StopLayoutUpdates(Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + void InvalidateVisualState(); + static property Windows::UI::Xaml::DependencyProperty^ DefaultViewModelProperty + { + Windows::UI::Xaml::DependencyProperty^ get(); + }; + property Windows::Foundation::Collections::IObservableMap^ DefaultViewModel + { + Windows::Foundation::Collections::IObservableMap^ get(); + void set(Windows::Foundation::Collections::IObservableMap^ value); + } + + protected: + virtual void GoHome(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + virtual void GoBack(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + virtual void GoForward(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + virtual Platform::String^ DetermineVisualState(Windows::UI::ViewManagement::ApplicationViewState viewState); + virtual void OnNavigatedTo(Windows::UI::Xaml::Navigation::NavigationEventArgs^ e) override; + virtual void OnNavigatedFrom(Windows::UI::Xaml::Navigation::NavigationEventArgs^ e) override; + virtual void LoadState(Platform::Object^ navigationParameter, + Windows::Foundation::Collections::IMap^ pageState); + virtual void SaveState(Windows::Foundation::Collections::IMap^ pageState); + + private: + Platform::String^ _pageKey; + bool _navigationShortcutsRegistered; + Platform::Collections::Map^ _defaultViewModel; + Windows::Foundation::EventRegistrationToken _windowSizeEventToken, + _acceleratorKeyEventToken, _pointerPressedEventToken; + Platform::Collections::Vector^ _layoutAwareControls; + void WindowSizeChanged(Platform::Object^ sender, Windows::UI::Core::WindowSizeChangedEventArgs^ e); + void OnLoaded(Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + void OnUnloaded(Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); + + void CoreDispatcher_AcceleratorKeyActivated(Windows::UI::Core::CoreDispatcher^ sender, + Windows::UI::Core::AcceleratorKeyEventArgs^ args); + void CoreWindow_PointerPressed(Windows::UI::Core::CoreWindow^ sender, + Windows::UI::Core::PointerEventArgs^ args); + LayoutAwarePage^ _this; // Strong reference to self, cleaned up in OnUnload + }; + } +} diff --git a/samples/winrt/ImageManipulations/C++/common/StandardStyles.xaml b/samples/winrt/ImageManipulations/C++/common/StandardStyles.xaml new file mode 100644 index 000000000..7c3d23877 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/common/StandardStyles.xaml @@ -0,0 +1,978 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + Mouse + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/samples/winrt/ImageManipulations/C++/common/suspensionmanager.cpp b/samples/winrt/ImageManipulations/C++/common/suspensionmanager.cpp new file mode 100644 index 000000000..c1ecf11cf --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/common/suspensionmanager.cpp @@ -0,0 +1,481 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +// +// SuspensionManager.cpp +// Implementation of the SuspensionManager class +// + +#include "pch.h" +#include "SuspensionManager.h" + +#include +#include + +using namespace SDKSample::Common; + +using namespace Concurrency; +using namespace Platform; +using namespace Platform::Collections; +using namespace Windows::Foundation; +using namespace Windows::Foundation::Collections; +using namespace Windows::Storage; +using namespace Windows::Storage::FileProperties; +using namespace Windows::Storage::Streams; +using namespace Windows::UI::Xaml; +using namespace Windows::UI::Xaml::Controls; +using namespace Windows::UI::Xaml::Interop; + +namespace +{ + Map^ _sessionState = ref new Map(); + String^ sessionStateFilename = "_sessionState.dat"; + + // Forward declarations for object object read / write support + void WriteObject(Windows::Storage::Streams::DataWriter^ writer, Platform::Object^ object); + Platform::Object^ ReadObject(Windows::Storage::Streams::DataReader^ reader); +} + +/// +/// Provides access to global session state for the current session. This state is serialized by +/// and restored by which require values to be +/// one of the following: boxed values including integers, floating-point singles and doubles, +/// wide characters, boolean, Strings and Guids, or Map where map values are +/// subject to the same constraints. Session state should be as compact as possible. +/// +IMap^ SuspensionManager::SessionState::get(void) +{ + return _sessionState; +} + +/// +/// Wrap a WeakReference as a reference object for use in a collection. +/// +private ref class WeakFrame sealed +{ +private: + WeakReference _frameReference; + +internal: + WeakFrame(Frame^ frame) { _frameReference = frame; } + property Frame^ ResolvedFrame + { + Frame^ get(void) { return _frameReference.Resolve(); } + }; +}; + +namespace +{ + std::vector _registeredFrames; + DependencyProperty^ FrameSessionStateKeyProperty = + DependencyProperty::RegisterAttached("_FrameSessionStateKeyProperty", + TypeName(String::typeid), TypeName(SuspensionManager::typeid), nullptr); + DependencyProperty^ FrameSessionStateProperty = + DependencyProperty::RegisterAttached("_FrameSessionStateProperty", + TypeName(IMap::typeid), TypeName(SuspensionManager::typeid), nullptr); +} + +/// +/// Registers a instance to allow its navigation history to be saved to +/// and restored from . Frames should be registered once +/// immediately after creation if they will participate in session state management. Upon +/// registration if state has already been restored for the specified key +/// the navigation history will immediately be restored. Subsequent invocations of +/// will also restore navigation history. +/// +/// An instance whose navigation history should be managed by +/// +/// A unique key into used to +/// store navigation-related information. +void SuspensionManager::RegisterFrame(Frame^ frame, String^ sessionStateKey) +{ + if (frame->GetValue(FrameSessionStateKeyProperty) != nullptr) + { + throw ref new FailureException("Frames can only be registered to one session state key"); + } + + if (frame->GetValue(FrameSessionStateProperty) != nullptr) + { + throw ref new FailureException("Frames must be either be registered before accessing frame session state, or not registered at all"); + } + + // Use a dependency property to associate the session key with a frame, and keep a list of frames whose + // navigation state should be managed + frame->SetValue(FrameSessionStateKeyProperty, sessionStateKey); + _registeredFrames.insert(_registeredFrames.begin(), ref new WeakFrame(frame)); + + // Check to see if navigation state can be restored + RestoreFrameNavigationState(frame); +} + +/// +/// Disassociates a previously registered by +/// from . Any navigation state previously captured will be +/// removed. +/// +/// An instance whose navigation history should no longer be +/// managed. +void SuspensionManager::UnregisterFrame(Frame^ frame) +{ + // Remove session state and remove the frame from the list of frames whose navigation + // state will be saved (along with any weak references that are no longer reachable) + auto key = safe_cast(frame->GetValue(FrameSessionStateKeyProperty)); + if (SessionState->HasKey(key)) SessionState->Remove(key); + _registeredFrames.erase( + std::remove_if(_registeredFrames.begin(), _registeredFrames.end(), [=](WeakFrame^& e) + { + auto testFrame = e->ResolvedFrame; + return testFrame == nullptr || testFrame == frame; + }), + _registeredFrames.end() + ); +} + +/// +/// Provides storage for session state associated with the specified . +/// Frames that have been previously registered with have +/// their session state saved and restored automatically as a part of the global +/// . Frames that are not registered have transient state +/// that can still be useful when restoring pages that have been discarded from the +/// navigation cache. +/// +/// Apps may choose to rely on to manage +/// page-specific state instead of working with frame session state directly. +/// The instance for which session state is desired. +/// A collection of state subject to the same serialization mechanism as +/// . +IMap^ SuspensionManager::SessionStateForFrame(Frame^ frame) +{ + auto frameState = safe_cast^>(frame->GetValue(FrameSessionStateProperty)); + + if (frameState == nullptr) + { + auto frameSessionKey = safe_cast(frame->GetValue(FrameSessionStateKeyProperty)); + if (frameSessionKey != nullptr) + { + // Registered frames reflect the corresponding session state + if (!_sessionState->HasKey(frameSessionKey)) + { + _sessionState->Insert(frameSessionKey, ref new Map()); + } + frameState = safe_cast^>(_sessionState->Lookup(frameSessionKey)); + } + else + { + // Frames that aren't registered have transient state + frameState = ref new Map(); + } + frame->SetValue(FrameSessionStateProperty, frameState); + } + return frameState; +} + +void SuspensionManager::RestoreFrameNavigationState(Frame^ frame) +{ + auto frameState = SessionStateForFrame(frame); + if (frameState->HasKey("Navigation")) + { + frame->SetNavigationState(safe_cast(frameState->Lookup("Navigation"))); + } +} + +void SuspensionManager::SaveFrameNavigationState(Frame^ frame) +{ + auto frameState = SessionStateForFrame(frame); + frameState->Insert("Navigation", frame->GetNavigationState()); +} + +/// +/// Save the current . Any instances +/// registered with will also preserve their current +/// navigation stack, which in turn gives their active an opportunity +/// to save its state. +/// +/// An asynchronous task that reflects when session state has been saved. +task SuspensionManager::SaveAsync(void) +{ + // Save the navigation state for all registered frames + for (auto&& weakFrame : _registeredFrames) + { + auto frame = weakFrame->ResolvedFrame; + if (frame != nullptr) SaveFrameNavigationState(frame); + } + + // Serialize the session state synchronously to avoid asynchronous access to shared + // state + auto sessionData = ref new InMemoryRandomAccessStream(); + auto sessionDataWriter = ref new DataWriter(sessionData->GetOutputStreamAt(0)); + WriteObject(sessionDataWriter, _sessionState); + + // Once session state has been captured synchronously, begin the asynchronous process + // of writing the result to disk + return task(sessionDataWriter->StoreAsync()).then([=](unsigned int) + { + return sessionDataWriter->FlushAsync(); + }).then([=](bool flushSucceeded) + { + (void)flushSucceeded; // Unused parameter + return ApplicationData::Current->LocalFolder->CreateFileAsync(sessionStateFilename, + CreationCollisionOption::ReplaceExisting); + }).then([=](StorageFile^ createdFile) + { + return createdFile->OpenAsync(FileAccessMode::ReadWrite); + }).then([=](IRandomAccessStream^ newStream) + { + return RandomAccessStream::CopyAndCloseAsync( + sessionData->GetInputStreamAt(0), newStream->GetOutputStreamAt(0)); + }).then([=](UINT64 copiedBytes) + { + (void)copiedBytes; // Unused parameter + return; + }); +} + +/// +/// Restores previously saved . Any instances +/// registered with will also restore their prior navigation +/// state, which in turn gives their active an opportunity restore its +/// state. +/// +/// A version identifer compared to the session state to prevent +/// incompatible versions of session state from reaching app code. Saved state with a +/// different version will be ignored, resulting in an empty +/// dictionary. +/// An asynchronous task that reflects when session state has been read. The +/// content of should not be relied upon until this task +/// completes. +task SuspensionManager::RestoreAsync(void) +{ + _sessionState->Clear(); + + task getFileTask(ApplicationData::Current->LocalFolder->GetFileAsync(sessionStateFilename)); + return getFileTask.then([=](StorageFile^ stateFile) + { + task getBasicPropertiesTask(stateFile->GetBasicPropertiesAsync()); + return getBasicPropertiesTask.then([=](BasicProperties^ stateFileProperties) + { + auto size = unsigned int(stateFileProperties->Size); + if (size != stateFileProperties->Size) throw ref new FailureException("Session state larger than 4GB"); + task openReadTask(stateFile->OpenReadAsync()); + return openReadTask.then([=](IRandomAccessStreamWithContentType^ stateFileStream) + { + auto stateReader = ref new DataReader(stateFileStream); + return task(stateReader->LoadAsync(size)).then([=](unsigned int bytesRead) + { + (void)bytesRead; // Unused parameter + // Deserialize the Session State + Object^ content = ReadObject(stateReader); + _sessionState = (Map^)content; + + // Restore any registered frames to their saved state + for (auto&& weakFrame : _registeredFrames) + { + auto frame = weakFrame->ResolvedFrame; + if (frame != nullptr) + { + frame->ClearValue(FrameSessionStateProperty); + RestoreFrameNavigationState(frame); + } + } + }, task_continuation_context::use_current()); + }); + }); + }); +} + +#pragma region Object serialization for a known set of types + +namespace +{ + // Codes used for identifying serialized types + enum StreamTypes { + NullPtrType = 0, + + // Supported IPropertyValue types + UInt8Type, UInt16Type, UInt32Type, UInt64Type, Int16Type, Int32Type, Int64Type, + SingleType, DoubleType, BooleanType, Char16Type, GuidType, StringType, + + // Additional supported types + StringToObjectMapType, + + // Marker values used to ensure stream integrity + MapEndMarker + }; + + void WriteString(DataWriter^ writer, String^ string) + { + writer->WriteByte(StringType); + writer->WriteUInt32(writer->MeasureString(string)); + writer->WriteString(string); + } + + void WriteProperty(DataWriter^ writer, IPropertyValue^ propertyValue) + { + switch (propertyValue->Type) + { + case PropertyType::UInt8: + writer->WriteByte(UInt8Type); + writer->WriteByte(propertyValue->GetUInt8()); + return; + case PropertyType::UInt16: + writer->WriteByte(UInt16Type); + writer->WriteUInt16(propertyValue->GetUInt16()); + return; + case PropertyType::UInt32: + writer->WriteByte(UInt32Type); + writer->WriteUInt32(propertyValue->GetUInt32()); + return; + case PropertyType::UInt64: + writer->WriteByte(UInt64Type); + writer->WriteUInt64(propertyValue->GetUInt64()); + return; + case PropertyType::Int16: + writer->WriteByte(Int16Type); + writer->WriteUInt16(propertyValue->GetInt16()); + return; + case PropertyType::Int32: + writer->WriteByte(Int32Type); + writer->WriteUInt32(propertyValue->GetInt32()); + return; + case PropertyType::Int64: + writer->WriteByte(Int64Type); + writer->WriteUInt64(propertyValue->GetInt64()); + return; + case PropertyType::Single: + writer->WriteByte(SingleType); + writer->WriteSingle(propertyValue->GetSingle()); + return; + case PropertyType::Double: + writer->WriteByte(DoubleType); + writer->WriteDouble(propertyValue->GetDouble()); + return; + case PropertyType::Boolean: + writer->WriteByte(BooleanType); + writer->WriteBoolean(propertyValue->GetBoolean()); + return; + case PropertyType::Char16: + writer->WriteByte(Char16Type); + writer->WriteUInt16(propertyValue->GetChar16()); + return; + case PropertyType::Guid: + writer->WriteByte(GuidType); + writer->WriteGuid(propertyValue->GetGuid()); + return; + case PropertyType::String: + WriteString(writer, propertyValue->GetString()); + return; + default: + throw ref new InvalidArgumentException("Unsupported property type"); + } + } + + void WriteStringToObjectMap(DataWriter^ writer, IMap^ map) + { + writer->WriteByte(StringToObjectMapType); + writer->WriteUInt32(map->Size); + for (auto&& pair : map) + { + WriteObject(writer, pair->Key); + WriteObject(writer, pair->Value); + } + writer->WriteByte(MapEndMarker); + } + + void WriteObject(DataWriter^ writer, Object^ object) + { + if (object == nullptr) + { + writer->WriteByte(NullPtrType); + return; + } + + auto propertyObject = dynamic_cast(object); + if (propertyObject != nullptr) + { + WriteProperty(writer, propertyObject); + return; + } + + auto mapObject = dynamic_cast^>(object); + if (mapObject != nullptr) + { + WriteStringToObjectMap(writer, mapObject); + return; + } + + throw ref new InvalidArgumentException("Unsupported data type"); + } + + String^ ReadString(DataReader^ reader) + { + int length = reader->ReadUInt32(); + String^ string = reader->ReadString(length); + return string; + } + + IMap^ ReadStringToObjectMap(DataReader^ reader) + { + auto map = ref new Map(); + auto size = reader->ReadUInt32(); + for (unsigned int index = 0; index < size; index++) + { + auto key = safe_cast(ReadObject(reader)); + auto value = ReadObject(reader); + map->Insert(key, value); + } + if (reader->ReadByte() != MapEndMarker) + { + throw ref new InvalidArgumentException("Invalid stream"); + } + return map; + } + + Object^ ReadObject(DataReader^ reader) + { + auto type = reader->ReadByte(); + switch (type) + { + case NullPtrType: + return nullptr; + case UInt8Type: + return reader->ReadByte(); + case UInt16Type: + return reader->ReadUInt16(); + case UInt32Type: + return reader->ReadUInt32(); + case UInt64Type: + return reader->ReadUInt64(); + case Int16Type: + return reader->ReadInt16(); + case Int32Type: + return reader->ReadInt32(); + case Int64Type: + return reader->ReadInt64(); + case SingleType: + return reader->ReadSingle(); + case DoubleType: + return reader->ReadDouble(); + case BooleanType: + return reader->ReadBoolean(); + case Char16Type: + return (char16_t)reader->ReadUInt16(); + case GuidType: + return reader->ReadGuid(); + case StringType: + return ReadString(reader); + case StringToObjectMapType: + return ReadStringToObjectMap(reader); + default: + throw ref new InvalidArgumentException("Unsupported property type"); + } + } +} + +#pragma endregion diff --git a/samples/winrt/ImageManipulations/C++/common/suspensionmanager.h b/samples/winrt/ImageManipulations/C++/common/suspensionmanager.h new file mode 100644 index 000000000..65e1180a0 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/common/suspensionmanager.h @@ -0,0 +1,50 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +// +// SuspensionManager.h +// Declaration of the SuspensionManager class +// + +#pragma once + +#include + +namespace SDKSample +{ + namespace Common + { + /// + /// SuspensionManager captures global session state to simplify process lifetime management + /// for an application. Note that session state will be automatically cleared under a variety + /// of conditions and should only be used to store information that would be convenient to + /// carry across sessions, but that should be disacarded when an application crashes or is + /// upgraded. + /// + ref class SuspensionManager sealed + { + internal: + static void RegisterFrame(Windows::UI::Xaml::Controls::Frame^ frame, Platform::String^ sessionStateKey); + static void UnregisterFrame(Windows::UI::Xaml::Controls::Frame^ frame); + static Concurrency::task SaveAsync(void); + static Concurrency::task RestoreAsync(void); + static property Windows::Foundation::Collections::IMap^ SessionState + { + Windows::Foundation::Collections::IMap^ get(void); + }; + static Windows::Foundation::Collections::IMap^ SessionStateForFrame( + Windows::UI::Xaml::Controls::Frame^ frame); + + private: + static void RestoreFrameNavigationState(Windows::UI::Xaml::Controls::Frame^ frame); + static void SaveFrameNavigationState(Windows::UI::Xaml::Controls::Frame^ frame); + }; + } +} diff --git a/samples/winrt/ImageManipulations/C++/pch.cpp b/samples/winrt/ImageManipulations/C++/pch.cpp new file mode 100644 index 000000000..97389d94c --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/pch.cpp @@ -0,0 +1,16 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +// +// pch.cpp +// Include the standard header and generate the precompiled header. +// + +#include "pch.h" diff --git a/samples/winrt/ImageManipulations/C++/pch.h b/samples/winrt/ImageManipulations/C++/pch.h new file mode 100644 index 000000000..13f9bc34c --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/pch.h @@ -0,0 +1,23 @@ +//********************************************************* +// +// Copyright (c) Microsoft. All rights reserved. +// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF +// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY +// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR +// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. +// +//********************************************************* + +// +// pch.h +// Header for standard system include files. +// + +#pragma once + +#include +#include +#include +#include "Common\LayoutAwarePage.h" +#include "Common\SuspensionManager.h" +#include "App.xaml.h" diff --git a/samples/winrt/ImageManipulations/C++/sample-utils/SampleTemplateStyles.xaml b/samples/winrt/ImageManipulations/C++/sample-utils/SampleTemplateStyles.xaml new file mode 100644 index 000000000..ec2c1a713 --- /dev/null +++ b/samples/winrt/ImageManipulations/C++/sample-utils/SampleTemplateStyles.xaml @@ -0,0 +1,51 @@ + + + + + + + + + + + + diff --git a/samples/winrt/ImageManipulations/description.html b/samples/winrt/ImageManipulations/description.html new file mode 100644 index 000000000..ad1df7d56 --- /dev/null +++ b/samples/winrt/ImageManipulations/description.html @@ -0,0 +1,238 @@ + + + + + + + Media capture using capture device sample + + + + + + + + + diff --git a/samples/winrt/ImageManipulations/description/4ee0dda0-3e7e-46df-b80b-1692acc1c812Combined.css b/samples/winrt/ImageManipulations/description/4ee0dda0-3e7e-46df-b80b-1692acc1c812Combined.css new file mode 100644 index 000000000..e69de29bb diff --git a/samples/winrt/ImageManipulations/description/Brand.css b/samples/winrt/ImageManipulations/description/Brand.css new file mode 100644 index 000000000..98415561e --- /dev/null +++ b/samples/winrt/ImageManipulations/description/Brand.css @@ -0,0 +1,3629 @@ +#BodyBackground +{ + min-width:0px; +} + +#JelloExpander, .IE7 #JelloExpander, .IE8 #JelloExpander, .IE9 #JelloExpander +{ + padding-left:0px; + padding-right:0px; +} + +#JelloSizer +{ + margin: 0 auto; + max-width: 0px; + padding: 0; + width: 100%; +} + +/* Global Styles */ +body +{ + font-size:0.75em; +} + +h1 +{ + font-size: 3em; + font-family: 'Segoe UI Light','Segoe UI', 'Lucida Grande', Verdana, Arial, Helvetica, sans-serif; + color: #707070; + font-weight: normal; + padding-top:4px; + margin-bottom: 17px; + line-height: 1.3; + font-weight:100; +} + +h2, h3, h4, h5, h6 +{ + font-family: 'Segoe UI', 'Lucida Grande', Verdana, Arial, Helvetica, sans-serif; + color:#2a2a2a; + font-weight:normal; +} + +a, a:link, a:visited { + color: #00749E; +} +a:hover { + color: #0095c4; + text-decoration:none; +} + +/*---------- Masthead -----------*/ + +.NetworkLogo a { + display: none; +} + +/*-------- Start Advertisment --------*/ + +.advertisment { + padding-top:5px; +} + +/*-------- End Advertisment --------*/ + + +/*-------- Start LocalTabs Page --------*/ +#GalleriesNavigation { + float: right; +} + +.LocalNavigation +{ + display:block; + overflow: auto; + width: auto; +} + .LocalNavigation .HeaderTabs { + width: auto; + } + .LocalNavigation .TabOff a { + color:#333; + } + + .LocalNavigation .TabOff a:hover { + background-color:#E0E7EC; + margin-top: 1px; + padding: 4px 6px; + } + + .LocalNavigation #notificationLink span { + color:Red; + font-weight:bold; + } + + +/*-------- End LocalTabs Page --------*/ + +/*-------- Start SubMenu Page --------*/ + +.subMenu +{ + margin: 0 0 10px 0; + color: #FFFFFF; + overflow: hidden; +} + +.subMenu a, .subMenu span +{ + margin: 5px 5px 0 5px; +} + +.subMenu > h2 +{ + float: left; + margin: 7px 15px 0 0; + vertical-align: middle; + color: #666; + padding-bottom: 5px; +} + +.subMenu .advancedSearchLink +{ + float: left; + margin: 9px 0 0 15px; +} + + + +.subMenu .uploadLink +{ + float: right; + margin: 5px 5px 0 0; + padding: 0 0 0 30px; + height: 24px; + background: url('../Common/smalluploadicon.png') no-repeat 10px 0; + color: #0066E1; + cursor: pointer; +} + +.subMenu #searchBoxContainer +{ + float: left; + width: 400px; + padding: 3px 50px; +} + +/*-------- End SubMenu Page --------*/ + +/*-------- Start SearchBox --------*/ + +div.searchbox +{ + overflow:auto; + width:450px; + margin:26px 0 14px 0; +} + +.srchbox +{ + width: 100%; + background: #FFFFFF; + padding: 0px 2px; + height: 25px; + border: 1px solid #ccc; + table-layout: auto; + margin-bottom:5px; +} + + + +.srchbox #searchImageCell input +{ + background: transparent url('searchButton.png') no-repeat 0 0; + width: 20px; + height: 20px; + padding-left: 1px; + margin-top: 3px; +} + + +.srchbox #searchImageCell +{ + text-align: right; + padding: 0px; + vertical-align: middle; +} + +.IE7 .srchbox #searchImageCell +{ + padding:2px 2px 0 0; +} + +.srchbox #searchImageCell input +{ +} + + +table.srchbox +{ + table-layout: fixed; +} + +.srchbox .stxtcell +{ + padding-right: 4px; + width:90%; +} + +.srchbox .stxtcell > input +{ + margin-right: 4px; + height: 26px; + line-height:26px; + width: 100%; + padding: 0px; + padding-left: 4px; + padding-top: 2px; + border: none; + font-style: normal !important; + outline: none; +} + +.IE7 .srchbox .stxtcell > input +{ + height: 20px; +} + +.srchbox .stxtcell > input.stxtinptpassive +{ + color: #AAA; + font-style: normal !important; +} + +/*-------- End SearchBox --------*/ + + +/*-------- Start Search Page --------*/ + + +#searchPage #mainContentContainer +{ + margin: 0 0 0 243px; +} + +#searchPage .browseFilterBar, #dashboardPage .browseFilterBar +{ + padding: 5px 0 6px 0; + +} + + #searchPage .browseFilterBar a, #dashboardPage .browseFilterBar a + { + font-weight:normal; + } + + #searchPage .browseFilterBar .browseSort, #dashboardPage .browseFilterBar .browseSort + { + float:right; + } + + #searchPage .browseBreadcrumb + { + padding-top:3px; + } + + #searchPage .browseFilterBar .browseSort select, #dashboardPage .browseFilterBar .browseSort select + { + height:20px; + } + + #searchPage .browseFilterBar .browseSort img, #dashboardPage .browseFilterBar .browseSort img + { + vertical-align:text-top; + } + +#searchPage h2, #searchPage h3 +{ + font-size: 1.25em; + padding: 2px 0 3px 3px; +} + +#searchPage h2 +{ + display:inline; + clear:none; +} + + #dashboardsearchbox + { + width:420px; + margin:0; + float:left; + } + +/*-------- End Search Page --------*/ + +/*-------- Begin Requests Page --------*/ + +#requestsPage { + color:#666; +} + + #requestsPage h1 { + clear:none; + } + #requestsPage h2, #requestsPage h3 + { + font-size: 1.25em; + } + + #requestsPage h2 + { + display:inline; + clear:none; + } + + #requestsPage h3 + { + padding: 2px 0 3px 3px; + } + + #requestsPage #mainContentContainer + { + margin: 0 0 0 243px; + } + + #requestsPage #mainContentWrapper { + float:left; + width:100%; + } + #requestsPage #mainContent { + position:relative; + } + + #requestsPage #mainContent .subtitle{ + margin-bottom:5px; + } + + #requestsPage #requestsFilterBar { + margin-top: 10px; + overflow: auto; + } + + #requestsPage #requestsFilterBar .requestsCrumb { + float: left; + margin:10px 0 10px 10px; + } + #requestsPage #requestsFilterBar .requestsSort { + float: right; + margin:10px 0 10px 0; + } + + #requestsPage #Pager { + text-align: center; + padding-top: .75em; + padding-bottom: .75em; + } + + #requestsPage #requestsRss { + float: right; + margin: 2px 0 0 3px; + cursor: pointer; + } + + .IE7 #requestsPage #requestsList { + padding-top: 10px; + margin-top:5px; + } + + #requestsPage #requestsList .noResults { + padding: 10px 0; + } + + +/*-------- End Requests Page --------*/ + +/*-------- Begin Request List Item --------*/ +.requestlistitem td +{ + padding: .9em 0 .9em .5em; +} + .requestlistitem .votebox + { + + text-align:center; + } + + .requestlistitem .voteboxcontainer + { + vertical-align:top; + width: 75px; + } + + .requestlistitem #votelabel + { + border-width:1px; + border-bottom-style:solid; + } + + #myRequestsTab #votenumber, + #requestsPage #votenumber + { + background-color: #777; + padding: 6px 10px; + color: #fff; + margin-bottom: 5px; + font-size: 1.6em; + } + + #myRequestsTab .votelink, + #requestsPage .votelink + { + font-weight:bold; + background-color: #eee; + padding: 5px 0; + width: 75px; + float: left; + } + + #myRequestsTab .needvote, + #requestsPage .needvote + { + font-weight:bold; + } + + #myRequestsTab .alreadyvoted, #myRequestsTab .resolved + #requestsPage .alreadyvoted, #requestsPage .resolved + { + font-weight:bold; + color: #666; + } + + .requestlistitem .linkform + { + width:100%; + float:right; + margin-top:15px; + } + + .requestlistitem .requesttitle + { + font-weight:600; + margin-bottom:3px; + } + + .requestlistitem .requester + { + margin-bottom:3px; + } + + .requestlistitem .hidden + { + display:none; + } + + .requestlistitem .requestSummary div + { + margin-bottom:0.25em; + } + + .requestlistitem .requestSummary + { + line-height: 1.45em; + width: 80%; + } + .requestlistitem .requestInfo + { + padding:1.2em 0 0 2.5em; + vertical-align:top; + width:15%; + line-height: 1.45em; + } + + .requestlinks > td + { + padding-bottom: 16px; + } + + .requestlinks div + { + float:right; + } + + .requestlistitem .textbox + { + width:95%; + } + + + +/*-------- End Request List Item --------*/ + + +/*-------- Begin New Request Form --------*/ +#newrequest { + margin-top:5px; + background:#F8F8F8; + border-width:1px; + border-style:solid; + border-color:#E8E8E8; + padding:5px; +} + +#newrequest > div:first-child { + font-weight:bold; +} + +#newrequest .itemheading { + margin-top:5px; +} + +#newrequest textarea { + width:95%; +} + +#newrequest .field-validation-error { + display:block; + color:Red; + font-weight:bold; +} +#newrequest #submit { + margin-top:5px; +} + +/*-------- End New Request Form --------*/ + +/*-------- Request Description Page ------*/ + +.reqDescPage #mainContent { + overflow: auto; +} +.reqDescPage #header { + float: left; + width: 100%; +} + +.reqDescPage { + color: #000 !important; +} + +.reqDescPage #sideNav { + background-color: #fff; +} + +.reqDescPage #header td { + vertical-align: bottom; +} + +.reqDescPage #header #votenumber { + width: 50px; + padding: 6px 12px; + text-align: center; + float: left; +} + +.reqDescPage #header .requestTitle { + margin: 0 0 5px 10px; + width: 85%; +} + +.reqDescPage #header .requestTitle h1 { + margin-bottom: 0px; + font-size: 2em; +} + +.reqDescPage #headerLinks { + +} + +.reqDescPage .votelink +{ + text-align: center; + float: left; +} + +#requestsPage #headerLinks .assignlink +{ + font-weight:bold; + background-color: #eee; + padding: 5px 10px; + float: left; + margin-left: 10px; +} + +.reqDescPage #projectInfo { + clear: both; +} + +.reqDescPage #solutions { + clear: both; + padding-top: 30px; +} + +.reqDescPage .solutionItems { + clear: both; +} + +.reqDescPage .solutionHeader { + border-bottom: 1px solid #ccc; +} + +.reqDescPage .solutionAddContainer { + padding-top: 15px; + float: left; +} + +.reqDescPage #SubmittedProjectLinkUrl { + width: 400px; +} + +.reqDescPage .solutionItem { + margin-top: 25px; + padding: 0 5px 5px 0; + float: left; +} +.reqDescPage .solutionItemDetails { + float: left; + width: 535px; +} + +.reqDescPage .solutionItemLinks { + margin-top: 10px; + clear: both; + float: left; + width: 100%; +} + +.reqDescPage .solutionItemLinks a { + float: left; + margin-right: 10px; + font-weight:bold; + background-color: #eee; + padding: 5px 10px; +} + +.reqDescPage .solutionItem .itemTitle { + font-size: 1.2em; + padding-bottom: 5px; +} + +.reqDescPage .tagsContainer label { + display: none; +} + +.reqDescPage .solutionItem .summaryBox { + padding: .25em 0 .25em 0; + clear: both; + line-height: 1.45; +} + +.reqDescPage .solutionsection { + float: left; + margin-left: 20px; +} +.reqDescPage .completedSolution { + font-size: 1.25em; + padding-top: 4px; +} + +.reqDescPage .requestDescriptionCont, .reqDescPage .requestDicussions { + padding-top: 30px; + clear: both; + overflow: auto; +} + +.reqDescPage .requestDescription { + padding-top: 10px; + line-height: 1.45; +} + +.reqDescPage .requestDicussionsAsk { + padding: 10px 0; +} + +.reqDescPage .watermark { + color:Gray; + } + + +/*-------- Begin Extra Actions Section --------*/ +#extraActions +{ + float: right; + width: 300px; + vertical-align: top; +} + + #extraActions .section + { + padding: 0 0 10px 0; + overflow:auto; + } + + #extraActions .section a + { + font-weight:bold; + } +/*-------- End Extra Actions Section --------*/ + +/*-------- Begin Contribute --------*/ + + +#contributeSection a +{ + font-size:1.1em; +} + +#contributeSection, #sideNav #contributeSection h3, .sidebar #contributeSection h3, #contributeSection h3 +{ + background-color:#fff; + margin: 0 0 9px 0; + padding-left: 0px; +} + +#sideNav #contributeSection h3, .sidebar #contributeSection h3, #contributeSection h3 +{ + font-size:1.65em; + margin-top:42px; +} + +#sideNav #contributeSection, #contributeSection +{ + padding:0 0 41px 0; +} + +#projectPage .sidebar #contributeSection +{ + margin: 10px 0 10px 0; + padding:0 0 5px 0; + +} + +#sideNav #contributeSection > div, .sidebar #contributeSection > div, #contributeSection > div +{ + padding: 0 0 2px 0; + overflow:auto; +} + +#sideNav #contributeSection img, #contributeSection img +{ + float: left; + width: 25px; +} + +#sideNav #contributeSection .contributeAction, .sidebar #contributeSection .contributeAction, #contributeSection .contributeAction +{ + padding:17px 0 0 0; +} + + #contributeSection .contributeAction img + { + background-color:#00749e; + margin-right:9px; + } + + #contributeSection .contributeAction img:hover + { + background-color:#0095C4; + } + + #contributeSection .contributeAction a + { + display:block; + line-height: 1.8; + } + +#uploadLink, #exclamationLink, #myContributionsLink +{ + display:block; + line-height: 1.8; +} + +#myContributionsLink span +{ + color: red; +} + +#feedbackLink +{ + background: url("FeedbackIcon.png") no-repeat; + width: 40px; + height: 40px; +} + +.itemRow .affiliationLink, #editorPicksSection .affiliationLink +{ + background: url("../common/MicrosoftLogo.png") no-repeat; + width: 82px; + height: 18px; +} + + +#contributeSection a+div +{ + +} + +.IE7 #contributeSection a+div +{ + float:left; +} +/*-------- End Contribute --------*/ + + +/*-------- Begin Directory List Footer --------*/ + +#directoryListFooter { + padding:5px; + margin-top:10px; + margin-bottom:10px; + border:1px solid #E6E6E6; + background-color:#F8F8F8; + width:438px; +} + +#directoryListFooter h4 +{ + font-size:1em; +} + +/*-------- End Directory List Footer --------*/ + +/*-------- Begin Editors Picks --------*/ +#editorPicksSection ul +{ + padding-left:0px; + line-height:135%; +} + + #editorPicksSection ul li + { + clear: left; + padding-top: 10px; + list-style:none; + } + #editorPicksSection ul li:first-child { + padding-top: 0; + } + + #editorPicksSection a { + font-weight: normal !important; + } + + #editorPicksSection ul li .thumbnail + { + float: left; + padding: 3px; + max-width: 60px; + } + + #editorPicksSection ul li .thumbnail img { + max-width: 60px; + } +/*-------- End Editors Picks --------*/ + +/*-------- Begin Side Nav Section --------*/ + +#sideNav +{ + width: 215px; + margin-left: 0; + vertical-align: top; + float: left; +} + + #sideNav #sideNavHeader + { + margin-bottom: 10px; + padding-left: 12px; + } + + #sideNav #sideNavHeader a + { + margin-right:3px; + } + + #sideNav .section + { + padding: 0 10px 18px 10px; + } + #sideNav .section h3 + { + + padding: 2px 0 3px 3px; + } + + #sideNav .section ul + { + } + + #sideNav .section ul li + { + padding: 4px 0 2px; + margin-left: 3px; + margin-right: 3px; + } + + + #sideNav .section form > div + { + padding: 4px 0 0 0; + border-bottom: none; + margin: 0 3px 0 3px; + color: #3A3E43; + } + .IE8 #sideNav .section ul li > div.itemText + { + word-wrap: break-word; + width: 225px; + } + #sideNav .section ul li > div.itemCount + { + color: #3A3E43; + } + #sideNav .section a + { + font-weight: normal; + } + + #sideNav .section a:hover + { + text-decoration: underline; + } + +/*-------- End Side Nav Section --------*/ + + + +/*-------- Start Dashboard Page --------*/ + +#dashboardPage { + padding-top:5px; + clear:both; +} +#dashboardPage .contributions .tabContents { + padding: 5px 5px 10px 0; + clear:both; +} + #dashboardPage .contributions .noContributions { + clear:both; + } +#dashboardPage #mainContentWrapper { + float:left; + width:100%; +} + +#dashboardPage #detailsSection { + float:left; + margin-left:-100%; + width:240px; + padding: 0 0 18px 10px; +} + +.IE7 #dashboardPage #detailsSection { + width:auto; + min-width: 300px; + max-width: 300px; +} +.IE7 #dashboardPage #detailsSection .itemBar { + width: 270px; +} + +#dashboardPage #detailsSection h3 { + word-wrap:break-word; +} + +#dashboardPage #mainContent { + margin:0 0 0 250px; + position:relative; +} + +#dashboardPage .dashboardEvenRow +{ + background-color: #ececec; +} +#dashboardPage .dashboardPadding +{ + padding-bottom: 4px; +} +#dashboardPage .dashboardCell +{ + width: 100%; + vertical-align: top; + padding: 1em 0.5em 0.5em 0.5em; + word-wrap: break-word; +} +#dashboardPage .projectManagement +{ + width: 24em; + padding: 0em 1em 0em 1em; + vertical-align: top; + text-align: right; + float: right; +} + +#dashboardPage #subscriptionsLink +{ + position:absolute; + right:5px; +} + +#dashboardPage .itemDelete +{ + vertical-align:top; + padding-top:.8em; + width:30px; +} + +#dashboardPage .itemDeleteText +{ + border-style:solid; + border-width:thin; + border-collapse:collapse; + padding-bottom:2px; + padding-left:4px; + padding-right:4px; + color:Gray +} + +#dashboardPage #myRequestsTab .requestTabHeaders +{ + font-weight:normal; + border-bottom: solid 1px #CCC; + padding-bottom: 10px; + padding-top: 10px; + float: left; + width: 100%; +} + +#dashboardPage #myRequestsTab .requestTabHeaders div:first-child +{ + border: 0; +} + +#dashboardPage #myRequestsTab .requestTabHeaders div +{ + padding: 2px 9px 3px 9px; + font-size: 0.9em; + line-height: 125%; + color:#00749E; + border-left: solid 1px #555; + cursor: pointer; + float: left; + text-align: center; +} + +#dashboardPage #myRequestsTab .requestTabHeaders div.currentRequest +{ + background-color:#fff; + cursor: default; + border-bottom:none; + margin-bottom:-2px; + color:#000; +} + +#dashboardPage #myRequestsTab .requestTabHeaders div.currentRequest a { + color: #000; +} + + +#dashboardPage #myRequestsTab .requestTabHeaders div a +{ + text-decoration:none; +} + +#dashboardPage #myRequestsTab .requestTabContents +{ + clear: both; +} + +#dashboardPage #myRequestsTab .requestTabContents .noResults { + padding-top: 20px; +} + +/*-------- End Dashboard Page --------*/ + +/*-------- Start Upload Page --------*/ + +#UploadPage +{ + margin-left:10px; + max-width:925px; +} + + #UploadPage .watermark { + color:Gray; + } + + #UploadPage .projectTypeChoice { + } + + #UploadPage .projectTypeChoice > div { + padding: 20px 10px 20px 10px; + border: 1px solid darkgrey; + cursor: pointer; + height: 200px; + float: left; + } + + #UploadPage .projectTypeChoice > div + div + { + margin: 0 0 0 5px; + } + + #UploadPage .projectTypeChoice div.current + { + background-color: #E9E9E9; + cursor: default; + } + + #UploadPage .projectTypeChoice div.choice { + font-size: large; + font-weight: bold; + padding: 0 0 10px 0; + } + + #UploadPage .projectTypeChoice div.description { + padding: 0 0 0 17px; + } + + #UploadPage .projectTypeChoice #genericSampleUploadDescription { + padding-top: 5px; + } + + #UploadPage .projectTypeChoice #genericSampleUploadDescription .logos { + overflow: auto; + } + + #UploadPage .projectTypeChoice #genericSampleUploadDescription .vslogo { + background: url(../samples/vslogo.png) no-repeat; + width: 125px; + height: 18px; + } + + #UploadPage .projectTypeChoice #genericSampleUploadDescription .javalogo { + background: url(../samples/javalogo.png) no-repeat; + width: 33px; + height: 60px; + float: left; + } + + #UploadPage .projectTypeChoice #genericSampleUploadDescription .phplogo { + background: url(../samples/phplogo.png) no-repeat; + width: 65px; + height: 35px; + float: left; + margin: 15px 0 0 10px; + } + + #UploadPage .projectTypeChoice #genericSampleUploadDescription .nodejslogo { + background: url(../samples/nodejslogo.png) no-repeat; + width: 90px; + height: 25px; + float: left; + margin: 18px 0 0 10px; + } + + #UploadPage .projectTypeChoice #genericSampleUploadDescription > div+div { + margin-top: 10px; + } + + #UploadPage .projectTypeContents { + clear: left; + padding: 10px 0 0 0; + } + + #UploadPage .projectTypeContents .instruction > div+div { + padding-top: 5px; + } + #UploadPage #libraryContainer { + margin: 5px 0 0 0; + display: none; + } + #UploadPage fieldset + { + margin: 10px 0 30px 5px; + } + + #UploadPage fieldset > * + { + margin-left:10px; + } + + #UploadPage .fieldsetStyleContainer { + margin: 10px 0 0 5px; + } + + #UploadPage fieldset h2, + #UploadPage .fieldsetStyleContainer h2 + { + font-family: 'Segoe UI Semibold','Segoe UI','Lucida Grande',Verdana,Arial,Helvetica,sans-serif; + font-size: 17px; + font-weight: bold; + color: #3A3E43; + border-bottom: 2px solid #EFEFEF; + width: 100%; + padding-bottom: 3px; + margin-left:0px; + margin-bottom:8px; + } + + .IE7 #UploadPage fieldset h2, + .IE7 #UploadPage .fieldsetStyleContainer h2 + { + margin-left:-10px; + } + + #UploadPage fieldset .field-validation-error + { + clear:left; + display:block; + margin-top:4px; + } + + #UploadPage fieldset .required + { + margin-left: 3px; + } + + #UploadPage fieldset .instruction, + #UploadPage .fieldsetStyleContainer .description, + #UploadPage .fieldsetStyleContainer .instruction + { + color: #3A3E43; + margin:0 0 10px 0; + } + + #UploadPage fieldset .faqLink + { + margin: 0 0 10px 0; + } + + #UploadPage fieldset label + { + display:block; + } + + #UploadPage fieldset input[type=text] + { + width:60%; + } + + #UploadPage fieldset input[type=checkbox] + { + float:left; + clear:left; + margin-right:5px; + } + + #UploadPage fieldset input[type=radio] + { + float:left; + clear:left; + margin-right:5px; + } + + #UploadPage fieldset#richDescription textarea + { + width:70%; + height:600px; + } + + #UploadPage fieldset#summary textarea + { + width:60%; + height:100px; + margin-top: 10px; + margin-left: -30px; + } + + .IE #UploadPage fieldset#summary textarea, .IE9 #UploadPage fieldset#summary textarea + { + margin-left: -30px; + overflow: auto; + } + + .FF #UploadPage fieldset#summary textarea + { + margin-left: -30px; + } + + #UploadPage fieldset#summary #SummaryReadOnly + { + width:60%; + margin-top: 10px; + padding-top: 5px; + color: #909082; + } + + #UploadPage fieldset#summary #SummaryCharCount + { + width:60%; + text-align: right; + } + + #UploadPage fieldset#options label + { + margin-bottom:10px; + } + + #UploadPage fieldset#license label + { + margin-bottom:10px; + } + + #UploadPage input[type="text"].tagInput, #UploadPage input[type="text"].listInput + { + width:40%; + float:left; + } + + #UploadPage .addedTags, #UploadPage .addedProjects + { + margin-bottom:15px; + width: 500px; + + } + #UploadPage .addedTags .tag, #UploadPage .addedProjects .projectTitle + { + position:relative; + overflow:hidden; + } + + #UploadPage .addedTags .tag label, #UploadPage .addedProjects .projectTitle label + { + float:left; + width: 450px; + } + + #UploadPage .addedTags .tag a, #UploadPage .addedProjects .projectTitle a + { + position:absolute; + text-align:right; + right:0px; + } + + .fileManager .fileUploadProgressIndicator + { + width: 500px; + } + + .fileManager .uploadProcessingWarning { + margin-top: 5px; + } + + .fileManager .fileUploadProgressIndicator .throbber + { + font-weight: bold; + background: url('./progressIndicatorWhite.gif') no-repeat 10px 0; + padding: 7px 10px 5px 60px; + height: 25px; + margin-left: -10px; + } + + .fileManager #uploadFrame, .fileManager .uploadFrame + { + width:100%; + } + + .fileManager .addLabel + a + { + margin-left:25px; + } + + .fileManager fieldset label { + display: block; + } + + .fileManager .unlocalizedFiles { + color:#808080; + } + + .fileManager .filesContainer + { + margin-bottom:15px; + width: 500px; + + } + + .fileManager .filesContainer .file + { + position:relative; + overflow:hidden; + } + + .fileManager .filesContainer .file .title { + border-bottom: 1px solid #000000; + padding-bottom: 3px; + margin-top: 10px; + margin-bottom: 10px; + } + + .fileManager .filesContainer .file .title .manageLinks + { + float: right; + } + + .fileManager .filesContainer .file .version { + padding: 0 0 20px 10px; + } + + .fileManager .filesContainer .file .version label { + float: left; + font-weight: bold; + } + + .fileManager .filesContainer .file .version span { + float: left; + margin-left: 5px; + } + + .fileManager .filesContainer .file .language { + clear: left; + padding: 0 0 5px 10px; + } + + .fileManager .filesContainer .file .language label { + font-weight: bold; + } + + .fileManager .filesContainer .file .language label + label { + font-weight: normal; + padding-left: 10px; + } + + .fileManager .filesContainer .file .language div { + padding-left: 20px; + } + + .file .requirements { + clear: left; + padding: 0 0 0 10px; + } + + .file .requirements label { + font-weight: bold; + } + + .file .requirements > div { + padding-left: 20px; + } + + .file .requirements .requirementsContent { + padding-top: 5px; + padding-bottom: 10px; + } + + .file .requirements .requirementsContent label { + font-style: italic; + font-weight: normal; + } + + .file .requirements .requirementsContent > div + { + margin-bottom:4px; + position:relative; + } + + .requirementsContent .requirementsRemove { + float:right; + position:absolute; + right:0px; + } + + .requirements .requirementsAddError { + color:#ff0000; + } + + .requirements .reqBrowseButton { + margin-left : 10px; + } + + #UploadPage fieldset .requirements input[type="text"] { + margin-right: 10px; + width: 70%; + } + + .reqBrowsefade + { + position: absolute; + background-color: #aaaaaa; + } + .reqBrowse + { + background-color: #f4f4f4; + border:1px solid #000000; + -border-radius: 5px; + -moz-border-radius: 5px; + -webkit-border-radius: 5px; + padding:15px; + position: absolute; + display: block; + } + + .reqBrowsebuttons + { + text-align:right; + position:absolute; + right:10px; + bottom:5px; + } + .reqBrowsebuttons > button + { + color: Blue; + border: none; + background: none; + font-weight: bold; + cursor:pointer; + } + + .reqBrowseclose + { + display:none; + } + + .reqBrowseDialog + { + width: 700px; + clear:both; + overflow:hidden; + padding-bottom: 20px; + } + + .reqBrowseDialog > h2 + { + border-bottom: 1px solid #000000; + padding-bottom: 5px; + } + .reqBrowseDialog > p + { + margin: 15px 0 15px 0; + } + + .reqBrowseSearchCont + { + width: 100%; + background: white; + padding: 0px 2px; + -moz-border-radius: 2px; + border-radius: 2px; + border: 1px solid #888; + margin-bottom: 5px; + height: 29px; + } + .reqBrowseSearchCont .rbboxcont + { + width:90%; + float:left; + margin-top: 2px; + } + + .reqBrowseSearchCont input + { + border:none; + outline:none; + border-color: transparent; + } + .reqBrowseSearchBox + { + margin-right: 4px; + height: 20px; + line-height: 20px; + width: 100%; + padding-left: 4px; + padding-top: 2px; + } + + .reqBrowseSearchBoxDefault + { + color: #AAA; + } + .reqBrowseSearchCont .rbbtncont + { + float: right; + margin-top: 4px; + } + .reqBrowseSearchBtn + { + background: transparent url('searchButton.png') no-repeat 0 0; + width: 22px; + height: 22px; + float:left; + cursor:pointer; + } + + .reqBrowseTabs + { + border-bottom: 5px solid #E8E8E8; + margin:3px 0; + overflow:auto; + } + + .reqBrowseTabs .reqBrowseTabsR + { + color:#fff !important; + } + + .reqBrowseTabs .reqBrowseTabsHighlight + { + color:#000 !important; + background-color:#E8E8E8; + } + + .reqBrowseTabs a + { + padding:5px 12px; + color:#fff; + background-color:#888; + font-weight:bold; + float:left; + margin: 0 4px 0px 0; + } + + .reqBrowsePager td + { + text-align:center; + } + .reqBrowseDialog #Pager + { + margin: 5px 0 15px 0; + } + + .reqBrowseContent + { + height:310px; + overflow:auto; + clear:both; + position:relative; + } + + .reqBrowsePager + { + width:700px; + margin:0 auto; + } + + .reqBrowseContentError + { + color:#ff0000; + margin:5px; + } + .reqBrowseContent .requirementItem + { + border-bottom: 2px solid #E8E8E8; + padding: 4px 0 6px 0; + overflow:auto; + } + .reqBrowseContent .section1 + { + float:left; + width:75%; + padding-left:25px; + position:relative; + } + + .reqBrowseContent .section1 input + { + position:absolute; + left:0px; + } + + .reqBrowseContent .title + { + font-weight:bold; + } + .reqBrowseContent .section2 + { + float:right; + } + + + .progressIndicatorfade + { + position: absolute; + background-color: #FFFFFF; + } + .progressIndicator + { + background-color: #f4f4f4; + border: 1px solid #000000; + -border-radius: 5px; + -moz-border-radius: 5px; + -webkit-border-radius: 5px; + padding: 15px; + position: absolute; + display: block; + max-width: 70%; + max-height: 70%; + } + #progressIndicator .progressIndicatorclose + { + display: none; + } + + #progressIndicatorContent + { + font-weight: bold; + padding: 7px 10px 0 10px; + height: 25px; + } + + +/*-------- End Upload Page --------*/ + +/* --- + --- Homepage --- + --- */ +p +{ + margin: 0 0 1px 0; +} +#homePageHeader +{ + float: left; + margin-bottom: 1em; + width: 100%; +} + +.tagline { + font-size: x-small; + position: relative; + top: -11px; +} + +.logo +{ + float: left; + width: 455px; + height: 70px; + font-weight: bold; + font-size: 22px; + margin: 0 10px 0 0; + color: #3A3E43; +} +.logo > img +{ + float: right; +} +.logo > div +{ + color: #3A3E43; + font-weight: bold; + font-size: 22px; +} + +.welcomeInfo +{ + float: left; + width: 700px; +} +.welcomeInfo h2 +{ + font-size: 16px; +} +.welcomeInfo p +{ + margin: 5px 0 0 0; +} + +#siteActions > div +{ + border: 1px solid #BBBBBB; + padding: 15px 5px 0 65px; + height: 55px; + background-repeat: no-repeat; + background-position: 10px 10px; +} +#siteActions a, #siteActions p +{ + margin-top: 5px; +} + +#siteActions .label a +{ + font-size: 1.25em; + font-weight: bold; + margin-bottom: 3px; +} + +#myGalleryBox +{ + background-image: url("MyGalleryIcon.png"); +} + +#findActions +{ + float: right; + padding: 10px; + width: 225px; + height: 50px; + border: 1px solid #BBBBBB; + margin: 0 10px 0 0; +} + +#findBox div:first-child +{ + margin: 0 0 5px 0; + font-weight: bold; +} + +#legalDisclaimer +{ + margin: 0 0 10px; + color: #798072; + font-size: 0.8em; +} + +#siteActions +{ + float: right; + width: 200px; +} +#siteActions > div +{ + margin-bottom: 10px; +} +.homePageProjects +{ + width: 100%; + float: left; +} +.homePageProjects > div +{ + padding-left: 1.5em; +} +.homePageProjects > div:first-child +{ + padding-left: 0; +} + +.homePageProjects .projectList ul +{ + padding: 0; + margin: 0; +} +.homePageProjects li +{ + margin-top: .5em; + padding-bottom: 0.5em; + list-style-type: none; +} +.homePageProjects .itemRow a, .homePageProjects .itemRow a:hover +{ + color: #0054A6; +} + +.projectList > a:first-child +{ + margin-left: 1px; + float: right; +} +.projectList > a, .projectList > a:hover +{ + color: #5BAEDB; +} +.projectListTitle +{ + height: 27px; + font-size: 16px; + font-weight: bold; + line-height: 125%; + background: #E8E8E8; + padding: 5px 5px 5px 25px; +} +.homePageListPager +{ + text-align: right; + margin-bottom: -.5em; +} + +.recentlyAddedProjects +{ + float: left; + width: 32%; +} +.mostPopularProjects +{ + float: right; + width: 32%; +} +.highestRankedProjects +{ + overflow: hidden; +} +* html .highestRankedProjects +{ + float: left; +} +* html .highestRankedProjects > div +{ + width: 100%; +} + +#Pager +{ + text-align:left; +} + +/* Impromptu warning style */ +.ipWarningfade +{ + position: absolute; + background-color: #aaaaaa; +} +div.ipWarning +{ + width: 400px; + position: absolute; + border-radius: 10px; + -moz-border-radius: 10px; + -webkit-border-radius: 10px; + padding: 20px 0 20px 20px; + background-color: #FCE5E6; + border: solid 2px #EE1F25; +} +.ipWarningcontainer +{ + font-weight: bold; +} +.ipWarningclose +{ + display: none; +} +.ipWarningmessage div +{ + position: relative; +} +.ipWarningmessage div div + div +{ + text-align: center; + padding-right: 20px; +} +.ipWarningmessage div div:first-child div +{ + margin-left: 50px; +} +.ipWarningmessage div div:first-child img +{ + position: absolute; + margin: -20px 0 0 0; + top: 35%; +} +.ipWarningbuttons +{ + text-align: center; + margin: 20px 0 0 0; +} +.ipWarning button +{ + padding: 4px 8px 4px 8px; + margin: 2px; + font-weight: bold; + border: solid 1px #A6A3A6; + color: #FFFFFF; + background: #B8BABC; + filter: progid:DXImageTransform.Microsoft.Gradient(GradientType=0,StartColorStr='#B8BABC',EndColorStr='#949699'); +} + +#eyebrow +{ + width: 100%; +} +#siteMessage .unsupportedLocale { + margin: 10px 0 0 243px; + border: solid 1px #CCC; + background: #FCFEC5; + padding: 5px; +} + +#Footer +{ + width: 100%; + height: auto; +} + +.clear +{ + clear: both; +} + +#buildVersion +{ + clear: both; + margin-left: auto; + margin-right: auto; + padding-right: 26px; + padding-top: 8px; + text-align: right; +} + +#page +{ + clear: both; + padding-top: 10px; +} + +#page h1 +{ + padding: 10px 0px; +} + +#ownerBar +{ + background: #EFEFEF; + border: 2px solid #7FCBF5; + text-align: left; + color: Black; + margin: 10px 0 20px 0; + padding: 5px; + word-spacing: 0px; + font-size: medium; +} + +#ownerBar a +{ + color: Blue; + padding: 0 5px 0 5px; +} + + + + +/*-------- Start Tab Control --------*/ + +.tabHeaders +{ + font-weight:normal; + text-transform: uppercase; + border-bottom: solid 1px #CCC; + float: left; + width: 100%; +} + +.tabHeaders div +{ + padding: 7px 19px 8px 19px; + font-size: 0.9em; + line-height: 125%; + color:#00749E; + cursor: pointer; + float: left; + text-align: center; +} + +.tabHeaders div.current +{ + background-color:#fff; + cursor: default; + border: 1px solid #CCC; + border-bottom:none; + margin-bottom:-2px; + color:#000; +} + +.tabHeaders div a +{ + text-decoration:none; +} + +.tabContents +{ + clear: both; +} + +#MainContent .tabHeaders div.current a +{ + color:#000; +} + +.tabContents div.current +{ + display: block; +} +/*-------- End Tab Control --------*/ + +.itemContainer +{ + width: 100%; +} +.itemRow .itemTitle +{ + padding-bottom: 5px; + font-size:1.1em; +} + +.itemRow .itemBody, .itemRow .itemInfo +{ + padding:15px 17px 16px 0; +} +.itemRow .itemDescription +{ + overflow: hidden; + max-height: 80px; +} +.itemRow .itemBody +{ + vertical-align: top; + line-height: 1.4; +} +.itemRow .itemBody a.officialMicrosoftLabel +{ + color: #ACACAC; +} +.itemRow .itemInfo +{ + vertical-align: top; + padding-left: .5em; + line-height: 1.4; + width: 10em; + text-align:right; +} + +.IE7 .itemRow .itemInfo +{ + width:11em; +} + +.itemRow .itemInfo .ratingStars +{ + float: left; +} +.itemRow .itemInfo .ratingCount +{ + padding: 0 0 0 5px; + float: left; +} +.itemRow .ratingInfo +{ + text-align: center; +} + +.itemRow .affiliationLink, #editorPicksSection .affiliationLink +{ + position: relative; + top: 3px; +} + +#editorPicksSection a.officialMicrosoftLabel +{ + color: #ACACAC; +} + +.itemRow .tagsContainer label { + display:none; +} + +.editorPickedItem +{ + background-color:#F8F8F8; +} + +.editorPickedText +{ + font-size:1.25em; + padding-bottom:2px; +} +.editorPickedItem > td +{ + border-top:6px solid #fff; +} + +.dirSubHeading +{ + margin-bottom:15px; +} + +#searchPage .dirSubHeading h2 +{ + line-height:1.4; + font-size:1.1em; +} + +#searchPage .dirSubHeading h2 span +{ + padding-top:5px; + display:block; +} + +#searchPage .dirSubHeading h1, #searchPage .dirSubHeading h2 +{ + clear:none; + padding-left:0px; +} + +.dirSubHeading .dirSubLinks +{ + font-size:1.2em; + padding-top:5px; +} + + +.summaryBox +{ + padding: .25em 0 .25em 0; + clear: both; + line-height:1.45; +} + +/*-------- Start Rating Stars --------*/ + +.RatingStar +{ + width: 11px; + height: 11px; + padding: 0 8px 0 0; + background-position: center; + float: left; +} + +.FilledRatingStar, .HalfRatingStar, .EmptyRatingStar, .FilledRatingStarHover +{ + width: 11px; + height: 11px; + padding: 0px 1px 0px 0px; + margin-top: 2px; +} + +.FilledRatingStar +{ + background: url(../samples/fullStar.png) no-repeat; +} + +.ownerRating .FilledRatingStar +{ + background: url(../samples/fullStar.png) no-repeat; +} + +.FilledRatingStarHover +{ + background: url(../samples/fullStarHover.png) no-repeat; + +} + +.HalfRatingStar +{ + background: url(../samples/halfStar.png) no-repeat; + +} + +.EmptyRatingStar +{ + background: url(../samples/emptyStar.png) no-repeat; + +} + +.EditStarMode .RatingStar +{ + cursor: pointer; +} + + + +/*-------- End Rating Stars --------*/ + +.discussionFormTable +{ + width: 100%; + table-layout: fixed; +} + + +#ReviewsTabPane .seeAllLink, #DiscussionsTabPane .seeAllLink +{ + margin-top: 10px; + text-align: center; +} + +/*-------- Start DiscussionsTab --------*/ + +.threadActions +{ + text-align: right; + margin-top: 10px; + float: right; +} + +#DiscussionsTabPane .reply, #DiscussionsTabPane .toggleDiscussion +{ + cursor: pointer; +} + + +#defaultDicussionText, #newDiscussion +{ + padding-top: 10px; +} +#DiscussionsTabPane .posts +{ + display: block; +} +#DiscussionsTabPane .threadHeader .left +{ + float: left; +} +#DiscussionsTabPane .threadHeader .right +{ + float: right; +} +#DiscussionsTabPane .normal +{ + font-weight: normal; +} + +#DiscussionsTabPane .threadHeader +{ + position: relative; + background-color: #ECECEC; + padding: 4px 10px 4px 10px; +} + +#DiscussionsTabPane .threadHeader .title +{ + color: #000000; + font-weight: bold; + margin-bottom: .7em; +} +#DiscussionsTabPane .threadHeader .label +{ + color: #000000; +} +#DiscussionsTabPane .postMeta +{ + color: #666666; +} +#DiscussionsTabPane .threadHeader .value +{ + font-weight: bold; + color: #000000; +} + +#DiscussionsTabPane .reply +{ + font-weight: normal; + cursor: hand; +} + +#DiscussionsTabPane ul li +{ + list-style-type: none; + list-style-image: none; + padding-bottom: 10px; +} + +#DiscussionsTabPane ul +{ + padding-left: 0px; + margin-left: 0px; + padding-right: 2px; +} + +.IE #reviewList .right +{ + margin-right: -1em; +} + +#newDiscussion +{ + margin: 0 0 15px 0; +} +#newDiscussion #Title +{ + width: 50%; +} +#newDiscussion textarea +{ + width: 99%; + height: 10em; +} + +#DiscussionsTabPane +{ + margin-left: 0px; + padding: 0 1em 1em 1em; +} + + +.postMeta +{ + float: right; +} + +.postReply +{ + cursor: hand; + float: right; + font-weight: bold; +} + +.postSaveReply +{ + display: none; +} + +.postSaveReply textarea +{ + width: 99%; + margin-bottom: 4px; + height: 8em; +} + +.toggleDiscussion +{ + cursor: hand; +} + +.saveReplyErrorMessage +{ + display: none; + margin: 0 0 4px 0; + color: Red; + font-weight: bold; +} + +#discussionListItem .avatar +{ + float: left; + padding: 5px; + vertical-align: middle; +} + +#discussionListItem .discussion +{ + margin-left: 55px; + padding: 0 5px 5px 5px; + vertical-align: top; +} + +.IE7 #discussionListItem .avatar +{ + margin-top: 15px; +} + + +/*-------- End DiscussionsTab --------*/ + + +.flotChart +{ + height: 300px; +} + +#projectMenuBarTop +{ + padding: 10px 0 20px 0; + font-weight: bold; + font-size: 25px; +} +.dayHeader +{ + font-weight: bold; +} + +/*-------- Start StatsPage --------*/ +#statsPage +{ + border: none; + background-color: Transparent; + margin-top: 1em; +} + +#statsPage .rangeBox +{ + padding: 5px; + background-color: #ECECEC; + border: solid 1px #C2C2C2; + float: left; +} +#statsPage .statBox +{ + margin-top: 1em; + margin-bottom: 10px; + overflow: hidden; + display: none; +} +#statsPage .statBox h3 +{ + font-size: 1.1em; + display: inline; +} + +#statsPage #statMessage +{ + margin-top: 1em; + display: none; +} + +#statsPage #statDownloadBox img { + float: left; +} + +#statsPage .statDownloadLink { + padding-left: 5px; + vertical-align: middle; +} + +#pointTooltip +{ + border: solid #000000 1px; + height: 35px; + background-color: #EEEEEE; + position: absolute; + display: none; + text-align: center; + padding: 9px; + border-radius: 4px; + -moz-border-radius: 4px; + -webkit-border-radius: 4px; + z-index: 1000; + white-space: nowrap; +} + +.flotChart +{ + height: 300px; +} + +/*-------- End StatsPage --------*/ + + +/***************************************************************/ +/* TagAutoComplete Styles */ +/***************************************************************/ +.AutoCompletePanel +{ + font-size: 95%; + border: solid .1em #999; + background-color: #f0f0f0; + padding: .15em; +} + +.AutoCompletePanel div.Row +{ + color: #000; + cursor: pointer; + background-color: transparent; + padding: .15em .25em; + text-align: left; +} + +.AutoCompletePanel div.Selected +{ + color: #fff; + background-color: #6D6D6D; +} + + +/*-------- Start Subscription Form --------*/ + +#subscribeForm +{ + background-color: #D3D3D1; + border: 1px solid #000000; + -border-radius: 5px; + -moz-border-radius: 5px; + -webkit-border-radius: 5px; + padding: 15px; + position: absolute; + display: block; +} + +#subscribeForm .subscribeFormbuttons +{ + text-align: right; + margin-top: 10px; +} +#subscribeForm .subscribeFormbuttons > button +{ + color: Blue; + border: none; + background: none; + font-weight: bold; + cursor: pointer; +} + +#subscribeForm .subscribeFormclose +{ + display: none; +} + +#subscribeForm table +{ + margin-bottom: 15px; +} + +#subscribeForm table th +{ + text-align: left; + font-weight: bold; + border-bottom: 1px solid #000000; +} + +#subscribeForm table td +{ + padding: 5px 10px 5px 20px; +} + +#subscribeForm .rowHeading td +{ + font-weight: bold; + border-bottom: 1px solid #FFFFFF; +} + +#subscribeForm table tr td:first-child +{ + padding: 5px 40px 5px 0; +} + + +/*-------- End Subscription Form --------*/ + +/*-------- Start Tag Browser --------*/ + +.tagBrowserfade +{ + position: absolute; + background-color: #aaaaaa; +} +#tagBrowser +{ + background-color: #f4f4f4; + border:1px solid #000000; + -border-radius: 5px; + -moz-border-radius: 5px; + -webkit-border-radius: 5px; + padding:15px; + position: absolute; + display: block; +} + +#tagBrowser .tagBrowserbuttons +{ + text-align:right; + margin-top: 10px; +} +#tagBrowser .tagBrowserbuttons > button +{ + color: Blue; + border: none; + background: none; + font-weight: bold; + cursor:pointer; +} + +#tagBrowser .tagBrowserclose +{ + display:none; +} + +.tagBrowserContainer { + width: 450px; +} + +.tagBrowserContainer h2 { + border-bottom: 1px solid #000000; + padding-bottom: 5px; +} +.tagBrowserContainer > p { + margin: 15px 0 15px 0; +} + +.tagBrowserContainer .tags { + margin: 5px; + height: 225px; + overflow-y: scroll; +} + + +/*-------- End Tag Browser --------*/ + +/*-------- Start List Filter Box --------*/ + +div#filterInputBox +{ + overflow:auto; + min-width:225px; +} + +.filterBox +{ + width: 100%; + background: #FFFFFF; + padding: 0px 2px; + height: 25px; + -moz-border-radius: 2px; + border-radius: 2px; + border: 1px solid #888888; + table-layout: auto; + margin-bottom:5px; +} + + .filterBox #filterImageCell + { + text-align: right; + padding: 0px; + vertical-align: middle; + } + + .IE7 .filterBox #filterImageCell + { + padding:2px 2px 0 0; + } + + .filterBox #filterImg + { + background: transparent url('searchButton.png') no-repeat 0 0; + width: 22px; + height: 22px; + } + +table.filterBox +{ + table-layout: fixed; +} + + .filterBox .stxtcell + { + padding-right: 4px; + width:90%; + } + + .filterBox .stxtcell > input + { + margin-right: 4px; + height: 26px; + line-height:26px; + width: 100%; + padding: 0px; + padding-left: 4px; + padding-top: 2px; + border: none; + } + + .IE7 .filterBox .stxtcell > input + { + height: 20px; + } + + .filterBox .stxtcell > input.stxtinptpassive + { + color: #ACACAC; + } + +/*-------- End List Filter Box --------*/ + +/*-------- Start Notifications --------*/ + +#notificationsSummaryBox +{ + font-weight: bold; + padding: .5em; +} + + +.notificationsByDay, .notifications +{ + /*list-style: none;*/ +} + +.dayHeader +{ + border-bottom: 1px solid silver; + margin-top: 1em; + padding-bottom: .5em; +} + +.notifications +{ + margin-top: .5em; +} + +ul.notifications li +{ + line-height: 1.5em; +} + +ul.notifications li.unread +{ + font-weight: bold; +} + +ul.notifications li ol, ul.notifications li ul +{ + margin-left: 1.5em; +} + +/*-------- End Notifications --------*/ + +/*-------- Start ProjectDetailsPage --------*/ + +#projectPage #projectInfo +{ + position: relative; + margin-top: 5px; + height: 100%; +} +#projectInfo .section +{ + float: left; + margin-bottom: 0px; + height: 100%; +} + + #projectInfo .section .itemBar, #projectInfo .section .itemBarLong + { + clear: both; + padding: 7px 0 7px 0; + overflow:auto; + } + + #projectInfo .section .itemBarLong:first-child + { + padding-top:3px; + } + + .IE7 #projectInfo .section .itemBar, .IE7 #projectInfo .section .itemBarLong + { + padding-top: 5px; + } + + #projectInfo .section .itemBar > label, #projectInfo .section .itemBarLong > label + { + width: 130px; + float: left; + text-transform: capitalize; + } + + #projectInfo .section .itemBar div#yourRating { + float:left; + } + + #projectInfo .section .itemBar div.RatingStar { + margin:2px 1px 0 0; + } + + #projectInfo .section .itemBar div#RatingCount { + padding: 0 0 0 3px; + } + + #projectInfo .section .itemBar .ratingsWithCountContainer img { + vertical-align:top; + float:left; + padding-right: 4px; + } + + #projectInfo .section .itemBar > span, #projectInfo .section .itemBarLong > span, #projectPage .section .itemBar > div + { + float: left; + } + + #projectInfo .section .itemBar, #projectInfo .section .itemBarLong { + width: 100%; + } + + #projectInfo .section .itemBar > span { + float: none; + } + + #projectInfo .section .itemBar > span .shareThisItem { + white-space: nowrap; + } + + #projectInfo .section .itemBarLong div + { + margin-left: 130px; + padding: 0; + } + + #projectInfo .section .viewonlinecont + { + background-color:#d3d3d3; + padding:5px 10px; + margin-top:10px; + float:left; + font-weight:bold; + } + +#projectInfo #sectionLeft +{ + width: 50%; +} +#projectInfo #sectionRight +{ + width: 50%; +} +.IE7 #projectInfo #sectionRight +{ + width: auto; +} + +#projectPage h2.projectSummary +{ + font-weight:normal; + font-size: 1.1em; + margin-bottom: 11px; + line-height:1.4; + word-wrap: break-word; +} + +.IE7 #projectPage h2.projectSummary +{ + font-family: 'Segoe UI' , 'Lucida Grande' ,Verdana,Arial,Helvetica,sans-serif; +} + +.IE #projectPage .projectTitle, .IE9 #projectPage .projectTitle +{ + width: 100%; +} + +#projectPage #reportAbuse +{ + float: left; + font-size: x-small; +} + +#projectPage .hiddenSidebar { + display: none; +} + +#projectPage .fullProjectBody { + margin-left:-275px; +} + +.IE8 #projectPage #userCard { + float: left; + height: auto; +} + +#projectPage #userCard .avatar img { + max-width: 100px; + max-height: 100px; +} + +#projectDetails +{ + overflow:hidden; +} + +#projectBody +{ + width: 100%; + overflow:hidden; +} + + #projectDetails > div:first-child + { + margin: 5px 0 0 260px; + } + + #projectBody > div:first-child + { + margin: 20px 0 0 260px; + } + + .IE7 #projectContent .tabHeaders + { + overflow:hidden; + margin-bottom:-20px; + } + +#projectPage .sidebar +{ + float: left; + width: 215px; + margin-right: -250px; +} + + #projectPage .sidebar .section + { + margin: 20px 0 10px 0; + } + + #projectPage .sidebar .section .titleBar h3 + { + padding: 0 0 2px 0; + } + + #projectPage .sidebar .section .itemBarRight + { + min-height: 21px; + position: relative; + padding-top: 5px; + } + + #projectPage .sidebar .section .titleBar + { + margin-bottom: 5px; + } + + #projectPage .sidebar .section .authorItem + { + padding: 0 0 5px 10px; + } + + #projectPage .sidebar .section .authorItem a + { + display:block; + float:left; + max-width:170px; + } + + #projectPage .sidebar .section .authorItem > div + { + float:right; + } + + #projectPage .sidebar #advertisement + { + margin-top: 20px; + } + + #projectPage .sidebar #advertisement .label + { + text-align: center; + } + + #projectPage .sidebar #moreFromAuthor + { + width:225px; + margin: 20px 0 10px 0; + float:left; + } + + #projectPage .sidebar #moreFromAuthor .bottomBar { + padding: 5px 0px 5px 25px; + text-align: right; + } + +#projectPage #Collections { + min-height:22px; + min-width:169px; +} + +#projectPage #Collections .bevelButton { + background-color: #8CC63F; +} + +#projectPage .bevelButton +{ + font-weight: bold; + border-radius: 3px; + -moz-border-radius: 3px; + -webkit-border-radius: 3px; + color: white; + padding: 2px 10px 3px; + text-align: center; +} + +#projectPage #DiscussionsTabPane .bevelButton +{ + font-weight: normal; + color: black; + padding: 1px 10px 1px; +} + +#projectPage #Downloads { + padding: 0 0 8px 0; + float: left; +} + #projectPage #Downloads > div:first-child { + float: left; + margin: 15px 0 0 0; + height:35px; + line-height:1.6; + width: 130px; + } + #projectPage #Downloads label + { + font-size:1.25em; + } + #projectPage #Downloads input + { + min-width: 100px; + padding: 3px 10px 3px 10px; + margin: 3px 10px 0 10px; + font-weight: bold; + float: left; + } + + #projectPage #Downloads .button + { + background-color:#007494; + color:#fff; + padding:5px 15px; + margin: 15px 15px 0 0; + float:left; + } + + #projectPage #Downloads .button:hover + { + background-color:#0095c4; + text-decoration:none; + } + +#projectPage #projectBody .attachments { + margin: 0 0 15px 0; +} + + #projectPage #projectBody .attachments label { + float: left; + } + + #projectPage #projectBody .attachments .files a, #projectPage #projectBody .attachments .files span { + float: left; + padding: 0 5px 0 5px; + } + +#publishBar +{ + border: 1px solid #707070; + background-color: #F8F8F8; + margin-top: 10px; +} + +#sourceItem { + height: 610px; +} + #sourceItem > div:first-child { + padding: 20px 5px 0 15px; + font-weight: bold; + } + #sourceItem > div+div { + height: 560px; + padding: 10px; + overflow: auto; + } + #sourceItem .copyCode { + font-weight: normal; + margin: 0 15px 0 0; + float: right; + } + +.sourceList { + height: 600px; + padding: 5px; + border-top: 1px solid #CCC; + margin-top: -1px; +} + + .sourceList .sourceListTabHeader + { + margin:20px 10px; + } + .sourceList .sourceListTabs + { + margin-bottom:20px; + border-bottom: 1px solid #CCC; + float:left; + width:100%; + } + .sourceList .sourceListTabs .languageTab { + padding:5px 10px 5px 10px; + font-weight: bold; + float: left; + margin: 0 3px 0px 0; + color:#00749E; + } + .sourceList .sourceListTabs .languageTab:hover + { + color: #0095C4; + } + + .sourceList .sourceListTabs .selectedLanguage { + background-color: #fff; + color: #000; + border: 1px solid #ccc; + border-bottom:none; + margin-bottom:-2px; + } + + .sourceList .sourceListTabs .unselectedLanguage { + cursor: pointer; + } + + .sourceList .endTabs { + clear: both; + } + + .sourceList .sourceListContent { + padding-top: 5px; + } + + +.sbfc, +.sbfe +{ + white-space: nowrap; + text-indent: 20px; + cursor: pointer; + padding: .2em 0em .2em 0em; + background-repeat: no-repeat; + background-position: left center; + font-weight: bold; + text-decoration: none !important; +} + +.sbfc +{ + background-image: url(../samples/node_closed.gif); +} + +.sbfe +{ + white-space: nowrap; + background-image: url(../samples/node_opened.gif); +} + +.ndbf +{ + white-space: nowrap; + text-indent:20px; +} + +.sbf +{ + white-space: nowrap; + background: url(../samples/bullet.gif) no-repeat 4px -1px; + cursor: pointer; + text-indent: 20px; + white-space: nowrap; + padding: .1em 0em .1em 0em; +} + +.sbsf, +.sbf:hover +{ + color: #000; + text-decoration: none !important; + +} + +.sbsf +{ + color: #000 !important; + background-color: rgb(232, 232, 232); +} + +.sbf:hover +{ + color: #ce8b10; +} + +/*-------- Translate --------*/ +.translatePage { + width: 900px; +} +.translatePage .fileManager { + margin-bottom:20px; +} +.translatePage .fileManager h4 { + margin-top:10px; +} +.translatePage #formContainer { + width: 100%; +} +.translatePage .formLabel { + width: 300px; + padding: 5px; +} + .translatePage .textInput { + width: 425px; +} +.translatePage TEXTAREA.richText { + height: 600px; + width: 620px; +} +.translatePage TEXTAREA.unadornedEditor { + height: 600px; +} +.translatePage .formWideLabel, .translatePage .richText { + padding: 5px; +} +.translatePage .formWideLabel, .translatePage .unadornedEditor { + width: 750px; + padding: 5px; +} +.translatePage #languageSelection { + margin: 15px; + display: inline-block; +} + +.translateTab, .translateTabSelected { + font-weight: bold; + float: left; + text-align: center; + margin: 10px; + padding: 7px; + background: #E8E8E8; + white-space: nowrap; + cursor: pointer; +} + +.translateTabSelected +{ + color: White; + background: Gray; +} + +.translateTabSelected .translateLabel, .translateTab .translateLabel +{ + white-space: nowrap; + float: left; + padding: 0 5px 0 5px; +} + +.translateTab #deleteLanguage, .translateTab #moreLanguages +{ + padding-left: 10px; +} + +.translateLabel #deleteLanguage { + color: #FFFFFF; +} +/*-------- End Translate --------*/ +/*-------- Begin Eula --------*/ + + +#eulaPagefade +{ + position: absolute; + background-color: #FFFFFF; +} + +#eulaPage +{ + background-color: #f4f4f4; + border: 1px solid #000000; + -border-radius: 5px; + -moz-border-radius: 5px; + -webkit-border-radius: 5px; + padding: 15px; + display: block; + max-width: 70%; +} + +#eulaPage .eulaPageclose +{ + text-align: right; +} + +#eulaPage .eulaPagemessage +{ + overflow: auto; + max-height: 350px; +} + + #eulaPage .eulaPagemessage h1 + { + margin: 0 0 5px 0; + } + +#eulaPage .eulaPagebuttons +{ + text-align: right; + margin-top: 10px; +} +#eulaPage .eulaPagebuttons > button +{ + color: Blue; + border: none; + background: none; + font-weight: bold; + cursor: pointer; +} + + + #eulaPage #eula #documentText + { + line-height: normal; + } + +/*-------- End DocumentView --------*/ +/*-------- Begin FAQ --------*/ + +#FAQPage #TableOfContents h2 +{ + padding: 5px; + border-bottom: 2px solid #EFEFEF; + margin: 0 0 10px 0; + max-width: 70%; +} + +#FAQPage .FAQSection +{ + padding: 10px 0px; + width: 70%; +} + + #FAQPage .FAQSection h2 + { + padding: 5px; + border-bottom: 2px solid #EFEFEF; + } + + #FAQPage .FAQSection h3 + { + padding: 5px; + } + + #FAQPage .FAQSection ul, #FAQPage .FAQSection ol + { + margin: 0; + } + + #FAQPage .FAQSection #description > div + { + overflow: auto; + padding-left: 10px; + } + + #FAQPage .FAQSection #description img + { + float: left; + } + + #FAQPage .FAQSection #description div > div + { + padding-top: 10px; + float: left; + } + + #FAQPage .FAQSection > div > div + { + padding: 0 15px; + } + + #FAQPage #Reputation th, #FAQPage #Reputation td + { + padding-left: 20px; + } + +/*-------- End FAQ --------*/ +/*-------- Begin DocumentView --------*/ + +#documentView #documentText +{ + line-height: normal; +} + +/*-------- End DocumentView --------*/ + +.Opera wbr:after +{ + content: "\00200B"; +} + + +.IE9 wbr:after +{ + content: "\00200B"; +} + +.IE8 wbr { + width: 0px; + display: inline-block; + overflow: hidden; +} + +/*-------- Begin FileManager --------*/ + +.uploadControlNoError + { + height:30px + } +.uploadControlWithError + { + height:80px; + } +/*-------- End FileManager --------*/ + +/*-------- Begin User Card --------*/ +#userCard .titleBar { + border-bottom: solid 5px #E8E8E8; + margin: 10px 0 5px 0; +} + #userCard .titleBar h3 + { + font-size: 1.0em; + font-weight: bold; + line-height: 125%; + padding: 0 0 2px 0; + } + +#userCard .userFeed { + float: right; +} + +#userCard .bio { + max-width:300px; + white-space: normal; +} + +#userCard .avatar +{ + padding: 5px 5px 10px 5px; + margin: 0 0 3px 0; + text-align: center; +} + +#userCard .itemBar { + padding: 5px; +} + + #userCard .itemBar label + { + float: left; + text-transform: capitalize; + } + + #userCard .itemBar label+span { + display: block; + margin-left: 100px; + } + +#userCard .collapsableSidebar { + clear:both; + width:100%; + margin-top: 5px; +} + +/* Profile Overrides */ +#userCard +{ + padding: 0 0 10px 0 +} + +#userCard .profile-usercard +{ + width:225px; +} + +#userCard .profile-usercard, #userCard .profile-inline, #userCard .profile-inline-header +{ + border:0px; + background-color:#fff; +} + +#userCard .profile-userimage-large +{ + margin-bottom:10px; + border:none; + width:auto; + height:auto; +} + +#userCard .profile-inline .profile-inline-header-details +{ + width:100%; + display:block; + clear:both; + margin-left:0px; +} + + + +/*-------- End User Card --------*/ +/*-------- Begin Description Progress Meter --------*/ + +#descriptionProgressMeter { + float: right; + border-top: 1px solid #DADADA; + border-left: 1px solid #DADADA; + width: 210px; + padding-left: 7px; +} + + #descriptionProgressMeter h4 { + font-weight: bold; + } + + #descriptionProgressMeter #progressGraphic { + border: 1px solid #888888; + width: 205px; + margin: 10px 0; + background-color: #E9E9E9; + padding: 1px 0 0 1px; + } + + #descriptionProgressMeter #progressGraphic div { + background-image: url("../common/progress_meter.png"); + padding-left: 5px; + } + + #descriptionProgressMeter #progressText { + font-weight: bold; + margin: 5px 0; + + } + + #descriptionProgressMeter #goodDescriptionText p+p { + padding: 5px 0; + } + + #descriptionProgressMeter #goodDescriptionText > div { + margin-top: 5px; + width: 150px; + border-radius: 5px; + -moz-border-radius: 5px; + -webkit-border-radius: 5px; + overflow: auto; + } + + #descriptionProgressMeter #goodDescriptionText div img { + float: left; + } + + #descriptionProgressMeter #goodDescriptionText div div { + margin: 7px 0 5px 10px; + } + +/*-------- End Description Progress Indicator --------*/ + +/*-------- Start Sample Pack Tab View --------*/ +.SamplePackTab #headerBar +{ + padding: 15px 5px 20px 5px; + font-weight: bold +} + .SamplePackTab #headerBar .reportCount + { + padding-top: 2px; + } + .SamplePackTab #headerBar .samplePackSort + { + float: right; + color: #666; + } +/*-------- End Sample Pack Tab View --------*/ \ No newline at end of file diff --git a/samples/winrt/ImageManipulations/description/Combined.css b/samples/winrt/ImageManipulations/description/Combined.css new file mode 100644 index 000000000..e69de29bb diff --git a/samples/winrt/ImageManipulations/description/Galleries.css b/samples/winrt/ImageManipulations/description/Galleries.css new file mode 100644 index 000000000..ac2e94ee8 --- /dev/null +++ b/samples/winrt/ImageManipulations/description/Galleries.css @@ -0,0 +1,418 @@ +/* *************************************************** +Galleries.css - Common Structure + +This is where we define common layout for structures that are truly close to common across the different +Galleries sites. To make sure this works we need to follow certain conventions. + +1. Define each logical structure in its own section with its own comment block that gives the section +a Name, Description and defines the root element if one exists (i.e #someElement). Also, mark the closing block. + +2. Indent styles in a section to represent if it is a child of a previous element. +i.e. #someDiv{ + } + #someDiv ul { + } + +3. Do not include brand specific information here like colors and fonts unless they are *really* common. + +4. If there is an element that you know will be overridden in each brand stylesheet still include it here with an empty definition. +This will aid in knowing what section to override and what selectors to use. + +i.e. #someSction a { + } + +5. When you add a new section also update the Table of Contents below so that we have a quick overview of the sections + + *****************************************************/ + +/**************************************************** +Table of Contents + + Global - global classes + + FileAttachmentDisplay - The list of attached files under the editor + Eyebrow - The breadcrumb control at the top of the master page + Pager - The common paging control, used for browsing pages of search results + Profile User Card - Elements in the profile usercard control + SideNav - The navigation side bar that contains the search filters + + +*****************************************************/ + +/******************************** +Name: Global +Root: none +Description: truly global classes +********************************/ +body { + text-align: left; + direction: ltr; +} + +img.rss { + background: url(../../../GlobalResources/Images/Rss.png) no-repeat; + background-position: 0px 0px; + height: 17px; + width: 17px; +} +/* End Global Section */ + +/******************************** +Name: FileAttachmentDisplay +Root: #fileAttachmentDisplay +Description: The list of attached files under the editor +********************************/ +#fileAttachmentDisplay { +} + #fileAttachmentDisplay .attachment { + margin-right: 10px; + float: left; + } + + #fileAttachmentDisplay .attachment .displayAttachment { + padding: 0px 0 13px 0; + float: left; + } + + #fileAttachmentDisplay .attachment .removeAttachment { + background-image: url('/Content/Common/delete.png'); + display: block; + width: 16px; + height: 16px; + float: left; + } +/* End FileAttachmentDisplay Section */ + + +/******************************** +Name: Eyebrow +Root: .EyebrowContainer +Description: The breadcrumb control at the top of the master page +********************************/ +.EyebrowContainer { +} + .EyebrowContainer div.EyebrowElement{ + display:inline; + } + + .EyebrowContainer .EyebrowElement{ + font-weight:normal + } + .EyebrowContainer .EyebrowLeafLink{ + color:#000; + } +/* End Eyebrow Section */ + +/******************************** +Name: Pager +Root: #Pager +Description: The common paging control, used for browsing pages of search results +********************************/ +#Pager { +} + #Pager div{ + display:inline; + } +/* End Pager Section */ + +/******************************** + +Name: Profile User Card +Root: #dashboardPage #userCard +Description: Elements in the profile usercard control + +********************************/ + #dashboardPage #userCard .profile-usercard-inline { + margin: 5px 0 10px 0; + } + + /* #dashboardPage #userCard .profile-usercard { + width: 288px; + } +/* End Profile User Card Section */ + +/******************************** + +Name: Discussion +Root: #DiscussionsTabPane +Description: Defines the layout of the dicussion + + +********************************/ +#DiscussionsTabPane { +} + + #DiscussionsTabPane .itemHidden + { + background: lightgrey; + } + + #discussionListItem { + } + + .discussion .postActions + { + float: right; + } + + #discussionListItem .postItem + { + white-space: pre-wrap; + word-wrap: break-word; + font-size:1em; + } + +/* End Discussion Section */ + + +/******************************** + +Name: SearchDefaultLocale +Root: .searchDefaultLocale +Description: Defines the layout of the include english result checkbox on the Browse Page + + +********************************/ +.searchDefaultLocale +{ + float: right; + margin: 20px 0 0 5px; +} + .searchDefaultLocale input + { + vertical-align:top; + } + .searchDefaultLocale span + { + margin-left: -3px; + } +/*-------- End SearchDefaultLocale --------*/ + + +/******************************** + +Name: SideNav +Root: #sideNav +Description: Defines the layout of the naviation elements on the side of the Browse Page + These represent the different filters like Code Language, Category and Tag + + +********************************/ + +#sideNav { + width: 250px; + vertical-align:top; + background-color:#eee; +} + #sideNav h3 { + } + + #sideNav .section { + padding: 0 0 10px 0; + position: relative; + } + + #sideNav .section a { + } + + #sideNav .section a:hover { + } + + #sideNav .section > div { + padding:5px 5px 5px 0; + line-height:150%; + } + + #sideNav .section ul { + list-style-type:none; + padding:0px; + margin:0px; + } + + #sideNav .section ul li { + position: relative; + padding: 5px 5px 5px 0; + } + + #sideNav .section ul li .selectedFilter { + float: left; + padding-right: 5px; + } + + #sideNav .section div.itemCount { + float: right; + } + + #sideNav .section form input[ type = "checkbox"] { + margin: 0px 4px 0px 0px; + vertical-align: middle; + } +/* End SideNav Section */ + +/*----------- Contribution Logos *******/ +.contributionLogo { + float: left; + position: relative; + margin-right: 6px; +} + +.logo_visualstudio { + background: transparent url('../common/logos/visualstudio.png') no-repeat; + width: 23px; + height: 12px; + margin-top: 3px; +} +.logo_allinonecode { + background: transparent url('../common/logos/1code.png') no-repeat; + width: 14px; + height: 16px; +} +.logo_exchange { + background: transparent url('../common/logos/exchange.png') no-repeat; + width: 14px; + height: 16px; +} +.logo_ie { + background: transparent url('../common/logos/ie.png') no-repeat; + width: 16px; + height: 16px; +} +.logo_office { + background: transparent url('../common/logos/office.png') no-repeat; + width: 17px; + height: 16px; +} +.logo_windows { + background: transparent url('../common/logos/windows.png') no-repeat; + width: 17px; + height: 16px; + } +.logo_azure { + background: transparent url('../common/logos/windowsazuredark.png') no-repeat; + width: 16px; + height: 16px; +} + +.logo_windowsphone { + background: transparent url('../common/logos/windowsphone.png') no-repeat; + width: 16px; + height: 16px; + } + + .contributionLogoTip { + position: absolute; + display: none; + border: solid 1px #CCC; + color: #333; + background-color: #F0F0F0; + font-size: 11px; + font-family: "Segoe UI",Sans-Serif; + box-shadow: 3px 3px 5px #888; + -moz-box-shadow: 3px 3px 5px #888; + z-index: 1003; + padding: 5px; + min-width: 250px; + } + +/*----------- End Contribution Logos *******/ + +.clear +{ + clear: both; +} + +.customcontributionLogoTip { + position: absolute; + display: none; + border: solid 1px #CCC; + background-color: white; + color: #333; + font-size: 11px; + font-family: "Segoe UI",Sans-Serif; + box-shadow: 3px 3px 5px #888; + -moz-box-shadow: 3px 3px 5px #888; + z-index: 1004; + padding: 5px; + min-width: 250px; +} + +.customcontributionTittle { + font-size: 14px; + margin-left: 90px; +} + +.customcontributionDiscription { + font-size: 13px; + margin: 10px 5px; + text-align: justify; +} + +.customcontribution { + float: left; + position: relative; + margin-right: 6px; +} + +.customcontributionLink { + margin-left: 5px; +} + +.customcontributionlogo { + float: left; + padding: 0 10px; + margin: 0; + width: 70px; + height: 70px; + background-repeat: no-repeat; +} + + +.logo_azure_large { + background-image: url('../common/logos/windowsazure_large.png'); +} +.logo_visualstudio_large { + background-image: url('../common/logos/visualstudio_large.png'); +} +.logo_exchange_large { + background-image: url('../common/logos/exchange_large.png'); +} +.logo_ie_large { + background-image: url('../common/logos/ie_large.png'); +} +.logo_office_large { + background-image: url('../common/logos/office_large.png'); +} +.logo_windows_large { + background-image: url('../common/logos/windows_large.png'); +} +.logo_windowsphone_large { + background-image: url('../common/logos/windowsphone_large.png'); +} + +/* Custome Header */ +.dirSubHeading .windowssdk .container +{ + background: #FF3300 url('wpappsbackground.png') no-repeat; + color: white; + padding: 8px 10px 18px 170px; +} + +.dirSubHeading .windowssdk .container h1, .dirSubHeading .windowssdk .container h2 { + color: white !important; +} + +.dirSubHeading .windowssdk .container p { + margin: 20px 0 0 0 !important; +} + +.dirSubHeading .windowssdk .container a { + background-color:#ffd800; + color: #2a2a2a !important; + cursor:pointer; + font-size:13px; + font-family:'Segoe UI Semibold','Segoe UI','Lucida Grande',Verdana,Arial,Helvetica,sans-serif; + padding:4px 12px 6px; +} + + + diff --git a/samples/winrt/ImageManipulations/description/Layout.css b/samples/winrt/ImageManipulations/description/Layout.css new file mode 100644 index 000000000..625f77763 --- /dev/null +++ b/samples/winrt/ImageManipulations/description/Layout.css @@ -0,0 +1,147 @@ +#container { + min-height: 768px; +} + +#leftSubHeaderContainer +{ + margin-top:20px; +} + +#title h1 +{ + font-size:25px; +} + +#subtitle h2 +{ + font-size:15px; +} + +#subtitle +{ + margin-left:10px; +} + + +#formContainer +{ + margin-left:10px; + margin-top:30px; +} + +.formLabel +{ + float:left; + width: 250px; +} + +.formRow +{ + clear:both; + padding: 10px 0 10px 10px; +} + + +.formRecaptchaRow +{ + clear:both; + float:left; + margin-top:20px; + margin-left:10px; + margin-bottom:20px; +} + +.formSubmitRow +{ + clear:both; + margin-top:20px; + margin-left:300px; + margin-bottom:20px; +} + +.formControl { + width:300px; + float:left; +} +.formControl .textInput +{ + width:300px; +} + +.formControl textarea +{ + width:425px; + height:100px; +} + +.formControl .tag +{ + width:425px; +} + +.formControl .richText +{ + margin-top:10px; + width:500px; + height:440px; +} + +.formWideLabel +{ + width:500px; +} + +.formBigLabel +{ + margin-top:20px; + font-size:20px; +} + +.formControlBelow +{ + clear:both; + margin-top:10px; + width:500px; +} + +.required +{ + color: Red; +} +.helpText +{ + color: #9D9D9D; + font-style: italic; +} + +#agreementSummary +{ + clear:both; + margin-top:10px; + width:800px; +} + +.field-validation-error, .validation-summary-errors +{ + color: #FF0000; + font-weight: bold; +} + +.tinyMCETemplate { + position: relative; + left: 400px; + width: 300px; + max-height: 300px; + overflow: auto; +} + +.IE6 .tinyMCETemplate { + left: 25px; +} + +.ownerBar { + padding: 5px; +} +.ownerBar .ownerBarOptions { + float: right; +} diff --git a/samples/winrt/ImageManipulations/description/c2e69f54-1c43-4037-b90b-5f775f1d945fBrand.css b/samples/winrt/ImageManipulations/description/c2e69f54-1c43-4037-b90b-5f775f1d945fBrand.css new file mode 100644 index 000000000..e3f039dfb --- /dev/null +++ b/samples/winrt/ImageManipulations/description/c2e69f54-1c43-4037-b90b-5f775f1d945fBrand.css @@ -0,0 +1,303 @@ +/*Global*/ +h1 { + font-size: 36px; + font-family: 'Segoe UI Light'; + color: #707070; + font-weight: normal; + margin-bottom: 17px !important; +} + +h2, h3, h4, h5, h6, #searchPage h3 { + font-family: 'Segoe UI', 'Lucida Grande', Verdana, Arial, Helvetica, sans-serif !important; + font-weight:normal; + color: #2A2A2A !important; +} + +a, a:link, a:visited { + color: #0095c4; +} + +body { + color:#707070; +} + +.profile-usercard { + color:#707070 !important; +} + +/*temporary setting to override msdn_windows.css +can remove once conflicting settings are removed from that file*/ + + +.LocalNavigation, .LocalNavigation .TabOn, .LocalNavigation .TabOn:hover, .LocalNavigation .TabOff, .LocalNavigation .TabOff a:hover { + display: block; + background-color:transparent !important; + color: #0095c4; +} + +.LocalNavigation .TabOff a { +color:#707070 ; +} + +/*End Global*/ + +.EyebrowContainer +{ + margin-bottom: 0 !important; +} + +#sideNav +{ + width: 215px !important; +} + +#searchPage #mainContentContainer +{ + margin-right:0 !important; + margin-left:243px !important; +} + +#searchPage .dirSubHeading h2 +{ + font-size: 14px !important; + font-weight: normal !important; + color: #454545 !important; + line-height: 1.45; +} + +#searchPage #directoryListFooter, #searchPage #Pager { + font-size: 14px; +} + +#searchPage h2, #searchPage h3 +{ + font-size: 1.25em !important; +} + +#sideNav #contributeSection h3, .sidebar #contributeSection h3, #contributeSection h3 +{ + font-size: 1.65em !important; +} + +.subMenu > h2 +{ + font-family: 'Segoe UI Light','Segoe UI', 'Lucida Grande', Verdana, Arial, Helvetica, sans-serif !important; + font-weight:normal; + font-size:30px; + margin: 15px 10px 5px 0; + padding-bottom:0px; +} + +.itemRow { +} + .itemRow .itemBody, .itemRow .itemInfo { + padding: 18px 17px 20px 0; + font-size: 14px; + line-height: 1.45em; + } + + .itemRow .itemTitle { + font-weight: normal; + } + + .itemRow .summaryBox{ + color: #454545; + } + + .Samples #MainContent .itemRow .itemTitle a { + font-weight: 600 !important; + line-height: 1.45; + } + #MainContent a.officialMicrosoftLabel + { + color: #ACACAC; + } + + +.tabContents { + border-top-width:0px; +} + +#UploadPage { + margin: 0px 0px 0px 10px; +} + #UploadPage h1 { + padding: 0; + font-size: 22px; + } + #UploadPage h2 { + color:#F39700 !important; + } + + #UploadPage #uploadPageInstruction { + margin-top:10px; + } + + #UploadPage fieldset { + margin-left:0px; + } + + #UploadPage fieldset h2 { + font-weight:normal; + } + + #UploadPage fieldset#uploadsForm{ + margin-top:25px; + } + + #UploadPage fieldset#summary textarea { + margin-left:0px; + } + + #UploadPage .projectTypeChoice > div { + height: 250px; + } + +#sideNav { +} + + #sideNav .section h3 { + background-color: transparent; + + } + + #sideNav .section UL LI { + border-bottom-width: 0px; + } + + #sideNav .section form > div { + border-bottom: none; + color: #707070; + } + #sideNav .section ul li > div.itemCount + { + color: #707070; + } + + +#searchPage { +} + + #searchPage h2, #searchPage h3 { + text-transform:none; + background-color:transparent; + font-weight:normal; + font-size:1.2em; + } + + #searchPage .browseFilterBar { + background-color:transparent; + border-width:0px; + font-weight:normal; + } + +#requestsPage { + padding-top:15px; +} + #requestsPage .tabHeaders { + overflow: visible; + } + + #requestsPage #requestsList { + border: none; + } + + + #requestsPage h2, #requestsPage h3 { + text-transform:none; + background-color:transparent; + font-weight:normal; + font-size:1.2em; + } + + .reqBrowseContent .title { + font-weight: bold !important; + color:#000 !important; + font-family: 'Segoe UI', 'Lucida Grande', Verdana, Arial, Helvetica, sans-serif !important; + } + + .reqDescPage #header #votenumber { + height: 30px; + padding: 9px 12px 3px 12px; + } + +#extraActions { +} + #extraActions .section + { + margin-bottom: 10px; + } + #extraActions .section a + { + font-weight:normal; + } + + #extraActions #contributeSection div img { + width:0px; + } + + + +#projectPage { +} + + #projectPage .projectTitle { + color: #707070; + margin: 5px 0px 15px 0px; + } + + #projectPage h2.projectSummary, #projectPage #projectInfo, #projectPage .tabHeaders { + font-size: 14px !important; + line-height: 1.45em; + color: #454545 !important; + font-weight: normal !important; + } + + #projectPage #projectInfo a { + color: #00749e; + } + + #projectPage #Downloads a, #projectPage #Downloads label { + font-size: 14px; + } + + #projectPage #reportAbuse { + font-size: 1em; + } + + #projectPage #publishBar a, #projectPage #publishBar a:visited { + color: #0095c4; + font-weight: normal; + } + + #projectPage #Collections .bevelButton{ + background-color: #F8F8F8; + color: #0095C4; + border: 1px solid #707070; + } + + #projectPage #DiscussionsTabPane .threadHeader .title { + font-weight:bold !important; + color:Black !important;#F8F8F8; + font-family: 'Segoe UI', 'Lucida Grande', Verdana, Arial, Helvetica, sans-serif !important; + } + + #projectPage .sidebar .section .titleBar h3 { + font-weight:normal; + font-size:1.2em; + } + +#LocalNav { +} + + #LocalNav.HeaderTabs { + margin-left:11px; + } + + +#searchPage .dirSubHeading h1 +{ + margin-bottom:17px !important; +} + + diff --git a/samples/winrt/ImageManipulations/description/iframedescription.css b/samples/winrt/ImageManipulations/description/iframedescription.css new file mode 100644 index 000000000..9abc9cdb3 --- /dev/null +++ b/samples/winrt/ImageManipulations/description/iframedescription.css @@ -0,0 +1,179 @@ +body { + color: #000000; + font-family: 'Segoe UI',Verdana,Arial; + font-size: 0.813em; + font-style: normal; + word-wrap: break-word; +} + +/*BEGIN HEADERS*/ +.h1, h1 { + color: #3A3E43; + font-family: 'Segoe UI',Verdana,Arial; + font-size: 1.4em; + font-weight: bold; + margin: 0; +} + +.h2, h2 { + color: #3A3E43; + font-family: 'Segoe UI',Verdana,Arial; + font-size: 1.2em; + font-weight: bold; +} +.h3, h3 { + color: #3A3E43; + font-family: 'Segoe UI',Verdana,Arial; + font-size: 1.077em; + font-weight: bold; +} +.h4, h4 { + color: #3A3E43; + font-family: 'Segoe UI',Verdana,Arial; + font-size: 1em; + font-weight: bold; +} +h4.subHeading { + margin-bottom: 7px; + margin-top: 13px; +} +/*END HEADERS*/ + +/*BEGIN LINKS*/ +a:link { + color: #00749E; + text-decoration: none; +} +a:hover { + text-decoration: underline; +} +a:visited { + color: #960BB4; + text-decoration: none; +} +a:focus { + outline: 1px dotted #000000; +} + +a.libraryLink:link { + text-decoration:none; + border-bottom:1px dotted; +} + +/*END LINKS*/ + +/*BEGIN IMAGES*/ +img { + border: 0 none; +} +/*END IMAGES*/ + +/*BEGIN TABLE*/ +.title table { + color: #000000; + font-family: 'Segoe UI',Verdana,Arial; + font-size: 1.077em; + font-style: normal; +} +table { + border-collapse: collapse; +} + +table, table th, table td { + border:1px solid #BBBBBB; +} +/*END TABLE*/ + +/*BEGIN LIST*/ +ul { + list-style-type: disc; + margin-left:40px; + padding-left: 0; +} +ul li { + padding-bottom: 10px; +} +ol { + margin-left:40px; + padding-left: 0; +} +ol li { + padding-bottom: 10px; +} +/*END LIST*/ + +.scriptcode { + position: relative; + padding: 8px 8px 8px 8px; + background: #FFFFFF; + font-size: 12px; + line-height: 125%; + font-weight:normal; +} +.scriptcode pre +{ + white-space: pre-wrap !important; /* css-3 */ + word-wrap: break-word !important; /* Internet Explorer 5.5+ */ + margin:0 0 10px 0 !important; + padding: 10px; + border-top: solid 2px #D0D2D2; + border-bottom: solid 2px #D0D2D2; + border-left: solid 1px #D0D2D2; + border-right: solid 1px #D0D2D2; +} + +.scriptcode .title { + color:#E66A38; + font-size: 12px; + font-weight:bold; + margin: 0; + min-height: 23px; +} +.scriptcode .title > span:first-child { + border-left: solid 1px #D0D2D2; +} +.scriptcode .title > span { + padding: 4px 8px 4px 8px; + display: inline-block; + border-top: 1px solid #D0D2D2; + border-right: 1px solid #D0D2D2; + border-collapse: collapse; + text-align: center; + background: white; +} +.scriptcode .title > span.otherTab { + color: #1364C4; + background: #EFF5FF; + cursor: pointer; +} + +.scriptcode .hidden { + display: none !important; + visibility: hidden !important; +} + +.scriptcode .copyCode { + padding: 8px 2px 0 2px !important; + margin-right: 15px; + position: absolute !important; + right: 0 !important; + top: 17px; + display:block !important; + background: #FFFFFF; +} +.scriptcode .pluginLinkHolder { + display: none; +} +.scriptcode .pluginEditHolderLink { + display: none; +} + +.Opera wbr +{ + display: inline-block; +} + +.IE9 wbr:after +{ +content: "\00200B"; +} diff --git a/samples/winrt/ImageManipulations/description/offline.js b/samples/winrt/ImageManipulations/description/offline.js new file mode 100644 index 000000000..f5d07c8af --- /dev/null +++ b/samples/winrt/ImageManipulations/description/offline.js @@ -0,0 +1,52 @@ +var Galleries = Galleries || { }; + +(function() { + + function findElem(parent, tagName, className) { + var elemToSearch = (parent) ? parent : document.body; + var tagMatch = elemToSearch.getElementsByTagName(tagName); + var evaluator = function(elem) { + return (className) ? (elem.className.indexOf(className) > -1) : true; + }; + + return findArrayElem(tagMatch, evaluator); + } + + function findArrayElem(array, evaluator) { + var newArray = new Array(); + for (var count = 0; count < array.length; count++) { + if (evaluator(array[count])) { + newArray.push(array[count]); + } + } + return newArray; + } + + function iterateElem(elems, delegate) { + for(var count = 0; count < elems.length; count++) { + delegate(count, elems[count]); + } + } + + function isHidden(elem) { + return (elem.offsetHeight === 0 && elem.offsetWidth === 0) || elem.style && elem.style.display === "none"; + } + + function onWindowLoad(callback) { + attachEventHandler(null, 'load', callback); + } + + function attachEventHandler(elem, event, callback) { + var elemToAttach = (elem) ? elem : window; + if (document.addEventListener) { + elemToAttach.addEventListener(event, callback, false); + } else if ( document.attachEvent ) { + elemToAttach.attachEvent('on' + event, callback); + } + } + + Galleries.findElem = findElem; + Galleries.iterateElem = iterateElem; + Galleries.attachEventHandler = attachEventHandler; + Galleries.onWindowLoad = onWindowLoad; +})(); \ No newline at end of file diff --git a/samples/winrt/ImageManipulations/license.rtf b/samples/winrt/ImageManipulations/license.rtf new file mode 100644 index 000000000..690a7ad07 --- /dev/null +++ b/samples/winrt/ImageManipulations/license.rtf @@ -0,0 +1,25 @@ +{\rtf1\ansi\ansicpg1252\uc1\htmautsp\deff2{\fonttbl{\f0\fcharset0 Times New Roman;}{\f2\fcharset0 MS Shell Dlg;}}{\colortbl\red0\green0\blue0;\red255\green255\blue255;}\loch\hich\dbch\pard\plain\ltrpar\itap0{\lang1033\fs16\f2\cf0 \cf0\ql{\f2 \line \li0\ri0\sa0\sb0\fi0\ql\par} +{\fs40\f2 {\ltrch MICROSOFT LIMITED PUBLIC LICENSE version 1.1}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 \line {\ltrch ----------------------}\line \li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch This license governs use of code marked as \ldblquote sample\rdblquote or \ldblquote example\rdblquote available on this web site without a license agreement, as provided under the section above titled \ldblquote NOTICE SPECIFIC TO SOFTWARE AVAILABLE ON THIS WEB SITE.\rdblquote If you use such code (the \ldblquote software\rdblquote ), you accept this license. If you do not accept the license, do not use the software.}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 \line \li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch 1. Definitions}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch The terms \ldblquote reproduce,\rdblquote \ldblquote reproduction,\rdblquote \ldblquote derivative works,\rdblquote and \ldblquote distribution\rdblquote have the same meaning here as under U.S. copyright law. }\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch A \ldblquote contribution\rdblquote is the original software, or any additions or changes to the software.}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch A \ldblquote contributor\rdblquote is any person that distributes its contribution under this license.}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch \ldblquote Licensed patents\rdblquote are a contributor\rquote s patent claims that read directly on its contribution.}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 \line \li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch 2. Grant of Rights}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch (A) Copyright Grant - Subject to the terms of this license, including the license conditions and limitations in section 3, each contributor grants you a non-exclusive, worldwide, royalty-free copyright license to reproduce its contribution, prepare derivative works of its contribution, and distribute its contribution or any derivative works that you create.}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch (B) Patent Grant - Subject to the terms of this license, including the license conditions and limitations in section 3, each contributor grants you a non-exclusive, worldwide, royalty-free license under its licensed patents to make, have made, use, sell, offer for sale, import, and/or otherwise dispose of its contribution in the software or derivative works of the contribution in the software.}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 \line \li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch 3. Conditions and Limitations}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch (A) No Trademark License- This license does not grant you rights to use any contributors\rquote name, logo, or trademarks.}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch (B) If you bring a patent claim against any contributor over patents that you claim are infringed by the software, your patent license from such contributor to the software ends automatically.}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch (C) If you distribute any portion of the software, you must retain all copyright, patent, trademark, and attribution notices that are present in the software.}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch (D) If you distribute any portion of the software in source code form, you may do so only under this license by including a complete copy of this license with your distribution. If you distribute any portion of the software in compiled or object code form, you may only do so under a license that complies with this license.}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch (E) The software is licensed \ldblquote as-is.\rdblquote You bear the risk of using it. The contributors give no express warranties, guarantees or conditions. You may have additional consumer rights under your local laws which this license cannot change. To the extent permitted under your local laws, the contributors exclude the implied warranties of merchantability, fitness for a particular purpose and non-infringement.}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 {\ltrch (F) Platform Limitation - The licenses granted in sections 2(A) and 2(B) extend only to the software or derivative works that you create that run directly on a Microsoft Windows operating system product, Microsoft run-time technology (such as the .NET Framework or Silverlight), or Microsoft application platform (such as Microsoft Office or Microsoft Dynamics).}\li0\ri0\sa0\sb0\fi0\ql\par} +{\f2 \line \li0\ri0\sa0\sb0\fi0\ql\par} +} +} \ No newline at end of file From 9e06287121155f1312cc64a23a5e56bf08ffcfad Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Mon, 24 Jun 2013 02:32:57 -0700 Subject: [PATCH 32/75] Windows RT sample updated. Unused scenarious removed. Grey scale convertion replaced with cv::Canny call. --- platforms/scripts/cmake_winrt.cmd | 2 +- platforms/winrt/arm.winrt.toolchain.cmake | 13 +- .../ImageManipulations/C++/AudioCapture.xaml | 62 -- .../C++/AudioCapture.xaml.cpp | 366 ------------ .../C++/AudioCapture.xaml.h | 70 --- .../ImageManipulations/C++/BasicCapture.xaml | 87 --- .../C++/BasicCapture.xaml.cpp | 535 ------------------ .../C++/BasicCapture.xaml.h | 88 --- .../ImageManipulations/C++/Constants.cpp | 2 - .../winrt/ImageManipulations/C++/Constants.h | 2 +- .../ImageManipulations/C++/MainPage.xaml | 6 +- .../C++/MediaCapture.vcxproj | 160 +++++- .../C++/MediaCapture.vcxproj.filters | 25 +- .../MediaExtensions/Grayscale/Grayscale.cpp | 250 +------- .../C++/MediaExtensions/Grayscale/Grayscale.h | 34 +- .../Grayscale/Grayscale.vcxproj | 20 +- 16 files changed, 221 insertions(+), 1501 deletions(-) delete mode 100644 samples/winrt/ImageManipulations/C++/AudioCapture.xaml delete mode 100644 samples/winrt/ImageManipulations/C++/AudioCapture.xaml.cpp delete mode 100644 samples/winrt/ImageManipulations/C++/AudioCapture.xaml.h delete mode 100644 samples/winrt/ImageManipulations/C++/BasicCapture.xaml delete mode 100644 samples/winrt/ImageManipulations/C++/BasicCapture.xaml.cpp delete mode 100644 samples/winrt/ImageManipulations/C++/BasicCapture.xaml.h diff --git a/platforms/scripts/cmake_winrt.cmd b/platforms/scripts/cmake_winrt.cmd index aafed7d09..c6d8cb8e0 100644 --- a/platforms/scripts/cmake_winrt.cmd +++ b/platforms/scripts/cmake_winrt.cmd @@ -3,4 +3,4 @@ cd build rem call "C:\Program Files\Microsoft Visual Studio 11.0\VC\bin\x86_arm\vcvarsx86_arm.bat" -cmake.exe -GNinja -DCMAKE_BUILD_TYPE=Release -DWITH_FFMPEG=OFF -DBUILD_opencv_gpu=OFF -DBUILD_opencv_python=OFF -DCMAKE_TOOLCHAIN_FILE=..\..\winrt\arm.winrt.toolchain.cmake ..\..\.. +cmake.exe -GNinja -DWITH_TBB=ON -DBUILD_TBB=ON -DCMAKE_BUILD_TYPE=Release -DWITH_FFMPEG=OFF -DBUILD_opencv_gpu=OFF -DBUILD_opencv_python=OFF -DCMAKE_TOOLCHAIN_FILE=..\..\winrt\arm.winrt.toolchain.cmake ..\..\.. diff --git a/platforms/winrt/arm.winrt.toolchain.cmake b/platforms/winrt/arm.winrt.toolchain.cmake index b34056cd5..ac9af117d 100644 --- a/platforms/winrt/arm.winrt.toolchain.cmake +++ b/platforms/winrt/arm.winrt.toolchain.cmake @@ -3,4 +3,15 @@ set(CMAKE_SYSTEM_PROCESSOR "arm-v7a") set(CMAKE_FIND_ROOT_PATH "${CMAKE_SOURCE_DIR}/platforms/winrt") set(CMAKE_REQUIRED_DEFINITIONS -D_ARM_WINAPI_PARTITION_DESKTOP_SDK_AVAILABLE) -add_definitions(-D_ARM_WINAPI_PARTITION_DESKTOP_SDK_AVAILABLE) \ No newline at end of file +add_definitions(-D_ARM_WINAPI_PARTITION_DESKTOP_SDK_AVAILABLE) + +set(CMAKE_CXX_FLAGS "" CACHE STRING "c++ flags") +set(CMAKE_C_FLAGS "" CACHE STRING "c flags") + +set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -ZW -EHsc -GS") +set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -GS") + + +set(CMAKE_SHARED_LINKER_FLAGS "/r:System.Runtime.WindowsRuntime.dll /r:System.Threading.Tasks.dll" CACHE STRING "shared linker flags") +set(CMAKE_MODULE_LINKER_FLAGS "/r:System.Runtime.WindowsRuntime.dll /r:System.Threading.Tasks.dll" CACHE STRING "module linker flags") +set(CMAKE_EXE_LINKER_FLAGS "/r:System.Runtime.WindowsRuntime.dll /r:System.Threading.Tasks.dll" CACHE STRING "executable linker flags") \ No newline at end of file diff --git a/samples/winrt/ImageManipulations/C++/AudioCapture.xaml b/samples/winrt/ImageManipulations/C++/AudioCapture.xaml deleted file mode 100644 index be65bcd8c..000000000 --- a/samples/winrt/ImageManipulations/C++/AudioCapture.xaml +++ /dev/null @@ -1,62 +0,0 @@ - - - - - - - - - - - - - - - - This scenario shows how to do an audio only capture using the default microphone. Click on StartRecord to start recording. - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/samples/winrt/ImageManipulations/C++/AudioCapture.xaml.cpp b/samples/winrt/ImageManipulations/C++/AudioCapture.xaml.cpp deleted file mode 100644 index 37fc379d3..000000000 --- a/samples/winrt/ImageManipulations/C++/AudioCapture.xaml.cpp +++ /dev/null @@ -1,366 +0,0 @@ -//********************************************************* -// -// Copyright (c) Microsoft. All rights reserved. -// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF -// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY -// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR -// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. -// -//********************************************************* - -// -// AudioCapture.xaml.cpp -// Implementation of the AudioCapture class -// - -#include "pch.h" -#include "AudioCapture.xaml.h" -#include -using namespace concurrency; - -using namespace SDKSample::MediaCapture; - -using namespace Windows::UI::Xaml; -using namespace Windows::UI::Xaml::Navigation; -using namespace Windows::UI::Xaml::Data; -using namespace Windows::System; -using namespace Windows::Foundation; -using namespace Platform; -using namespace Windows::UI; -using namespace Windows::UI::Core; -using namespace Windows::UI::Xaml; -using namespace Windows::UI::Xaml::Controls; -using namespace Windows::UI::Xaml::Data; -using namespace Windows::UI::Xaml::Media; -using namespace Windows::Storage; -using namespace Windows::Media::MediaProperties; -using namespace Windows::Storage::Streams; -using namespace Windows::System; -using namespace Windows::UI::Xaml::Media::Imaging; - - -AudioCapture::AudioCapture() -{ - InitializeComponent(); - ScenarioInit(); -} - -/// -/// Invoked when this page is about to be displayed in a Frame. -/// -/// Event data that describes how this page was reached. The Parameter -/// property is typically used to configure the page. -void AudioCapture::OnNavigatedTo(NavigationEventArgs^ e) -{ - // A pointer back to the main page. This is needed if you want to call methods in MainPage such - // as NotifyUser() - rootPage = MainPage::Current; - m_eventRegistrationToken = Windows::Media::MediaControl::SoundLevelChanged += ref new EventHandler(this, &AudioCapture::SoundLevelChanged); -} - -void AudioCapture::OnNavigatedFrom(NavigationEventArgs^ e) -{ - // A pointer back to the main page. This is needed if you want to call methods in MainPage such - // as NotifyUser() - Windows::Media::MediaControl::SoundLevelChanged -= m_eventRegistrationToken; -} - -void AudioCapture::ScenarioInit() -{ - try - { - rootPage = MainPage::Current; - btnStartDevice3->IsEnabled = true; - btnStartStopRecord3->IsEnabled = false; - m_bRecording = false; - playbackElement3->Source = nullptr; - m_bSuspended = false; - ShowStatusMessage(""); - } - catch (Exception ^e) - { - ShowExceptionMessage(e); - } - -} - -void AudioCapture::ScenarioReset() -{ - ScenarioInit(); -} - - -void AudioCapture::SoundLevelChanged(Object^ sender, Object^ e) -{ - create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, ref new Windows::UI::Core::DispatchedHandler([this]() - { - if(Windows::Media::MediaControl::SoundLevel != Windows::Media::SoundLevel::Muted) - { - ScenarioReset(); - } - else - { - if (m_bRecording) - { - ShowStatusMessage("Stopping Record on invisibility"); - - create_task(m_mediaCaptureMgr->StopRecordAsync()).then([this](task recordTask) - { - try - { - recordTask.get(); - m_bRecording = false; - }catch (Exception ^e) - { - ShowExceptionMessage(e); - } - }); - } - } - }))); -} - -void AudioCapture::RecordLimitationExceeded(Windows::Media::Capture::MediaCapture ^currentCaptureObject) -{ - try - { - if (m_bRecording) - { - create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, ref new Windows::UI::Core::DispatchedHandler([this](){ - try - { - ShowStatusMessage("Stopping Record on exceeding max record duration"); - EnableButton(false, "StartStopRecord"); - create_task(m_mediaCaptureMgr->StopRecordAsync()).then([this](task recordTask) - { - try - { - recordTask.get(); - m_bRecording = false; - SwitchRecordButtonContent(); - EnableButton(true, "StartStopRecord"); - ShowStatusMessage("Stopped record on exceeding max record duration:" + m_recordStorageFile->Path); - } - catch (Exception ^e) - { - ShowExceptionMessage(e); - m_bRecording = false; - SwitchRecordButtonContent(); - EnableButton(true, "StartStopRecord"); - } - }); - - } - catch (Exception ^e) - { - m_bRecording = false; - SwitchRecordButtonContent(); - EnableButton(true, "StartStopRecord"); - ShowExceptionMessage(e); - } - - }))); - } - } - catch (Exception ^e) - { - m_bRecording = false; - SwitchRecordButtonContent(); - EnableButton(true, "StartStopRecord"); - ShowExceptionMessage(e); - } -} - -void AudioCapture::Failed(Windows::Media::Capture::MediaCapture ^currentCaptureObject, Windows::Media::Capture::MediaCaptureFailedEventArgs^ currentFailure) -{ - String ^message = "Fatal error: " + currentFailure->Message; - create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, - ref new Windows::UI::Core::DispatchedHandler([this, message]() - { - ShowStatusMessage(message); - }))); -} - -void AudioCapture::btnStartDevice_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) -{ - try - { - EnableButton(false, "StartDevice"); - ShowStatusMessage("Starting device"); - auto mediaCapture = ref new Windows::Media::Capture::MediaCapture(); - m_mediaCaptureMgr = mediaCapture; - auto settings = ref new Windows::Media::Capture::MediaCaptureInitializationSettings(); - settings->StreamingCaptureMode = Windows::Media::Capture::StreamingCaptureMode::Audio; - create_task(mediaCapture->InitializeAsync()).then([this](task initTask) - { - try - { - initTask.get(); - - auto mediaCapture = m_mediaCaptureMgr.Get(); - EnableButton(true, "StartPreview"); - EnableButton(true, "StartStopRecord"); - EnableButton(true, "TakePhoto"); - ShowStatusMessage("Device initialized successful"); - mediaCapture->RecordLimitationExceeded += ref new Windows::Media::Capture::RecordLimitationExceededEventHandler(this, &AudioCapture::RecordLimitationExceeded); - mediaCapture->Failed += ref new Windows::Media::Capture::MediaCaptureFailedEventHandler(this, &AudioCapture::Failed); - } - catch (Exception ^ e) - { - ShowExceptionMessage(e); - } - }); - } - catch (Platform::Exception^ e) - { - ShowExceptionMessage(e); - } -} - -void AudioCapture::btnStartStopRecord_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) -{ - try - { - String ^fileName; - EnableButton(false, "StartStopRecord"); - - if (!m_bRecording) - { - ShowStatusMessage("Starting Record"); - - fileName = AUDIO_FILE_NAME; - - task(KnownFolders::VideosLibrary->CreateFileAsync(fileName, Windows::Storage::CreationCollisionOption::GenerateUniqueName)).then([this](task fileTask) - { - try - { - this->m_recordStorageFile = fileTask.get(); - ShowStatusMessage("Create record file successful"); - - MediaEncodingProfile^ recordProfile= nullptr; - recordProfile = MediaEncodingProfile::CreateM4a(Windows::Media::MediaProperties::AudioEncodingQuality::Auto); - - create_task(m_mediaCaptureMgr->StartRecordToStorageFileAsync(recordProfile, this->m_recordStorageFile)).then([this](task recordTask) - { - try - { - recordTask.get(); - m_bRecording = true; - SwitchRecordButtonContent(); - EnableButton(true, "StartStopRecord"); - - ShowStatusMessage("Start Record successful"); - - - }catch (Exception ^e) - { - ShowExceptionMessage(e); - m_bRecording = false; - SwitchRecordButtonContent(); - EnableButton(true, "StartStopRecord"); - } - }); - } - catch (Exception ^e) - { - m_bRecording = false; - SwitchRecordButtonContent(); - EnableButton(true, "StartStopRecord"); - ShowExceptionMessage(e); - } - } - ); - } - else - { - ShowStatusMessage("Stopping Record"); - - create_task(m_mediaCaptureMgr->StopRecordAsync()).then([this](task) - { - try - { - m_bRecording = false; - EnableButton(true, "StartStopRecord"); - SwitchRecordButtonContent(); - - ShowStatusMessage("Stop record successful"); - if (!m_bSuspended) - { - task(this->m_recordStorageFile->OpenAsync(FileAccessMode::Read)).then([this](task streamTask) - { - try - { - ShowStatusMessage("Record file opened"); - auto stream = streamTask.get(); - ShowStatusMessage(this->m_recordStorageFile->Path); - playbackElement3->AutoPlay = true; - playbackElement3->SetSource(stream, this->m_recordStorageFile->FileType); - playbackElement3->Play(); - } - catch (Exception ^e) - { - ShowExceptionMessage(e); - m_bRecording = false; - SwitchRecordButtonContent(); - EnableButton(true, "StartStopRecord"); - } - }); - } - } - catch (Exception ^e) - { - m_bRecording = false; - SwitchRecordButtonContent(); - EnableButton(true, "StartStopRecord"); - ShowExceptionMessage(e); - } - }); - } - } - catch (Platform::Exception^ e) - { - EnableButton(true, "StartStopRecord"); - ShowExceptionMessage(e); - m_bRecording = false; - SwitchRecordButtonContent(); - } -} - - -void AudioCapture::ShowStatusMessage(Platform::String^ text) -{ - rootPage->NotifyUser(text, NotifyType::StatusMessage); -} - -void AudioCapture::ShowExceptionMessage(Platform::Exception^ ex) -{ - rootPage->NotifyUser(ex->Message, NotifyType::ErrorMessage); -} - -void AudioCapture::SwitchRecordButtonContent() -{ - { - if (m_bRecording) - { - btnStartStopRecord3->Content="StopRecord"; - } - else - { - btnStartStopRecord3->Content="StartRecord"; - } - } -} -void AudioCapture::EnableButton(bool enabled, String^ name) -{ - if (name->Equals("StartDevice")) - { - btnStartDevice3->IsEnabled = enabled; - } - - else if (name->Equals("StartStopRecord")) - { - btnStartStopRecord3->IsEnabled = enabled; - } - -} - diff --git a/samples/winrt/ImageManipulations/C++/AudioCapture.xaml.h b/samples/winrt/ImageManipulations/C++/AudioCapture.xaml.h deleted file mode 100644 index b75efdc72..000000000 --- a/samples/winrt/ImageManipulations/C++/AudioCapture.xaml.h +++ /dev/null @@ -1,70 +0,0 @@ -//********************************************************* -// -// Copyright (c) Microsoft. All rights reserved. -// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF -// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY -// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR -// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. -// -//********************************************************* - -// -// AudioCapture.xaml.h -// Declaration of the AudioCapture class -// - -#pragma once - -#include "pch.h" -#include "AudioCapture.g.h" -#include "MainPage.xaml.h" - -#define AUDIO_FILE_NAME "audio.mp4" - -namespace SDKSample -{ - namespace MediaCapture - { - /// - /// An empty page that can be used on its own or navigated to within a Frame. - /// - [Windows::Foundation::Metadata::WebHostHidden] - public ref class AudioCapture sealed - { - public: - AudioCapture(); - - protected: - virtual void OnNavigatedTo(Windows::UI::Xaml::Navigation::NavigationEventArgs^ e) override; - virtual void OnNavigatedFrom(Windows::UI::Xaml::Navigation::NavigationEventArgs^ e) override; - private: - MainPage^ rootPage; - - void ScenarioInit(); - void ScenarioReset(); - - void SoundLevelChanged(Object^ sender, Object^ e); - void RecordLimitationExceeded(Windows::Media::Capture::MediaCapture ^ mediaCapture); - void Failed(Windows::Media::Capture::MediaCapture ^ mediaCapture, Windows::Media::Capture::MediaCaptureFailedEventArgs ^ args); - - void btnStartDevice_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); - - void btnStartPreview_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); - - void btnStartStopRecord_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); - - void ShowStatusMessage(Platform::String^ text); - void ShowExceptionMessage(Platform::Exception^ ex); - - void EnableButton(bool enabled, Platform::String ^name); - void SwitchRecordButtonContent(); - - Platform::Agile m_mediaCaptureMgr; - Windows::Storage::StorageFile^ m_photoStorageFile; - Windows::Storage::StorageFile^ m_recordStorageFile; - bool m_bRecording; - bool m_bSuspended; - Windows::Foundation::EventRegistrationToken m_eventRegistrationToken; - }; - } -} diff --git a/samples/winrt/ImageManipulations/C++/BasicCapture.xaml b/samples/winrt/ImageManipulations/C++/BasicCapture.xaml deleted file mode 100644 index 2cc0b0a5f..000000000 --- a/samples/winrt/ImageManipulations/C++/BasicCapture.xaml +++ /dev/null @@ -1,87 +0,0 @@ - - - - - - - - - - - - - - - - - This scenario demonstrates how to use the MediaCapture API to preview the camera stream, record a video, and take a picture using default initialization settings. - You can also adjust the brightness and contrast. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/samples/winrt/ImageManipulations/C++/BasicCapture.xaml.cpp b/samples/winrt/ImageManipulations/C++/BasicCapture.xaml.cpp deleted file mode 100644 index f385fa9a7..000000000 --- a/samples/winrt/ImageManipulations/C++/BasicCapture.xaml.cpp +++ /dev/null @@ -1,535 +0,0 @@ -//********************************************************* -// -// Copyright (c) Microsoft. All rights reserved. -// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF -// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY -// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR -// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. -// -//********************************************************* - -// -// BasicCapture.xaml.cpp -// Implementation of the BasicCapture class -// - -#include "pch.h" -#include "BasicCapture.xaml.h" -#include "ppl.h" - -using namespace Windows::System; -using namespace Windows::Foundation; -using namespace Platform; -using namespace Windows::UI; -using namespace Windows::UI::Core; -using namespace Windows::UI::Xaml; -using namespace Windows::UI::Xaml::Controls; -using namespace Windows::UI::Xaml::Navigation; -using namespace Windows::UI::Xaml::Data; -using namespace Windows::UI::Xaml::Media; -using namespace Windows::Storage; -using namespace Windows::Media::MediaProperties; -using namespace Windows::Storage::Streams; -using namespace Windows::System; -using namespace Windows::UI::Xaml::Media::Imaging; - -using namespace SDKSample::MediaCapture; -using namespace concurrency; - - -BasicCapture::BasicCapture() -{ - InitializeComponent(); - ScenarioInit(); -} - -/// -/// Invoked when this page is about to be displayed in a Frame. -/// -/// Event data that describes how this page was reached. The Parameter -/// property is typically used to configure the page. -void BasicCapture::OnNavigatedTo(NavigationEventArgs^ e) -{ - // A pointer back to the main page. This is needed if you want to call methods in MainPage such - // as NotifyUser() - rootPage = MainPage::Current; - m_eventRegistrationToken = Windows::Media::MediaControl::SoundLevelChanged += ref new EventHandler(this, &BasicCapture::SoundLevelChanged); -} - -void BasicCapture::OnNavigatedFrom(NavigationEventArgs^ e) -{ - // A pointer back to the main page. This is needed if you want to call methods in MainPage such - // as NotifyUser() - - Windows::Media::MediaControl::SoundLevelChanged -= m_eventRegistrationToken; - m_currentScenarioLoaded = false; -} - - -void BasicCapture::ScenarioInit() -{ - try - { - btnStartDevice1->IsEnabled = true; - btnStartPreview1->IsEnabled = false; - btnStartStopRecord1->IsEnabled = false; - m_bRecording = false; - m_bPreviewing = false; - btnStartStopRecord1->Content = "StartRecord"; - btnTakePhoto1->IsEnabled = false; - previewElement1->Source = nullptr; - playbackElement1->Source = nullptr; - imageElement1->Source= nullptr; - sldBrightness->IsEnabled = false; - sldContrast->IsEnabled = false; - m_bSuspended = false; - previewCanvas1->Visibility = Windows::UI::Xaml::Visibility::Collapsed; - - } - catch (Exception ^e) - { - ShowExceptionMessage(e); - } - -} - -void BasicCapture::ScenarioReset() -{ - previewCanvas1->Visibility = Windows::UI::Xaml::Visibility::Collapsed; - ScenarioInit(); -} - -void BasicCapture::SoundLevelChanged(Object^ sender, Object^ e) -{ - create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, ref new Windows::UI::Core::DispatchedHandler([this]() - { - if(Windows::Media::MediaControl::SoundLevel != Windows::Media::SoundLevel::Muted) - { - ScenarioReset(); - } - else - { - if (m_bRecording) - { - ShowStatusMessage("Stopping Record on invisibility"); - - create_task(m_mediaCaptureMgr->StopRecordAsync()).then([this](task recordTask) - { - m_bRecording = false; - }); - } - if (m_bPreviewing) - { - ShowStatusMessage("Stopping Preview on invisibility"); - - create_task(m_mediaCaptureMgr->StopPreviewAsync()).then([this](task previewTask) - { - try - { - previewTask.get(); - m_bPreviewing = false; - } - catch (Exception ^e) - { - ShowExceptionMessage(e); - } - }); - } - } - }))); -} - -void BasicCapture::RecordLimitationExceeded(Windows::Media::Capture::MediaCapture ^currentCaptureObject) -{ - try - { - if (m_bRecording) - { - create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, ref new Windows::UI::Core::DispatchedHandler([this](){ - try - { - ShowStatusMessage("Stopping Record on exceeding max record duration"); - EnableButton(false, "StartStopRecord"); - create_task(m_mediaCaptureMgr->StopRecordAsync()).then([this](task recordTask) - { - try - { - recordTask.get(); - m_bRecording = false; - SwitchRecordButtonContent(); - EnableButton(true, "StartStopRecord"); - ShowStatusMessage("Stopped record on exceeding max record duration:" + m_recordStorageFile->Path); - } - catch (Exception ^e) - { - ShowExceptionMessage(e); - m_bRecording = false; - SwitchRecordButtonContent(); - EnableButton(true, "StartStopRecord"); - } - }); - - } - catch (Exception ^e) - { - m_bRecording = false; - SwitchRecordButtonContent(); - EnableButton(true, "StartStopRecord"); - ShowExceptionMessage(e); - } - - }))); - } - } - catch (Exception ^e) - { - m_bRecording = false; - SwitchRecordButtonContent(); - EnableButton(true, "StartStopRecord"); - ShowExceptionMessage(e); - } -} - -void BasicCapture::Failed(Windows::Media::Capture::MediaCapture ^currentCaptureObject, Windows::Media::Capture::MediaCaptureFailedEventArgs^ currentFailure) -{ - String ^message = "Fatal error: " + currentFailure->Message; - create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, - ref new Windows::UI::Core::DispatchedHandler([this, message]() - { - ShowStatusMessage(message); - }))); -} - -void BasicCapture::btnStartDevice_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) -{ - try - { - EnableButton(false, "StartDevice"); - ShowStatusMessage("Starting device"); - auto mediaCapture = ref new Windows::Media::Capture::MediaCapture(); - m_mediaCaptureMgr = mediaCapture; - create_task(mediaCapture->InitializeAsync()).then([this](task initTask) - { - try - { - initTask.get(); - - auto mediaCapture = m_mediaCaptureMgr.Get(); - EnableButton(true, "StartPreview"); - EnableButton(true, "StartStopRecord"); - EnableButton(true, "TakePhoto"); - ShowStatusMessage("Device initialized successful"); - mediaCapture->RecordLimitationExceeded += ref new Windows::Media::Capture::RecordLimitationExceededEventHandler(this, &BasicCapture::RecordLimitationExceeded); - mediaCapture->Failed += ref new Windows::Media::Capture::MediaCaptureFailedEventHandler(this, &BasicCapture::Failed); - } - catch (Exception ^ e) - { - ShowExceptionMessage(e); - } - } - ); - } - catch (Platform::Exception^ e) - { - ShowExceptionMessage(e); - } -} - -void BasicCapture::btnStartPreview_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) -{ - m_bPreviewing = false; - try - { - ShowStatusMessage("Starting preview"); - EnableButton(false, "StartPreview"); - auto mediaCapture = m_mediaCaptureMgr.Get(); - - previewCanvas1->Visibility = Windows::UI::Xaml::Visibility::Visible; - previewElement1->Source = mediaCapture; - create_task(mediaCapture->StartPreviewAsync()).then([this](task previewTask) - { - try - { - previewTask.get(); - auto mediaCapture = m_mediaCaptureMgr.Get(); - m_bPreviewing = true; - ShowStatusMessage("Start preview successful"); - if(mediaCapture->VideoDeviceController->Brightness) - { - SetupVideoDeviceControl(mediaCapture->VideoDeviceController->Brightness, sldBrightness); - } - if(mediaCapture->VideoDeviceController->Contrast) - { - SetupVideoDeviceControl(mediaCapture->VideoDeviceController->Contrast, sldContrast); - } - - }catch (Exception ^e) - { - ShowExceptionMessage(e); - } - }); - } - catch (Platform::Exception^ e) - { - m_bPreviewing = false; - previewElement1->Source = nullptr; - EnableButton(true, "StartPreview"); - ShowExceptionMessage(e); - } -} - -void BasicCapture::btnTakePhoto_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) -{ - try - { - ShowStatusMessage("Taking photo"); - EnableButton(false, "TakePhoto"); - - task(KnownFolders::PicturesLibrary->CreateFileAsync(PHOTO_FILE_NAME, Windows::Storage::CreationCollisionOption::GenerateUniqueName)).then([this](task getFileTask) - { - try - { - this->m_photoStorageFile = getFileTask.get(); - ShowStatusMessage("Create photo file successful"); - ImageEncodingProperties^ imageProperties = ImageEncodingProperties::CreateJpeg(); - - create_task(m_mediaCaptureMgr->CapturePhotoToStorageFileAsync(imageProperties, this->m_photoStorageFile)).then([this](task photoTask) - { - try - { - photoTask.get(); - EnableButton(true, "TakePhoto"); - ShowStatusMessage("Photo taken"); - - task(this->m_photoStorageFile->OpenAsync(FileAccessMode::Read)).then([this](task getStreamTask) - { - try - { - auto photoStream = getStreamTask.get(); - ShowStatusMessage("File open successful"); - auto bmpimg = ref new BitmapImage(); - - bmpimg->SetSource(photoStream); - imageElement1->Source = bmpimg; - } - catch (Exception^ e) - { - ShowExceptionMessage(e); - EnableButton(true, "TakePhoto"); - } - }); - } - catch (Platform::Exception ^ e) - { - ShowExceptionMessage(e); - EnableButton(true, "TakePhoto"); - } - }); - } - catch (Exception^ e) - { - ShowExceptionMessage(e); - EnableButton(true, "TakePhoto"); - } - }); - } - catch (Platform::Exception^ e) - { - ShowExceptionMessage(e); - EnableButton(true, "TakePhoto"); - } -} - -void BasicCapture::btnStartStopRecord_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e) -{ - try - { - String ^fileName; - EnableButton(false, "StartStopRecord"); - - if (!m_bRecording) - { - ShowStatusMessage("Starting Record"); - - fileName = VIDEO_FILE_NAME; - - task(KnownFolders::VideosLibrary->CreateFileAsync(fileName,Windows::Storage::CreationCollisionOption::GenerateUniqueName )).then([this](task fileTask) - { - try - { - this->m_recordStorageFile = fileTask.get(); - ShowStatusMessage("Create record file successful"); - - MediaEncodingProfile^ recordProfile= nullptr; - recordProfile = MediaEncodingProfile::CreateMp4(Windows::Media::MediaProperties::VideoEncodingQuality::Auto); - - create_task(m_mediaCaptureMgr->StartRecordToStorageFileAsync(recordProfile, this->m_recordStorageFile)).then([this](task recordTask) - { - try - { - recordTask.get(); - m_bRecording = true; - SwitchRecordButtonContent(); - EnableButton(true, "StartStopRecord"); - - ShowStatusMessage("Start Record successful"); - } - catch (Exception ^e) - { - ShowExceptionMessage(e); - } - }); - } - catch (Exception ^e) - { - m_bRecording = false; - SwitchRecordButtonContent(); - EnableButton(true, "StartStopRecord"); - ShowExceptionMessage(e); - } - } - ); - } - else - { - ShowStatusMessage("Stopping Record"); - - create_task(m_mediaCaptureMgr->StopRecordAsync()).then([this](task recordTask) - { - try - { - recordTask.get(); - m_bRecording = false; - EnableButton(true, "StartStopRecord"); - SwitchRecordButtonContent(); - - ShowStatusMessage("Stop record successful"); - if (!m_bSuspended) - { - task(this->m_recordStorageFile->OpenAsync(FileAccessMode::Read)).then([this](task streamTask) - { - try - { - auto stream = streamTask.get(); - ShowStatusMessage("Record file opened"); - ShowStatusMessage(this->m_recordStorageFile->Path); - playbackElement1->AutoPlay = true; - playbackElement1->SetSource(stream, this->m_recordStorageFile->FileType); - playbackElement1->Play(); - } - catch (Exception ^e) - { - ShowExceptionMessage(e); - m_bRecording = false; - EnableButton(true, "StartStopRecord"); - SwitchRecordButtonContent(); - } - }); - } - } - catch (Exception ^e) - { - m_bRecording = false; - EnableButton(true, "StartStopRecord"); - SwitchRecordButtonContent(); - ShowExceptionMessage(e); - } - }); - } - } - catch (Platform::Exception^ e) - { - EnableButton(true, "StartStopRecord"); - ShowExceptionMessage(e); - SwitchRecordButtonContent(); - m_bRecording = false; - } -} - -void BasicCapture::SetupVideoDeviceControl(Windows::Media::Devices::MediaDeviceControl^ videoDeviceControl, Slider^ slider) -{ - try - { - if ((videoDeviceControl->Capabilities)->Supported) - { - slider->IsEnabled = true; - slider->Maximum = videoDeviceControl->Capabilities->Max; - slider->Minimum = videoDeviceControl->Capabilities->Min; - slider->StepFrequency = videoDeviceControl->Capabilities->Step; - double controlValue = 0; - if (videoDeviceControl->TryGetValue(&controlValue)) - { - slider->Value = controlValue; - } - } - else - { - slider->IsEnabled = false; - } - } - catch (Platform::Exception^ e) - { - ShowExceptionMessage(e); - } -} - -// VideoDeviceControllers -void BasicCapture::sldBrightness_ValueChanged(Platform::Object^ sender, Windows::UI::Xaml::Controls::Primitives::RangeBaseValueChangedEventArgs^ e) -{ - bool succeeded = m_mediaCaptureMgr->VideoDeviceController->Brightness->TrySetValue(sldBrightness->Value); - if (!succeeded) - { - ShowStatusMessage("Set Brightness failed"); - } -} - -void BasicCapture::sldContrast_ValueChanged(Platform::Object^ sender, Windows::UI::Xaml::Controls::Primitives::RangeBaseValueChangedEventArgs ^e) -{ - bool succeeded = m_mediaCaptureMgr->VideoDeviceController->Contrast->TrySetValue(sldContrast->Value); - if (!succeeded) - { - ShowStatusMessage("Set Contrast failed"); - } -} - -void BasicCapture::ShowStatusMessage(Platform::String^ text) -{ - rootPage->NotifyUser(text, NotifyType::StatusMessage); -} - -void BasicCapture::ShowExceptionMessage(Platform::Exception^ ex) -{ - rootPage->NotifyUser(ex->Message, NotifyType::ErrorMessage); -} - -void BasicCapture::SwitchRecordButtonContent() -{ - if (m_bRecording) - { - btnStartStopRecord1->Content="StopRecord"; - } - else - { - btnStartStopRecord1->Content="StartRecord"; - } -} -void BasicCapture::EnableButton(bool enabled, String^ name) -{ - if (name->Equals("StartDevice")) - { - btnStartDevice1->IsEnabled = enabled; - } - else if (name->Equals("StartPreview")) - { - btnStartPreview1->IsEnabled = enabled; - } - else if (name->Equals("StartStopRecord")) - { - btnStartStopRecord1->IsEnabled = enabled; - } - else if (name->Equals("TakePhoto")) - { - btnTakePhoto1->IsEnabled = enabled; - } -} - diff --git a/samples/winrt/ImageManipulations/C++/BasicCapture.xaml.h b/samples/winrt/ImageManipulations/C++/BasicCapture.xaml.h deleted file mode 100644 index 28129efb7..000000000 --- a/samples/winrt/ImageManipulations/C++/BasicCapture.xaml.h +++ /dev/null @@ -1,88 +0,0 @@ -//********************************************************* -// -// Copyright (c) Microsoft. All rights reserved. -// THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF -// ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY -// IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR -// PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT. -// -//********************************************************* - -// -// BasicCapture.xaml.h -// Declaration of the BasicCapture class -// - -#pragma once - -#include "pch.h" -#include "BasicCapture.g.h" -#include "MainPage.xaml.h" - -using namespace Windows::UI::Xaml; -using namespace Windows::UI::Xaml::Controls; -using namespace Windows::Graphics::Display; -using namespace Windows::UI::ViewManagement; -using namespace Windows::Devices::Enumeration; -#define VIDEO_FILE_NAME "video.mp4" -#define PHOTO_FILE_NAME "photo.jpg" -namespace SDKSample -{ - namespace MediaCapture - { - /// - /// An empty page that can be used on its own or navigated to within a Frame. - /// - [Windows::Foundation::Metadata::WebHostHidden] - public ref class BasicCapture sealed - { - public: - BasicCapture(); - - protected: - virtual void OnNavigatedTo(Windows::UI::Xaml::Navigation::NavigationEventArgs^ e) override; - virtual void OnNavigatedFrom(Windows::UI::Xaml::Navigation::NavigationEventArgs^ e) override; - - private: - MainPage^ rootPage; - void ScenarioInit(); - void ScenarioReset(); - - void Suspending(Object^ sender, Windows::ApplicationModel::SuspendingEventArgs^ e); - void Resuming(Object^ sender, Object^ e); - - void btnStartDevice_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); - void SoundLevelChanged(Object^ sender, Object^ e); - void RecordLimitationExceeded(Windows::Media::Capture::MediaCapture ^ mediaCapture); - void Failed(Windows::Media::Capture::MediaCapture ^ mediaCapture, Windows::Media::Capture::MediaCaptureFailedEventArgs ^ args); - - - void btnStartPreview_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); - - void btnStartStopRecord_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); - - void btnTakePhoto_Click(Platform::Object^ sender, Windows::UI::Xaml::RoutedEventArgs^ e); - - void SetupVideoDeviceControl(Windows::Media::Devices::MediaDeviceControl^ videoDeviceControl, Slider^ slider); - void sldBrightness_ValueChanged(Platform::Object^ sender, Windows::UI::Xaml::Controls::Primitives::RangeBaseValueChangedEventArgs^ e); - void sldContrast_ValueChanged(Platform::Object^ sender, Windows::UI::Xaml::Controls::Primitives::RangeBaseValueChangedEventArgs^ e); - - void ShowStatusMessage(Platform::String^ text); - void ShowExceptionMessage(Platform::Exception^ ex); - - void EnableButton(bool enabled, Platform::String ^name); - void SwitchRecordButtonContent(); - - Platform::Agile m_mediaCaptureMgr; - Windows::Storage::StorageFile^ m_photoStorageFile; - Windows::Storage::StorageFile^ m_recordStorageFile; - bool m_bRecording; - bool m_bEffectAdded; - bool m_bSuspended; - bool m_bPreviewing; - Windows::UI::Xaml::WindowVisibilityChangedEventHandler ^m_visbilityHandler; - Windows::Foundation::EventRegistrationToken m_eventRegistrationToken; - bool m_currentScenarioLoaded; - }; - } -} diff --git a/samples/winrt/ImageManipulations/C++/Constants.cpp b/samples/winrt/ImageManipulations/C++/Constants.cpp index 873b98381..a26634272 100644 --- a/samples/winrt/ImageManipulations/C++/Constants.cpp +++ b/samples/winrt/ImageManipulations/C++/Constants.cpp @@ -18,7 +18,5 @@ Platform::Array^ MainPage::scenariosInner = ref new Platform::Array - + @@ -47,7 +47,7 @@ - + @@ -92,7 +92,7 @@ - + diff --git a/samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj b/samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj index d2f255d1b..84b6d3df5 100644 --- a/samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj +++ b/samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj @@ -101,14 +101,6 @@ AdvancedCapture.xaml Code - - AudioCapture.xaml - Code - - - BasicCapture.xaml - Code - MainPage.xaml @@ -127,12 +119,6 @@ Designer - - Designer - - - Designer - Designer @@ -156,14 +142,6 @@ App.xaml - - AudioCapture.xaml - Code - - - BasicCapture.xaml - Code - @@ -194,6 +172,144 @@ {ba69218f-da5c-4d14-a78d-21a9e4dec669} + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + + true + true + true + true + true + true + + diff --git a/samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj.filters b/samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj.filters index 5f6124c2b..403e5ea1f 100644 --- a/samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj.filters +++ b/samples/winrt/ImageManipulations/C++/MediaCapture.vcxproj.filters @@ -40,9 +40,7 @@ Sample-Utils - - @@ -56,8 +54,6 @@ - - @@ -71,8 +67,6 @@ - - @@ -85,4 +79,23 @@ {54f287f8-e4cb-4f47-97d0-4c469de6992e} + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.cpp b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.cpp index 687386ece..d41fa3481 100644 --- a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.cpp +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.cpp @@ -8,6 +8,9 @@ #include "Grayscale.h" #include "bufferlock.h" +#include "opencv2\core\core.hpp" +#include "opencv2\imgproc\imgproc.hpp" + #pragma comment(lib, "d2d1") using namespace Microsoft::WRL; @@ -80,16 +83,12 @@ NOTES ON THE MFT IMPLEMENTATION // Video FOURCC codes. -const DWORD FOURCC_YUY2 = '2YUY'; -const DWORD FOURCC_UYVY = 'YVYU'; const DWORD FOURCC_NV12 = '21VN'; // Static array of media types (preferred and accepted). const GUID g_MediaSubtypes[] = { - MFVideoFormat_NV12, - MFVideoFormat_YUY2, - MFVideoFormat_UYVY + MFVideoFormat_NV12 }; HRESULT GetImageSize(DWORD fcc, UINT32 width, UINT32 height, DWORD* pcbImage); @@ -102,27 +101,6 @@ inline T clamp(const T& val, const T& minVal, const T& maxVal) return (val < minVal ? minVal : (val > maxVal ? maxVal : val)); } - -// TransformChroma: -// Apply the transforms to calculate the output chroma values. - -void TransformChroma(const D2D1::Matrix3x2F& mat, BYTE *pu, BYTE *pv) -{ - // Normalize the chroma values to [-112, 112] range - - D2D1_POINT_2F pt = { static_cast(*pu) - 128, static_cast(*pv) - 128 }; - - pt = mat.TransformPoint(pt); - - // Clamp to valid range. - clamp(pt.x, -112.0f, 112.0f); - clamp(pt.y, -112.0f, 112.0f); - - // Map back to [16...240] range. - *pu = static_cast(pt.x + 128.0f); - *pv = static_cast(pt.y + 128.0f); -} - //------------------------------------------------------------------- // Functions to convert a YUV images to grayscale. // @@ -141,144 +119,6 @@ void TransformChroma(const D2D1::Matrix3x2F& mat, BYTE *pu, BYTE *pv) // dwHeightInPixels Frame height, in pixels. //------------------------------------------------------------------- -// Convert UYVY image. - -void TransformImage_UYVY( - const D2D1::Matrix3x2F& mat, - const D2D_RECT_U& rcDest, - _Inout_updates_(_Inexpressible_(lDestStride * dwHeightInPixels)) BYTE *pDest, - _In_ LONG lDestStride, - _In_reads_(_Inexpressible_(lSrcStride * dwHeightInPixels)) const BYTE* pSrc, - _In_ LONG lSrcStride, - _In_ DWORD dwWidthInPixels, - _In_ DWORD dwHeightInPixels) -{ - DWORD y = 0; - const DWORD y0 = min(rcDest.bottom, dwHeightInPixels); - - // Lines above the destination rectangle. - for ( ; y < rcDest.top; y++) - { - memcpy(pDest, pSrc, dwWidthInPixels * 2); - pSrc += lSrcStride; - pDest += lDestStride; - } - - // Lines within the destination rectangle. - for ( ; y < y0; y++) - { - WORD *pSrc_Pixel = (WORD*)pSrc; - WORD *pDest_Pixel = (WORD*)pDest; - - for (DWORD x = 0; (x + 1) < dwWidthInPixels; x += 2) - { - // Byte order is U0 Y0 V0 Y1 - // Each WORD is a byte pair (U/V, Y) - // Windows is little-endian so the order appears reversed. - - if (x >= rcDest.left && x < rcDest.right) - { - BYTE u = pSrc_Pixel[x] & 0x00FF; - BYTE v = pSrc_Pixel[x+1] & 0x00FF; - - TransformChroma(mat, &u, &v); - - pDest_Pixel[x] = (pSrc_Pixel[x] & 0xFF00) | u; - pDest_Pixel[x+1] = (pSrc_Pixel[x+1] & 0xFF00) | v; - } - else - { -#pragma warning(push) -#pragma warning(disable: 6385) -#pragma warning(disable: 6386) - pDest_Pixel[x] = pSrc_Pixel[x]; - pDest_Pixel[x+1] = pSrc_Pixel[x+1]; -#pragma warning(pop) - } - } - - pDest += lDestStride; - pSrc += lSrcStride; - } - - // Lines below the destination rectangle. - for ( ; y < dwHeightInPixels; y++) - { - memcpy(pDest, pSrc, dwWidthInPixels * 2); - pSrc += lSrcStride; - pDest += lDestStride; - } -} - - -// Convert YUY2 image. - -void TransformImage_YUY2( - const D2D1::Matrix3x2F& mat, - const D2D_RECT_U& rcDest, - _Inout_updates_(_Inexpressible_(lDestStride * dwHeightInPixels)) BYTE *pDest, - _In_ LONG lDestStride, - _In_reads_(_Inexpressible_(lSrcStride * dwHeightInPixels)) const BYTE* pSrc, - _In_ LONG lSrcStride, - _In_ DWORD dwWidthInPixels, - _In_ DWORD dwHeightInPixels) -{ - DWORD y = 0; - const DWORD y0 = min(rcDest.bottom, dwHeightInPixels); - - // Lines above the destination rectangle. - for ( ; y < rcDest.top; y++) - { - memcpy(pDest, pSrc, dwWidthInPixels * 2); - pSrc += lSrcStride; - pDest += lDestStride; - } - - // Lines within the destination rectangle. - for ( ; y < y0; y++) - { - WORD *pSrc_Pixel = (WORD*)pSrc; - WORD *pDest_Pixel = (WORD*)pDest; - - for (DWORD x = 0; (x + 1) < dwWidthInPixels; x += 2) - { - // Byte order is Y0 U0 Y1 V0 - // Each WORD is a byte pair (Y, U/V) - // Windows is little-endian so the order appears reversed. - - if (x >= rcDest.left && x < rcDest.right) - { - BYTE u = pSrc_Pixel[x] >> 8; - BYTE v = pSrc_Pixel[x+1] >> 8; - - TransformChroma(mat, &u, &v); - - pDest_Pixel[x] = (pSrc_Pixel[x] & 0x00FF) | (u<<8); - pDest_Pixel[x+1] = (pSrc_Pixel[x+1] & 0x00FF) | (v<<8); - } - else - { -#pragma warning(push) -#pragma warning(disable: 6385) -#pragma warning(disable: 6386) - pDest_Pixel[x] = pSrc_Pixel[x]; - pDest_Pixel[x+1] = pSrc_Pixel[x+1]; -#pragma warning(pop) - } - } - pDest += lDestStride; - pSrc += lSrcStride; - } - - // Lines below the destination rectangle. - for ( ; y < dwHeightInPixels; y++) - { - memcpy(pDest, pSrc, dwWidthInPixels * 2); - pSrc += lSrcStride; - pDest += lDestStride; - } -} - // Convert NV12 image void TransformImage_NV12( @@ -307,7 +147,8 @@ void TransformImage_NV12( // Lines above the destination rectangle. DWORD y = 0; - const DWORD y0 = min(rcDest.bottom, dwHeightInPixels); + + const DWORD y0 = rcDest.bottom < dwHeightInPixels ? rcDest.bottom : dwHeightInPixels; for ( ; y < rcDest.top/2; y++) { @@ -323,13 +164,8 @@ void TransformImage_NV12( { if (x >= rcDest.left && x < rcDest.right) { - BYTE u = pSrc[x]; - BYTE v = pSrc[x+1]; - - TransformChroma(mat, &u, &v); - - pDest[x] = u; - pDest[x+1] = v; + pDest[x] = 0; + pDest[x+1] = 0; } else { @@ -351,9 +187,9 @@ void TransformImage_NV12( } CGrayscale::CGrayscale() : - m_pSample(NULL), m_pInputType(NULL), m_pOutputType(NULL), m_pTransformFn(NULL), + m_pSample(NULL), m_pInputType(NULL), m_pOutputType(NULL), m_imageWidthInPixels(0), m_imageHeightInPixels(0), m_cbImageSize(0), - m_transform(D2D1::Matrix3x2F::Identity()), m_rcDest(D2D1::RectU()), m_bStreamingInitialized(false), + m_TransformType(Preview), m_rcDest(D2D1::RectU()), m_bStreamingInitialized(false), m_pAttributes(NULL) { InitializeCriticalSectionEx(&m_critSec, 3000, 0); @@ -1516,10 +1352,8 @@ HRESULT CGrayscale::BeginStreaming() // Get the chroma transformations. - float scale = (float)MFGetAttributeDouble(m_pAttributes, MFT_GRAYSCALE_SATURATION, 0.0f); - float angle = (float)MFGetAttributeDouble(m_pAttributes, MFT_GRAYSCALE_CHROMA_ROTATION, 0.0f); - - m_transform = D2D1::Matrix3x2F::Scale(scale, scale) * D2D1::Matrix3x2F::Rotation(angle); + // float scale = (float)MFGetAttributeDouble(m_pAttributes, MFT_GRAYSCALE_SATURATION, 0.0f); + // float angle = (float)MFGetAttributeDouble(m_pAttributes, MFT_GRAYSCALE_CHROMA_ROTATION, 0.0f); m_bStreamingInitialized = true; } @@ -1563,42 +1397,36 @@ HRESULT CGrayscale::OnProcessOutput(IMFMediaBuffer *pIn, IMFMediaBuffer *pOut) HRESULT hr = GetDefaultStride(m_pInputType, &lDefaultStride); if (FAILED(hr)) { - goto done; + return hr; } // Lock the input buffer. hr = inputLock.LockBuffer(lDefaultStride, m_imageHeightInPixels, &pSrc, &lSrcStride); if (FAILED(hr)) { - goto done; + return hr; } // Lock the output buffer. hr = outputLock.LockBuffer(lDefaultStride, m_imageHeightInPixels, &pDest, &lDestStride); if (FAILED(hr)) { - goto done; - } - - // Invoke the image transform function. - assert (m_pTransformFn != NULL); - if (m_pTransformFn) - { - (*m_pTransformFn)(m_transform, m_rcDest, pDest, lDestStride, pSrc, lSrcStride, - m_imageWidthInPixels, m_imageHeightInPixels); - } - else - { - hr = E_UNEXPECTED; - goto done; + return hr; } + //(*m_pTransformFn)(m_transform, m_rcDest, pDest, lDestStride, pSrc, lSrcStride, + // m_imageWidthInPixels, m_imageHeightInPixels); + cv::Mat InputFrame(m_imageHeightInPixels + m_imageHeightInPixels/2, m_imageWidthInPixels, CV_8UC1, pSrc, lSrcStride); + cv::Mat InputGreyScale(InputFrame, cv::Range(0, m_imageHeightInPixels), cv::Range(0, m_imageWidthInPixels)); + cv::Mat OutputFrame(m_imageHeightInPixels + m_imageHeightInPixels/2, m_imageWidthInPixels, CV_8UC1, pDest, lDestStride); + OutputFrame.setTo(cv::Scalar(128)); + cv::Mat OutputGreyScale(OutputFrame, cv::Range(0, m_imageHeightInPixels), cv::Range(0, m_imageWidthInPixels)); + cv::Canny(InputGreyScale, OutputGreyScale, 80, 90); + // Set the data size on the output buffer. hr = pOut->SetCurrentLength(m_cbImageSize); - // The VideoBufferLock class automatically unlocks the buffers. -done: return hr; } @@ -1626,8 +1454,6 @@ HRESULT CGrayscale::UpdateFormatInfo() m_imageHeightInPixels = 0; m_cbImageSize = 0; - m_pTransformFn = NULL; - if (m_pInputType != NULL) { hr = m_pInputType->GetGUID(MF_MT_SUBTYPE, &subtype); @@ -1635,19 +1461,7 @@ HRESULT CGrayscale::UpdateFormatInfo() { goto done; } - if (subtype == MFVideoFormat_YUY2) - { - m_pTransformFn = TransformImage_YUY2; - } - else if (subtype == MFVideoFormat_UYVY) - { - m_pTransformFn = TransformImage_UYVY; - } - else if (subtype == MFVideoFormat_NV12) - { - m_pTransformFn = TransformImage_NV12; - } - else + if (subtype != MFVideoFormat_NV12) { hr = E_UNEXPECTED; goto done; @@ -1678,20 +1492,6 @@ HRESULT GetImageSize(DWORD fcc, UINT32 width, UINT32 height, DWORD* pcbImage) switch (fcc) { - case FOURCC_YUY2: - case FOURCC_UYVY: - // check overflow - if ((width > MAXDWORD / 2) || (width * 2 > MAXDWORD / height)) - { - hr = E_INVALIDARG; - } - else - { - // 16 bpp - *pcbImage = width * height * 2; - } - break; - case FOURCC_NV12: // check overflow if ((height/2 > MAXDWORD - height) || ((height + height/2) > MAXDWORD / width)) diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.h b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.h index b83223bce..96b4b1b96 100644 --- a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.h +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.h @@ -42,15 +42,14 @@ DEFINE_GUID(CLSID_GrayscaleMFT, DEFINE_GUID(MFT_GRAYSCALE_DESTINATION_RECT, 0x7bbbb051, 0x133b, 0x41f5, 0xb6, 0xaa, 0x5a, 0xff, 0x9b, 0x33, 0xa2, 0xcb); - -// {14782342-93E8-4565-872C-D9A2973D5CBF} -DEFINE_GUID(MFT_GRAYSCALE_SATURATION, -0x14782342, 0x93e8, 0x4565, 0x87, 0x2c, 0xd9, 0xa2, 0x97, 0x3d, 0x5c, 0xbf); - -// {E0BADE5D-E4B9-4689-9DBA-E2F00D9CED0E} -DEFINE_GUID(MFT_GRAYSCALE_CHROMA_ROTATION, -0xe0bade5d, 0xe4b9, 0x4689, 0x9d, 0xba, 0xe2, 0xf0, 0xd, 0x9c, 0xed, 0xe); - +enum ProcessingType +{ + Preview, + GrayScale, + Canny, + Zoom, + Sepia +}; template void SafeRelease(T **ppT) { @@ -61,18 +60,6 @@ template void SafeRelease(T **ppT) } } -// Function pointer for the function that transforms the image. -typedef void (*IMAGE_TRANSFORM_FN)( - const D2D1::Matrix3x2F& mat, // Chroma transform matrix. - const D2D_RECT_U& rcDest, // Destination rectangle for the transformation. - BYTE* pDest, // Destination buffer. - LONG lDestStride, // Destination stride. - const BYTE* pSrc, // Source buffer. - LONG lSrcStride, // Source stride. - DWORD dwWidthInPixels, // Image width in pixels. - DWORD dwHeightInPixels // Image height in pixels. - ); - // CGrayscale class: // Implements a grayscale video effect. @@ -244,7 +231,7 @@ private: CRITICAL_SECTION m_critSec; // Transformation parameters - D2D1::Matrix3x2F m_transform; // Chroma transform matrix. + ProcessingType m_TransformType; D2D_RECT_U m_rcDest; // Destination rectangle for the effect. // Streaming @@ -259,8 +246,5 @@ private: DWORD m_cbImageSize; // Image size, in bytes. IMFAttributes *m_pAttributes; - - // Image transform function. (Changes based on the media type.) - IMAGE_TRANSFORM_FN m_pTransformFn; }; #endif \ No newline at end of file diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.vcxproj b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.vcxproj index 8af8f2c7d..c7e905936 100644 --- a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.vcxproj +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.vcxproj @@ -123,13 +123,14 @@ $(WindowsSDK_WindowsMetadata);$(AdditionalUsingDirectories) false - $(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common; + $(OPENCV_DIR)\include;$(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common Console - runtimeobject.lib;%(AdditionalDependencies);mf.lib;mfuuid.lib;mfplat.lib + runtimeobject.lib;%(AdditionalDependencies);mf.lib;mfuuid.lib;mfplat.lib;opencv_core245.lib;opencv_imgproc245.lib false Grayscale.def + $(OPENCV_DIR)\lib;%(AdditionalLibraryDirectories) mdmerge -metadata_dir "$(WindowsSDK_MetadataPath)" -o "$(ProjectDir)$(Configuration)\$(MSBuildProjectName)" -i "$(MSBuildProjectDirectory)" -v -partial @@ -146,13 +147,14 @@ $(WindowsSDK_WindowsMetadata);$(AdditionalUsingDirectories) false - $(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common; + $(OPENCV_DIR)\include;$(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common Console runtimeobject.lib;%(AdditionalDependencies);mf.lib;mfuuid.lib;mfplat.lib false Grayscale.def + $(OPENCV_DIR)\lib;%(AdditionalLibraryDirectories) mdmerge -metadata_dir "$(WindowsSDK_MetadataPath)" -o "$(SolutionDir)$(Platform)\$(Configuration)\$(MSBuildProjectName)" -i "$(MSBuildProjectDirectory)" -v -partial @@ -169,13 +171,14 @@ $(WindowsSDK_WindowsMetadata);$(AdditionalUsingDirectories) false - $(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common; + $(OPENCV_DIR)\include;$(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common Console runtimeobject.lib;%(AdditionalDependencies);mf.lib;mfuuid.lib;mfplat.lib false Grayscale.def + $(OPENCV_DIR)\lib;%(AdditionalLibraryDirectories) mdmerge -metadata_dir "$(WindowsSDK_MetadataPath)" -o "$(SolutionDir)$(Platform)\$(Configuration)\$(MSBuildProjectName)" -i "$(MSBuildProjectDirectory)" -v -partial @@ -192,13 +195,14 @@ $(WindowsSDK_WindowsMetadata);$(AdditionalUsingDirectories) false - $(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common; + $(OPENCV_DIR)\include;$(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common Console runtimeobject.lib;%(AdditionalDependencies);mf.lib;mfuuid.lib;mfplat.lib false Grayscale.def + $(OPENCV_DIR)\lib;%(AdditionalLibraryDirectories) mdmerge -metadata_dir "$(WindowsSDK_MetadataPath)" -o "$(ProjectDir)$(Configuration)\$(MSBuildProjectName)" -i "$(MSBuildProjectDirectory)" -v -partial @@ -215,13 +219,14 @@ $(WindowsSDK_WindowsMetadata);$(AdditionalUsingDirectories) false - $(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common; + $(OPENCV_DIR)\include;$(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common Console runtimeobject.lib;%(AdditionalDependencies);mf.lib;mfuuid.lib;mfplat.lib false Grayscale.def + $(OPENCV_DIR)\lib;%(AdditionalLibraryDirectories) mdmerge -metadata_dir "$(WindowsSDK_MetadataPath)" -o "$(SolutionDir)$(Platform)\$(Configuration)\$(MSBuildProjectName)" -i "$(MSBuildProjectDirectory)" -v -partial @@ -238,13 +243,14 @@ $(WindowsSDK_WindowsMetadata);$(AdditionalUsingDirectories) false - $(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common; + $(OPENCV_DIR)\include;$(ProjectDir);$(IntermediateOutputPath);%(AdditionalIncludeDirectories);$(ProjectDir)\..\Common Console runtimeobject.lib;%(AdditionalDependencies);mf.lib;mfuuid.lib;mfplat.lib false Grayscale.def + $(OPENCV_DIR)\lib;%(AdditionalLibraryDirectories) mdmerge -metadata_dir "$(WindowsSDK_MetadataPath)" -o "$(SolutionDir)$(Platform)\$(Configuration)\$(MSBuildProjectName)" -i "$(MSBuildProjectDirectory)" -v -partial From de9f659f1e37d798ff22c8b9a1c3a13f68bdde12 Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Mon, 10 Jun 2013 11:48:53 -0700 Subject: [PATCH 33/75] Several transforms added to sample IMFTransform. --- .../C++/AdvancedCapture.xaml | 3 +- .../C++/AdvancedCapture.xaml.cpp | 45 +++--- .../C++/AdvancedCapture.xaml.h | 1 + .../ImageManipulations/C++/MainPage.xaml | 12 +- .../MediaExtensions/Grayscale/Grayscale.cpp | 153 ++++++++++++------ .../C++/MediaExtensions/Grayscale/Grayscale.h | 22 +-- 6 files changed, 146 insertions(+), 90 deletions(-) diff --git a/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml b/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml index 4e6ebfd30..e6266adb9 100644 --- a/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml +++ b/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml @@ -40,7 +40,8 @@ - + + diff --git a/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.cpp b/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.cpp index dc59acc2e..15890ba6b 100644 --- a/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.cpp +++ b/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.cpp @@ -122,7 +122,7 @@ void AdvancedCapture::ScenarioReset() void AdvancedCapture::SoundLevelChanged(Object^ sender, Object^ e) { create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, ref new Windows::UI::Core::DispatchedHandler([this]() - { + { if(Windows::Media::MediaControl::SoundLevel != Windows::Media::SoundLevel::Muted) { ScenarioReset(); @@ -220,7 +220,7 @@ void AdvancedCapture::RecordLimitationExceeded(Windows::Media::Capture::MediaCap void AdvancedCapture::Failed(Windows::Media::Capture::MediaCapture ^currentCaptureObject, Windows::Media::Capture::MediaCaptureFailedEventArgs^ currentFailure) { String ^message = "Fatal error" + currentFailure->Message; - create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, + create_task(Dispatcher->RunAsync(Windows::UI::Core::CoreDispatcherPriority::High, ref new Windows::UI::Core::DispatchedHandler([this, message]() { ShowStatusMessage(message); @@ -325,7 +325,7 @@ void AdvancedCapture::btnTakePhoto_Click(Platform::Object^ sender, Windows::UI:: EnableButton(false, "TakePhoto"); auto currentRotation = GetCurrentPhotoRotation(); - task(KnownFolders::PicturesLibrary->CreateFileAsync(TEMP_PHOTO_FILE_NAME, Windows::Storage::CreationCollisionOption::GenerateUniqueName)).then([this, currentRotation](task getFileTask) + task(KnownFolders::PicturesLibrary->CreateFileAsync(TEMP_PHOTO_FILE_NAME, Windows::Storage::CreationCollisionOption::GenerateUniqueName)).then([this, currentRotation](task getFileTask) { try { @@ -520,7 +520,7 @@ void AdvancedCapture::lstEnumedDevices_SelectionChanged(Platform::Object^ sender } }); } - + btnStartDevice2->IsEnabled = true; btnStartPreview2->IsEnabled = false; btnStartStopRecord2->IsEnabled = false; @@ -581,12 +581,12 @@ void AdvancedCapture::EnumerateWebcamsAsync() } void AdvancedCapture::AddEffectToImageStream() -{ +{ auto mediaCapture = m_mediaCaptureMgr.Get(); Windows::Media::Capture::VideoDeviceCharacteristic charecteristic = mediaCapture->MediaCaptureSettings->VideoDeviceCharacteristic; if((charecteristic != Windows::Media::Capture::VideoDeviceCharacteristic::AllStreamsIdentical) && - (charecteristic != Windows::Media::Capture::VideoDeviceCharacteristic::PreviewPhotoStreamsIdentical) && + (charecteristic != Windows::Media::Capture::VideoDeviceCharacteristic::PreviewPhotoStreamsIdentical) && (charecteristic != Windows::Media::Capture::VideoDeviceCharacteristic::RecordPhotoStreamsIdentical)) { Windows::Media::MediaProperties::IMediaEncodingProperties ^props = mediaCapture->VideoDeviceController->GetMediaStreamProperties(Windows::Media::Capture::MediaStreamType::Photo); @@ -596,13 +596,13 @@ void AdvancedCapture::AddEffectToImageStream() Windows::Foundation::Collections::IVectorView^ supportedPropsList = mediaCapture->VideoDeviceController->GetAvailableMediaStreamProperties(Windows::Media::Capture::MediaStreamType::Photo); { unsigned int i = 0; - while (i< supportedPropsList->Size) + while (i < supportedPropsList->Size) { Windows::Media::MediaProperties::IMediaEncodingProperties^ props = supportedPropsList->GetAt(i); String^ s = props->Type; if(props->Type->Equals("Video")) - { + { task(mediaCapture->VideoDeviceController->SetMediaStreamPropertiesAsync(Windows::Media::Capture::MediaStreamType::Photo,props)).then([this](task changeTypeTask) { try @@ -616,7 +616,7 @@ void AdvancedCapture::AddEffectToImageStream() { effectTask3.get(); m_bEffectAddedToPhoto = true; - ShowStatusMessage("Adding effect to photo stream successful"); + ShowStatusMessage("Adding effect to photo stream successful"); chkAddRemoveEffect->IsEnabled = true; } @@ -633,8 +633,7 @@ void AdvancedCapture::AddEffectToImageStream() { ShowExceptionMessage(e); chkAddRemoveEffect->IsEnabled = true; - chkAddRemoveEffect->IsChecked = false; - + chkAddRemoveEffect->IsChecked = false; } }); @@ -686,8 +685,8 @@ void AdvancedCapture::chkAddRemoveEffect_Checked(Platform::Object^ sender, Windo auto mediaCapture = m_mediaCaptureMgr.Get(); Windows::Media::Capture::VideoDeviceCharacteristic charecteristic = mediaCapture->MediaCaptureSettings->VideoDeviceCharacteristic; - ShowStatusMessage("Add effect successful to preview stream successful"); - if((charecteristic != Windows::Media::Capture::VideoDeviceCharacteristic::AllStreamsIdentical) && + ShowStatusMessage("Add effect successful to preview stream successful"); + if((charecteristic != Windows::Media::Capture::VideoDeviceCharacteristic::AllStreamsIdentical) && (charecteristic != Windows::Media::Capture::VideoDeviceCharacteristic::PreviewRecordStreamsIdentical)) { Windows::Media::MediaProperties::IMediaEncodingProperties ^props = mediaCapture->VideoDeviceController->GetMediaStreamProperties(Windows::Media::Capture::MediaStreamType::VideoRecord); @@ -703,14 +702,14 @@ void AdvancedCapture::chkAddRemoveEffect_Checked(Platform::Object^ sender, Windo m_bEffectAddedToRecord = true; AddEffectToImageStream(); chkAddRemoveEffect->IsEnabled = true; - } + } catch(Exception ^e) { ShowExceptionMessage(e); chkAddRemoveEffect->IsEnabled = true; chkAddRemoveEffect->IsChecked = false; } - }); + }); } else { @@ -718,7 +717,7 @@ void AdvancedCapture::chkAddRemoveEffect_Checked(Platform::Object^ sender, Windo chkAddRemoveEffect->IsEnabled = true; } - } + } else { AddEffectToImageStream(); @@ -777,7 +776,7 @@ void AdvancedCapture::chkAddRemoveEffect_Unchecked(Platform::Object^ sender, Win { ShowExceptionMessage(e); chkAddRemoveEffect->IsEnabled = true; - chkAddRemoveEffect->IsChecked = true; + chkAddRemoveEffect->IsChecked = true; } }); @@ -791,7 +790,7 @@ void AdvancedCapture::chkAddRemoveEffect_Unchecked(Platform::Object^ sender, Win { ShowExceptionMessage(e); chkAddRemoveEffect->IsEnabled = true; - chkAddRemoveEffect->IsChecked = true; + chkAddRemoveEffect->IsChecked = true; } @@ -813,7 +812,7 @@ void AdvancedCapture::chkAddRemoveEffect_Unchecked(Platform::Object^ sender, Win { ShowExceptionMessage(e); chkAddRemoveEffect->IsEnabled = true; - chkAddRemoveEffect->IsChecked = true; + chkAddRemoveEffect->IsChecked = true; } }); @@ -821,7 +820,7 @@ void AdvancedCapture::chkAddRemoveEffect_Unchecked(Platform::Object^ sender, Win else { chkAddRemoveEffect->IsEnabled = true; - chkAddRemoveEffect->IsChecked = true; + chkAddRemoveEffect->IsChecked = true; } } catch (Exception ^e) @@ -1032,3 +1031,9 @@ Windows::Media::Capture::VideoRotation AdvancedCapture::VideoRotationLookup( } } + + +void SDKSample::MediaCapture::AdvancedCapture::EffectType_SelectionChanged(Platform::Object^ sender, Windows::UI::Xaml::Controls::SelectionChangedEventArgs^ e) +{ + +} diff --git a/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.h b/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.h index 83556b95e..4784900d7 100644 --- a/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.h +++ b/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml.h @@ -98,6 +98,7 @@ namespace SDKSample bool m_bRotateVideoOnOrientationChange; bool m_bReversePreviewRotation; Windows::Foundation::EventRegistrationToken m_orientationChangedEventToken; + void EffectType_SelectionChanged(Platform::Object^ sender, Windows::UI::Xaml::Controls::SelectionChangedEventArgs^ e); }; } } diff --git a/samples/winrt/ImageManipulations/C++/MainPage.xaml b/samples/winrt/ImageManipulations/C++/MainPage.xaml index 82d5494b6..e0ed0d79c 100644 --- a/samples/winrt/ImageManipulations/C++/MainPage.xaml +++ b/samples/winrt/ImageManipulations/C++/MainPage.xaml @@ -116,17 +116,7 @@ - - - - - - - - - + diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.cpp b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.cpp index d41fa3481..e853d4627 100644 --- a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.cpp +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.cpp @@ -30,9 +30,9 @@ MFT_GRAYSCALE_DESTINATION_RECT (type = blob, UINT32[4] array) MFT_GRAYSCALE_SATURATION (type = double) - Sets the saturation level. The nominal range is [0...1]. Values beyond 1.0f + Sets the saturation level. The nominal range is [0...1]. Values beyond 1.0f result in supersaturated colors. Values below 0.0f create inverted colors. - + MFT_GRAYSCALE_CHROMA_ROTATION (type = double) Rotates the chroma values of each pixel. The attribue value is the angle of @@ -45,7 +45,7 @@ as a scaling transform. NOTES ON THE MFT IMPLEMENTATION -1. The MFT has fixed streams: One input stream and one output stream. +1. The MFT has fixed streams: One input stream and one output stream. 2. The MFT supports the following formats: UYVY, YUY2, NV12. @@ -56,34 +56,34 @@ NOTES ON THE MFT IMPLEMENTATION 5. If both types are set, no type can be set until the current type is cleared. 6. Preferred input types: - + (a) If the output type is set, that's the preferred type. - (b) Otherwise, the preferred types are partial types, constructed from the + (b) Otherwise, the preferred types are partial types, constructed from the list of supported subtypes. - + 7. Preferred output types: As above. -8. Streaming: - - The private BeingStreaming() method is called in response to the - MFT_MESSAGE_NOTIFY_BEGIN_STREAMING message. +8. Streaming: + + The private BeingStreaming() method is called in response to the + MFT_MESSAGE_NOTIFY_BEGIN_STREAMING message. If the client does not send MFT_MESSAGE_NOTIFY_BEGIN_STREAMING, the MFT calls - BeginStreaming inside the first call to ProcessInput or ProcessOutput. + BeginStreaming inside the first call to ProcessInput or ProcessOutput. This is a good approach for allocating resources that your MFT requires for - streaming. - -9. The configuration attributes are applied in the BeginStreaming method. If the - client changes the attributes during streaming, the change is ignored until - streaming is stopped (either by changing the media types or by sending the + streaming. + +9. The configuration attributes are applied in the BeginStreaming method. If the + client changes the attributes during streaming, the change is ignored until + streaming is stopped (either by changing the media types or by sending the MFT_MESSAGE_NOTIFY_END_STREAMING message) and then restarted. - + */ // Video FOURCC codes. -const DWORD FOURCC_NV12 = '21VN'; +const DWORD FOURCC_NV12 = '21VN'; // Static array of media types (preferred and accepted). const GUID g_MediaSubtypes[] = @@ -124,11 +124,11 @@ inline T clamp(const T& val, const T& minVal, const T& maxVal) void TransformImage_NV12( const D2D1::Matrix3x2F& mat, const D2D_RECT_U& rcDest, - _Inout_updates_(_Inexpressible_(2 * lDestStride * dwHeightInPixels)) BYTE *pDest, - _In_ LONG lDestStride, + _Inout_updates_(_Inexpressible_(2 * lDestStride * dwHeightInPixels)) BYTE *pDest, + _In_ LONG lDestStride, _In_reads_(_Inexpressible_(2 * lSrcStride * dwHeightInPixels)) const BYTE* pSrc, - _In_ LONG lSrcStride, - _In_ DWORD dwWidthInPixels, + _In_ LONG lSrcStride, + _In_ DWORD dwWidthInPixels, _In_ DWORD dwHeightInPixels) { // NV12 is planar: Y plane, followed by packed U-V plane. @@ -189,7 +189,7 @@ void TransformImage_NV12( CGrayscale::CGrayscale() : m_pSample(NULL), m_pInputType(NULL), m_pOutputType(NULL), m_imageWidthInPixels(0), m_imageHeightInPixels(0), m_cbImageSize(0), - m_TransformType(Preview), m_rcDest(D2D1::RectU()), m_bStreamingInitialized(false), + m_TransformType(Preview), m_rcDest(D2D1::RectU()), m_bStreamingInitialized(false), m_pAttributes(NULL) { InitializeCriticalSectionEx(&m_critSec, 3000, 0); @@ -786,12 +786,12 @@ HRESULT CGrayscale::GetInputStatus( return MF_E_INVALIDSTREAMNUMBER; } - // If an input sample is already queued, do not accept another sample until the + // If an input sample is already queued, do not accept another sample until the // client calls ProcessOutput or Flush. - // NOTE: It is possible for an MFT to accept more than one input sample. For - // example, this might be required in a video decoder if the frames do not - // arrive in temporal order. In the case, the decoder must hold a queue of + // NOTE: It is possible for an MFT to accept more than one input sample. For + // example, this might be required in a video decoder if the frames do not + // arrive in temporal order. In the case, the decoder must hold a queue of // samples. For the video effect, each sample is transformed independently, so // there is no reason to queue multiple input samples. @@ -902,12 +902,12 @@ HRESULT CGrayscale::ProcessMessage( case MFT_MESSAGE_SET_D3D_MANAGER: // Sets a pointer to the IDirect3DDeviceManager9 interface. - // The pipeline should never send this message unless the MFT sets the MF_SA_D3D_AWARE + // The pipeline should never send this message unless the MFT sets the MF_SA_D3D_AWARE // attribute set to TRUE. Because this MFT does not set MF_SA_D3D_AWARE, it is an error // to send the MFT_MESSAGE_SET_D3D_MANAGER message to the MFT. Return an error code in // this case. - // NOTE: If this MFT were D3D-enabled, it would cache the IDirect3DDeviceManager9 + // NOTE: If this MFT were D3D-enabled, it would cache the IDirect3DDeviceManager9 // pointer for use during streaming. hr = E_NOTIMPL; @@ -972,7 +972,7 @@ HRESULT CGrayscale::ProcessInput( // The client must set input and output types before calling ProcessInput. if (!m_pInputType || !m_pOutputType) { - hr = MF_E_NOTACCEPTING; + hr = MF_E_NOTACCEPTING; goto done; } @@ -1016,7 +1016,7 @@ HRESULT CGrayscale::ProcessOutput( // This MFT does not accept any flags for the dwFlags parameter. - // The only defined flag is MFT_PROCESS_OUTPUT_DISCARD_WHEN_NO_BUFFER. This flag + // The only defined flag is MFT_PROCESS_OUTPUT_DISCARD_WHEN_NO_BUFFER. This flag // applies only when the MFT marks an output stream as lazy or optional. But this // MFT has no lazy or optional streams, so the flag is not valid. @@ -1266,7 +1266,7 @@ HRESULT CGrayscale::OnCheckMediaType(IMFMediaType *pmt) goto done; } - // Reject single-field media types. + // Reject single-field media types. UINT32 interlace = MFGetAttributeUINT32(pmt, MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive); if (interlace == MFVideoInterlace_FieldSingleUpper || interlace == MFVideoInterlace_FieldSingleLower) { @@ -1350,10 +1350,13 @@ HRESULT CGrayscale::BeginStreaming() goto done; } - // Get the chroma transformations. + // Get the effect type + UINT32 effect = MFGetAttributeUINT32(m_pAttributes, MFT_IMAGE_EFFECT, 1); - // float scale = (float)MFGetAttributeDouble(m_pAttributes, MFT_GRAYSCALE_SATURATION, 0.0f); - // float angle = (float)MFGetAttributeDouble(m_pAttributes, MFT_GRAYSCALE_CHROMA_ROTATION, 0.0f); + if ((effect >= 0) && (effect < InvalidEffect)) + { + m_TransformType = (ProcessingType)effect; + } m_bStreamingInitialized = true; } @@ -1363,7 +1366,7 @@ done: } -// End streaming. +// End streaming. // This method is called if the client sends an MFT_MESSAGE_NOTIFY_END_STREAMING // message, or when the media type changes. In general, it should be called whenever @@ -1414,16 +1417,72 @@ HRESULT CGrayscale::OnProcessOutput(IMFMediaBuffer *pIn, IMFMediaBuffer *pOut) return hr; } - //(*m_pTransformFn)(m_transform, m_rcDest, pDest, lDestStride, pSrc, lSrcStride, - // m_imageWidthInPixels, m_imageHeightInPixels); + cv::Mat InputFrame(m_imageHeightInPixels + m_imageHeightInPixels/2, m_imageWidthInPixels, CV_8UC1, pSrc, lSrcStride); + cv::Mat InputGreyScale(InputFrame, cv::Range(0, m_imageHeightInPixels), cv::Range(0, m_imageWidthInPixels)); + cv::Mat OutputFrame(m_imageHeightInPixels + m_imageHeightInPixels/2, m_imageWidthInPixels, CV_8UC1, pDest, lDestStride); + + switch (m_TransformType) + { + case Preview: + { + InputFrame.copyTo(OutputFrame); + } break; + case GrayScale: + { + OutputFrame.setTo(cv::Scalar(128)); + cv::Mat OutputGreyScale(OutputFrame, cv::Range(0, m_imageHeightInPixels), cv::Range(0, m_imageWidthInPixels)); + InputGreyScale.copyTo(OutputGreyScale); + } break; + case Canny: + { + OutputFrame.setTo(cv::Scalar(128)); + cv::Mat OutputGreyScale(OutputFrame, cv::Range(0, m_imageHeightInPixels), cv::Range(0, m_imageWidthInPixels)); + cv::Canny(InputGreyScale, OutputGreyScale, 80, 90); + + } break; + case Sobel: + { + OutputFrame.setTo(cv::Scalar(128)); + cv::Mat OutputGreyScale(OutputFrame, cv::Range(0, m_imageHeightInPixels), cv::Range(0, m_imageWidthInPixels)); + cv::Sobel(InputGreyScale, OutputGreyScale, CV_8U, 1, 1); + } break; + case Histogram: + { + const int mHistSizeNum = 25; + const int channels[3][1] = {{0}, {1}, {2}}; + const int mHistSize[] = {25}; + const float baseRabge[] = {0.f,256.f}; + const float* ranges[] = {baseRabge}; + const cv::Scalar mColorsRGB[] = { cv::Scalar(200, 0, 0, 255), cv::Scalar(0, 200, 0, 255), + cv::Scalar(0, 0, 200, 255) }; + + cv::Mat BgrFrame; + cv::cvtColor(InputFrame, BgrFrame, cv::COLOR_YUV420sp2BGR); + int thikness = (int) (BgrFrame.cols / (mHistSizeNum + 10) / 5); + if(thikness > 5) thikness = 5; + int offset = (int) ((BgrFrame.cols - (5*mHistSizeNum + 4*10)*thikness)/2); + + // RGB + for (int c=0; c<3; c++) + { + std::vector hist; + cv::calcHist(&BgrFrame, 1, channels[c], cv::Mat(), hist, 1, mHistSize, ranges); + cv::normalize(hist, hist, BgrFrame.rows/2, 0, cv::NORM_INF); + for(int h=0; hSetCurrentLength(m_cbImageSize); @@ -1461,7 +1520,7 @@ HRESULT CGrayscale::UpdateFormatInfo() { goto done; } - if (subtype != MFVideoFormat_NV12) + if (subtype != MFVideoFormat_NV12) { hr = E_UNEXPECTED; goto done; @@ -1511,7 +1570,7 @@ HRESULT GetImageSize(DWORD fcc, UINT32 width, UINT32 height, DWORD* pcbImage) return hr; } -// Get the default stride for a video format. +// Get the default stride for a video format. HRESULT GetDefaultStride(IMFMediaType *pType, LONG *plStride) { LONG lStride = 0; diff --git a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.h b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.h index 96b4b1b96..a6a6aa2f1 100644 --- a/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.h +++ b/samples/winrt/ImageManipulations/C++/MediaExtensions/Grayscale/Grayscale.h @@ -37,18 +37,18 @@ DEFINE_GUID(CLSID_GrayscaleMFT, // Configuration attributes +// {698649BE-8EAE-4551-A4CB-3EC98FBD3D86} +DEFINE_GUID(MFT_IMAGE_EFFECT, +0x698649be, 0x8eae, 0x4551, 0xa4, 0xcb, 0x3e, 0xc9, 0x8f, 0xbd, 0x3d, 0x86); -// {7BBBB051-133B-41F5-B6AA-5AFF9B33A2CB} -DEFINE_GUID(MFT_GRAYSCALE_DESTINATION_RECT, -0x7bbbb051, 0x133b, 0x41f5, 0xb6, 0xaa, 0x5a, 0xff, 0x9b, 0x33, 0xa2, 0xcb); enum ProcessingType { - Preview, - GrayScale, - Canny, - Zoom, - Sepia + GrayScale, + Canny, + Sobel, + Histogram, + InvalidEffect }; template void SafeRelease(T **ppT) @@ -63,9 +63,9 @@ template void SafeRelease(T **ppT) // CGrayscale class: // Implements a grayscale video effect. -class CGrayscale +class CGrayscale : public Microsoft::WRL::RuntimeClass< - Microsoft::WRL::RuntimeClassFlags< Microsoft::WRL::RuntimeClassType::WinRtClassicComMix >, + Microsoft::WRL::RuntimeClassFlags< Microsoft::WRL::RuntimeClassType::WinRtClassicComMix >, ABI::Windows::Media::IMediaExtension, IMFTransform > { @@ -231,7 +231,7 @@ private: CRITICAL_SECTION m_critSec; // Transformation parameters - ProcessingType m_TransformType; + ProcessingType m_TransformType; D2D_RECT_U m_rcDest; // Destination rectangle for the effect. // Streaming From bf22567c09ce2eba16cc5ac179a44b9822ef4283 Mon Sep 17 00:00:00 2001 From: Alexander Smorkalov Date: Fri, 14 Jun 2013 15:01:09 -0700 Subject: [PATCH 34/75] Transform selection implemented in sample GUI. Gistogram output does not work propertly due color conversion problems. --- .../C++/AdvancedCapture.xaml | 39 +- .../C++/AdvancedCapture.xaml.cpp | 505 ++++-------------- .../C++/AdvancedCapture.xaml.h | 10 +- .../winrt/ImageManipulations/C++/App.xaml.cpp | 2 + .../ImageManipulations/C++/MainPage.xaml.cpp | 12 +- .../C++/MediaCapture.vcxproj | 4 +- .../MediaExtensions/Grayscale/Grayscale.cpp | 66 ++- .../C++/MediaExtensions/Grayscale/Grayscale.h | 2 +- .../C++/Package.appxmanifest | 3 - 9 files changed, 149 insertions(+), 494 deletions(-) diff --git a/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml b/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml index e6266adb9..07db96f27 100644 --- a/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml +++ b/samples/winrt/ImageManipulations/C++/AdvancedCapture.xaml @@ -33,38 +33,31 @@ - - - - - - - - - + +
+

Media capture using capture device sample

+
+
+ + + +
+ +
+

This sample demonstrates how to use the +MediaCapture API to capture video, audio, and pictures from a capture device, such as a webcam. +

+

Specifically, this sample covers:

+
    +
  • Previewing video from a capture device, such as a webcam, connected to the computer. +
  • Capturing video from a capture device, such as a webcam, connected to the computer. +
  • Taking a picture from a capture device, such as a webcam, connected to the computer. +
  • Enumerating cameras connected to the computer.
  • Adding a video effect to a video.
  • Recording audio from a capture device connected to the computer.
+

+

For more information on capturing video in your app, see +Quickstart: capturing a photo or video using the camera dialog and +Quickstart: capturing video using the MediaCapture API.

+

Important  

+

This sample uses the Media Extension feature of Windows 8 to add functionality to the Microsoft Media Foundation pipeline. A Media Extension consists of a hybrid object that implements both Component Object Model (COM) and Windows Runtime + interfaces. The COM interfaces interact with the Media Foundation pipeline. The Windows Runtime interfaces activate the component and interact with the Windows Store app. +

+

In most situations, it is recommended that you use Visual C++ with Component Extensions (C++/CX ) to interact with the Windows Runtime. But in the case of hybrid components that implement both COM and Windows Runtime interfaces, such as Media + Extensions, this is not possible. C++/CX can only create Windows Runtime objects. So, for hybrid objects it is recommended that you use +Windows Runtime C++ Template Library to interact with the Windows Runtime. Be aware that Windows Runtime C++ Template Library has limited support for implementing COM interfaces.

+

+

To obtain an evaluation copy of Windows 8, go to +Windows 8.

+

To obtain an evaluation copy of Microsoft Visual Studio 2012, go to +Visual Studio 2012.

+

Related topics

+
Windows 8 app samples +
Roadmaps
Adding multimedia +
Capturing or rendering audio, video, and images +
Designing UX for apps +
Roadmap for apps using C# and Visual Basic +
Roadmap for apps using C++ +
Roadmap for apps using JavaScript +
Tasks
Quickstart: capturing a photo or video using the camera dialog +
Quickstart: capturing video using the MediaCapture API +
Reference
AddEffectAsync +
ClearEffectsAsync +
MediaCapture +
MediaCaptureSettings +
MediaEncodingProfile +
StartRecordToStorageFileAsync +
Windows.Media.Capture
+

Operating system requirements

+ + + + + + + + + + + +
Client
Windows 8
Server
Windows Server 2012
+

Build the sample

+

+
    +
  1. Start Visual Studio Express 2012 for Windows 8 and select File > +Open > Project/Solution.
  2. Go to the directory in which you unzipped the sample. Go to the directory named for the sample, and double-click the Visual Studio Express 2012 for Windows 8 Solution (.sln) file. +
  3. Press F7 or use Build > Build Solution to build the sample.
+

+

Run the sample

+

To debug the app and then run it, press F5 or use Debug > Start Debugging. To run the app without debugging, press Ctrl+F5 or use +Debug > Start Without Debugging.

+
+ +
+ + +