Should CMake fail to complete the configuration,
you can always have a look at the file
CMakeFiles/CMakeOutput.log in the build
folder to find the protocol. It usually contains useful hints to
quickly resolve problems.
error C1002: compiler is out of heap space in pass 2
error LNK1102: out of memory
occured so far only on virtual machines with very limited memory. Even in this case, just restarting the build with a single thread often resolved the issue.
/usr/bin/ld: warning: libpng15.so.15, needed by [...]/anaconda/lib/libopencv_highgui.so, not found
This issue arises when using the current Anaconda/Miniconda installation
with the OpenCV package installed. The fix is to add the Anaconda
installation's lib path to the
LD_LIBRARY_PATH by adding the following
When doing so, also other software will detect and use Anaconda's OpenCV libraries.
error while loading shared libraries: libmx.so: cannot open shared object file: No such file or directory
Resolution: add the MATLAB folder containing the libraries and binary
files to your LD_LIBRARY_PATH:
and run the build again. To make this permanent, open
add this line at the end.
predict-methods of a
Resolution: even for the high-level languages Python and Matlab, I decided to implement a strict, non-converting interface for all functions (in contrast to, e.g., scikit-learn). The reasoning behind this was simple: when working with more than 100Gb of image-data, starting a copy operation to convert its type could be fatal it there is not enough memory. Additionally, due to restrictions of Boost Python, it was necessary to always require 2D arrays. Go through the following checklist:
Check that both arrays (data AND annotations) are two dimensional, i.e., check
both inputs! Are they all 2? If not, you can use
to enforce this.
Check the datatype of the inputs (data AND annotations) (e.g., visually in Matlab, in
X.dtype). Do they match the datatypes of your
If not, you can cast in Python with, e.g.,
X.astype('uint32') and in
Matlab with, e.g.,
In Python, check that both input matrices (data AND annotations) are in C contiguous layout,
C_CONTIGUOUS. Is it
True? If not, you can
use the command
np.ascontiguousarray(X), to enforce this.
The class is templated as
Array<dtype, ndims, ncontiguous>.
dtype, is obviously the datatype of the stored content,
ndimsspecifies how many dimensions the array has,
ncontiguousspecifies the number of dimensions that are row major congituous. If
ncontiguous == ndims, the array is row-major contiguous. Within the library currently only row-major contiguous arrays are used, so the second and third parameter will always be the same.
A required .dll file is missing. You can find out which one by opening the .dll file in the x64 dependency walker. If you are using the precompiled library for Python, most probably the Visual C++ x64 redistributables for VS2013 are missing (download them here).
Resolution: This is a very annoying issue related to the way MATLAB searches for the system libraries. For luck, it is easy to resolve. I especially like the last line of the article ;) :
After running this command in my terminal, the world calms down.
boost_python3library can not be found
Resolution: The precompiled boost libraries are suffixed with their full Python 3 version. You can just create a symlink to them by running
cd /usr/lib/x86_64-linux-gnu/ sudo ln -s libboost_python-py3[X].so libboost_python3.so
replacing [X] with the provided Python version.