I'm building Skia on Windows following this link.
For Windows x64 the build was quite smooth. But not for 32 bit.
1) I tried specifying target_cpu = "x86" instead of target_cpu = "x64", gn gen works fine but ninja throws errors complaining that the paths to visual studio contain spaces. It has all sorts of errors similar to the following:
"C:\Programs " is not a valid path.
2) I tried generating sln files and building from within IDE (which is an alternative as mentioned in the link). However, I can't even get the x64 version to compile this way (a lot of non-zero exit codes from ninja, no further messages observed).
3) I tried using the toolchain that the website claims to be "the only way to support 32 bit builds". The toolchain is to be downloaded using the following command (to be executed in skia dir):
python infra/bots/assets/win_toolchain/download.py -t C:/toolchain
I managed to work around loads of intricacies (gutil conflicts, .py extension omissions, path variables, to google cloud service) and I'm now stuck at this:
Logged in as xxxxxxxxxxxxxxxx
AccessDeniedException: 403 Caller does not have storage.objects.list access to bucket skia-buildbots.
I'm not restricted to the way it is built so long as it generates the "libs" for me. But with a large project having so many external dependencies I don't think it's easy to brew my own way.
One solution I've found:
Open the out\Release\toolchain.ninja text file (or the toolchain.ninja specific to your configuration)
Remove the following string (you can use a "Replace Text" with an empty string in your text editor):
C:/Program Files (x86)/Microsoft Visual Studio 14.0/win_sdk/bin/SetEnv.cmd /x86 && C:/Program Files (x86)/Microsoft Visual Studio 14.0/VC/bin/amd64_x86/
from everywhere (in case that you've used the x86, for x64 the string might be different)
And use ninja -C out/Release dm as usual
In this way we are using a toolchain where cl.exe, ml.exe link.exe commands are called directly (accesible from the PATH environment)
An other solution base on #dacap's. But I edit the gn configure instead.
Change file gn/toolchain/BUILD.gn
...
if (msvc == 2015) {
bin = "$win_vc/bin/amd64"
} else {
bin = "$win_vc/Tools/MSVC/$win_toolchain_version/bin/HostX64/$target_cpu"
}
env_setup = ""
if (target_cpu == "x86") {
# Toolchain asset includes a script that configures for x86 building.
# We don't support x86 builds with local MSVC installations.
env_setup = "cmd /c $win_sdk/bin/SetEnv.cmd /x86 && "
}
...
To
...
if (msvc == 2015) {
if (target_cpu == "x86") {
bin = "$win_vc/bin"
} else {
bin = "$win_vc/bin/amd64"
}
} else {
bin = "$win_vc/Tools/MSVC/$win_toolchain_version/bin/HostX64/$target_cpu"
}
env_setup = ""
#if (target_cpu == "x86") {
# # Toolchain asset includes a script that configures for x86 building.
# # We don't support x86 builds with local MSVC installations.
# env_setup = "cmd /c $win_sdk/bin/SetEnv.cmd /x86 && "
#}
.....
It seems that (as of skia m67) #WinCloud's fix is partially merged to upstream (still has to remove env_setup part though).
However, as noted in comments - it will crash during OpenGL initialization.
I've fixed all that (at least to the point where demo apps are runnable), as a little extra - fixed .lib compatibility with Visual Studio's Debug configurations.
Included are .bat files that build "no system libs" configuration with Clang (as the readme explicitly states that VC++ builds may have performance issues). To use those just download latest LLVM from https://releases.llvm.org/download.html and install it into default location (tested with 6.0.0).
If you need DLL runtime linking you'll have to edit gn/BUILD.gn file - add /MD flag by default and change /MTd to /MDd for debug.
Here's patch based on chrome/m67 branch:
https://gist.github.com/Alexx999/39eae9182eecaa3dc06e73fdb3a1e7d9
Related
I am in the process of creating a pure x64 version of my application. In order to do that, I also need an x64 installer. I've read online that the NSIS code does support x64 but since they don't distribute an x64 build that I need to build from source (including all plugins/etc).
I've been able to build NSIS (v3.0.4) from source for x86 using Python 2.7/SCons 3.1.1/VS 2012/Zlib 1.2.7.
scons ZLIB_W32=C:\Source\zlib-1.2.7
But when I add the TARGET_ARCH=amd64 to the scons command,
scons ZLIB_W32=C:\Source\zlib-1.2.7 TARGET_ARCH=amd64
It doesn't work. Initially it built but didn't link because zlib was still x86.
C:\Source\zlib-1.2.7\lib\zdll.lib : warning LNK4272: library machine type 'x86' conflicts with target machine type 'x64'
However, after rebuilding zlib1.dll (using VS2012) to be x64 (dumpbin confirms this)
I now get an error that it can't find zlib
scons ZLIB_W32=C:\Source\zlib-1.2.7 TARGET_ARCH=amd64
scons: Reading SConscript files ...
WARNING: VER_PACKED not set, defaulting to 0x03003666!
Delete("nsis-18-Nov-2019.cvs")
Delete(".instdist")
Delete(".test")
Using Microsoft tools configuration (14.2)
Checking for memset requirement... (cached) yes
Checking for memcpy requirement... (cached) yes
Checking for C library gdi32... (cached) yes
Checking for C library user32... (cached) yes
Checking for C library pthread... (cached) no
Checking for C library iconv... (cached) no
Checking for C library shlwapi... (cached) yes
Checking for C library oleaut32... (cached) yes
Checking for C library version... (cached) yes
Checking for C library zdll... no
Checking for C library z... no
zlib (win32) is missing!
Note that I have made sure that the directory structure matches in the x64 build so that the following files exist:
C:\Source\zlib-1.2.7\zlib1.dll
C:\Source\zlib-1.2.7\lib\zlib1.lib
C:\Source\zlib-1.2.7\include\zconf.h
C:\Source\zlib-1.2.7\include\zlib.h
It did occur to me that I'm telling scons to look for x86 zlib (hence the W32 in ZLIB_W32) but I didn't see an option for telling scons to look for x64 zlib in the -h output.
What am I missing?
-UPDATE 1-
I'm making progress but not out of the woods yet. I've identified several issues with my build. #1, I wasn't actually using VS2012 like I thought (I have several versions installed). See above where the scons output says 14.0 (VS2019). Oops. Unfortunately, simply adding a MSVC_VERSION=11.0 to my command line didn't fix it. It seems that the nsis project isn't passing this along to scons. The only way I could figure out how to do this was to modify the nsis SConstruct file from:
######################################################################
####### Build Environment ###
######################################################################
path = ARGUMENTS.get('PATH', '')
toolset = ARGUMENTS.get('TOOLSET', '')
arch = ARGUMENTS.get('TARGET_ARCH', 'x86')
if toolset and path:
defenv = Environment(TARGET_ARCH = arch, ENV = {'PATH' : path}, TOOLS = toolset.split(',') + ['zip'])
else:
if path:
defenv = Environment(TARGET_ARCH = arch, ENV = {'PATH' : path})
if toolset:
defenv = Environment(TARGET_ARCH = arch, TOOLS = toolset.split(',') + ['zip'])
if not toolset and not path:
defenv = Environment(TARGET_ARCH = arch)
Export('defenv')
to:
######################################################################
####### Build Environment ###
######################################################################
path = ARGUMENTS.get('PATH', '')
toolset = ARGUMENTS.get('TOOLSET', '')
arch = ARGUMENTS.get('TARGET_ARCH', 'x86')
vs_version = ARGUMENTS.get('MSVC_VERSION', '')
if toolset and path:
defenv = Environment(TARGET_ARCH = arch, ENV = {'PATH' : path}, TOOLS = toolset.split(',') + ['zip'], MSVC_VERSION = vs_version)
else:
if path:
defenv = Environment(TARGET_ARCH = arch, ENV = {'PATH' : path}, MSVC_VERSION = vs_version)
if toolset:
defenv = Environment(TARGET_ARCH = arch, TOOLS = toolset.split(',') + ['zip'], MSVC_VERSION = vs_version)
if not toolset and not path:
defenv = Environment(TARGET_ARCH = arch, MSVC_VERSION = vs_version)
Export('defenv')
Now my build output correctly identifies as VS2012:
C:\Source\nsis\nsis-code-r7069-NSIS-tags-v304>scons ZLIB_W32=C:\Source\zlib MSVC_VERSION=11.0
scons: Reading SConscript files ...
WARNING: VER_PACKED not set, defaulting to 0x03003666!
Delete("nsis-19-Nov-2019.cvs")
Delete(".instdist")
Delete(".test")
Using Microsoft tools configuration (11.0)
Checking for memset requirement... (cached) yes
<snip>
It seems like there should be a better way to allow this but I'm not familiar enough (yet) with scons or nsis's usage of it to submit a PR to fix this. Or maybe it already exists and I'm just not smart enough to find it (yet).
Now, on to building x64 nsis. Big thanks to #Anders for narrowing down which files are needed a bit and #bdbaddog for suggesting to look into the scons config.log where I found that I had named the zlib.dll lib file incorrectly when I build it for x64. After changing the output lib to
C:\Source\zlib-1.2.7\lib\zdll.lib
It at least tries to build but fails at linking...
-edit to update 1- all of the linking errors that were here previously were a bad rabbit hole I went down yesterday. Not sure if it was caused by scons leaving some temp files somewhere or VS doing it or me just being stupid but today when I started trying to track down the linking errors I couldn't reproduce them (there was a machine reboot yesterday and who knows what might have been locked/cached until then).
-UPDATE 2-
This update is removing all the mis-leading crap from update 1. Current situation is x64 NSIS build (using both my zlib-1.2.7-amd64 build AND the official NSIS zlib-1.2.8-amd64 pre-built binaries on the NSIS wiki) is failing at linking with the same error:
System.obj : error LNK2019: unresolved external symbol CallProc2 referenced in function Call
build\urelease\System\amd64-unicode\System.dll : fatal error LNK1120: 1 unresolved externals
scons: *** [build\urelease\System\amd64-unicode\System.dll] Error 1120
scons: building terminated because of errors.
So far I'm not having any luck figuring out what CallProc2 is but I think this means zlib is no longer at fault here.
-UPDATE 3-
Thanks to #Anders for the direction here since I was really stumped on the Callproc issue, the CallProc2 method is isolated to the system plugin and is caused by it not building correctly. There is some discussion in the comments below trying to find the cause but for now, I'm just excluding that plugin to get to a system that works before coming back to this.
Right now, the scons build completes and produces x64 binaries.
C:\Source\nsis\build\urelease\makensisw\>dumpbin /HEADERS ./makensisw.exe
Microsoft (R) COFF/PE Dumper Version 11.00.61030.0
Copyright (C) Microsoft Corporation. All rights reserved.
Dump of file ./makensisw.exe
PE signature found
File Type: EXECUTABLE IMAGE
FILE HEADER VALUES
8664 machine (x64)
However, the installer build now results in the following error:
scons dist-installer ZLIB_W32=C:\Source\zlib-1.2.8-x64 MSVC_VERSION=11.0
link /nologo /map /subsystem:console,5.01 /OUT:build\urelease\VPatch\Source\GenPat\GenPat.exe /LIBPATH:C:\Source\zlib-1.2.8-x64\lib zdll.lib build\urelease\VPatch\Source\GenPat\adler32.obj build\urelease\VPatch\Source\GenPat\Checksums.obj build\urelease\VPatch\Source\GenPat\ChunkedFile.obj build\urelease\VPatch\Source\GenPat\FileFormat1.obj build\urelease\VPatch\Source\GenPat\GlobalTypes.obj build\urelease\VPatch\Source\GenPat\main.obj build\urelease\VPatch\Source\GenPat\md5.obj build\urelease\VPatch\Source\GenPat\PatchGenerator.obj build\urelease\VPatch\Source\GenPat\POSIXUtil.obj
adler32.obj : error LNK2019: unresolved external symbol _adler32 referenced in function "unsigned long __cdecl Checksum::adler32(unsigned long,unsigned char const *,unsigned int)" (?adler32#Checksum##YAKKPBEI#Z)
build\urelease\VPatch\Source\GenPat\GenPat.exe : fatal error LNK1120: 1 unresolved externals
scons: *** [build\urelease\VPatch\Source\GenPat\GenPat.exe] Error 1120
Unfortunately, without the installer, I'm kinda lost again. The x64 binaries run but when trying to compile any script (I believe), I get the following error (note that I copied makensis and zlib1.dll into the same directory for this):
C:\Source>makensis.exe ./myapp.nsi
Error: reading stub "C:\Stubs\zlib-amd64-unicode"
Error initalizing CEXEBuild: error setting default stub
-UPDATE 4-
I have finally gotten a build to work successfully after ignoring the system plugin. This works both with compiling zlib from source for x64 or with the prebuilt version from the NSIS wiki (linked in #Anders answer below):
scons ZLIB_W32=C:\Source\zlib-1.2.8-x64 TARGET_ARCH=amd64 MSVC_VERSION=11.0 SKIPPLUGINS=System
and I was able to deploy to my dev system using (from an admin enabled prompt since I was installing to Program Files):
scons PREFIX="C:\Program Files (x86)\NSIS" install ZLIB_W32=C:\Source\zlib-1.2.8-x64 TARGET_ARCH=amd64 SKIPPLUGINS=System
Unfortunately, you cannot build the NSIS installer from here (dist-install) because the NSIS installer itself apparently depends on usage of the System plugin. So this scenario isn't sufficient for being able to prep a build machine unless you are going to build everything from scratch on that machine.
Also, I'm not out of the woods yet because I also use the System plugin for my installer so I need to figure out why that isn't working.
However, I was able to compile and run a pure x64 bare minimum installer package with the above setup:
# name the installer
OutFile "Installer.exe"
# default section start; every NSIS script has at least one section.
Section
# default section end
SectionEnd
The above was straight out of the NSIS docs.
My environment:
cl.exe:
Microsoft (R) C/C++ Optimizing Compiler Version 19.24.28316 for x64
nsis version: 3.06.1
zlib: Zlib-1.2.8-win64-AMD64
I modified SCons/Tool/masm.py:
C:\miniconda3\Lib\site-packages\SCons-4.0.1-py3.7.egg\SCons\Tool>diff -c masm.old.py masm.py
*** masm.old.py Fri Oct 09 22:02:45 2020
--- masm.py Fri Oct 09 21:58:06 2020
***************
*** 61,66 ****
--- 61,68 ----
shared_obj.add_emitter(suffix, SCons.Defaults.SharedObjectEmitter)
env['AS'] = 'ml'
+ if env.get('TARGET_ARCH')=='amd64':
+ env['AS'] = 'ml64'
env['ASFLAGS'] = SCons.Util.CLVar('/nologo')
env['ASPPFLAGS'] = '$ASFLAGS'
env['ASCOM'] = '$AS $ASFLAGS /c /Fo$TARGET $SOURCES'
Also modified nsis-3.06.1-src\Contrib\System\SConscript as follows:
C:\dev\nsis-3.06.1-src\Contrib\System>diff -c SConscript.old SConscript
*** SConscript.old Fri Oct 09 22:06:31 2020
--- SConscript Fri Oct 09 19:17:40 2020
***************
*** 4,9 ****
--- 4,10 ----
Source/Buffers.c
Source/Plugin.c
Source/System.c
+ Source/Call-amd64.S
""")
libs = Split("""
Then C:\dev\nsis-3.06.1-src>scons TARGET_ARCH=amd64 ended successfully.
I have
\zlib1.dll
\include\zconf.h
\include\zlib.h
\lib\libzdll.a
\lib\zdll.lib
\lib\zlib.lib
You can get pre-built zlib for NSIS here.
In the past I have actually used Process Monitor to figure out which file SCons is looking for, that sadly seems to be the fastest way to find out.
Regarding CallProc2. Add "System" to SKIPPLUGINS to get past this issue to make sure everything else compiles. CallProc2 is implemented in the amd64 .S file and used by system.c. You need the 64-bit Microsoft assembler detected correctly to build the .S file.
If you only need to compile NSIS once then you can cheat; Open a Visual Studio command prompt and execute ml64.exe /c Call-amd64.S and then copy the .obj file to the same build directory as system.obj and run Scons again.
To do it properly you need to investigate why Scons is not compiling the .S file. Could be a configuration issue or a bug in the related Sconscript file.
So I am trying to build and test out a CMake with the Android NDK on Android Studio. I can get my library to compile, but it doesn't seem to want to pull any third-party dependencies over. I've been reading through the toolchain and looking for better documentations, with no luck. Can someone tell me if I am missing?
cmake_minimum_required(VERSION 3.4.1)
set(SFML_PATH ${ANDROID_NDK}/sources/sfml)
set(SFML_LIB_PATH ${SFML_PATH}/lib/${ANDROID_NDK_ABI_NAME})
set(SFML_LIB_SYSTEM ${SFML_LIB_PATH}/libsfml-system.so)
set(SFML_LIB_AUDIO ${SFML_LIB_PATH}/libsfml-audio.so)
set(SFML_LIB_GRAPHICS ${SFML_LIB_PATH}/libsfml-graphics.so)
set(SFML_LIB_NETWORK ${SFML_LIB_PATH}/libsfml-network.so)
set(SFML_LIB_WINDOW ${SFML_LIB_PATH}/libsfml-window.so)
set(SFML_LIB_ACTIVITY ${SFML_LIB_PATH}/libsfml-activity.so)
set(SFML_LIB_MAIN ${SFML_LIB_PATH}/libsfml-main.a)
set(SFML_LIBS ${SFML_LIB_SYSTEM} ${SFML_LIB_GRAPHICS} ${SFML_LIB_AUDIO} ${SFML_LIB_WINDOW} ${SFML_LIB_ACTIVITY})
include_directories(${SFML_PATH}/include)
link_directories(${SFML_LIB_PATH})
add_library(native-lib SHARED
src/main/cpp/native-lib.cpp)
target_link_libraries(native-lib log ${SFML_LIBS})
#file(COPY ${SFML_LIBS} DESTINATION ${__android_install_path})
FOREACH(SFML_LIB ${SFML_LIB})
execute_process( COMMAND "${CMAKE_COMMAND}" -E copy_if_different "${SFML_LIB}" "${LIBRARY_OUTPUT_PATH}/${SFML_LIB}" RESULT_VARIABLE __fileCopyProcess )
MESSAGE("Lib: ${SFML_LIB}")
ENDFOREACH(SFML_LIB)
Above is my CMakeLists.txt. I have done a little hacking to get it to compile with SFML with the paths, as I have not found good documentation with CMake and Android yet.
May you add more info for:
"but it doesn't seem to want to pull any third-party dependencies over."?
this one:
https://github.com/googlesamples/android-ndk/tree/master/hello-libs
has static and shared 3rd party libs, you may try it.
For the shared dependent lib, you will need to pack them into APK, that is done inside gradle, cmake will not do it.
The above example shows that, basically they need to be copied into your app/src/main/jniLibs too so they will be packed into apk, and pushed to your android phone/tablet. At runtime they could be loaded.
I have tried to put a group of libraries into one directory, and use
link_directories(...)
then just put the lib names directly into
target_link_libraries(...)
also works. Make sure you have the right libs for the ABIs you intend to support for your app [looks like you are just building for one ABI].
The process could be little long it will depend on your android skills.
An example could be similar to this process:
Crosscompile sfml.
Create your jni bridge
Generate with cmake the project and compile
Copy your files to android studio. create java loading library code.
I guess that you have crosscompiled sfml and you know how works crosscompiling process, if I am wrong check these link below:
Tutorial:
https://github.com/SFML/SFML/wiki/Tutorial:-Building-SFML-for-Android
Source code:
https://github.com/SFML/SFML
Toolchain:
https://github.com/SFML/SFML/blob/master/cmake/toolchains/android.toolchain.cmake
Changes on your cmake:
add this file
FIND_PACKAGE(SFML required)
In cmake put your SFML build directory and cmake will fills your VARIABLES
automatically for instance this variables:
set(SFML_PATH ${ANDROID_NDK}/sources/sfml)
set(SFML_LIB_PATH ${SFML_PATH}/lib/${ANDROID_NDK_ABI_NAME})
set(SFML_LIB_SYSTEM ${SFML_LIB_PATH}/libsfml-system.so)
set(SFML_LIB_AUDIO ${SFML_LIB_PATH}/libsfml-audio.so)
set(SFML_LIB_GRAPHICS ${SFML_LIB_PATH}/libsfml-graphics.so)
set(SFML_LIB_NETWORK ${SFML_LIB_PATH}/libsfml-network.so)
set(SFML_LIB_WINDOW ${SFML_LIB_PATH}/libsfml-window.so)
set(SFML_LIB_ACTIVITY ${SFML_LIB_PATH}/libsfml-activity.so)
set(SFML_LIB_MAIN ${SFML_LIB_PATH}/libsfml-main.a)
There are two ways to make android studio native apps:
Easy way:
Create JNI bridge:
Crosscompile your cmake script and copy your lib to app/src/main/jniLibs
add library in execution time
code:
try
{
Log.v(LOG_TAG, "adding your library");
System.loadLibrary(your_library);
}
catch(UnsatisfiedLinkError e)
{
Log.e(LOG_TAG,e.getMessage());
}
More complete way (it allows to debug library)
Create your ndk module in gradle
example
android.ndk {
moduleName = "your_library"
cppFlags.add("-fexceptions")
//cppFlags.add("-std=c++11")
//cFlags.add("-fopenmp")
cppFlags.add("-I" + file("src/main/jni").absolutePath)
stl = "gnustl_shared" // Which STL library to use: gnustl or stlport
ldLibs.addAll(["android", "EGL", "GLESv2", "dl", "log", "z"])
String libsDir = curDir.absolutePath + "/src/main/jniLibs/armeabi/"
ldLibs.add(libsDir + "your_native_lib.so")
}
I try to use MSVC and MSVC_VERSION in toolchain file -
cmake-toolchain-file.cmake
message (STATUS "Toolchain MSVC=${MSVC} MSVC_VERSION=${MSVC_VERSION}")
CMakeLists.txt
message (STATUS "Project MSVC=${MSVC} MSVC_VERSION=${MSVC_VERSION}")
Output is
> cmake .. -DCMAKE_TOOLCHAIN_FILE=../cmake/cmake-toolchain-file.cmake
-- Toolchain MSVC= MSVC_VERSION=
-- Project MSVC=1 MSVC_VERSION=1800
-- Bulding target for: windows-x86
-- Configuring done
-- Generating done
-- Build files have been written to: D:/test/build
Is it normal behavior that I get MSVC and MSVC_VERSION from toolchain file empty?
Is it normal behavior that I get MSVC and MSVC_VERSION from toolchain file empty?
Yes. Variables initialized after project command. This is where toolchain read also. Try to add some extra messages:
cmake_minimum_required(VERSION 3.0)
message(STATUS "Before project: MSVC(${MSVC})")
project(Foo)
message(STATUS "After project: MSVC(${MSVC})")
Result:
-- Before project: MSVC()
-- Toolchain MSVC()
-- Toolchain MSVC()
...
-- After project: MSVC(1)
-- Configuring done
-- Generating done
So I guess you have one hidden question in mind:
Is it confusing?
Yes, for the Visual Studio generators. Since usually you can set some critical stuff like path to compiler in toolchain it doesn't make sense to read compiler-related variables like CMAKE_CXX_COMPILER_ID before toolchain processed, i.e. before project command. But when you set generator to Visual Studio your compiler is always MSVC. This gives you a hint for the workaround. Just check CMAKE_GENERATOR variable in toolchain instead of MSVC_VERSION (though this will not work for NMake generator of course):
if("${CMAKE_GENERATOR}" STREQUAL "Visual Studio 12 2013")
message("Toolchain: MSVC 1800")
endif()
if("${CMAKE_GENERATOR}" STREQUAL "Visual Studio 9 2008")
message("Toolchain: MSVC 1500")
endif()
I am trying to generate some Boost 1.58 libraries that I need (chrono, regex and thread) for Visual Studio 2012 and link the libraries with CMake. I have real problems for CMake and Visual Studio to find or link the libs, depending on the configuration I set.
I am finally using the following configuration:
bjam.exe --link=static --threading multi --variant=debug stage
But this doesn't seem to generate static libs.
How should I generate the libs and search them with CMake in order for Visual Studio to link them properly?
I finally came up with the solution and I think it is detailed enough to become a generic answer.
Visual Studio searches for dynamic libraries so we need to compile Boost libraries as shared, multithreaded, debug and release, and runtime-link shared. In windows using bjam.exe all commands have the prefix "--" except link, so the right way to build the libraries is:
bjam.exe link=shared --threading=multi --variant=debug --variant=release --with-chrono --with-regex --with-thread stage
This will generate the libs and DLLs in the folder Boost/stage/lib, so we need to set an environment variable Boost_LIBRARY_DIR C:/Boost/stage/lib, for example.
There are more options that may come in hand:
runtime-link = shared/static
toolset= msvc-11.0
The libraries will have a name like this for release:
boost_chrono-vc110-mt-1_58.lib
boost_chrono-vc110-mt-1_58.dll
And for debug:
boost_chrono-vc110-mt-gd-1_58.lib
boost_chrono-vc110-mt-gd-1_58.dll
In order for CMake to link them properly we need to write the following in the CMakeLists.txt:
add_definitions( -DBOOST_ALL_DYN_LINK ) //If not VS will give linking errors of redefinitions
set(Boost_USE_STATIC_LIBS OFF )
set(Boost_USE_MULTITHREADED ON)
set(Boost_USE_STATIC_RUNTIME OFF)
find_package(Boost COMPONENTS thread chrono regex REQUIRED )
INCLUDE_DIRECTORIES(${Boost_INCLUDE_DIRS})
TARGET_LINK_LIBRARIES( ${PROJ_NAME} ${Boost_LIBRARIES} )
bjam.exe --link=static --threading multi --variant=debug stage
But this doesn't seem to generate static libs.
Building the special stage target places Boost library binaries in the stage\lib\ subdirectory of the Boost tree.
More about building Boost on Windows here
CMake:
SET (CMAKE_BUILD_TYPE Debug) # in order to link with boost debug libs you may need to set that to build your program in debug mode (or do that from command line)
FIND_PACKAGE (Boost 1.58 COMPONENTS "chrono" "regex" "thread" REQUIRED)
#ADD_DEFINITIONS(-DBOOST_ALL_DYN_LINK) # make sure you don't have this as it will try to link with boost .dll's
INCLUDE_DIRECTORIES(${Boost_INCLUDE_DIRS})
LINK_DIRECTORIES(${Boost_LIBRARY_DIRS})
TARGET_LINK_LIBRARIES(${EXE_OR_LIB_NAME} ${Boost_LIBRARIES})
I downloaded a source code from codeproject.com
In the zip file only .cpp files and .hpp files were present but no .dsw (vc++ work space) file.
how to compile the files in vc++.
source code link: http://www.codeproject.com/Articles/13852/BasicExcel-A-Class-to-Read-and-Write-to-Microsoft
As far as I know, vc++ requires a dsw. That said, you can easily create one by creating a project + workspace and add import all the files into the project.
Unless the code requires special compile options (unlikely if it comes without a project file), this should work fine.
When you compile in VisualC++ cl.exe is executed in order to compile a s program.
So a way to compile that does not require dsw file might be running compilation in command line. This is an example for a source file main.cpp:
call vsvars32.bat && cl /MD /Od /EHsc /GR /Zi /Oy- main.cpp
On my computer the full path to vsvars32.bat is
C:\Program Files (x86)\Microsoft Visual Studio 9.0\Common7\Tools\vsvars32.bat
dsw is an ancient format, and is replaced by .sln and .vcxproj files. What I would suggest would be File -> New -> Project from existing code, browse to the directory with the cpp/hpp files and let Visual Studio create the project with default compiler switches - so you won't have to run cl.exe from the command line as suggested here.
That being said, the chances of circa-2006 c++ code to work on IDEs 4 versions later are small. You will almost certainly have to understand and fix the code - or find a more recent one that does what you need.