This new cmake module allows to generate cmake package files. This may ease the
porting for cmake projects that are exporting cmake package informations for other
depending projects. The module uses as much as possible the templates provided by
the cmake installation (and so cmake needs to be installed).
A custom_target, if install is set to true, will always be built by
default even if build_by_default is explicitly set to false.
Ensure that this does not happen if it's set explicitly. To keep
backward compatibility, if build_by_default is not set explicitly and
install is true, set build_by_default to true.
Fixes#4107
* Fixed spelling
* Merged the Buildoptions and Projectinfo interpreter
* Moved detect_compilers to Environment
* Added removed test case
* Split detect_compilers and moved even more code into Environment
* Moved set_default_options to coredata
* Small code simplification in mintro.run
* Move cmd_line_options back to `environment`
We don't actually wish to persist something this unstructured, so we
shouldn't make it a field on `coredata`. It would also be data
denormalization since the information we already store in coredata
depends on the CLI args.
The returned not-found object can be from any type because we were
returning the first of the failed attempts. It also can happen that we
don't have any dependency object in which case we should just return
NotFoundDependency() object as well instead of raising an exception.
That exception was happening before, but dependency_impl() was
calling find_external_dependency() in a try block so it was hidden.
When req_version is None (e.g. pcap-config case) it gets printed in the
logs.
Take this opportunity to reformat the message to look more like
ExternalProgram messages.
Not need to catch exceptions in dependency_fallback(), it's already
handled in do_subproject(). This ensure subproject errors are handled
the same way when doing dependency() fallback and when doing
subproject().
Otherwise Python gets all confused and it makes testing difficult.
Also minimally emulate the behaviour of the normal object to make the rest
of the code happy.
If a configure_file has an install_dir set, the supported install
argument is ignored, while this should have actually higher priority
than the install_dir itself.
Also check that correct types are used for `install` and `install_dir`.
Add test to verify this.
Fixes#3983
On OpenBSD, the main PowerPC machine is known as macppc, while the processor
family is powerpc:
$ uname -{m,p}
macppc powerpc
$ echo 'import platform; print (platform.machine())' | python3
macppc
$ echo 'import platform; print (platform.processor())' | python3
powerpc
This allows for e.g. GLib configure to properly detect that cpu:
Build machine cpu family: ppc
Build machine cpu: macppc
While it failed before with: WARNING: Unknown CPU family 'macppc' <snip>
Since trying to cross compile for Windows from Linux (WSL) and having
paths like this:
'-L/mnt/c/Program Files (x86)/Microsoft Visual Studio/2017/\
Community/VC/Tools/MSVC/14.15.26726/lib/x64'
I found that the spaces and brackets in the paths weren't properly
escaped by the Ninja backend.
This adds a hidden option to dump the current otherwise hidden peristant
state in coredata.dat.
This interface is unstable as meson has no compatibility promises about
coredata.dat.
When python sees an invalid character in a filename for the current locale,
instead of clobbering it, it saves is as an invalid codepoint called a
surrogate. We need to explicitly instruct the encoder to write those out
as-is. In the JSON file, we replace them instead to produce valid json.