Sometimes, distros want to configure a project so that it does not
use any bundled library. In this case, meson.build might want
to do something like this, where slirp is a combo option
with values auto/system/internal:
slirp = dependency('', required: false)
if get_option('slirp') != 'internal'
slirp = dependency('slirp',
required: get_option('slirp') == 'system')
endif
if not slirp.found()
slirp = subproject('libslirp', ...) .variable('...')
endif
and we cannot use "fallback" because the "system" value should never
look for a subproject.
This worked until 0.54.x, but in 0.55.x this breaks because of the
automatic subproject search. Note that the desired effect here is
backwards compared to the policy of doing an automatic search on
"required: true"; we only want to do the search if "required" is false!
It would be possible to look for the dependency with `required: false`
and issue the error manually, but it's ugly and it may produce an error
message that looks "different" from Meson's.
Instead, with this change it is possible to achieve this effect in an
even simpler way:
slirp = dependency('slirp',
required: get_option('slirp') != 'auto',
allow_fallback: get_option('slirp') == 'system' ? false : ['slirp', 'libslirp_dep'])
The patch also adds support for "allow_fallback: true", which is
simple and enables automatic fallback to a wrap even for non-required
dependencies.
Force_fallback is not an interpreter keyword argument, and there
is no reason to handle it as one since it is not used anywhere
else (and in fact is explicitly ignored by get_dep_identifier).
Use a Python keyword argument instead, which makes the code
simpler.
Those function are common source of issue when used in a subproject because they
point to the parent project root which is rarely what is expected and is a
violation of subproject isolation.
This means that, in the common case of a simple meson.build which
doesn't contain any 'native: true' targets, we won't require a native
compiler when cross-compiling, without needing any changes in the
meson.build.
when that statement gets evaluated, the interpreter remembers the
version target and if it was part of the evaluation of a `if` condition
then the target meson version is temporally overriden within that
if-block.
Fixes: #7590
- Exceptions raised during subproject setup were ignored.
- Allow c_stdlib in native file, was already half supported.
- Eliminate usage of subproject variable name by overriding
'<lang>_stdlib' dependency name.
This creates a full set of option in environment that mirror those in
coredata, this mirroring of the coredata structure is convenient because
lookups int env (such as when initializing compilers) becomes a straight
dict lookup, with no list iteration. It also means that all of the
command line and machine files are read and stored in the correct order
before they're ever accessed, simplifying the logic of using them.
This is like the project options, but for meson builtin options.
The only real differences here have to do with the differences between
meson builtin options and project options. Some meson options can be set
on a per-machine basis (build.pkg_config_path vs pkg_config_path) others
can be set on a per-subproject basis, but should inherit the parent
setting.
'native:' keyword was only added in 0.54. For projects declaring
meson_version >= 0.54, warn, because those projects can and should set
the keyword. For older projects declaring support for older versions,
don't warn and use the default implicitly.
Fixes https://github.com/mesonbuild/meson/issues/6849.
We don't need the legacy variable name system as for dependency()
fallbacks because meson.override_find_program() is largely used already,
so we can just rely on it.
This fix the following common pattern, we don't want to implicitly
fallback on the first line:
foo_dep = dependency('foo', required: false)
if not foo_dep.found()
foo_dep = cc.find_library('foo', required : false)
if not foo_dep.found()
foo_dep = dependency('foo', fallback: 'foo')
endif
endif
When meson is currently being run with a python that seems to have been
installed from the Windows Store, replace the general WindowsApps
directory in search paths with dirname(sys.executable), and also handle
failures with pathlib.resolve on WindowsApps exe files.
This is done with the new cmake subprojects options object
that is similar to the already exisiting configuration data
object. It is consumed by the new `options` kwarg of the
cmake.subproject function.
D lang compilers have an option -release (or similar) which turns off
asserts, contracts, and other runtime type checking. This patch wires
that up to the b_ndebug flag.
Fixes#7082
Ideally we wouldn't need to have the default dict here and could just
rely on it being set as soon as project is called. There is a corner
case exercised by test case common/35 run program, which is that if a
FeatureNew or FeatureDeprecated is called to generate the meson version
it will be unset, to work around this I've changed the type from a dict
to a default dict with '' as the default value.
A better fix would probably be to store all of the
FeatureNew/FeatureDeprecated checks until the end, then evaluate them,
but for now this results in no loss of functionality, only more
functionality, even if it isn't prefect.
This gives the version that the feature was deprecated in, and doesn't
print the warning if the project supports versions of meson in which the
project wasn't deprecated.
The implementation of this function has changed enough that the name
doesn't really reflect what it actually does. It basically returns true
unless you're cross compiling, need and exe_wrapper, and don't have one.
The original function remains but is marked as deprecated.
This makes one small change the meson source language, which is that it
defines that can_run_host_binaries will return true in build == host
compilation, which was the behavior that already existed. Previously
this was undefined in build == host compilation.
Gtest can output junit results with a command line switch. We can parse
this to get more detailed results than the returncode, and put those in
our own Junit output. We basically just throw away the top level
'testsuites' object, then fixup the names of the tests, and shove that
into our junit.