In most cases instead pass `for_machine`, the name of the relevant
machines (what compilers target, what targets run on, etc). This allows
us to use the cross code path in the native case, deduplicating the
code.
As one can see, environment got bigger as more information is kept
structured there, while ninjabackend got a smaller. Overall a few amount
of lines were added, but the hope is what's added is a lot simpler than
what's removed.
Since gpgme 1.13.0, pkg-config files are available and this is the
preferred way to detect the dependency. Without this, projects that wish
to generate pkg-config files that Requires.private on gpgme, now have
their custom dependency() fallbacks overridden with an incorrect
configtool dependency.
Instad of having special casing of threads in the backends and
everywehre else, do what we did for openmp, create a real
dependency. Then make use of the fact that dependencies can now have
sub dependencies to add threads.
Currently we specialcase OpenMP like we do threads, with a special
`need_openmp` method. This seems like a great idea, but doesn't work
out in practice, as well as it complicates the opemp
implementation. If GCC is built without opemp support for example, we
still add -fopenmp to the the command line, which results in
compilation errors.
This patch discards that and treats it like a normal dependency,
removes the need_openmp() method, and sets the compile_args attributes
from the compiler.
Fixes#5115
For dynamic linking, some mingw releases don't link
correctly with pythonXX.lib in all cases.
This patch forces mingw to link against
pyhthonXX.dll instead of the .lib file, which has
a better compatiblity.
Note that msys 1.0 old platform is detected
as windows instead of 'mingw'
Instead of only doing a naive filesystem search, also run the linker
so that it can tell us whether the -F path specified actually contains
the framework we're looking for.
Unfortunately, `extraframework` searching is still not 100% correct in
the case when since we want to search in either /Library/Frameworks or
in /System/Library/Frameworks but not in both. The -Z flag disables
searching in those prefixes and would in theory allow this, but then
you cannot force the linker to look in those by manually adding -F
args, so that doesn't work.
Somehow I overlooked this class when adjusting everything else in c59ec874
et seq.
Update Python3Dependency so it has a _factory() method which returns a list
of constructors, depending on the method: kwarg, which integrates better
with the reporting in find_external_dependency()
This ensures the return of a PkgConfigDependency object, if the dependency
is found with pkgconfig, so we don't have to specifically implement
get_pkgconfig_variable() ourselves.
If successful, we should identify the method which was successful
If successful, we should report the version found (if known)
If failing, we should identify the methods we tried
Some dependency detectors which had no reporting now gain it
There's all kinds of complexities, inconsistencies and special cases hidden
in the existing behaviour, e.g.:
- boost reports modules requested, and BOOST_ROOT (if set)
- gtest/gmock report if they are a prebuilt library or header only
- mpi reports the language
- qt reports modules requested, and the config tool used or tried
- configtool reports the config tool used
- llvm reports if missing modules are optional (one per line)
We add some simple hooks to allow the dependency object to expose the
currently reported information into the consolidated reporting
Note that PkgConfigDependency() takes a silent: keyword which is used
internallly to suppress reporting. This behaviour isn't needed in
find_external_dependency().
Give ConfigToolDependency() a finish_init callback, so that tool-specific
initialization can be called from the constructor, rather than after
construction in the factory class.
v2:
finalize -> finish_init for clarity
find_external_dependency() now makes and iterates over a list of callables
which are constructors with bound arguments for the dependency objects we
are going to attempt to make, so we can consolidate reporting on these
attempts and handling failures in that function.
We now pass the current subproject to every FeatureNew and
FeatureDeprecated call. This requires a bunch of rework to:
1. Ensure that we have access to the subproject in the list of
arguments when used as a decorator (see _get_callee_args).
2. Pass the subproject to .use() when it's called manually.
3. We also can't do feature checks for new features in
meson_options.txt because that's parsed before we know the
meson_version from project()
D is not a 'c-like' language, but it can link to C libraries. The same
might be true of Rust in the future and Go when we add support for it.
This contains no functionality changes.