In commit 89146e84c9 we added some
complicated code to verify the llvm framework's "combination" matrix
lookup. It expects to find llvm with both cmake and config-tool, with
the same version. But the sanity check is wonky -- it checks that both
have the same found status, instead, so if both are not found then we
proceed to try to convert the string "unknown" to a mapping of semver
integers, and this is guaranteed to fail.
This can happen for example if the system llvm exists in the general
case, but actual modules cannot be found because the system llvm does
not distribute static modules. For example, this is the case on Gentoo.
Abort more obviously by just insisting that both be found. If they
aren't both found, then investigative efforts know to look at why they
weren't found.
This issue was encounetered while working on a contribution to nixpkgs.
Nix allows the store to be installed on a separate, case-sensitive APFS
volume. When the store is on a case-sensitive volume, these tests fail
because they try to use `foundation` instead of `Foundation`.
Currently you can only use one of qt4, qt5, qt6 in a single project
when using a machine file because the config-tool lookup for qt only
looks at `qmake` in the machine files, instead of looking up the
binary names directly.
Allow specifying `qmake` `qmake4` `qmake5` and `qmake6`.
This is necessary for gstreamer, which can build separate qt5 and qt6
plugins that are distributed as static libraries, so the user can pick
which one to use.
* DubDependency._check_dub returns the version
* check for compatible Dub version
Dub versions starting at 1.32 have a new cache structure
into which Meson doesn't know where to find compatible artifacts
* skipping D tests involving Dub
* refactor _check_dub
makes mypy happier
* make linters happy
* localize some logic
This matches the tests for Python extensions.
Also include some other cleanups to these `meson.build` files:
Adding `python_dep` is no longer needed, this is automatic now.
Use a single line for `import('python').find_installation()`,
because the result of `import('python')` by itself is not used
for anything.
For all source `*.py` files installed via either py.install_sources() or
an `install_dir: py.get_install_dir()`, produce `*.pyc` files at install
time. Controllable via a module option.
Case 1:
- Prog links to static lib A
- A link_whole to static lib B
- B link to static lib C
- Prog dependencies should be A and C but not B which is already
included in A.
Case 2:
- Same as case 1, but with A being installed.
- To be useful, A must also include all objects from C that is not
installed.
- Prog only need to link on A.
On Windows, the SDL2 library is generally provided with only CMake config
files. This commit allows meson to fallback on CMake as a last resort to
find the SDL2 library.
import('python').find_installation('python').version() causes exception
because of a missing initialization, when `find_installation()` receives
a name or a path.
This allows changing the crate name with which a library ends up being
available inside the Rust code, similar to cargo's dependency renaming
feature or `extern crate foo as bar` inside Rust code.
Rust has a `debug_assert!()` macro, which is designed to be toggled on
the command line. It is on by default in debug builds, and off by
default in release builds, in cargo. This matches what meson's b_ndebug
option does in `if-release` mode.
If the optional first "mainlib" argument is there, then we infer several
values. Otherwise, some of those values fall back to a generic default,
and two of them -- name and description -- fall back to being mandatory.
In commit e84f293f67, we removed
validation for description as part of refactoring that never actually
validated anything.
The paths in meson.build use / as path separator, however, the paths
constructed during the directory structure walk use native path
separators, thus the path never compare equal to the excluded ones.
Normalize the exclusion paths before the comparison.
Rust doesn't have a concept of dependency compile arguments, i.e.
something like headers. Dependencies are linked in and all required
metadata is provided by the linker flags.
During evaluation of codeblocks, we start off with an iteration of
nodes, and then while evaluating them we may update the global
self.current_node context. When catching and formatting errors, we
didn't take into account that the node might be updated from the
original top-level iteration.
Switch to formatting errors using self.current_node instead, to ensure
we can point at the likely most-accurate actual cause of an error.
Also update the current node in a few more places, so that function
calls always see the function call as the current node, even if the most
recently parsed node was an argument to the function call.
Fixes#11643
Extend the "common/include order" test to ensure that the build
directory is preferred over the source directory. For example,
when using `configure_file()`, the resulting file should be
preferred over a file with the same name in the source directory.
Allows getting closer to `./run_project_tests.py -- -Dwerror=true`.
- when argc and argv are not *both* used, there's a standard, compliant
mechanism to mark the variable as unused
- generated code should not build as -Werror
- more thoroughly comment out some commented code
Because we base the pickled data name on the name property of the
command being run... and for built targets, `exe.name` is always just
the name. However, for an ExternalProgram this is just whatever string
we searched for, so, NOT just the basename.
This became a bigger issue once we started using generator() with the
actual program in commit 6aeec80836,
rather than first casting it to a string, because the VS backend
*always* uses the meson_exe approach for various reasons related to VS
being VS.
Outside of that, it's difficult to actually get an ExternalProgram
object passed to meson_exe -- CustomTarget lowers it to a string,
capture is handled via argparse instead of pickling, etc.
Fixes#11593
We will still try to load `meson_options.txt` if `meson.options` doesn't
exist. Because there are some advantages to using `meson.options` even
with older versions of meson (such as better text editor handling)
we will not warn about the existence of a `meson.options` file if a
`meson_options.txt` file or symlink also exists.
The name `meson.options` was picked instead of alternative proposals,
such as `meson_options.build` for a couple of reasons:
1. meson.options is shorter
2. While the syntax is the same, only the `option()` function may be
called in meson.options, while, it may not be called in meson.build
3. While the two files share a syntax and elementary types (strings,
arrays, etc), they have different purposes: `meson.build` declares
build targets, `meson.options` declares options. This is similar to
the difference between C's `.c` and `.h` extensions.
As an implementation detail `Interpreter.option_file` has been removed,
as it is used exactly once, in the `project()` call to read the options,
and we can just calculate it there and not store it.
Fixes: #11176
It was totally subproject-unsafe, and setting a super bad example. This
is bad, because doxygen is annoying to get right and we occasionally
tell people to go use our example test case.
There is a fun nuance here, that makes doxygen unpredictably work on
some versions, and fail on others. Specifically, values must be quoted
in doxygen 1.8, but not in doxygen 1.9, or they break -- but only if the
output directory contains spaces. This was "fixed" in commit
ef91bacb7a
which actually caused it to act like an unquoted OUTPUT_DIRECTORY is not
provided at all, and then fixed for real in commit
eb3d1eb5ad
For portability, it is necessary to quote this just to be on the safe
side.
Fixes#11579
It can only be used for projects that don't have any rules at all, i.e.
they are purely using Meson to:
- configure files
- run (script?) tests
- install files that exist by the end of the setup stage
This can be useful e.g. for Meson itself, a pure python project.
This was added in f774609 to only change the access time of the
coredata file if the coredata struct actually changed. However,
this doesn't work as pickle serializations aren't guaranteed to
be stable. Instead, let's manually check if options have changed
values and skip the save if they haven't changed.
We also extend the associated unit test to cover all the option
types and to ensure that configure does get executed if one of the
options changes value.
This test is intended to test really long output, so it prints 100k
lines of stdout/stderr. It completes in two seconds on my machine, but
the default 30-second timeout is apparently too much for CI, because on
Windows we often get flaky tests due to this. e.g. we'll get within 200
lines of the end.
Bump the CI time by x2. We know this isn't particularly surprising
behavior, and allowing it to request another 30 seconds won't hang the
CI. But it will save us from some spurious failures and restarted jobs.
In commit eaf365cb3e we explicitly sorted
them for neatness, with the rationale that we were restoring intentional
behavior and we only need a set for stylistic purposes.
This actually wasn't true, because we never sorted them to begin with
(we did sort the version numbers), but sorting them is fine. The bigger
issue is that we actually used a set to avoid printing the same feature
type multiple times. Now we do print them multiple times -- because each
registered feature includes the unique node.
Fix this by using both sorted and a set.
Fix tests that should in retrospect have flagged this as an issue, but
were added later on in the same series to check something else entirely,
happen to cover this too, and were presumably copied directly from
stdout as-is...
We need to know the project minimum version before evaluating the rest
of the function. There's three basic approaches:
- try to set it inside KwargInfo
- just run a minimal version of func_project for this, then load
everything after
- drop down to the AST and set it before anything else
In order to handle FeatureNew emitted by a FunctionNode evaluated
before project() due to being inlined, such as `version: run_command()`,
only option 3 suffices, the rest all happen way too late. Since we have
just added AST handling support for erroring out, we can do that to set
the version as well.