This saves on a 1500-line import at startup and may be skipped entirely
if no compiled languages are used. In exchange, we move the
implementation to a new file that is imported instead.
Followup to commit ab20eb5bbc.
Linkers that aren't actually ar, were refactored into a base class in
commit 253ff71e6b, which however didn't
take into account that we were doing checks for this.
I noticed when building a project that uses a proc macro that Meson
passed -C prefer-dynamic for the executable, and not the proc macro,
while cargo passed -C prefer-dynamic for the proc macro, but not for
the executable. Meson's behavior broke setting -C panic=abort on the
executable.
As far as we can tell, because we explicitly pass each library path to
rustc, the only thing -C prefer-dynamic affects in Meson is how the
standard libraries are linked. Generally, one does not want the
standard libraries to be dynamically linked, because if the Rust
compiler is ever updated, anything linked against the old standard
libraries will likely break, due to the lack of a stable Rust ABI.
Therefore, I've reorganised Meson's behavior around the principle that
the standard libraries should only be dynamically linked when Rust
dynamic linking has already been opted into in some other way. The
details of how this manifests are now explained in the documentation.
We are going to truncate it down below with int() anyway, no need to
carry extra precision. And mypy complains if the type changes between
float and int.
Specifically, when those files can be created by a build rule with one
version of meson.build, and created as e.g. a shared_library alias
symlink in another version of meson.build, the cleandead command will
delete important files just because they don't happen to be created by
ninja itself.
Work around this by making a dummy rule that exists solely to insert the
files into the build graph to trick ninja into not deleting them.
Closes#11861
ninja's configured command for regenerating a build directory on any
action that *requires* reconfiguring, specifies the source and build
directories as they were known during initial project generation. This
means that if the build directory is no longer the *same* build
directory, we will regenerate... the original location, rather than the
location we want.
After that, ninja notices that build.ninja is still out of date, so it
goes and reconfigures again. And again. And again.
This is probably broken intentions, but endless reconfigure loops are a
kind of evil beyond all evils. There are no valid options here
whatsoever other than:
- doing what the user actually meant
- spawning a clear error message describing why meson refuses to work,
then exiting with a fatal error
But it turns out that it's actually pretty easy to do what the user
actually meant, and reconfigure the current build directory instead of
the original one. This permanently breaks the link between the two.
Fixes#6131
This allows changing the crate name with which a library ends up being
available inside the Rust code, similar to cargo's dependency renaming
feature or `extern crate foo as bar` inside Rust code.
This will help with the writing of tools to generate
VisualStudio project and solution files, and possibly
for other IDEs as well.
- Used compilers a about `host`, `build` and `target` machines
arere listed in `intro-compilers.json`
- Informations lister in `intro-machines.json`
- `intro-dependencies.json` now includes internal dependencies,
and relations between dependencies.
- `intro-targets.json` now includes dependencies, `vs_module_defs`,
`win_subsystem`, and linker parameters.
We need to remember its value when reconfiguring, but the Build object
is not reused, only coredata is.
This also makes CLI more consistent by allowing `-Dvsenv=true` syntax.
Fixes: #11309
The proc-macro code was not running at all because of a missing dash in
the crate type, and the proc macro dylib path was not generated as a
path but including the `-o ` commandline parameter prefix.
As meson requires source_dir!=build_dir and stores the rust-project.json
inside the build directory, while software like rust-analyzer expects it
at the root of the source directory, manual steps are needed for making
them work together.
One option, as described in the documentation, is per project
configuration. Another option, that works correctly with
compile-commands.json and clangd, is to store a symlink to the file in
the build directory at the root of the source directory.
As currently rust-project.json stores paths relative to the location of
the file itself and rust-analyzer does not resolve symlinks, this does
not work.
To solve this, store absolute paths in rust-project.json as is already
done in compile_commands.json for the directory.
Rustc as of version 1.61.0 has support for controlling when
whole-archive linking takes place, previous to this it tried to make a
good guess about what you wanted, which worked most of the time. This is
now implemented.
Additionally, rustc makes some assumptions about library names
(specifically static names), that meson does not keep. This can be fixed
with rustc 1.67, where a new +verbatim modifier has been added. We can
then force rustc to use the name we give it. Before that, we can sneak
through `/WHOELARCHIVE:` in cases of dynamic linking (into a dll or
exe), but we can't force the archiver to do what we want (rustc
considers the archiver to be an implementation detail). The only
solution I can come up with is to copy the library to the format that
rustc expects. I've run into some issues with that as well, so we warn
in that case.
The decisions to leave static into static broken on MSVC for 1.61–1.66
was made because:
1) The work around is non-trivial, and we would have to support that
workaround for a long time
2) The number of users of Rust in Meson is small
3) The number of users of Rust in Meson on Windows, with MSVC is tiny
4) Using rustup to update rustc on windows is trivial, and solves the
problem completely
Fixes: #10723Fixes: #11247
Co-authored-by: Nirbheek Chauhan <nirbheek@centricular.com>
This solves rebuild issues when e.g. importing a .pxd header from a .pyx
file, just like C/C++ source headers. The transpiler needs to run again
in this case.
This functionality is present in the 3.0.0 alphas of cython, and is also
backported to 0.29.33.
Fixes#9049
We want to use as much default ninja behavior as we can, so reuse $out
instead of repeating the output file as a string in $ARGS, and raise
that into the build rule so it is only listed once.
Given the construct `foo = (bar == baz)` some people like parentheses
and some do not. They're pointless and don't mean anything, though. I
don't feel this is particularly helpful to code clarity, tbh, and pylint
now notices this and warns about it in our current pylint config.
I think this is reasonable, so let's remove the odd parens.
The code below this already handles being passed an Executable or
ExternalProgram, and it does it correctly, since it handles host
binaries that need an exe_wrapper correctly, while the code in the
generator paths doesn't.
The xcode backend is, like always, problematic, it doesn't handle things
the same way as the ninja and vscode backends, and generates a shell
script instead of using meson as a wrapper when needed (it seems likely
that just forcing the meson path for xcode would be better). I don't
have a working mac to develop a fix for, so I've left a todo comment
there.
Fixes: #11264
It is possible, albeit possibly inadvisable, for the exact combination
of MSVC and "$CXX has C++ specific flags in it" to occur. When this
happens, and cl.exe is given a filename ending in .c, it complains that
you cannot compile a .c file with that option.
Instead, pick the first filename matching that language and use that as
the temporary filename. This more or less matches what we do in
compiler-time checks. And it's the proper thing to do, rather than
assume that cl.exe, when detected as the current C++ compiler, can
*also* compile C because it's *also* a C compiler.
Fixes#11257
When auto-generating e.g. a `clang-format` target, we first check to see
if the user has already defined one, and if so we don't bother creating
our own. We check for two things:
- if a ninja target already exists, skip
- if a run_target was defined, skip
The second check is *obviously* a duplicate of the first check. But the
first check never actually worked, because all_outputs was only
generated *after* generating all utility rules and actually writing out
the build.ninja file. The check itself compares against nothing, and
always evaluates to false no matter what.
Fix this by reordering the target creation logic so we track outputs
immediately, but only error about them later. Now, we no longer need to
special-case run_target at all, so we can drop that whole logic from
build.py and interpreter.py, and simplify the tracked state.
Fixes defining an `alias_target()` for a utility, which tried to
auto-generate another rule and errored out. Also fixes doing the same
thing with a `custom_target()` although I cannot imagine why anyone
would want to produce an output file named `clang-format` (unless clang
itself decided to migrate to Meson, which would be cool but feels
unlikely).
Which adds the `use-set-for-membership` check. It's generally faster in
python to use a set with the `in` keyword, because it's a hash check
instead of a linear walk, this is especially true with strings, where
it's actually O(n^2), one loop over the container, and an inner loop of
the strings (as string comparison works by checking that `a[n] == b[n]`,
in a loop).
Also, I'm tired of complaining about this in reviews, let the tools do
it for me :)
This finds uses of deny-listed functions, which defaults to map and
filter. These functions should be replaced by comprehensions in
idiomatic python because:
1. comprehensions are more heavily optimized and are often faster
2. They avoid the need for lambdas in some cases, which make them
faster
3. you can do the equivalent in one statement rather than two, which
is faster
4. They're easier to read
5. if you need a concrete instance (ie, a list) then you don't have
to convert the iterator to a list afterwards
This introduce a new type of BuildTarget: CompileTarget. From ninja
backend POV it is the same thing as any other build target, except that
it skips the final link step. It could be used in the future for
transpilers too.