When performing isinstance checks, an identity comparison is
automatically done, but we don't use isinstance here because we need
strict identity equality *without allowing subtypes*.
Comparing type() == type() is a value comparison, but could produce
effectively the same results as an identity comparison, usually, despite
being semantically off. pycodestyle learned to detect this and warn you
to do strict identity comparison.
We allow custom_target() but check for it based on hasattr, ever since
the initial implementation way back in the day, in commit
66a6ea984b. This is a bit broken because
various objects might support that but still aren't supposed to work. We
can actually just use isintance checks like we do immediately above,
which are more accurate and avoid crashes on things that aren't even
targets at all, like run_target(). Although custom_target indexes are
actually targets those didn't work either.
Fixes#9648
This fixes two issues in constructing the default installation path
when install_dir is not specified:
- inside a subproject, install_data() would construct the destination
path using the parent project name instead than the current project
name,
- when specifying preserve_path, install_data() would construct the
destination path omitting the project name.
Fixes#11910.
The function name adheres to the pattern used by many other Meson DSL
implementation methods, thus stating that this is the implementation
of the functionality without argument validation is not very useful.
The docstring is separated from the function declaration by a blank
line. Removing the docstring avoids having to decide if the blank line
should be there or not.
The only real differences between these generators is the file extension
and the templates themselves. We can uses a shared abstract class
with a few abstract properties to provide all of this to the same base
class. This results in less code duplication and easier maintanence.
I've made a few cleanups to the shared template:
- use `str.capitalize()` instead of `str.upper()[0] + str[1:]`
- use `open` as a context manager
- use f-strings
- put some duplicate calculations in the initializer
Since it's deprecated anyway, we don't really want to plumb it all the
way down into the build and backend layers. Instead, we can just turn
it into a `win_subsystem` value in the interpreter if `win_subsystem`
isn't already set.
When the pkgconfig module is imported, but not used, it will insert None
on the end of the devenv list. This list is not expected to contain
None, and causes Meson to crash. This can happen in complex build setups
(reported from mesa), where pkgconfig is only used in some
configurations
Fixes: #12032
We need to verify that we don't produce multiple rules for the same
file. This uniqueness check is easy to do with a set, but the original
old implementation used a dict with True as the value. This isn't a
terrible way to implement a set -- we do it for our own internal
OrderedSet implementation, even, and back in prehistory (python 2.3) the
standard library's set type was one. But it was deleted and replaced
with a fast native implementation, and we should too.
I originally refactored this wrapper function in commit
dd2f1c4c57 and at the time I made it take
the following call pattern:
```
self.create_phony_target(self.all_outputs, ..., implicit_outs=None)
```
I am not sure *why*. There are a couple problems here:
- When creating a phony target, there are never implicit outs and we never
try to create them.
- It's invalid to call it without self.all_outputs as the first argument,
so really we should just use that directly.
This made the function signature pointlessly longer and more
complicated.
We always expect the args parameter in the wrapped function to
eventually receive a tuple due to reasons. But in one specific optargs
condition we passed it along without any fixup at all.
We know exactly what type we need, since the interpreter function is
correctly typed and thinly wraps over this. But we didn't even get the
container type correct. :(
This would be either the value `kwargs['install']`, or `False`. There
isn't any case that `BuildTarget.need_install` handles that
`BuildTarget.install` doesn't handle, if we just initialized it
correctly. So, just set Target.install correctly in the super
initializer, and do away with need_install.
When running setup with `--profile-self` option,
there are currently no logs after "Found ninja...". However, there are
still some lengthy processes for generating targets and ninja.build.
This add more log entries, when profiling, only for the purpose of
displaying the timestamps of the different steps in ninja generation.
Jar has a very low set of overlap with other target types, including
that jar sources *must* be .java, and no other target allows .java
sources. As such, the difficulty in crafting a useful `build_target`
invocation that allows both `jar` and anything else is high, and the
usefulness is dubious. Just use `jar()` directly instead.
This depends on the changes to make all of the jar() specific keyword
arguments be handled by typed_kwargs so that the deprecation messages
are correct and consistent.
Also move it into the Jar class. This is an exclusive Jar keyword
argument, and is only used inside java paths, so there's no reason to
have this in all build targets.
This moves to a list of shared objects inside the type_checking module.
This is based on my experience of fully typing all of these functions
several times, and will make it easier to slowly land all of the changes
we want to make.
We already import a bunch of objects directly from ..build but don't use
them nearly as much as we can. This resulted both in longer lines and s
minor performance difference since python has to resolve the name
binding the long way. There's no reason not to rewrite these names to
use the direct imports.
Found while investigating the fact that Executable was imported but
never used. It's easier to just use it.
This detects cases where module A imports a function from B, and C
imports that same function from A instead of B. It's not part of the API
contract of A, and causes innocent refactoring to break things.
A linker mixin has to be able to align with the base linker it will be
used for, in order to reference super(). Since they weren't inherited,
calls to super() resulted in mypy errors, which we ignored, and casting.
Use the same trick we use for compilers, and make the linker inherit
from the base linker type when running under mypy, and from object at
runtime.