Performed using https://github.com/ilevkivskyi/com2ann
This has no actual effect on the codebase as type checkers (still)
support both and negligible effect on runtime performance since
__future__ annotations ameliorates that. Technically, the bytecode would
be bigger for non function-local annotations, of which we have many
either way.
So if it doesn't really matter, why do a large-scale refactor? Simple:
because people keep wanting to, but it's getting nickle-and-dimed. If
we're going to do this we might as well do it consistently in one shot,
using tooling that guarantees repeatability and correctness.
Repeat with:
```
com2ann mesonbuild/
```
It's shorter and more descriptive. Although we always enforce the same
rules either way, a unified decorator is one less line of code for each
location, and also tells you how many "too few" arguments you *did*
pass.
Which adds the `use-set-for-membership` check. It's generally faster in
python to use a set with the `in` keyword, because it's a hash check
instead of a linear walk, this is especially true with strings, where
it's actually O(n^2), one loop over the container, and an inner loop of
the strings (as string comparison works by checking that `a[n] == b[n]`,
in a loop).
Also, I'm tired of complaining about this in reviews, let the tools do
it for me :)
This finds uses of deny-listed functions, which defaults to map and
filter. These functions should be replaced by comprehensions in
idiomatic python because:
1. comprehensions are more heavily optimized and are often faster
2. They avoid the need for lambdas in some cases, which make them
faster
3. you can do the equivalent in one statement rather than two, which
is faster
4. They're easier to read
5. if you need a concrete instance (ie, a list) then you don't have
to convert the iterator to a list afterwards
Add a MissingCompiler class returned by compiler detecting methods
intead of None - accessing such an object raises a DependencyException
Fixes#10586
Co-authored-by: duckflyer <duckflyer@gmail.com>
flake8 6 upgrades to pyflakes 3, and in turn this means that support for
parsing `# type: ` style annotations has been removed.
https://github.com/PyCQA/pyflakes/pull/684
This caused one file to fail linting, because it had a typing import
which was only used by a type comment.
```
mesonbuild/cmake/interpreter.py:55:5: F401 '.common.CMakeConfiguration' imported but unused
```
Updating it to actual annotations allows pyflakes to detect its usage
again, and flake8 passes. Do the whole file while we are here.
This catches some optimization problems, mostly in the use of `all()`
and `any()`. Basically writing `any([x == 5 for x in f])` vs `any(x == 5
for x in f)` reduces the performance because the entire concrete list
must first be created, then iterated over, while in the second f is
iterated and checked element by element.
In various situations we want to figure out what type of compiler we
have, because we want to know stuff like "is it the pgi one", or "does
it use msvc style". The compiler object has this property already, via
an API specifically designed to communicate this info, but instead we
performed isinstance checks on a compiler class.
This is confusing and indirect, and has the side effect of requiring
more imports everywhere. We should do away with it.
Sometimes, the machine file can include compiler command line options,
in order to pick the correct multilib. For example, Meson uses "$cc
--print-search-dirs" to find the library search path, where $cc is the
cc from the machine file. Because the outputs of "gcc -m32
--print-search-dirs" and "gcc --print-search-dirs" are different, this
only works if you have
[binaries]
cc = ['gcc', '-m32']
in the machine file. Right now, however, the cmake module assumes that
the compiler listed in the machine file is either a compiler, or a
"launcher" followed by the compiler. Check if the second argument
starts with a slash (for Microsoft-like compilers) or a dash (for
everyone else), and if so presume that the CMAKE_*_COMPILER_LAUNCHER
need not be defined.
Imported both inside and outside of T.TYPE_CHECKING, the runtime import
can be removed by quote-deferring one of the use sites.
Update:
In between then and now, this got removed from T.TYPE_CHECKING, move it
back there rather than preserving the runtime import.
I ran into one of these from LGTM, and it would be nice if pylint could
warn me as part of my local development process instead of waiting for
the CI to tell me.
We have a lot of these. Some of them are harmless, if unidiomatic, such
as `if (condition)`, others are potentially dangerous `assert(...)`, as
`assert(condtion)` works as expected, but `assert(condition, message)`
will result in an assertion that never triggers, as what you're actually
asserting is `bool(tuple[2])`, which will always be true.
* backends: Add a Visual Studio 2013 backend
This is more-or-less a quick port from the VS2015 backend, except that
we update the Visual Studio version strings and toolset versions
accordingly. Also correct the generator string for Visual Studio 2015
in mesonbuild/cmake/common.py.
* backend: Add VS2012 backend
Similar to what we did for Visual Studio 2013, add a Visual Studio 2012
backend.
* vs2010backend.py: Implement `link_whole:` if needed
We actually need Visual Studio 2015 Update 2 to use `/WHOLEARCHIVE:`,
which is what we are currently using for `link_whole:` on Visual Studio.
For Visual Studio versions before that, we need to expand from the
static targets that were indicated by `link_whole:`, and any of the
sub-dependent targets that were pulled in via the dependent target's
`link_whole:`. This wil ensure `link_whole:` would actually work in
such cases.
* vs2010backend.py: Handle objects from generated sources
Unforunately, we can't use backends.determine_ext_objs() reliably, as
the Visual Studio backends handle this differently.
* vs2010backend.py: Fix generating VS2010 projects
Visual Studio 2010 (at least the Express Edition) does not set the envvar
%VisualStudioVersion% in its command prompt, so fix generating VS2010
projects by taking account into this, so that we can determine the location
of vcvarsall.bat correctly.
* whole archive test: Disable on vs2012/2013 backends too
The Visual Studio 2012/2013 IDE has problems handling the items that would be
generated from this test case, so skip this test when using
--backend=vs[2012|2013]. This test does work for the Ninja backend when
VS2012 or VS2013 is used, though.
Consolidate this error message with XCode along with the vs2010 backend.
* docs: Add the new vs2012 and vs2013 backends
Let people know that we have backends for vs2012 and 2013. Also let
people know that generating Visual Studio 2010 projects have been fixed
and the pre-vs2015 backends now handle the `link_whole:` project option.