1. They (and the others) all use PerMachineDefaultable. It's not the
best class, but consistency come first. (It and all of them can be
improved accross the board later.)
2. They use `None` as the default argument so as not to mutate what's
effectively a global variables. (Thanks @dcbaker!)
3. They have a `fallback` field to centralize authority on when
environment variables should be consulted.
First of all, I'd like compilers and other modules that environment.py
currently imports to be able to take these without creating
hard-to-follow module cycles.
Second of all, environment.py's exact purpose seems a bit obscured.
Splitting the data types (and basic pure functions) from the more
complex logic that infers that data seems like a good way to separate
concerns.
This allows the person running configure (either a developer, user, or
distro maintainer) to keep a configuration of where various kinds of
files should end up.
Instead use coredata.compiler_options.<machine>. This brings the cross
and native code paths closer together, since both now use that.
Command line options are interpreted just as before, for backwards
compatibility. This does introduce some funny conditionals. In the
future, I'd like to change the interpretation of command line options so
- The logic is cross-agnostic, i.e. there are no conditions affected by
`is_cross_build()`.
- Compiler args for both the build and host machines can always be
controlled by the command line.
- Compiler args for both machines can always be controlled separately.
First, I noticed there was a dangling use of now-removed cross_info in
the CMake lookup. No tests had caught this, but it means that CMake deps
were totally broken. [It also meant that CMake could not be specified
from a native file.]
In a previous of mine PR which removed cross_info, I overhauled finding
pkg-config a bit so that the native and cross paths were shared. I
noticed that the CMake code greatly resembled the pkg-config code, so I
set about fixing it to match.
I then realized I could refactor things further, separating caching,
finding alternatives, and validating them, while also making the
validations less duplicated. So I ended up changing pkg config lookup a
lot too (and CMake again, to keep matching).
Overall, I think I have the proper ideom for tool lookup now, repated in
two places. I think it would make sense next to share this logic between
these two, compilers, static linkers, and any other tool similarly
specifiable. Either the `BinaryTable` class in environment.py, or a new
class for `Compiler` and friends to subclass, would be good candidates
for this.
* Fixed spelling
* Merged the Buildoptions and Projectinfo interpreter
* Moved detect_compilers to Environment
* Added removed test case
* Split detect_compilers and moved even more code into Environment
* Moved set_default_options to coredata
* Small code simplification in mintro.run
* Move cmd_line_options back to `environment`
We don't actually wish to persist something this unstructured, so we
shouldn't make it a field on `coredata`. It would also be data
denormalization since the information we already store in coredata
depends on the CLI args.
On OpenBSD, the main PowerPC machine is known as macppc, while the processor
family is powerpc:
$ uname -{m,p}
macppc powerpc
$ echo 'import platform; print (platform.machine())' | python3
macppc
$ echo 'import platform; print (platform.processor())' | python3
powerpc
This allows for e.g. GLib configure to properly detect that cpu:
Build machine cpu family: ppc
Build machine cpu: macppc
While it failed before with: WARNING: Unknown CPU family 'macppc' <snip>
Building a cross compiler (`build == host != target`) is not cross
compiling. As such, it doesn't make sense to handle it under
`is_cross_build`.
(N.B. Building a standard library for a cross compiler would require
cross compiling, but Meson has support to do such a thing as part of a
compiler build currently.)
We can't just do compiler.has_builtin_define('_M_IX86'), because the
VisualStudioCCompiler class doesn't implement has_builtin_define(), and
getting the compiler to disgorge it's builtin defines isn't easy...
But we can now use the target we stored when we identifed the compiler.
Also update comment appropriately
Store the MSVC compiler target architecture ('x86', 'x64' or 'ARM' (this
is ARM64, I believe)), rather than just if it's x64 or not.
The regex used for target architecture should be ok, based on this list
of [1] version outputs, but we assume x86 if no match, for safety's
sake.
[1] https://stackoverflow.com/a/1233332/1951600
Also detect arch even if cl outputs version to stdout.
Ditto for clang-cl
Future work: is_64 is now only used in get_instruction_set_args()
Starting with VS 2017, `Platform` is not always set (f.ex., if you use
VsDevCmd.bat directly instead of vcvars*.bat), but `VSCMD_ARG_HOST_ARCH`
is always set, so try that first.
samu prints a different message when the build is a no-op, so make
assertBuildIsNoop consider that as well.
Also, if compile_commands.json cannot be found, just skip the test. This
seems reasonable since meson just produces a warning if `ninja -t compdb`
fails.
Finally, only capture stdout in run_meson_command_tests.py, since the
backend may print messages the tests don't recognize to stderr.
Fixes#3405.
Write command line options into a separate file to be able to
reconfigure from scatch in the case coredata cannot be loaded. The most
common case is when we are reconfiguring with a newer meson version.
This means that we should try as much as possible to maintain backward
compatibility for the cmd_line.txt file format.
The main difference with a normal reconfigure is it will use new
default options values and will read again environment variables like
CFLAGS, etc.
Handle clang's cl or clang-cl being in PATH, or set in CC/CXX
Future work: checking the name of the executable here seems like a bad idea.
These compilers will fail to be detected if they are renamed.
v2:
Update compiler.get_argument_type() test
Fix comparisons of id inside CCompiler, backends and elsewhere
v3:
ClangClCPPCompiler should be a subclass of ClangClCCompier, as well
Future work: mocking in test_find_library_patterns() is effected, as we
now test for a subclass, rather than self.id in CCompiler.get_library_naming()