Switch from build.compiler to environment.coredata.compiler and likewise
from build.cross_compiler to environment.coredata.cross_compiler in
backends. (Only seems to be actually used in ninja backend).
Currently C++ inherits C, which can lead to diamond problems. By pulling
the code out into a standalone mixin class that the C, C++, ObjC, and
Objc++ compilers can inherit and override as necessary we remove one
source of diamonding. I've chosen to split this out into it's own file
as the CLikeCompiler class is over 1000 lines by itself. This also
breaks the VisualStudio derived classes inheriting from each other, to
avoid the same C -> CPP inheritance problems. This is all one giant
patch because there just isn't a clean way to separate this.
I've done the same for Fortran since it effectively inherits the
CCompiler (I say effectively because was it actually did was gross
beyond explanation), it's probably not correct, but it seems to work for
now. There really is a lot of layering violation going on in the
Compilers, and a really good scrubbing would do this code a lot of good.
After the previous commit, outfile is now passed down to lots of things
which don't use it, as they only create built statements, rather than
writing them out. Remove these unnecessary args.
Store the build statements and then write them all out, rather than
writing them out as we go.
Construct a NinjaBuildElement for the 'PHONY' target, rather than
writing it literally to the build.ninja file.
After the previous commit, outfile is now passed down to lots of things
which don't use it, as they only create rules, rather than writing them
out. Remove these unnecessary args.
Store the rules and then write them all out, rather than writing them
out as we go.
Store the rule broken down into parts which do and don't go into
rspfile, so we can construct either a rsp or non-rsp version of the
rule.
Setting this variable to contain additional commands to symlink shlib
aliases was removed in commit c0bf3e8d, so it's always unset now, and
thus does nothing.
This isn't safe given the way python implements default arguments.
Basically python store a reference to the instance it was passed, and
then if that argument is not provided it uses the default. That means
that two calls to the same function get the same instance, if one of
them mutates that instance every subsequent call that gets the default
will receive the mutated instance. The idiom to this in python is to use
None and replace the None,
def in(value: str, container: Optional[List[str]]) -> boolean:
return src in (container or [])
if there is no chance of mutation it's less code to use or and take
advantage of None being falsy. If you may want to mutate the value
passed in you need a ternary (this example is stupid):
def add(value: str, container: Optional[List[str]]) -> None:
container = container if container is not None else []
container.append(value)
I've used or everywhere I'm sure that the value will not be mutated by
the function and erred toward caution by using ternaries for the rest.
If find_program() returns a file from the source directory, anything
that uses it should add the file to the dependencies, so that they are
rebuilt whenever the script changes. Generator is not doing that.
While at it, I am doing two related fixes:
- Generator is not checking whther the generator actually was found,
resulting in a Python error involving NoneType if it isn't. To minimize
backwards compatibility issues, I am only raising the error when
g.process() is acutally called.
- the error message for custom_target with a nonexisting program
erroneously mention a not-found external program "nonexistingprogram".
The new error is similar to the one I am adding for generators.
Instad of having special casing of threads in the backends and
everywehre else, do what we did for openmp, create a real
dependency. Then make use of the fact that dependencies can now have
sub dependencies to add threads.
commit b02b2d6d0d462310b313588ca7705d391e830eeb
Author: Michael Hirsch, Ph.D <scivision@users.noreply.github.com>
Date: Sun Mar 10 03:51:09 2019 -0400
cleanup
commit 3311ff5fb12577c78671bf2ff2787d28b86ba5fa
Author: Michael Hirsch, Ph.D <scivision@users.noreply.github.com>
Date: Sun Mar 10 03:50:30 2019 -0400
more robust
commit 8030dcb76698b148ee47ecded1f33b6d3821cca2
Author: Michael Hirsch, Ph.D <scivision@users.noreply.github.com>
Date: Sun Mar 10 03:30:05 2019 -0400
inwork compiles OK but needs smod filenames
This patch creates an enum for selecting libtype as static, shared,
prefer-static, or prefer-shared. This also renames 'static-shared'
with 'prefer_static' and 'shared-static' with 'prefer_shared'. This is
just a refactor with no behavioral changes or user facing changes.
Currently we specialcase OpenMP like we do threads, with a special
`need_openmp` method. This seems like a great idea, but doesn't work
out in practice, as well as it complicates the opemp
implementation. If GCC is built without opemp support for example, we
still add -fopenmp to the the command line, which results in
compilation errors.
This patch discards that and treats it like a normal dependency,
removes the need_openmp() method, and sets the compile_args attributes
from the compiler.
Fixes#5115
This does two things:
* On windows GCC-like compilers, the subsystem is always explicitly
specified (either -mwindows or -mconsole). MSVC is already explicit.
* The gui_app linker flags are now added after those mandated by
external dependencies. This is because some misguided libraries (such
as SDL) think that hijacking `main()` and forcing `-mwindows` in link
flags is clever. We must unconditionally override such misuses to let
gui_app work as intended.
In addition to MSVC, which was worked around previously, GCC also does
not list includes from the PCH in the depfile by default, unless
-fpch-deps is given. I think it's best to stay safe and not rely on any
particular behavior from the compiler here.
Instead use coredata.compiler_options.<machine>. This brings the cross
and native code paths closer together, since both now use that.
Command line options are interpreted just as before, for backwards
compatibility. This does introduce some funny conditionals. In the
future, I'd like to change the interpretation of command line options so
- The logic is cross-agnostic, i.e. there are no conditions affected by
`is_cross_build()`.
- Compiler args for both the build and host machines can always be
controlled by the command line.
- Compiler args for both machines can always be controlled separately.
macOS provides the tool `lipo` to check the archs supported by an
object (executable, static library, dylib, etc). This is especially
useful for fat archives, but it also helps with thin archives.
Without this, the linker will fail to link to the library we mistakenly
'found' like so:
ld: warning: ignoring file /path/to/libfoo.a, missing required architecture armv7 in file /path/to/libfoo.a