Compare commits

...

2 Commits

Author SHA1 Message Date
Wolfgang Stöggl 32f64aa2cb docs: Add backend vs2019 to table in Builtin-options.md [skip ci] 5 years ago
Wolfgang Stöggl f037e7ef45 Fix typos found by codespell 5 years ago
  1. 2
      docs/markdown/Build-options.md
  2. 4
      docs/markdown/Builtin-options.md
  3. 4
      docs/markdown/CMake-module.md
  4. 2
      docs/markdown/Continuous-Integration.md
  5. 4
      docs/markdown/Contributing.md
  6. 6
      docs/markdown/Dependencies.md
  7. 2
      docs/markdown/Design-rationale.md
  8. 2
      docs/markdown/FAQ.md
  9. 2
      docs/markdown/Kconfig-module.md
  10. 2
      docs/markdown/Localisation.md
  11. 2
      docs/markdown/Pkg-config-files.md
  12. 4
      docs/markdown/Reference-manual.md
  13. 2
      docs/markdown/Rewriter.md
  14. 2
      docs/markdown/Unit-tests.md
  15. 2
      docs/markdown/Vala.md
  16. 2
      mesonbuild/ast/interpreter.py
  17. 4
      mesonbuild/ast/introspection.py
  18. 2
      mesonbuild/backend/ninjabackend.py
  19. 6
      mesonbuild/cmake/data/run_ctgt.py
  20. 4
      mesonbuild/cmake/fileapi.py
  21. 10
      mesonbuild/cmake/interpreter.py
  22. 4
      mesonbuild/cmake/traceparser.py
  23. 2
      mesonbuild/compilers/compilers.py
  24. 2
      mesonbuild/compilers/cpp.py
  25. 4
      mesonbuild/compilers/cuda.py
  26. 2
      mesonbuild/compilers/mixins/emscripten.py
  27. 2
      mesonbuild/compilers/mixins/visualstudio.py
  28. 4
      mesonbuild/coredata.py
  29. 4
      mesonbuild/dependencies/base.py
  30. 6
      mesonbuild/dependencies/dev.py
  31. 2
      mesonbuild/envconfig.py
  32. 6
      mesonbuild/environment.py
  33. 2
      mesonbuild/interpreter.py
  34. 4
      mesonbuild/interpreterbase.py
  35. 2
      mesonbuild/linkers.py
  36. 2
      mesonbuild/mconf.py
  37. 8
      mesonbuild/mesonlib.py
  38. 10
      mesonbuild/mesonmain.py
  39. 2
      mesonbuild/mintro.py
  40. 2
      mesonbuild/mlog.py
  41. 2
      mesonbuild/modules/cmake.py
  42. 2
      mesonbuild/modules/gnome.py
  43. 2
      mesonbuild/modules/unstable_simd.py
  44. 2
      mesonbuild/mtest.py
  45. 2
      mesonbuild/munstable_coredata.py
  46. 12
      mesonbuild/rewriter.py
  47. 18
      run_unittests.py
  48. 2
      test cases/common/187 find override/otherdir/meson.build
  49. 10
      test cases/common/194 check header/meson.build
  50. 2
      test cases/common/203 function attributes/meson.build
  51. 14
      test cases/common/218 dependency get_variable method/meson.build
  52. 10
      test cases/common/34 has header/meson.build
  53. 2
      test cases/common/62 install subdir/installed_files.txt
  54. 0
      test cases/common/62 install subdir/nested_elided/sub/dircheck/ninth.dat
  55. 2
      test cases/unit/65 cmake parser/meson.build

@ -94,7 +94,7 @@ if d.found()
endif
```
If the value of a `feature` option is set to `auto`, that value is overriden by
If the value of a `feature` option is set to `auto`, that value is overridden by
the global `auto_features` option (which defaults to `auto`). This is intended
to be used by packagers who want to have full control on which dependencies are
required and which are disabled, and not rely on build-deps being installed

@ -55,7 +55,7 @@ particularly the paths section may be necessary.
### Core options
Options that are labled "per machine" in the table are set per machine.
Options that are labeled "per machine" in the table are set per machine.
Prefixing the option with `build.` just affects the build machine configuration,
while unprefixed just affects the host machine configuration, respectively.
Using the option as-is with no prefix affects all machines. For example:
@ -69,7 +69,7 @@ Using the option as-is with no prefix affects all machines. For example:
| Option | Default value | Description | Is per machine |
| ------ | ------------- | ----------- | -------------- |
| auto_features {enabled, disabled, auto} | auto | Override value of all 'auto' features | no |
| backend {ninja, vs,<br>vs2010, vs2015, vs2017, xcode} | ninja | Backend to use | no |
| backend {ninja, vs,<br>vs2010, vs2015, vs2017, vs2019, xcode} | ninja | Backend to use | no |
| buildtype {plain, debug,<br>debugoptimized, release, minsize, custom} | debug | Build type to use | no |
| debug | true | Debug | no |
| default_library {shared, static, both} | shared | Default library type | no |

@ -86,14 +86,14 @@ and supports the following methods:
- `dependency(target)` returns a dependency object for any CMake target.
- `include_directories(target)` returns a meson `include_directories()`
object for the specified target. Using this function is not neccessary
object for the specified target. Using this function is not necessary
if the dependency object is used.
- `target(target)` returns the raw build target.
- `target_type(target)` returns the type of the target as a string
- `target_list()` returns a list of all target *names*.
- `get_variable(name)` fetches the specified variable from inside
the subproject. Usually `dependency()` or `target()` should be
prefered to extract build targets.
preferred to extract build targets.
## CMake configuration files

@ -74,7 +74,7 @@ install:
- cmd: if %arch%==x86 (set PYTHON_ROOT=C:\python37) else (set PYTHON_ROOT=C:\python37-x64)
# Print out dependency paths
- cmd: echo Using Python at %PYTHON_ROOT%
# Add neccessary paths to PATH variable
# Add necessary paths to PATH variable
- cmd: set PATH=%cd%;C:\ninja-build;%PYTHON_ROOT%;%PYTHON_ROOT%\Scripts;%PATH%
# Install meson
- cmd: pip install meson

@ -38,7 +38,7 @@ Every new feature requires some extra steps, namely:
into `run_unittests.py`.
- Must be registered with the [FeatureChecks framework](Release-notes-for-0.47.0.md#feature-detection-based-on-meson_version-in-project)
that will warn the user if they try to use a new feature while
targetting an older meson version.
targeting an older meson version.
- Needs a release note snippet inside `docs/markdown/snippets/` with
a heading and a brief paragraph explaining what the feature does
with an example.
@ -84,7 +84,7 @@ In a simplified list form the split would look like the following:
## Strategy for merging pull requests to trunk
Meson's merge strategy should fullfill the following guidelines:
Meson's merge strategy should fulfill the following guidelines:
- preserve as much history as possible

@ -137,7 +137,7 @@ of all the work behind the scenes to make this work.
You can use the keyword `method` to let meson know what method to use
when searching for the dependency. The default value is `auto`.
Aditional dependencies methods are `pkg-config`, `config-tool`, `cmake`,
Additional dependencies methods are `pkg-config`, `config-tool`, `cmake`,
`system`, `sysconfig`, `qmake`, `extraframework` and `dub`.
```meson
@ -159,11 +159,11 @@ to use both the old-style `<NAME>_LIBRARIES` variables as well as
imported targets.
It is possible to manually specify a list of CMake targets that should
be used with the `modules` property. Howerver, this step is optional
be used with the `modules` property. However, this step is optional
since meson tries to automatically guess the correct target based on the
name of the dependency.
Depending on the dependency it may be neccessary to explicitly specify
Depending on the dependency it may be necessary to explicitly specify
a CMake target with the `modules` property if meson is unable to guess
it automatically.

@ -226,7 +226,7 @@ First we build a shared library named foobar. It is marked
installable, so running `ninja install` installs it to the library
directory (the system knows which one so the user does not have to
care). Then we build a test executable which is linked against the
library. It will no tbe installed, but instead it is added to the list
library. It will not be installed, but instead it is added to the list
of unit tests, which can be run with the command `ninja test`.
Above we mentioned precompiled headers as a feature not supported by

@ -324,7 +324,7 @@ that Windows developers should be able to contribute using nothing but
Visual Studio.
At the time of writing (April 2018) there are only three languages
that could fullfill these requirements:
that could fulfill these requirements:
- C
- C++

@ -14,7 +14,7 @@ configurations in meson projects.
**Note**: this does not provide kconfig frontend tooling to generate a
configuration. You still need something such as kconfig frontends (see
link below) to parse your Kconfig files, and then (after you've
choosen the configuration options), output a ".config" file.
chosen the configuration options), output a ".config" file.
[kconfig-frontends]: http://ymorin.is-a-geek.org/projects/kconfig-frontends

@ -7,7 +7,7 @@ short-description: Localization with GNU Gettext
Localising your application with GNU gettext takes a little effort but is quite straightforward. We'll create a `po` subdirectory at your project root directory for all the localisation info.
## Generating .pot and .po files
In your main meson.build file include the `po` subdirectory in the build proces.
In your main meson.build file include the `po` subdirectory in the build process.
subdir('po')

@ -18,4 +18,4 @@ pkg.generate(libraries : libs,
This causes a file called `simple.pc` to be created and placed into the install directory during the install phase.
More infromation on the pkg-config module and the parameters can be found on the [pkgconfig-module](Pkgconfig-module.md) page.
More information on the pkg-config module and the parameters can be found on the [pkgconfig-module](Pkgconfig-module.md) page.

@ -697,7 +697,7 @@ Keyword arguments are the following:
[`dependency()`](#dependency) for argument format. The version of the program
is determined by running `program_name --version` command. If stdout is empty
it fallbacks to stderr. If the output contains more text than simply a version
number, only the first occurence of numbers separated by dots is kept.
number, only the first occurrence of numbers separated by dots is kept.
If the output is more complicated than that, the version checking will have to
be done manually using [`run_command()`](#run_command).
@ -1152,7 +1152,7 @@ You should use this instead of [`shared_library`](#shared_library),
to toggle your entire project (including subprojects) from shared to
static with only one option. This option applies to libraries being
built internal to the entire project. For external dependencies, the
default library type prefered is shared. This can be adapted on a per
default library type preferred is shared. This can be adapted on a per
library basis using the [dependency()](#dependency)) `static` keyword.
The keyword arguments for this are the same as for

@ -107,7 +107,7 @@ meson rewrite default-options {set/delete} <opt1> <value1> <opt2> <value2> ...
## Limitations
Rewriting a meson file is not guranteed to keep the indentation of the modified
Rewriting a meson file is not guaranteed to keep the indentation of the modified
functions. Additionally, comments inside a modified statement will be removed.
Furthermore, all source files will be sorted alphabetically.

@ -169,7 +169,7 @@ $ meson test --gdb --gdb-path /path/to/gdb testname
$ meson test --print-errorlogs
```
Meson will report the output produced by the failing tests along with other useful informations as the environmental variables. This is useful, for example, when you run the tests on Travis-CI, Jenkins and the like.
Meson will report the output produced by the failing tests along with other useful information as the environmental variables. This is useful, for example, when you run the tests on Travis-CI, Jenkins and the like.
For further information see the command line help of Meson by running `meson test -h`.

@ -82,7 +82,7 @@ development files. The VAPI is installed in Vala's standard search path and so
works just as seamlessly using the `dependency()` function.
### Targetting a version of GLib
### Targeting a version of GLib
Meson's [`dependency()`](Reference-manual.md#dependency) function allows a
version check of a library. This is often used to check a minimum version is
installed. When setting a minimum version of GLib, Meson will also pass this to

@ -289,7 +289,7 @@ class AstInterpreter(interpreterbase.InterpreterBase):
l = quick_resolve(node.left)
r = quick_resolve(node.right)
if isinstance(l, str) and isinstance(r, str):
result = l + r # String concatination detected
result = l + r # String concatenation detected
else:
result = self.flatten_args(l, include_unknown_args, id_loop_detect) + self.flatten_args(r, include_unknown_args, id_loop_detect)

@ -35,7 +35,7 @@ class IntrospectionHelper:
class IntrospectionInterpreter(AstInterpreter):
# Interpreter to detect the options without a build directory
# Most of the code is stolen from interperter.Interpreter
# Most of the code is stolen from interpreter.Interpreter
def __init__(self, source_root, subdir, backend, visitors=None, cross_file=None, subproject='', subproject_dir='subprojects', env=None):
visitors = visitors if visitors is not None else []
super().__init__(source_root, subdir, visitors=visitors)
@ -161,7 +161,7 @@ class IntrospectionInterpreter(AstInterpreter):
name = args[0]
srcqueue = [node]
# Process the soruces BEFORE flattening the kwargs, to preserve the original nodes
# Process the sources BEFORE flattening the kwargs, to preserve the original nodes
if 'sources' in kwargs:
srcqueue += mesonlib.listify(kwargs['sources'])

@ -165,7 +165,7 @@ class NinjaBuildElement:
# This is the only way I could find to make this work on all
# platforms including Windows command shell. Slash is a dir separator
# on Windows, too, so all characters are unambiguous and, more importantly,
# do not require quoting, unless explicitely specified, which is necessary for
# do not require quoting, unless explicitly specified, which is necessary for
# the csc compiler.
line = line.replace('\\', '/')
outfile.write(line)

@ -7,14 +7,14 @@ import os
import sys
commands = [[]]
SEPERATOR = ';;;'
SEPARATOR = ';;;'
# Generate CMD parameters
parser = argparse.ArgumentParser(description='Wrapper for add_custom_command')
parser.add_argument('-d', '--directory', type=str, metavar='D', required=True, help='Working directory to cwd to')
parser.add_argument('-o', '--outputs', nargs='+', metavar='O', required=True, help='Expected output files')
parser.add_argument('-O', '--original-outputs', nargs='+', metavar='O', required=True, help='Output files expected by CMake')
parser.add_argument('commands', nargs=argparse.REMAINDER, help='A "{}" seperated list of commands'.format(SEPERATOR))
parser.add_argument('commands', nargs=argparse.REMAINDER, help='A "{}" separated list of commands'.format(SEPARATOR))
# Parse
args = parser.parse_args()
@ -24,7 +24,7 @@ if len(args.outputs) != len(args.original_outputs):
sys.exit(1)
for i in args.commands:
if i == SEPERATOR:
if i == SEPARATOR:
commands += [[]]
continue

@ -155,7 +155,7 @@ class CMakeFileAPI:
link_flags += [i['fragment']]
# TODO The `dependencies` entry is new in the file API.
# maybe we can make use of that in addtion to the
# maybe we can make use of that in addition to the
# implicit dependency detection
tgt_data = {
'artifacts': [x.get('path', '') for x in tgt.get('artifacts', [])],
@ -175,7 +175,7 @@ class CMakeFileAPI:
processed_src_idx = []
for cg in tgt.get('compileGroups', []):
# Again, why an array, when there is usually only one element
# and arguments are seperated with spaces...
# and arguments are separated with spaces...
flags = []
for i in cg.get('compileCommandFragments', []):
flags += [i['fragment']]

@ -404,7 +404,7 @@ class ConverterCustomTarget:
# Modify the original outputs if they are relative. Again,
# relative paths are relative to ${CMAKE_CURRENT_BINARY_DIR}
# and the first disclaimer is stil in effect
# and the first disclaimer is still in effect
def ensure_absolute(x: str):
if os.path.isabs(x):
return x
@ -562,9 +562,9 @@ class CMakeInterpreter:
raise CMakeException('Failed to configure the CMake subproject')
def initialise(self, extra_cmake_options: List[str]) -> None:
# Run configure the old way becuse doing it
# Run configure the old way because doing it
# with the server doesn't work for some reason
# Aditionally, the File API requires a configure anyway
# Additionally, the File API requires a configure anyway
self.configure(extra_cmake_options)
# Continue with the file API If supported
@ -869,7 +869,7 @@ class CMakeInterpreter:
def process_custom_target(tgt: ConverterCustomTarget) -> None:
# CMake allows to specify multiple commands in a custom target.
# To map this to meson, a helper script is used to execute all
# commands in order. This addtionally allows setting the working
# commands in order. This additionally allows setting the working
# directory.
tgt_var = tgt.name # type: str
@ -893,7 +893,7 @@ class CMakeInterpreter:
command += ['-O'] + tgt.original_outputs
command += ['-d', tgt.working_dir]
# Generate the commands. Subcommands are seperated by ';;;'
# Generate the commands. Subcommands are separated by ';;;'
for cmd in tgt.command:
command += [resolve_source(x) for x in cmd] + [';;;']

@ -499,11 +499,11 @@ class CMakeTraceParser:
if curr_str is None:
curr_str = i
elif os.path.isfile(curr_str):
# Abort concatination if curr_str is an existing file
# Abort concatenation if curr_str is an existing file
fixed_list += [curr_str]
curr_str = i
elif not reg_start.match(curr_str):
# Abort concatination if curr_str no longer matches the regex
# Abort concatenation if curr_str no longer matches the regex
fixed_list += [curr_str]
curr_str = i
elif reg_end.match(i) or os.path.exists('{} {}'.format(curr_str, i)):

@ -554,7 +554,7 @@ class CompilerArgs(list):
default_dirs = self.compiler.get_default_include_dirs()
bad_idx_list = []
for i, each in enumerate(new):
# Remove the -isystem and the path if the path is a dafault path
# Remove the -isystem and the path if the path is a default path
if (each == '-isystem' and
i < (len(new) - 1) and
new[i + 1] in default_dirs):

@ -552,7 +552,7 @@ class IntelClCPPCompiler(VisualStudioLikeCPPCompilerMixin, IntelVisualStudioLike
IntelVisualStudioLikeCompiler.__init__(self, target)
def get_options(self):
# This has only been tested with verison 19.0,
# This has only been tested with version 19.0,
cpp_stds = ['none', 'c++11', 'vc++11', 'c++14', 'vc++14', 'c++17', 'vc++17', 'c++latest']
return self._get_options_impl(super().get_options(), cpp_stds)

@ -157,7 +157,7 @@ class CudaCompiler(Compiler):
raise EnvironmentException('Executables created by {0} compiler {1} are not runnable.'.format(self.language, self.name_string()))
# Interpret the result of the sanity test.
# As mentionned above, it is not only a sanity test but also a GPU
# As mentioned above, it is not only a sanity test but also a GPU
# architecture detection test.
if stde == '':
self.detected_cc = stdo
@ -191,7 +191,7 @@ class CudaCompiler(Compiler):
def get_option_compile_args(self, options):
args = []
# On Windows, the version of the C++ standard used by nvcc is dictated by
# the combination of CUDA version and MSVC verion; the --std= is thus ignored
# the combination of CUDA version and MSVC version; the --std= is thus ignored
# and attempting to use it will result in a warning: https://stackoverflow.com/a/51272091/741027
if not is_windows():
std = options['cuda_std']

@ -37,7 +37,7 @@ class EmscriptenMixin:
if mode == 'preprocess':
return None
# Unlike sane toolchains, emcc infers the kind of output from its name.
# This is the only reason why this method is overriden; compiler tests
# This is the only reason why this method is overridden; compiler tests
# do not work well with the default exe/obj suffices.
if mode == 'link':
suffix = 'js'

@ -83,7 +83,7 @@ class VisualStudioLikeCompiler(metaclass=abc.ABCMeta):
A number of compilers attempt to mimic MSVC, with varying levels of
success, such as Clang-CL and ICL (the Intel C/C++ Compiler for Windows).
This classs implements as much common logic as possible.
This class implements as much common logic as possible.
"""
std_warn_args = ['/W3']

@ -124,7 +124,7 @@ class UserIntegerOption(UserOption[int]):
try:
return int(valuestring)
except ValueError:
raise MesonException('Value string "%s" is not convertable to an integer.' % valuestring)
raise MesonException('Value string "%s" is not convertible to an integer.' % valuestring)
class UserUmaskOption(UserIntegerOption, UserOption[Union[str, int]]):
def __init__(self, description, value, yielding=None):
@ -412,7 +412,7 @@ class CoreData:
real.append(copy)
# Also replace the command line argument, as the pipe
# probably wont exist on reconfigure
# probably won't exist on reconfigure
filenames[i] = copy
continue
if sys.platform != 'win32':

@ -197,7 +197,7 @@ class Dependency:
"""Create a new dependency that contains part of the parent dependency.
The following options can be inherited:
links -- all link_with arguemnts
links -- all link_with arguments
includes -- all include_directory and -I/-isystem calls
sources -- any source, header, or generated sources
compile_args -- any compile args
@ -450,7 +450,7 @@ class ConfigToolDependency(ExternalDependency):
return cls.__new__(cls)
def find_config(self, versions=None):
"""Helper method that searchs for config tool binaries in PATH and
"""Helper method that searches for config tool binaries in PATH and
returns the one that best matches the given version requirements.
"""
if not isinstance(versions, list) and versions is not None:

@ -33,7 +33,7 @@ from typing import List, Tuple
def get_shared_library_suffix(environment, for_machine: MachineChoice):
"""This is only gauranteed to work for languages that compile to machine
"""This is only guaranteed to work for languages that compile to machine
code, not for languages like C# that use a bytecode and always end in .dll
"""
m = environment.machines[for_machine]
@ -292,7 +292,7 @@ class LLVMDependencyConfigTool(ConfigToolDependency):
if not self.static and mode == 'static':
# If llvm is configured with LLVM_BUILD_LLVM_DYLIB but not with
# LLVM_LINK_LLVM_DYLIB and not LLVM_BUILD_SHARED_LIBS (which
# upstreams doesn't recomend using), then llvm-config will lie to
# upstream doesn't recommend using), then llvm-config will lie to
# you about how to do shared-linking. It wants to link to a a bunch
# of individual shared libs (which don't exist because llvm wasn't
# built with LLVM_BUILD_SHARED_LIBS.
@ -305,7 +305,7 @@ class LLVMDependencyConfigTool(ConfigToolDependency):
except DependencyException:
lib_ext = get_shared_library_suffix(environment, self.for_machine)
libdir = self.get_config_value(['--libdir'], 'link_args')[0]
# Sort for reproducability
# Sort for reproducibility
matches = sorted(glob.iglob(os.path.join(libdir, 'libLLVM*{}'.format(lib_ext))))
if not matches:
if self.required:

@ -350,7 +350,7 @@ This is probably wrong, it should always point to the native compiler.''' % evar
First tries looking in explicit map, then tries environment variable.
"""
# Try explict map, don't fall back on env var
# Try explicit map, don't fall back on env var
command = self.binaries.get(name)
if command is not None:
command = mesonlib.stringlistify(command)

@ -771,7 +771,7 @@ class Environment:
will be appended. This means that if a space is required (such as
with swift which wants `-Xlinker --version` and *not*
`-Xlinker=--version`) you must pass as a list.
:extra_args: Any addtional arguments rquired (such as a source file)
:extra_args: Any additional arguments required (such as a source file)
"""
extra_args = typing.cast(typing.List[str], extra_args or [])
if isinstance(prefix, str):
@ -1316,7 +1316,7 @@ class Environment:
# figure out what linker rustc is using for a non-nightly compiler
# (On nightly you can pass -Z print-link-args). So we're going to
# hard code the linker based on the platform.
# Currenty gnu ld is used for everything except apple by
# Currently gnu ld is used for everything except apple by
# default, and apple ld is used on mac.
# TODO: find some better way to figure this out.
if self.machines[for_machine].is_darwin():
@ -1376,7 +1376,7 @@ class Environment:
linker = MSVCDynamicLinker(for_machine, version=version)
else:
# Getting LDC on windows to give useful linker output when not
# doing real work is painfully hard. It ships with a verison of
# doing real work is painfully hard. It ships with a version of
# lld-link, so just assume that we're going to use lld-link
# with it.
_, o, _ = Popen_safe(['lld-link.exe', '--version'])

@ -2111,7 +2111,7 @@ class Interpreter(InterpreterBase):
# Re-initialize machine descriptions. We can do a better job now because we
# have the compilers needed to gain more knowledge, so wipe out old
# inferrence and start over.
# inference and start over.
machines = self.build.environment.machines.miss_defaulting()
machines.build = environment.detect_machine_info(self.coredata.compilers.build)
self.build.environment.machines = machines.default_missing()

@ -243,7 +243,7 @@ class FeatureNew(FeatureCheckBase):
return 'Project specifies a minimum meson_version \'{}\' but uses features which were added in newer versions:'.format(tv)
def log_usage_warning(self, tv):
mlog.warning('Project targetting \'{}\' but tried to use feature introduced '
mlog.warning('Project targeting \'{}\' but tried to use feature introduced '
'in \'{}\': {}'.format(tv, self.feature_version, self.feature_name))
class FeatureDeprecated(FeatureCheckBase):
@ -258,7 +258,7 @@ class FeatureDeprecated(FeatureCheckBase):
return 'Deprecated features used:'
def log_usage_warning(self, tv):
mlog.deprecation('Project targetting \'{}\' but tried to use feature '
mlog.deprecation('Project targeting \'{}\' but tried to use feature '
'deprecated since \'{}\': {}'
''.format(tv, self.feature_version, self.feature_name))

@ -161,7 +161,7 @@ class ArmarLinker(ArLinker):
self.std_args = ['-csr']
def can_linker_accept_rsp(self) -> bool:
# armar cann't accept arguments using the @rsp syntax
# armar can't accept arguments using the @rsp syntax
return False

@ -62,7 +62,7 @@ class Conf:
self.source_dir = os.path.abspath(os.path.realpath(self.build_dir))
intr = mintro.IntrospectionInterpreter(self.source_dir, '', 'ninja', visitors = [AstIDGenerator()])
intr.analyze()
# Reenable logging just in case
# Re-enable logging just in case
mlog.enable()
self.coredata = intr.coredata
self.default_values_only = True

@ -398,7 +398,7 @@ class PerMachineDefaultable(PerMachine[typing.Optional[_T]]):
super().__init__(None, None)
def default_missing(self) -> "PerMachine[typing.Optional[_T]]":
"""Default host to buid
"""Default host to build
This allows just specifying nothing in the native case, and just host in the
cross non-compiler case.
@ -416,7 +416,7 @@ class PerThreeMachineDefaultable(PerMachineDefaultable, PerThreeMachine[typing.O
PerThreeMachine.__init__(self, None, None, None)
def default_missing(self) -> "PerThreeMachine[typing.Optional[_T]]":
"""Default host to buid and target to host.
"""Default host to build and target to host.
This allows just specifying nothing in the native case, just host in the
cross non-compiler case, and just target in the native-built
@ -688,9 +688,9 @@ def default_prefix():
def get_library_dirs() -> typing.List[str]:
if is_windows():
return ['C:/mingw/lib'] # TODO: get programatically
return ['C:/mingw/lib'] # TODO: get programmatically
if is_osx():
return ['/usr/lib'] # TODO: get programatically
return ['/usr/lib'] # TODO: get programmatically
# The following is probably Debian/Ubuntu specific.
# /usr/local/lib is first because it contains stuff
# installed by the sysadmin and is probably more up-to-date

@ -33,11 +33,11 @@ from .wrap import wraptool
class CommandLineParser:
def __init__(self):
self.term_width = shutil.get_terminal_size().columns
self.formater = lambda prog: argparse.HelpFormatter(prog, max_help_position=int(self.term_width / 2), width=self.term_width)
self.formatter = lambda prog: argparse.HelpFormatter(prog, max_help_position=int(self.term_width / 2), width=self.term_width)
self.commands = {}
self.hidden_commands = []
self.parser = argparse.ArgumentParser(prog='meson', formatter_class=self.formater)
self.parser = argparse.ArgumentParser(prog='meson', formatter_class=self.formatter)
self.subparsers = self.parser.add_subparsers(title='Commands',
description='If no command is specified it defaults to setup command.')
self.add_command('setup', msetup.add_arguments, msetup.run,
@ -60,7 +60,7 @@ class CommandLineParser:
help_msg='Manage subprojects')
self.add_command('help', self.add_help_arguments, self.run_help_command,
help_msg='Print help of a subcommand')
self.add_command('rewrite', lambda parser: rewriter.add_arguments(parser, self.formater), rewriter.run,
self.add_command('rewrite', lambda parser: rewriter.add_arguments(parser, self.formatter), rewriter.run,
help_msg='Modify the project definition')
# Hidden commands
@ -74,10 +74,10 @@ class CommandLineParser:
# FIXME: Cannot have hidden subparser:
# https://bugs.python.org/issue22848
if help_msg == argparse.SUPPRESS:
p = argparse.ArgumentParser(prog='meson ' + name, formatter_class=self.formater)
p = argparse.ArgumentParser(prog='meson ' + name, formatter_class=self.formatter)
self.hidden_commands.append(name)
else:
p = self.subparsers.add_parser(name, help=help_msg, aliases=aliases, formatter_class=self.formater)
p = self.subparsers.add_parser(name, help=help_msg, aliases=aliases, formatter_class=self.formatter)
add_arguments_func(p)
p.set_defaults(run_func=run_func)
for i in [name] + aliases:

@ -388,7 +388,7 @@ def run(options):
backend = backends.get_backend_from_name(options.backend, None)
intr = IntrospectionInterpreter(sourcedir, '', backend.name, visitors = [AstIDGenerator(), AstIndentationGenerator(), AstConditionLevel()])
intr.analyze()
# Reenable logging just in case
# Re-enable logging just in case
mlog.enable()
for key, val in intro_types.items():
if (not options.all and not getattr(options, key, False)) or 'no_bd' not in val:

@ -164,7 +164,7 @@ def force_print(*args: str, **kwargs: Any) -> None:
cleaned = raw.encode('ascii', 'replace').decode('ascii')
print(cleaned, end='')
# We really want a heterogenous dict for this, but that's in typing_extensions
# We really want a heterogeneous dict for this, but that's in typing_extensions
def debug(*args: Union[str, AnsiDecorator], **kwargs: Any) -> None:
arr = process_markup(args, False)
if log_file is not None:

@ -138,7 +138,7 @@ class CmakeModule(ExtensionModule):
cmakebin = dependencies.ExternalProgram('cmake', silent=False)
p, stdout, stderr = mesonlib.Popen_safe(cmakebin.get_command() + ['--system-information', '-G', 'Ninja'])[0:3]
if p.returncode != 0:
mlog.log('error retrieving cmake informations: returnCode={0} stdout={1} stderr={2}'.format(p.returncode, stdout, stderr))
mlog.log('error retrieving cmake information: returnCode={0} stdout={1} stderr={2}'.format(p.returncode, stdout, stderr))
return False
match = re.search('\nCMAKE_ROOT \\"([^"]+)"\n', stdout.strip())

@ -47,7 +47,7 @@ native_glib_version = None
def gir_has_option(intr_obj, option):
try:
g_ir_scanner = intr_obj.find_program_impl('g-ir-scanner')
# Handle overriden g-ir-scanner
# Handle overridden g-ir-scanner
if isinstance(getattr(g_ir_scanner, "held_object", g_ir_scanner), interpreter.OverrideProgram):
assert option in ['--extra-library', '--sources-top-dirs']
return True

@ -60,7 +60,7 @@ class SimdModule(ExtensionModule):
for iset in self.isets:
if iset not in kwargs:
continue
iset_fname = kwargs[iset] # Migth also be an array or Files. static_library will validate.
iset_fname = kwargs[iset] # Might also be an array or Files. static_library will validate.
args = compiler.get_instruction_set_args(iset)
if args is None:
mlog.log('Compiler supports %s:' % iset, mlog.red('NO'))

@ -818,7 +818,7 @@ Timeout: %4d
# project_name:suite_name
# so we need to select only the test belonging to project_name
# this if hanlde the first case (i.e., SUITE == suite_name)
# this if handle the first case (i.e., SUITE == suite_name)
# in this way we can run tests belonging to different
# (sub)projects which share the same suite_name

@ -79,7 +79,7 @@ def run(options):
if all_backends or backend.startswith('vs'):
print('Meson command used in build file regeneration: ' + ' '.join(v))
elif k == 'pkgconf_envvar':
print('Last seen PKGCONFIG enviroment variable value: ' + v)
print('Last seen PKGCONFIG environment variable value: ' + v)
elif k == 'version':
print('Meson version: ' + v)
elif k == 'cross_files':

@ -34,14 +34,14 @@ import json, os, re, sys
class RewriterException(MesonException):
pass
def add_arguments(parser, formater=None):
def add_arguments(parser, formatter=None):
parser.add_argument('-s', '--sourcedir', type=str, default='.', metavar='SRCDIR', help='Path to source directory.')
parser.add_argument('-V', '--verbose', action='store_true', default=False, help='Enable verbose output')
parser.add_argument('-S', '--skip-errors', dest='skip', action='store_true', default=False, help='Skip errors instead of aborting')
subparsers = parser.add_subparsers(dest='type', title='Rewriter commands', description='Rewrite command to execute')
# Target
tgt_parser = subparsers.add_parser('target', help='Modify a target', formatter_class=formater)
tgt_parser = subparsers.add_parser('target', help='Modify a target', formatter_class=formatter)
tgt_parser.add_argument('-s', '--subdir', default='', dest='subdir', help='Subdirectory of the new target (only for the "add_target" action)')
tgt_parser.add_argument('--type', dest='tgt_type', choices=rewriter_keys['target']['target_type'][2], default='executable',
help='Type of the target to add (only for the "add_target" action)')
@ -51,7 +51,7 @@ def add_arguments(parser, formater=None):
tgt_parser.add_argument('sources', nargs='*', help='Sources to add/remove')
# KWARGS
kw_parser = subparsers.add_parser('kwargs', help='Modify keyword arguments', formatter_class=formater)
kw_parser = subparsers.add_parser('kwargs', help='Modify keyword arguments', formatter_class=formatter)
kw_parser.add_argument('operation', choices=rewriter_keys['kwargs']['operation'][2],
help='Action to execute')
kw_parser.add_argument('function', choices=list(rewriter_func_kwargs.keys()),
@ -60,13 +60,13 @@ def add_arguments(parser, formater=None):
kw_parser.add_argument('kwargs', nargs='*', help='Pairs of keyword and value')
# Default options
def_parser = subparsers.add_parser('default-options', help='Modify the project default options', formatter_class=formater)
def_parser = subparsers.add_parser('default-options', help='Modify the project default options', formatter_class=formatter)
def_parser.add_argument('operation', choices=rewriter_keys['default_options']['operation'][2],
help='Action to execute')
def_parser.add_argument('options', nargs='*', help='Key, value pairs of configuration option')
# JSON file/command
cmd_parser = subparsers.add_parser('command', help='Execute a JSON array of commands', formatter_class=formater)
cmd_parser = subparsers.add_parser('command', help='Execute a JSON array of commands', formatter_class=formatter)
cmd_parser.add_argument('json', help='JSON string or file to execute')
class RequiredKeys:
@ -882,7 +882,7 @@ def list_to_dict(in_list: List[str]) -> Dict[str, str]:
try:
for i in it:
# calling next(it) is not a mistake, we're taking the next element from
# the iterator, avoiding te need to preprocess it into a sequence of
# the iterator, avoiding the need to preprocess it into a sequence of
# key value pairs.
result[i] = next(it)
except StopIteration:

@ -1961,7 +1961,7 @@ class AllPlatformTests(BasePlatformTests):
https://github.com/mesonbuild/meson/pull/4555
Reverted to the first file only because of https://github.com/mesonbuild/meson/pull/4547#discussion_r244173438
TODO Change the format to a list officialy in a followup PR
TODO Change the format to a list officially in a followup PR
'''
if self.backend is not Backend.ninja:
raise unittest.SkipTest('{!r} backend can\'t install files'.format(self.backend.name))
@ -3213,7 +3213,7 @@ recommended as it is not supported on some platforms''')
# build user of library
self.new_builddir()
# replace is needed because meson mangles platform pathes passed via LDFLAGS
# replace is needed because meson mangles platform paths passed via LDFLAGS
self.init(os.path.join(testdirbase, 'exe'),
override_envvars={"LDFLAGS": '{}{}'.format(libdir_flag, libdir.replace('\\', '/'))})
self.build()
@ -3440,12 +3440,12 @@ recommended as it is not supported on some platforms''')
testdir = os.path.join(self.unit_test_dir, '41 featurenew subprojects')
out = self.init(testdir)
# Parent project warns correctly
self.assertRegex(out, "WARNING: Project targetting '>=0.45'.*'0.47.0': dict")
self.assertRegex(out, "WARNING: Project targeting '>=0.45'.*'0.47.0': dict")
# Subprojects warn correctly
self.assertRegex(out, r"\|WARNING: Project targetting '>=0.40'.*'0.44.0': disabler")
self.assertRegex(out, r"\|WARNING: Project targetting '!=0.40'.*'0.44.0': disabler")
self.assertRegex(out, r"\|WARNING: Project targeting '>=0.40'.*'0.44.0': disabler")
self.assertRegex(out, r"\|WARNING: Project targeting '!=0.40'.*'0.44.0': disabler")
# Subproject has a new-enough meson_version, no warning
self.assertNotRegex(out, "WARNING: Project targetting.*Python")
self.assertNotRegex(out, "WARNING: Project targeting.*Python")
# Ensure a summary is printed in the subproject and the outer project
self.assertRegex(out, r"\|WARNING: Project specifies a minimum meson_version '>=0.40'")
self.assertRegex(out, r"\| \* 0.44.0: {'disabler'}")
@ -4338,18 +4338,18 @@ class FailureTests(BasePlatformTests):
def test_using_too_recent_feature(self):
# Here we use a dict, which was introduced in 0.47.0
self.assertMesonOutputs("dict = {}",
".*WARNING.*Project targetting.*but.*",
".*WARNING.*Project targeting.*but.*",
meson_version='>= 0.46.0')
def test_using_recent_feature(self):
# Same as above, except the meson version is now appropriate
self.assertMesonDoesNotOutput("dict = {}",
".*WARNING.*Project targetting.*but.*",
".*WARNING.*Project targeting.*but.*",
meson_version='>= 0.47')
def test_using_too_recent_feature_dependency(self):
self.assertMesonOutputs("dependency('pcap', required: false)",
".*WARNING.*Project targetting.*but.*",
".*WARNING.*Project targeting.*but.*",
meson_version='>= 0.41.0')
def test_vcs_tag_featurenew_build_always_stale(self):

@ -10,7 +10,7 @@ e = executable('six', 'main.c', src)
test('six', e)
# The same again, but this time with a program that was genererated
# The same again, but this time with a program that was generated
# with configure_file.
gen = find_program('gencodegen')

@ -2,11 +2,11 @@ project('check header', 'c', 'cpp')
host_system = host_machine.system()
non_existant_header = 'ouagadougou.h'
non_existent_header = 'ouagadougou.h'
# Copy it into the builddir to ensure that it isn't found even if it's there
configure_file(input : non_existant_header,
output : non_existant_header,
configure_file(input : non_existent_header,
output : non_existent_header,
copy: true)
fallback = ''
@ -43,6 +43,6 @@ foreach comp : [meson.get_compiler('c'), meson.get_compiler('cpp')]
# This header exists in the source and the builddir, but we still must not
# find it since we are looking in the system directories.
assert(not comp.check_header(non_existant_header, prefix : fallback),
'Found non-existant header.')
assert(not comp.check_header(non_existent_header, prefix : fallback),
'Found non-existent header.')
endforeach

@ -20,7 +20,7 @@ c = meson.get_compiler('c')
cpp = meson.get_compiler('cpp')
if c.get_id() == 'pgi'
error('MESON_SKIP_TEST: PGI supports its own set of features, will need a seperate list for PGI to test it.')
error('MESON_SKIP_TEST: PGI supports its own set of features, will need a separate list for PGI to test it.')
endif
expected_result = not ['msvc', 'clang-cl', 'intel-cl'].contains(c.get_id())

@ -8,7 +8,7 @@ default = 'asufoiqwjtl;adjfbpiuqwoehtl;ajdfl;ghal;sdjg'
dep = dependency('zlib', method: 'pkg-config', required : false)
if not dep.found()
warning('Skipping pkg-config tests as zlib is not avialable or is not pkg-config')
warning('Skipping pkg-config tests as zlib is not available or is not pkg-config')
else
# Test for regular pkg-config
# We don't know what the value will be, but we know it should be the same
@ -16,9 +16,9 @@ else
assert(dep.get_pkgconfig_variable('prefix') == dep.get_variable(pkgconfig : 'prefix'),
'Got different values from get_pkgconfig_variable and get_variable(pkgconfig: )')
assert(dep.get_variable(pkgconfig : default, default_value : default) == default,
'pkg-config didnt get default when we should have.')
'pkg-config didn\'t get default when we should have.')
assert(dep.get_variable(pkgconfig : 'prefix', default_value : default) != default,
'pkg-config got default when we shouldnt have.')
'pkg-config got default when we shouldn\'t have.')
endif
dep_ct = dependency('llvm', method : 'config-tool', required : false)
@ -28,9 +28,9 @@ else
assert(dep_ct.get_configtool_variable('has-rtti') == dep_ct.get_variable(configtool : 'has-rtti'),
'Got different values from get_configtool_variable and get_variable(configtool: )')
assert(dep_ct.get_variable(configtool : default, default_value : default) == default,
'config-tool didnt get default when we should have.')
'config-tool didn\'t get default when we should have.')
assert(dep_ct.get_variable(configtool : 'has-rtti', default_value : default) != default,
'config-tool got default when we shouldnt have.')
'config-tool got default when we shouldn\'t have.')
endif
dep_cm = dependency('llvm', method : 'cmake', required : false)
@ -42,9 +42,9 @@ else
'RTTI information for cmake and config tools disagree')
endif
assert(dep_cm.get_variable(cmake : default, default_value : default) == default,
'cmake didnt get default when we should have.')
'cmake didn\'t get default when we should have.')
assert(dep_cm.get_variable(cmake : 'LLVM_ENABLE_RTTI', default_value : default) != default,
'cmake config-tool got default when we shouldnt have.')
'cmake config-tool got default when we shouldn\'t have.')
endif
idep = declare_dependency()

@ -2,11 +2,11 @@ project('has header', 'c', 'cpp')
host_system = host_machine.system()
non_existant_header = 'ouagadougou.h'
non_existent_header = 'ouagadougou.h'
# Copy it into the builddir to ensure that it isn't found even if it's there
configure_file(input : non_existant_header,
output : non_existant_header,
configure_file(input : non_existent_header,
output : non_existent_header,
configuration : configuration_data())
# Test that the fallback to __has_include also works on all compilers
@ -48,7 +48,7 @@ foreach fallback : fallbacks
# This header exists in the source and the builddir, but we still must not
# find it since we are looking in the system directories.
assert(not comp.has_header(non_existant_header, prefix : fallback),
'Found non-existant header.')
assert(not comp.has_header(non_existent_header, prefix : fallback),
'Found non-existent header.')
endforeach
endforeach

@ -1,6 +1,6 @@
usr/share/dircheck/fifth.dat
usr/share/dircheck/seventh.dat
usr/share/dircheck/nineth.dat
usr/share/dircheck/ninth.dat
usr/share/eighth.dat
usr/share/fourth.dat
usr/share/sixth.dat

@ -15,5 +15,5 @@ assert(dep.get_variable(cmake : 'CACHED_STRING_WS') == 'foo bar', 'set(CACHED ST
assert(dep.get_variable(cmake : 'CACHED_STRING_ARRAY_NS') == ['foo', 'bar'], 'set(CACHED STRING) without spaces is incorrect')
assert(dep.get_variable(cmake : 'CACHED_STRING_ARRAY_WS') == ['foo', 'foo bar', 'bar'], 'set(CACHED STRING[]) with spaces is incorrect')
# We don't suppor this, so it should be unset.
# We don't support this, so it should be unset.
assert(dep.get_variable(cmake : 'ENV{var}', default_value : 'sentinal') == 'sentinal', 'set(ENV) should be ignored')
Loading…
Cancel
Save