Skip to content

Commit

Permalink
v1.7.35
Browse files Browse the repository at this point in the history
  • Loading branch information
efroemling committed Jun 16, 2024
1 parent df0cf5f commit ce26be2
Show file tree
Hide file tree
Showing 12 changed files with 167 additions and 142 deletions.
128 changes: 64 additions & 64 deletions .efrocachemap

Large diffs are not rendered by default.

15 changes: 1 addition & 14 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
### 1.7.35 (build 21884, api 8, 2024-06-12)
### 1.7.35 (build 21888, api 8, 2024-06-16)
- Fixed an issue where the engine would block at exit on some version of Linux
until Ctrl-D was pressed in the calling terminal.
- V2 accounts have been around for a while now, so the old V1 device login
Expand Down Expand Up @@ -47,19 +47,6 @@
two forms. Now it is possible to provide both.
- Spaz classes now have a `default_hitpoints` which makes customizing that
easier (Thanks rabbitboom!)
- (WORK IN PROGRESS) As of this version, servers are *required* to be accessible
via ipv4 to appear in the public listing. So they may need to provide an ipv4
address in their config if the automatically detected one is ipv6. This should
reduce the confusion of ipv6-only servers appearing greyed out for lots of
ipv4-only people. Pretty much everyone can connect to ipv4.
- (WORK IN PROGRESS) There is now more personalized error feedback for the
connectivity checks when poking `Make My Party Public` or when launching the
command line server. Hopefully this will help navigate the new dual ipv4/ipv6
situation.
- (WORK IN PROGRESS) The low level `ConnectionToHostUDP` class can now accept
multiple `SockAddr`s; it will attempt to contact the host on all of them and
use whichever responds first. This allows us to pass both ipv4 and ipv6
addresses when available and transparently use whichever is more performant.
- Added `docker-build`, `docker-run`, `docker-clean` and `docker-save` targets
to Makefile.
- Fixed an issue in Assault where being teleported back to base with a sticky
Expand Down
89 changes: 52 additions & 37 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ help: env
# Set env-var BA_ENABLE_COMPILE_COMMANDS_DB=1 to enable creating/updating a
# cmake compile-commands database for use with things like clangd.
ifeq ($(BA_ENABLE_COMPILE_COMMANDS_DB),1)
PREREQ_COMPILE_COMMANDS_DB = .cache/compile_commands_db/compile_commands.json
ENV_COMPILE_COMMANDS_DB = .cache/compile_commands_db/compile_commands.json
endif

# pcommandbatch can be much faster when running hundreds or thousands of
Expand All @@ -46,19 +46,19 @@ else
PCOMMANDBATCH = $(PCOMMAND)
endif

# Prereq targets that should be safe to run anytime; even if project-files
# Env targets that should be safe to run anytime; even if project-files
# are out of date.
ENV_REQS_SAFE = .cache/checkenv $(PCOMMANDBATCHBIN) .dir-locals.el .mypy.ini \
.pyrightconfig.json .pylintrc .clang-format \
ballisticakit-cmake/.clang-format .editorconfig tools/cloudshell \
tools/bacloud
.pyrightconfig.json .pylintrc .clang-format \
ballisticakit-cmake/.clang-format .editorconfig tools/cloudshell \
tools/bacloud tools/pcommand

# Prereq targets that may break if the project needs updating should go here.
# Env targets that may break if the project needs updating should go here.
# An example is compile-command-databases; these might try to run cmake and
# fail if the CMakeList files don't match what's on disk. If such a target was
# included in ENV_REQS_SAFE it would try to build *before* project updates
# which would leave us stuck in a broken state.
ENV_REQS_POST_UPDATE_ONLY = $(PREREQ_COMPILE_COMMANDS_DB)
ENV_REQS_POST_UPDATE_ONLY = $(ENV_COMPILE_COMMANDS_DB)

# Target that should be built before building almost any other target. This
# installs tool config files, sets up the Python virtual environment, etc.
Expand Down Expand Up @@ -1245,30 +1245,30 @@ CHECK_CLEAN_SAFETY = $(PCOMMAND) check_clean_safety
# Some tool configs that need filtering (mainly injecting projroot path).
TOOL_CFG_INST = $(PCOMMAND) tool_config_install

# Anything that affects tool-config generation.
# Anything required for tool-config generation.
TOOL_CFG_SRC = tools/efrotools/toolconfig.py config/projectconfig.json \
.venv/.efro_venv_complete tools/pcommand
tools/pcommand

# Anything that should trigger an environment-check when changed.
ENV_SRC = tools/batools/build.py .venv/.efro_venv_complete tools/pcommand
ENV_SRC = tools/batools/build.py .venv/.efro_venv_complete

# Generate a pcommand script hard-coded to use our virtual environment.
# This is a prereq dependency so should not itself depend on env.
tools/pcommand: tools/efrotools/genwrapper.py tools/efrotools/pyver.py
# This is an env dependency so should not itself depend on env.
tools/pcommand: tools/efrotools/genwrapper.py .venv/.efro_venv_complete
@echo Generating tools/pcommand...
@PYTHONPATH=tools python3 -m \
efrotools.genwrapper pcommand batools.pcommandmain tools/pcommand

# Generate a cloudshell script hard-coded to use our virtual environment.
# This is a prereq dependency so should not itself depend on env.
tools/cloudshell: tools/efrotools/genwrapper.py tools/efrotools/pyver.py
# This is an env dependency so should not itself depend on env.
tools/cloudshell: tools/efrotools/genwrapper.py .venv/.efro_venv_complete
@echo Generating tools/cloudshell...
@PYTHONPATH=tools python3 -m \
efrotools.genwrapper cloudshell efrotoolsinternal.cloudshell tools/cloudshell

# Generate a bacloud script hard-coded to use our virtual environment.
# This is a prereq dependency so should not itself depend on env.
tools/bacloud: tools/efrotools/genwrapper.py tools/efrotools/pyver.py
# This is an env dependency so should not itself depend on env.
tools/bacloud: tools/efrotools/genwrapper.py .venv/.efro_venv_complete
@echo Generating tools/bacloud...
@PYTHONPATH=tools python3 -m \
efrotools.genwrapper bacloud batools.bacloud tools/bacloud
Expand Down Expand Up @@ -1300,40 +1300,54 @@ SKIP_ENV_CHECKS ?= 0
VENV_PYTHON ?= python3.12

# Increment this to force all downstream venvs to fully rebuild. Useful after
# removing requirements since upgrading in place will never uninstall stuff.
# removing requirements since upgrading venvs in place will never uninstall
# stuff.
VENV_STATE = 1

# Rebuild our virtual environment whenever reqs, Python version, or explicit
# state number changes. This is a dependency of env so it should not itself
# depend on env. Note that we list pcommand as a requirement but can't use it
# in here until the end when the venv is up. Also note that we try to update
# venvs in place when possible, but when Python version or venv-state changes
# we blow it away and start over to be safe.
.venv/.efro_venv_complete: tools/pcommand config/requirements.txt \
tools/efrotools/pyver.py
# Update our virtual environment whenever reqs changes, Python version
# changes, our venv's Python symlink breaks (can happen for minor Python
# updates), or explicit state number changes. This is a dependency of env so
# should not itself depend on env.
.venv/.efro_venv_complete: \
config/requirements.txt \
tools/efrotools/pyver.py \
.venv/bin/$(VENV_PYTHON) \
.venv/.efro_venv_state_$(VENV_STATE)
# Update venv in place when possible; otherwise create from scratch.
@[ -f .venv/bin/$(VENV_PYTHON) ] \
&& [ -f .venv/.efro_venv_state_$(VENV_STATE) ] \
&& echo Updating existing $(VENV_PYTHON) virtual environment in \'.venv\'... \
|| (echo Creating new $(VENV_PYTHON) virtual environment in \'.venv\'... \
&& rm -rf .venv)
$(VENV_PYTHON) -m venv .venv
&& rm -rf .venv && $(VENV_PYTHON) -m venv .venv \
&& touch .venv/.efro_venv_state_$(VENV_STATE))
.venv/bin/pip install --upgrade pip
.venv/bin/pip install -r config/requirements.txt
touch .venv/.efro_venv_state_$(VENV_STATE) \
.venv/.efro_venv_complete # Done last to enforce fully-built venvs.
@$(PCOMMAND) echo \
GRN Project virtual environment for BLD $(VENV_PYTHON) RST GRN \
at BLD .venv RST GRN is ready to use.

.cache/checkenv: $(ENV_SRC)
@touch .venv/.efro_venv_complete # Done last to signal fully-built venv.
@echo Project virtual environment for $(VENV_PYTHON) at .venv is ready to use.

# We don't actually create anything with this target, but its existence allows
# .efro_venv_complete to run when these bits don't exist, and that target
# *does* recreate this stuff. Note to self: previously I tried splitting
# things up more and recreating the venv in this target, but that led to
# unintuitive dependency behavior. For example, a python update could cause
# the .venv/bin/$(VENV_PYTHON) symlink to break, which would cause that target
# to blow away and rebuild the venv, but then the reestablished symlink might
# have an old modtime (since modtime is that of python itself) which could
# cause .efro_venv_complete to think it was already up to date and not run,
# leaving us with a half-built venv. So the way we do it now ensures the venv
# update always happens in full and seems mostly foolproof.
.venv/bin/$(VENV_PYTHON) .venv/.efro_venv_state_$(VENV_STATE):

.cache/checkenv: $(ENV_SRC) $(PCOMMAND)
@if [ $(SKIP_ENV_CHECKS) -ne 1 ]; then \
$(PCOMMAND) checkenv && mkdir -p .cache && touch .cache/checkenv; \
fi

$(PCOMMANDBATCHBIN): src/tools/pcommandbatch/pcommandbatch.c \
PCOMMANDBATCHSRC = src/tools/pcommandbatch/pcommandbatch.c \
src/tools/pcommandbatch/cJSON.c
@$(MAKE) tools/pcommand
@$(PCOMMAND) build_pcommandbatch $^ $@

$(PCOMMANDBATCHBIN): $(PCOMMANDBATCHSRC) $(PCOMMAND)
@$(PCOMMAND) build_pcommandbatch $(PCOMMANDBATCHSRC) $(PCOMMANDBATCHBIN)

# CMake build-type lowercase
CM_BT_LC = $(shell echo $(CMAKE_BUILD_TYPE) | tr A-Z a-z)
Expand Down Expand Up @@ -1368,6 +1382,7 @@ ballisticakit-cmake/.clang-format: .clang-format
# compile commands for all files; lets try to keep it up to date
# whenever CMakeLists changes.
.cache/compile_commands_db/compile_commands.json: \
$(PCOMMANDBATCH) \
ballisticakit-cmake/CMakeLists.txt
@$(PCOMMANDBATCH) echo BLU Updating compile commands db...
@mkdir -p .cache/compile_commands_db
Expand Down
12 changes: 6 additions & 6 deletions config/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
cpplint==1.6.1
dmgbuild==1.6.1
filelock==3.14.0
filelock==3.15.1
furo==2024.5.6
mypy==1.10.0
pbxproj==4.1.0
pdoc==14.5.0
pur==7.3.1
pylint==3.2.2
pur==7.3.2
pylint==3.2.3
pylsp-mypy==0.6.8
pytest==8.2.1
pytest==8.2.2
python-daemon==3.0.1
python-lsp-black==2.0.0
python-lsp-server==1.11.0
Expand All @@ -17,5 +17,5 @@ Sphinx==7.3.7
tomlkit==0.12.5
types-certifi==2021.10.8.3
types-filelock==3.2.7
types-requests==2.32.0.20240523
typing_extensions==4.12.0
types-requests==2.32.0.20240602
typing_extensions==4.12.2
2 changes: 1 addition & 1 deletion src/assets/ba_data/python/baenv.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@

# Build number and version of the ballistica binary we expect to be
# using.
TARGET_BALLISTICA_BUILD = 21884
TARGET_BALLISTICA_BUILD = 21888
TARGET_BALLISTICA_VERSION = '1.7.35'


Expand Down
4 changes: 2 additions & 2 deletions src/ballistica/base/platform/base_platform.cc
Original file line number Diff line number Diff line change
Expand Up @@ -64,8 +64,8 @@ auto BasePlatform::GetPublicDeviceUUID() -> std::string {
// We used to plug version in directly here, but that caused uuids to
// shuffle too rapidly during periods of rapid development. This
// keeps it more constant.
// __last_rand_uuid_component_shuffle_date__ 2024 6 12
auto rand_uuid_component{"WI5XDVM7QQBD4G6O0GS2DW6IPJ4VQT9X"};
// __last_rand_uuid_component_shuffle_date__ 2024 6 13
auto rand_uuid_component{"1URRE62C7234VP9L1BUPJ1P7QT7Q8YW3"};

inputs.emplace_back(rand_uuid_component);
auto gil{Python::ScopedInterpreterLock()};
Expand Down
2 changes: 1 addition & 1 deletion src/ballistica/shared/ballistica.cc
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ auto main(int argc, char** argv) -> int {
namespace ballistica {

// These are set automatically via script; don't modify them here.
const int kEngineBuildNumber = 21884;
const int kEngineBuildNumber = 21888;
const char* kEngineVersion = "1.7.35";
const int kEngineApiVersion = 8;

Expand Down
6 changes: 3 additions & 3 deletions tools/efro/error.py
Original file line number Diff line number Diff line change
Expand Up @@ -73,9 +73,9 @@ class RemoteError(Exception):
occurs remotely. The error string can consist of a remote stack
trace or a simple message depending on the context.
Communication systems should raise more specific error types locally
when more introspection/control is needed; this is intended somewhat
as a catch-all.
Communication systems should aim to communicate specific errors
gracefully as standard message responses when specific details are
needed; this is intended more as a catch-all.
"""

def __init__(self, msg: str, peer_desc: str):
Expand Down
14 changes: 7 additions & 7 deletions tools/efro/message/_protocol.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ def __init__(
forward_communication_errors: bool = False,
forward_clean_errors: bool = False,
remote_errors_include_stack_traces: bool = False,
log_remote_errors: bool = True,
log_errors_on_receiver: bool = True,
) -> None:
"""Create a protocol with a given configuration.
Expand All @@ -62,8 +62,8 @@ def __init__(
When an exception is not covered by the optional forwarding
mechanisms above, it will come across as efro.error.RemoteError
and the exception will be logged on the receiver
end - at least by default (see details below).
and the exception will be logged on the receiver end - at least
by default (see details below).
If 'remote_errors_include_stack_traces' is True, stringified
stack traces will be returned with efro.error.RemoteError
Expand All @@ -77,8 +77,8 @@ def __init__(
goal is usually to avoid returning opaque RemoteErrors and to
instead return something meaningful as part of the expected
response type (even if that value itself represents a logical
error state). If 'log_remote_errors' is False, however, such
exceptions will not be logged on the receiver. This can be
error state). If 'log_errors_on_receiver' is False, however, such
exceptions will *not* be logged on the receiver. This can be
useful in combination with 'remote_errors_include_stack_traces'
and 'forward_clean_errors' in situations where all error
logging/management will be happening on the sender end. Be
Expand Down Expand Up @@ -168,7 +168,7 @@ def _reg_sys(reg_tp: type[SysResponse], reg_id: int) -> None:
self.remote_errors_include_stack_traces = (
remote_errors_include_stack_traces
)
self.log_remote_errors = log_remote_errors
self.log_errors_on_receiver = log_errors_on_receiver

@staticmethod
def encode_dict(obj: dict) -> str:
Expand Down Expand Up @@ -219,7 +219,7 @@ def error_to_response(self, exc: Exception) -> tuple[SysResponse, bool]:
),
error_type=ErrorSysResponse.ErrorType.REMOTE,
),
self.log_remote_errors,
self.log_errors_on_receiver,
)

def _to_dict(
Expand Down
3 changes: 2 additions & 1 deletion tools/efro/message/_receiver.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ class MyClass:
# MyMessageReceiver fills out handler() overloads to ensure all
# registered handlers have valid types/return-types.
@receiver.handler
def handle_some_message_type(self, message: SomeMsg) -> SomeResponse:
# Deal with this message type here.
Expand All @@ -47,7 +48,7 @@ def handle_some_message_type(self, message: SomeMsg) -> SomeResponse:
obj.receiver.handle_raw_message(some_raw_data)
Any unhandled Exception occurring during message handling will result in
an Exception being raised on the sending end.
an efro.error.RemoteError being raised on the sending end.
"""

is_async = False
Expand Down
23 changes: 17 additions & 6 deletions tools/efro/message/_sender.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,22 +20,33 @@

class MessageSender:
"""Facilitates sending messages to a target and receiving responses.
This is instantiated at the class level and used to register unbound
class methods to handle raw message sending.
These are instantiated at the class level and used to register unbound
class methods to handle raw message sending. Generally this class is not
used directly, but instead autogenerated subclasses which provide type
safe overloads are used instead.
Example:
(In this example, MyMessageSender is an autogenerated class that
inherits from MessageSender).
class MyClass:
msg = MyMessageSender(some_protocol)
msg = MyMessageSender()
@msg.send_method
def send_raw_message(self, message: str) -> str:
# Actually send the message here.
# MyMessageSender class should provide overloads for send(), send_async(),
# etc. to ensure all sending happens with valid types.
obj = MyClass()
obj.msg.send(SomeMessageType())
# The MyMessageSender generated class would provides overloads for
# send(), send_async(), etc. to provide type-safety for message types
# and their associated response types.
# Thus, given the statement below, a type-checker would know that
# 'response' is a SomeResponseType or whatever is associated with
# SomeMessageType.
response = obj.msg.send(SomeMessageType())
"""

def __init__(self, protocol: MessageProtocol) -> None:
Expand Down
11 changes: 11 additions & 0 deletions tools/efrotools/pybuild.py
Original file line number Diff line number Diff line change
Expand Up @@ -480,6 +480,17 @@ def apple_patch(python_dir: str) -> None:
"""New test."""
patch_modules_setup(python_dir, 'apple')

# Filter an instance of 'itms-services' that appeared in Python3.12
# and which was getting me rejected from the app store.
fname = os.path.join(python_dir, 'Lib', 'urllib', 'parse.py')
ftxt = readfile(fname)
ftxt = replace_exact(
ftxt,
"'wss', 'itms-services']",
"'wss', 'i!t!m!s!-!s!e!r!v!i!c!e!s'.replace('!', '')]",
)
writefile(fname, ftxt)


def patch_modules_setup(python_dir: str, baseplatform: str) -> None:
"""Muck with the Setup.* files Python uses to build modules."""
Expand Down

0 comments on commit ce26be2

Please sign in to comment.