Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VHACD does not compile with musl libc #31555

Closed
jameswestman opened this issue Aug 22, 2019 · 10 comments · Fixed by #34250
Closed

VHACD does not compile with musl libc #31555

jameswestman opened this issue Aug 22, 2019 · 10 comments · Fixed by #34250

Comments

@jameswestman
Copy link
Contributor

Godot version: 3bd49da
OS: Alpine Linux (Docker)

Issue description:
Godot Engine does not compile on Alpine (or other musl-based systems) due to the use of PTHREAD_MUTEX_RECURSIVE_NP in the third-party VHACD module. This appears to be a glibc-specific macro.

Here is the relevant log output:

[Initial build] Compiling ==> thirdparty/vhacd/src/VHACD.cpp
In file included from thirdparty/vhacd/inc/btScalar.h:190,
                 from thirdparty/vhacd/inc/btAlignedAllocator.h:23,
                 from thirdparty/vhacd/inc/btAlignedObjectArray.h:19,
                 from thirdparty/vhacd/inc/btConvexHullComputer.h:18,
                 from thirdparty/vhacd/src/VHACD.cpp:30:
thirdparty/vhacd/inc/vhacdMutex.h: In constructor 'VHACD::Mutex::Mutex()':
thirdparty/vhacd/inc/vhacdMutex.h:97:60: error: 'PTHREAD_MUTEX_RECURSIVE_NP' was not declared in this scope
         VHACD_VERIFY(pthread_mutexattr_settype(&mutexAttr, PTHREAD_MUTEX_RECURSIVE_NP) == 0);
                                                            ^~~~~~~~~~~~~~~~~~~~~~~~~~
thirdparty/vhacd/inc/vhacdMutex.h:97:9: note: in expansion of macro 'VHACD_VERIFY'
         VHACD_VERIFY(pthread_mutexattr_settype(&mutexAttr, PTHREAD_MUTEX_RECURSIVE_NP) == 0);
         ^~~~~~~~~~~~
thirdparty/vhacd/inc/vhacdMutex.h:97:60: note: suggested alternative: 'PTHREAD_MUTEX_RECURSIVE'
         VHACD_VERIFY(pthread_mutexattr_settype(&mutexAttr, PTHREAD_MUTEX_RECURSIVE_NP) == 0);
                                                            ^~~~~~~~~~~~~~~~~~~~~~~~~~
thirdparty/vhacd/inc/vhacdMutex.h:97:9: note: in expansion of macro 'VHACD_VERIFY'
         VHACD_VERIFY(pthread_mutexattr_settype(&mutexAttr, PTHREAD_MUTEX_RECURSIVE_NP) == 0);
         ^~~~~~~~~~~~
scons: *** [thirdparty/vhacd/src/VHACD.x11.tools.64.o] Error 1
scons: building terminated because of errors.

@bojidar-bg
Copy link
Contributor

Same file, a bit higher up:

#if defined(__APPLE__)
#define PTHREAD_MUTEX_RECURSIVE_NP PTHREAD_MUTEX_RECURSIVE
#endif

I guess it tries to detect clang by checking for __APPLE__, but it evidently isn't enough?

@Calinou
Copy link
Member

Calinou commented Nov 10, 2019

Pull request opened upstream: kmammou/v-hacd#70

@chiguireitor
Copy link

Something similar happens when cross-compiling from linux to windows with mingw.

thirdparty/vhacd/src/VHACD-ASYNC.cpp:307:7: error: 'thread' in namespace 'std' does not name a type
  307 |  std::thread      *mThread{ nullptr };
      |       ^~~~~~
thirdparty/vhacd/src/VHACD-ASYNC.cpp:10:1: note: 'std::thread' is defined in header '<thread>'; did you forget to '#include <thread>'?
    9 | #include <float.h>
  +++ |+#include <thread>
   10 | 
thirdparty/vhacd/src/VHACD-ASYNC.cpp:315:15: error: 'mutex' in namespace 'std' does not name a type
  315 |  mutable std::mutex      mMessageMutex;
      |               ^~~~~
thirdparty/vhacd/src/VHACD-ASYNC.cpp:10:1: note: 'std::mutex' is defined in header '<mutex>'; did you forget to '#include <mutex>'?
    9 | #include <float.h>
  +++ |+#include <mutex>
   10 | 
[ 50%] thirdparty/vhacd/src/VHACD-ASYNC.cpp: In member function 'virtual bool VHACD::MyHACD_API::Compute(const double*, uint32_t, const uint32_t*, uint32_t, const VHACD::IVHACD::Parameters&)':
thirdparty/vhacd/src/VHACD-ASYNC.cpp:53:3: error: 'mThread' was not declared in this scope; did you mean 'fread'?
   53 |   mThread = new std::thread([this, countPoints, countTriangles, _desc]()
      |   ^~~~~~~
      |   fread
thirdparty/vhacd/src/VHACD-ASYNC.cpp:53:17: error: expected type-specifier
   53 |   mThread = new std::thread([this, countPoints, countTriangles, _desc]()
      |                 ^~~
thirdparty/vhacd/src/VHACD-ASYNC.cpp:57:4: error: expected primary-expression before ')' token
   57 |   });
      |    ^
thirdparty/vhacd/src/VHACD-ASYNC.cpp: In member function 'virtual void VHACD::MyHACD_API::Cancel()':
thirdparty/vhacd/src/VHACD-ASYNC.cpp:166:7: error: 'mThread' was not declared in this scope; did you mean 'fread'?
  166 |   if (mThread)
      |       ^~~~~~~
      |       fread
thirdparty/vhacd/src/VHACD-ASYNC.cpp:169:11: error: type '<type error>' argument given to 'delete', expected pointer
  169 |    delete mThread;
      |           ^~~~~~~
thirdparty/vhacd/src/VHACD-ASYNC.cpp: In member function 'virtual void VHACD::MyHACD_API::Update(double, double, double, const char*, const char*)':
thirdparty/vhacd/src/VHACD-ASYNC.cpp:235:3: error: 'mMessageMutex' was not declared in this scope; did you mean 'mMessage'?
  235 |   mMessageMutex.lock();
      |   ^~~~~~~~~~~~~
      |   mMessage
thirdparty/vhacd/src/VHACD-ASYNC.cpp: In member function 'virtual void VHACD::MyHACD_API::Log(const char*)':
thirdparty/vhacd/src/VHACD-ASYNC.cpp:247:3: error: 'mMessageMutex' was not declared in this scope; did you mean 'mMessage'?
  247 |   mMessageMutex.lock();
      |   ^~~~~~~~~~~~~
      |   mMessage
thirdparty/vhacd/src/VHACD-ASYNC.cpp: In member function 'void VHACD::MyHACD_API::processPendingMessages() const':
thirdparty/vhacd/src/VHACD-ASYNC.cpp:267:4: error: 'mMessageMutex' was not declared in this scope; did you mean 'mMessage'?
  267 |    mMessageMutex.lock();
      |    ^~~~~~~~~~~~~
      |    mMessage
thirdparty/vhacd/src/VHACD-ASYNC.cpp:275:4: error: 'mMessageMutex' was not declared in this scope; did you mean 'mMessage'?
  275 |    mMessageMutex.lock();
      |    ^~~~~~~~~~~~~
      |    mMessage

@akien-mga
Copy link
Member

akien-mga commented Dec 6, 2019

@chiguireitor That's a different issue though. What version of MinGW do you use? It might be too old, we use mingw-w64 6.0.0 with GCC 9.1 successfully to build official Godot binaries for Windows from Linux. Your version doesn't seem to support C++11.

@chiguireitor
Copy link

i just followed instructions to install it on ubuntu 19.10.... i'm away from my build box, will check back when i'm at home.

If it makes a difference, it's using the latest master head, so stuff could be broken.

@chiguireitor
Copy link

Ok, finally made it work, had to switch alternative mingw from the -windows suffix to the -posix suffix, should be noted on the documentation tho

@jameswestman
Copy link
Contributor Author

The original issue has not been fixed. I just tried compiling on Alpine Linux and it does not work.

@chiguireitor
Copy link

Did you try using posix threads? In ubuntu i know there's the update-alternative utilities, but dunno how it goes in Alpine.... also, Alpine uses Musl instead of Libc, so you might be having a slight incompatibility there.

@jameswestman
Copy link
Contributor Author

Yes, the incompatibility is caused by Alpine's use of musl. The pull request above fixes the problem, but it needs to also be in Godot's copy.

@akien-mga akien-mga reopened this Dec 10, 2019
jameswestman added a commit to jameswestman/godot that referenced this issue Dec 10, 2019
On some systems, including Alpine Linux, musl is used instead of
glibc. This commit patches the third-party V-HACD module to provide
a macro not provided by musl.

Fixes godotengine#31555.
marstaik pushed a commit to marstaik/godot that referenced this issue Dec 24, 2019
On some systems, including Alpine Linux, musl is used instead of
glibc. This commit patches the third-party V-HACD module to provide
a macro not provided by musl.

Fixes godotengine#31555.
@LinuxUserGD
Copy link
Contributor

@chiguireitor Seems to be #40853 now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants