Mumble overlay with EVE on Linux

Hi there!

I try to get Mumble overlay work on Linux.

My config atm:

  • Ubuntu 16.04.5 LTS
  • AMDGPU-PRO 18.30-633530 (Radeon RX 580)
  • wine-staging 3.14.0
  • mumble:i386 1.3.0
  • EVE set up with DX9

Running EVE with mumble-overlay = crash:
MumbleOverlayEveCrash

Funny enough, when I start wine with EVE’s environment + FurMark, the overlay works:

The reason I installed the AMDGPU-PRO driver because it worked the last time I tried it (was buggy, but at least not crashing EVE).

Does anyone know how to fix this?

TIA,
Zoe

Replacing the AMDGPU-PRO 18.30 with Mesa:

Vendor: X.Org (0x1002)
Device: Radeon RX 580 Series (POLARIS10 / DRM 3.23.0 / 4.15.0-33-generic, LLVM 6.0.0) (0x67df)
Version: 18.0.5
Accelerated: yes

EVE + overlay: no crash, but no overlay
FurMark + overlay: works

I think that you need a 32 bit overlay

and you need to make a sh file to run exefile with the proper parameter from the launcher, appending the overlay command before the wine command.

32 bits is required to inject on 32 bit opengl/vulkan games like EVE.

I know, that’s why I included the screenshot of FurMark, which is a 32-bit OpenGL benchmark that I execute the same way as I run EVE Online:
/usr/bin/mumble-overlay /opt/wine-staging/bin/wine FurMark.exe

I also wrote a minimal OpenGL application that utilizes glXSwapBuffers (that’s where EVE Online crashed, you can see the screenshot above):
image

But it would be nice to get the overlay working over EVE Online.

Did you try to put the overlay command in the “wine” command, not in the launcher command?

For example mine:

#!/bin/sh
#/usr/bin/wine /home/$"·$%$·/apitrace-msvc/x86/bin/apitrace.exe trace -a d3d11 “$@”
DXVK_LOG_LEVEL=none DXVK_USE_PIPECOMPILER=1 /usr/bin/wine “$@”

You just need the last line, adapt as will. in my case DXVK_LOG_LEVEL = moar fps and less craping on the SSD then the pipecompiler make it faster with Radeon cards.

then you make that .sh executable and finally you set that sh script as the “Custom wine” command.

The “$@” stuff everything the launcher sent to Wine as parameter.

Yes, I use a custom wine bash script
MUMBLE_OVERLAY_DEBUG=1 /usr/bin/mumble-overlay /opt/wine-staging/bin/wine $@ 2>&
1 | tee /tmp/eve

And th relevant lines in /tmp/eve are:
MumbleOverlay: Mumble overlay library loaded
MumbleOverlay: Iterating dlsym table 0xf7b09210 0xf7b094b0 42
MumbleOverlay: Original dlsym at 0xf7b09d90

And I don’t use DXVK and I run the client with DX9.

Some updates:

  • 4.18.5-041805-generic
  • mumble:i386 updated to 1.3.0~2870~gf7221c1
  • wine-staging 3.15

OpenGL Version 4.5 (Core Profile) Mesa 18.0.5, Vendor X.Org, Renderer Radeon RX 580 Series (POLARIS10 / DRM 3.26.0 / 4.18.5-041805-generic, LLVM 6.0.0), Shader 4.50

Got a ton of MumbleOverlay messages:
MumbleOverlay: Request for symbol __wine_process_init (0x7ce5c2a8:0xf7b5ed90)
ERROR: ld.so: object ‘/usr/lib/mumble/libmumble.so.1’ from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.

MumbleOverlay: Current context is: 0xeeded080
MumbleOverlay: GLX version 1.4

MumbleOverlay: Sending init overlay msg with w h 1024 700
MumbleOverlay: Sending init overlay msg with w h 1920 1080
MumbleOverlay: SHMEM /MumbleOverlayMemory355
MumbleOverlay: Failed to map memory
MumbleOverlay: BLIT 0 0 1024 700
MumbleOverlay: BLIT 0 0 1024 700
MumbleOverlay: ACTIVE 0 0 552 168

MumbleOverlay: Optimzied fullscreen blit

It’s still not working, but these changes are promising.

FurMark 1.20.1.0 still works and has no problem mapping memory:
MumbleOverlay: Sending init overlay msg with w h 634 375
MumbleOverlay: SHMEM /MumbleOverlayMemory369
MumbleOverlay: BLIT 0 0 634 375
MumbleOverlay: Optimzied fullscreen blit

What kind of DX9 are you using? Native DX9 with Nine or Wine slow DX9=>OpenGL layer

What kind of DX9 are you using? Native DX9 with Nine or Wine slow DX9=>OpenGL layer

What comes with Ubuntu, so the default…

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.