Friends, hello everyone!
First of all, I want to thank everyone who took part in the discussion about the accessibility of EVE Online. No matter what your point of view is, it truly matters, because every opinion helps us understand how to make the game more accessible and comfortable for everyone.
As for 3D sound, I’d like to point out that it’s not an essential feature. In EVE Online, most actions and events already have their own unique sounds that players can use as cues. Audio beacons for determining the exact position of objects in space don’t play a critical role. What’s far more important is ensuring access to the interface and controls.
I’d also like to touch on how information is perceived. In real life, we, blind people, don’t receive the full amount of visual information either. Yet we live, learn, work, and successfully adapt to the world by using other ways of interacting with it. So, accessibility in a game can be achieved without having to voice absolutely everything.
To make the idea clearer, let me give an example using the “Overview” panel.
Imagine that by pressing a key combination (for example, Ctrl + ]) an accessibility mode is activated. It doesn’t cause any visual changes, but it allows interaction with the game through text-to-speech. In this mode, you can move up and down through available sections such as “Skills,” “Market,” “Ship Fitting,” “Hangar,” and, when in space, “Overview” and others.
Let’s say we’re in open space, in a system with asteroid belts. Using the arrow keys, we navigate to “Overview,” press the right arrow, and enter it. Inside, there are several preconfigured tabs for different activities: “Mining,” “Structures,” and so on. We choose “Mining,” press the right arrow again to open it, and get a list of asteroid belts. After selecting the desired belt, we press the right arrow once more to open the context menu (the equivalent of a right-click for sighted players). We choose the needed action, press Enter — and it’s executed.
The same logic could be used for navigation in other sections as well — the “Skills” window, “Ship Fitting,” and so forth. Almost everything in EVE Online is already based on an object structure. Making it accessible through a tree-style system of spoken menus is entirely feasible. The same data used by sighted players would be used here too, which means the information and interaction speed would, in many cases, be nearly identical.
Thank you to everyone paying attention to this topic. I hope my example clearly shows that accessibility isn’t a separate interface, but an alternative way of interacting with the same game world.