Performance critical systems such as our rendering engine known as Trinity are indeed written in C++.
This was an exageration, I had two old computers that were anyhow about to die. The GPU did not die, though what happened (and still happens on my computer now) is that if I donât run in potato mode, then my fans start going wild with 2+ accounts. The gas clouds were however putting a huge load on my computer. (I think in some guristas cosmic sites).
My point is, we donât have return over our bug reports so we canât know if/when a bug we reported is supposed to be fixed.
Iâve seen people saying that using Vulkan would be good for Linux support, but graphics apis aside, what else is stopping EVE from running on linux?
I play EVE on Linux using Wine 4.0 with DXGI (older versions of Wine might want to use DXVK). Also if you use upstream Wine youâll want to apply this patch (https://forum.ubuntuusers.de/topic/wine-selbst-bauen-und-dann-installieren/, itâs in German but youâll want to apply the GetKeyState / GetKayboardState patch); from what I can see the main Wine devs feel that fixing this bug is heretical because it doesnât follow Windows behaviour, but then Windows doesnât mess up modifier key state when Alt-Tabbing windowsâŠ
In short Wine has already moved to translating DX calls to Vulkan calls for all but DX9.
I play everywhere, I rotate empires and hubs to keep standings up as well as switch to LS for escalations and do occasional WH dives. Still having problems. It never disappears, its just bigger / smaller issue in some places but still always there.
Graphically intensive environments are not bugs, they are just how the game was made. I cannot reasonable file a bug on something that isnt a bug at all.
Currently i have to keep opening my graphics settings every time i go into a cloud environment (which is often) and turn down shader settings until im in semi-potato mode, then go back and turn the graphics settings back up when i leave the cloud environment, it is annoying but i dont want to play in semi-potato mode all the time and the clouds kill my graphics card.
I would also note that since turning down shader quality far enough turns off clouds completely, I believe that it would not be difficult to put clouds into its own shader quality category and let me shut only the clouds down with its own shader then i could keep all the rest of the visual content at the highest shader quality.
If your desire is for me to enjoy as much visual content as possible then allowing me the option to just turn off clouds is how you will achieve that goal, if not, then i will be forced to continue dropping the shader quality on ALL items in a scene just to get rid of the clouds.
Let me be clear the clouds WILL be shut off no matter what choice you make because they are so graphics intensive that turning them off is a must. The question is whether i have to surrender ALL SHADER CONTENT in a scene just to get rid of the clouds.
Thought that I would add that, to not come across as a stubborn ass, I will file the bug report you recommended but again it isnt a bug at all but just a pure computational overload caused by so many points of data needed to calculate the clouds in the scene.
This is fantastic news for windows and Mac users.
Now what about us Linux users? Iâm running Mint 19 and its going to be a pain having to crank up my windows drive just to play EVE. It takes away access to everything else that i do.
BTW. Is there any chance with all this new technology coming up that things like better officers quarters and walking in station and similar items will occur?
Eve needs a face lift and this is a great opportunity to make it happen.
Please donât disappoint.
That IS a bug.
When the user experience differs from what the devs expected, this is a bug.
Hope it will be back compatible I am retired and on a fixed income upgrading is not an option for me so I and my 4 Omaga accounts will be gone
WellâŠ
The problem is not that you will switch to 64bits, it is the DirectX 12 part who scares me.
I mean, I still remember the DirectX 11 switch, who was followed by very bad moves like the removal of the âLoad Station Environnementâ option, and the new small moving bubbles of light inside and others additions who make me go down from 3 simultaneous clients launched, to 1.5 maximum in potato mode (meaning I launched the 2nd only the required time, was too stressful for the computer).
What you did only meaned less playability. I only get less.
Since then, I have upgraded my computer but I refuse to do this an another time. I have 24 Go RAM, a 1070 TI but still a i5-4590 (I am waiting the Ryzen 3), so I can see some erratic behaviour with 3 clients online (and the rest like Discord etc.) as the CPU is a bottleneck. I donât want to see this kind of configuration go down to 1.5 clients maximum.
And according to Process Monitor and GPU-Z about GPU consumption, a single EVE process sometimes go from ± 5% (potato) or ± 9 % (medium) to 18 % GPU with some spikes to 30 % with ZERO interaction with the game, on a goddamn 1070 TI. Just to show me a docked ship and an billboard advert I donât care ? Come on !
There was a precedent, so I am rather confident that you will do bad moves again for no valid reasons. Of course, you could add options in the menu for all new additions you will do but⊠you stated that you wanted to remove them in the future so⊠I wonât be surprised of a disaster.
Someone should make a ganker character called DirectX 12
There is a lot of particle clouds in rarely used dungeons, that developers probably havenât touched in a long time. Things like expedition sites, anomalies, less popular missions, no manâs land COSMOS. Just recently noticed how performance intensive was one of the Drone expeditions, Mare Sargassum if I recall correctly, with 9-16 âpatchesâ of clouds overheating GPU up to 80-85C zoomed in).
I understand that fixing all clouds in all existing content is unrealistic, but spot bug reporting also feels like band-aiding that lasts for many years now without solid solution (if we rule out turning off âEffectsâ in settings).
And by âsolid solutionâ (theoretical, as itâs probably impossible to expect in near future) I mean removing all old cloud objects and replacing them with environments like in those new event sites.
It would be more efficient to name it Windows 10 How much I can hate this crappy OSâŠ
They are not ending support for dx11
lol, i switched back to dx9 because i had my clients frozen when playing two monitors (dx11), as i remember it happened when i flew to anomaly where not-optimised particle clouds tend to be
hoping the Mac Client isnât too much of a pain for you. really not in the mood to buy another PC just for eve, but i do like my toys
Honestly though, if you have a GFX card predating DX11 support, you need to upgrade. Thereâs no way anyone on this planet exclusively plays EVE and EVE alone. Even the most diehard capsuleers play something else besides EVE. Itâs not expensive at all to get a GFX card that supports DX11 and DX12. If you can afford to play EVE with a computer currently that runs the game in something more then potato mode, chances are you can afford to buy a card that will carry you into EVEâs future with DX12. Thereâs really no excuse. EVE is VERY forgiving on specs, but put that ancient card to rest and get with the times. You can easily find a GTX 970 on Ebay for $300. (Probably cheaper now as itâs been a year since I bought mine.) No excuse to not upgrade. Not saying you need to invest in a 1080 or 2080. But considering EVE is extremely lax on specs, invest the money. As you canât expect CCP to support potato mode for extremely long periods of time. (When right now anything made in the last 10 years will run EVE. I canât be 100% certain but I bet if I put an 8800GT from Nvidia in my rig, it would run EVE at a decent resolution/framerate/settings.) Say what you want about CCP due to their tactics lately, but in all my years no other game company has supported dead and gone hardware as they have. They support hardware after the original company drops support. (their running tally for HW support is 10 years. So if you have a 10 year old GFX card, it will run EVE. May not be max graphics. But it will get the job done.)
PS: I donât have a modern rig by any means. So Iâm not one of those enthusiasts who bins a grand or more every year in favor of a new upgrade. My rig is 5+ years old and Iâm trying to etch out as much performance as I can while funds build up for a brand new rig that is $2,800 or more. But the fact is, EVE is one of the only games that will support hardware that is over 5 years old and keep supporting it as if it was brand new. So if you make the âcutâ donât be pissed. Go on to ebay and buy a used newer GFX card. Let that old GFX card have a rest, salute it for working as long as it did, and keep on trucking. Again based on my purchase a year a go. $300 nets you the ability to play EVE for another 5 years at minimum, plus enjoy the latest and greatest games at least high settings. (Some maxed out Ultra.) Thatâs with a GTX 970 + AMD FX 8150 with 32GB of ram.
But has CCP considered the side-effects for multiclient-players properly?
While it may look statistically sound, that most people run on a 64bit operating system, the dropping of the 32bit client and thus raising the minimum requirement will affect most people running more than one client.
With 8GB of RAM and a future minimum requirement of 4GB per client (as opposed to 2GB on 32bit), you can only run two clients on that computer.
So you are effectively halving the amount of clients a computer can run and i expect this will cost some subscriptions if people need to severely upgrade RAM and GPU or need to replace their machines if they hit a limit elsewhere. (like SSE4.1 as was already mentioned for otherwise perfectly working 64bit multicore Phenom systems).
Mass-test tomorrow: https://www.eveonline.com/article/pnu5rz/2019-03-05-64-bit-client-mass-test