A bug is when a program runs with an error, fault, failure or flaw and / or the programmer did not foresee that outcome of his programming choices.
I offer the following ‘reasonable’ support that cloudy environments in EVE are not bugs but rather a development choice by CCP. Do any of these statements prove that CCP is aware that cloudy environments are causing problems for some people’s graphics cards, no.
Does my support prove that there isn’t some bug buried deep in EVEs code that is causing the problem rather than my contention that is because such environments are graphically intensive do to high pixel counts, no they do not do that either but if ‘reasonable’ support offered between reasonable people is sufficient it is far more than enough support.
Cloudy environments have been in the game the entire 5 years ive been playing the game and over the years ive seen people comment about how much cloudy environments strain their graphics cards.
There are graphics settings in the game that allow you to completely shut off clouds entirely but they are grouped with a bunch of other graphics settings under one graphics alteration button, thus if you want to get rid of clouds you get rid of a bunch of other visual content that isn’t a problem.
I seriously doubt that even in the beginning that CCP devs could not foresee that high pixel environments would strain graphics cards since every game ever made must make choices about graphics intensity vs the ability of people to play the game at those intense graphics settings, it is to put it bluntly, programming 101 kind of stuff.
Since a bug requires it to either be a defect in programming (which it is not) or an unforeseeable outcome of a perfectly functioning program (which it is not).
We must come to the reasonable conclusion that it is not a bug but rather a development choice by CCP with a predictable and known outcome that cloudy environments tax some players graphics cards so much they must run such environments in semi-potato or full potato mode to avoid laggy game play.
Okay, to test im running win 10 64b + DX12 now, on older laptop 4 cores, two graphic cards and 8Gb ram …and i must say, eve looks better in DX11 than DX9 with no performance issues, in high graphic settings. Except yeah clouds in some anoms are pain in the ass…drop is like 60% from standard 80~100 FPS.
But from what Im understand, 64b client should give +1x more perforrmance on that laptop?
About DX12…well i belive that change is dedicated by microsoft because they drop support for DX9 as an very old API also the windows XP/7 and similiar older operating systems, and EVE is not the only one game whinch gotta move on. World of Warcraft, World of Tanks and many more did that change without any performance impact but with huge graphical changes. Most of newer systems are powered by windows 10 and built in DX12
EDIT: EVE was running on DX11 from ingame FPS monitor
Among the specs of video games programs, the ability for the user to play the game in a reasonable environment ( the one provided by the “minimum requirements”) is a contract of the development.
IF players whose hardware meet the minimum requirements are not able to play correctly, then this is a bug.
All bugs are not critical bugs, and for users the experience of a bug is enough to characterize a bug.
It may be a design bug, it may be a very low-priority bug, it may even be a bug that everybody actually takes benefit of, it’s still a bug nonetheless.
From the client point of view, a bug is when the expectation do not meat the reality. That’s why users may report false bug, because their expectations are not the same as the devs.
You never know. Sometimes it’s just coding against one framework, then using a new version of the same framework that does the same thing but “nicer” and thus, more GPU-intensive, and since the people in charge of binding the framework to the game is no more in charge nobody can fix this.
Really, dev is a very complex work. Many people think they know better than you, and spit on you while they have no idea of the challenges you face.
The best the users can do is keep it civil and report every bug/enhancement they perceive. Comments like “programing 101” are a hinder to both you and the devs.
Are you multiboxing on all accounts on lowest settings, in potato mode and with all efects etc. turned off? I bet no so you aren’t running 2 GiB ram per client max.
They update requirements because 64-bit client seams (from my testing on duality) use bit more than 2 GiB ram on normal gameplay (no fleet battles). So add ~500 MiB per client with 64-bit version not double amount needed.
To apply lights and shadows over dynamic clouds is problematic during real-time rendered scenes. It requires a lot of GPU resources. With DirectX 12 and modern GPU power it’s still possible to do without huge losing into performance. For example, by using DirectX 11, the moving clouds and atmosphere were well implemented in Dark Souls 3. Here are some in game screenshots:
Yeah. Something should be done with all cloud effects in this game: remastered, reworked, whatever.
In my opinion, the clouds are generated by using DX9 SDK maybe. DirectX 9 has a lot of unoptimized algorithms to render advanced graphics. The current issue of DirectX 9 is based on two aspects: it performs too many visual computations by using CPU resources and it contains many methods badly supported by modern GPUs.
Some previous posts seem to think that this 64-bit client will improve things like TIDI - however this doesn’t make sense to me. My impression is that the 64-bit client will improve things that are computed locally such as graphics but server-side computations will be largely unchanged. Is this the case?
PS - thank you for hinting that Linux will still have a chance at running the new client
Some causes of lag and crashes in large battles are client related, often due to the memory cap. The 64 bit client will help with these, which some people confuse with tidi.