Devblog: EVE 64-bit & DirectX 12

A bug is when a program runs with an error, fault, failure or flaw and / or the programmer did not foresee that outcome of his programming choices.

I offer the following ‘reasonable’ support that cloudy environments in EVE are not bugs but rather a development choice by CCP. Do any of these statements prove that CCP is aware that cloudy environments are causing problems for some people’s graphics cards, no.

Does my support prove that there isn’t some bug buried deep in EVEs code that is causing the problem rather than my contention that is because such environments are graphically intensive do to high pixel counts, no they do not do that either but if ‘reasonable’ support offered between reasonable people is sufficient it is far more than enough support.

  1. Cloudy environments have been in the game the entire 5 years ive been playing the game and over the years ive seen people comment about how much cloudy environments strain their graphics cards.

  2. There are graphics settings in the game that allow you to completely shut off clouds entirely but they are grouped with a bunch of other graphics settings under one graphics alteration button, thus if you want to get rid of clouds you get rid of a bunch of other visual content that isn’t a problem.

  3. I seriously doubt that even in the beginning that CCP devs could not foresee that high pixel environments would strain graphics cards since every game ever made must make choices about graphics intensity vs the ability of people to play the game at those intense graphics settings, it is to put it bluntly, programming 101 kind of stuff.

  4. Since a bug requires it to either be a defect in programming (which it is not) or an unforeseeable outcome of a perfectly functioning program (which it is not).

We must come to the reasonable conclusion that it is not a bug but rather a development choice by CCP with a predictable and known outcome that cloudy environments tax some players graphics cards so much they must run such environments in semi-potato or full potato mode to avoid laggy game play.

1 Like

Okay, to test im running win 10 64b + DX12 now, on older laptop 4 cores, two graphic cards and 8Gb ram …and i must say, eve looks better in DX11 than DX9 with no performance issues, in high graphic settings. Except yeah clouds in some anoms are pain in the ass…drop is like 60% from standard 80~100 FPS.

But from what Im understand, 64b client should give +1x more perforrmance on that laptop?

About DX12…well i belive that change is dedicated by microsoft because they drop support for DX9 as an very old API also the windows XP/7 and similiar older operating systems, and EVE is not the only one game whinch gotta move on. World of Warcraft, World of Tanks and many more did that change without any performance impact but with huge graphical changes. Most of newer systems are powered by windows 10 and built in DX12

EDIT: EVE was running on DX11 from ingame FPS monitor

I played EVE for several years on my MacBook. Now I can’t because the QT lib isn’t supported by the launcher anymore.

I’ve not left, I just can’t play.

New technology is wonderful, doesn’t mean its necessary.

1 Like

Give players an option to turn of citadel advertisements and it’ll be used more than skill extractors were to vacuum out mining barge.

3 Likes

Among the specs of video games programs, the ability for the user to play the game in a reasonable environment ( the one provided by the “minimum requirements”) is a contract of the development.
IF players whose hardware meet the minimum requirements are not able to play correctly, then this is a bug.

All bugs are not critical bugs, and for users the experience of a bug is enough to characterize a bug.
It may be a design bug, it may be a very low-priority bug, it may even be a bug that everybody actually takes benefit of, it’s still a bug nonetheless.

From the client point of view, a bug is when the expectation do not meat the reality. That’s why users may report false bug, because their expectations are not the same as the devs.

You never know. Sometimes it’s just coding against one framework, then using a new version of the same framework that does the same thing but “nicer” and thus, more GPU-intensive, and since the people in charge of binding the framework to the game is no more in charge nobody can fix this.
Really, dev is a very complex work. Many people think they know better than you, and spit on you while they have no idea of the challenges you face.
The best the users can do is keep it civil and report every bug/enhancement they perceive. Comments like “programing 101” are a hinder to both you and the devs.

3 Likes

I know tons of people who play on mac, are you sure you didn’t just have some strange bug?

1 Like

Why not use Vulkan instead? Then it’s open source and supports all OS :thinking:

5 Likes

It doesnt state that EVE will not run if you have 8GB of RAM that one can not run 3 (or more) clients, its simply not recommended…

Then again RAM prices are dropping again so hold out a few months and buy more ram #16GB #32GB :wink:

Alpha GoreSnorkel

MacOS does not support Vulkan, so no use :wink:

FYI Microsoft seems to have (back)ported DirectX12 to Windows 7 (for WoW mainly).

2 Likes

Are you multiboxing on all accounts on lowest settings, in potato mode and with all efects etc. turned off? I bet no so you aren’t running 2 GiB ram per client max.

They update requirements because 64-bit client seams (from my testing on duality) use bit more than 2 GiB ram on normal gameplay (no fleet battles). So add ~500 MiB per client with 64-bit version not double amount needed.

Just forget about playing Eve on a Mac and play it on a PC with Windows 10 way superior.

It’s not a full backport, but it is substantial enough that this was a factor in our decision making to go with DirectX 12.

3 Likes

Nope. That’s a screenshot from pre rendered Prometheus movie scene - the best prequel of Alien series btw. Here’s another one I made

To apply lights and shadows over dynamic clouds is problematic during real-time rendered scenes. It requires a lot of GPU resources. With DirectX 12 and modern GPU power it’s still possible to do without huge losing into performance. For example, by using DirectX 11, the moving clouds and atmosphere were well implemented in Dark Souls 3. Here are some in game screenshots:

d516a15076a1e993361692088fc24669bd3a016d

  • this scene contains clouds and fog, but they are blured and faded compared to the original movie sample.

Yeah. Something should be done with all cloud effects in this game: remastered, reworked, whatever.

In my opinion, the clouds are generated by using DX9 SDK maybe. DirectX 9 has a lot of unoptimized algorithms to render advanced graphics. The current issue of DirectX 9 is based on two aspects: it performs too many visual computations by using CPU resources and it contains many methods badly supported by modern GPUs.

Windows 7 is over. Upgrade it for got sake. People complaining this back when windows xp drops from windows 7.

1 Like

Some previous posts seem to think that this 64-bit client will improve things like TIDI - however this doesn’t make sense to me. My impression is that the 64-bit client will improve things that are computed locally such as graphics but server-side computations will be largely unchanged. Is this the case?

PS - thank you for hinting that Linux will still have a chance at running the new client :wink:

1 Like

Some causes of lag and crashes in large battles are client related, often due to the memory cap. The 64 bit client will help with these, which some people confuse with tidi.

Here you have something in work/testing for possible TIDI solution.

1 Like

Less Microsoft lock-in is always better. Vulkan is the future, not DX12. If at all possible, please go the Vulkan route.

1 Like