Devblog: EVE 64-bit & DirectX 12

lol ikr!!! IAMxTROLL my system is ASUS 64bit,i7,16GB w/ duel 1080Ti

We have seen how they perform. In most cases, better. When not implemented right? Yeah, sure, a little worse. Still worth it though.

And with that my time in eve is over,i am one of those 0,5% and donā€™t have the conditions to build a new PC anytime soon.
Well ā– ā– ā– ā–  and i just came back to eve after years too =[

are you sure your hardware wont support 64bit and it isnā€™t just your OS that is 32bit?

1 Like

This is a serious request, not being sarcastic. Can someone explain why a 64-bit client for Eve would be better or more beneficial than the current [32-bit] client?

A 32bit program can only use 3.5GB of ram, 64bit can use way more.
In big tidi fights crashes due to running out of memory are common, this should reduce those significantly

3 Likes

Thanks for the clarification, did not realize a 32-bit application can only use so much memory/RAM. Was thinking a 64-bit OS running a 32-bit application would not run into memory issues like that, good to know was incorrect.

On the downside 64bit programs do use slightly more memory due to more overhead for addressing

Since 2009 there is DX11 so now why the hurry? I Irony until I get a satireā€¦ :man_facepalming:

Funny thing is everyone will upgrade and see zero differenceā€¦

2 Likes

Thatā€™s utter nonsense you dont run out if memory especially 3.5gb of it. Eve doesnā€™t have a memory leak. Tidi is done server side not client side therefore is a server issue not client issue

You do definitely run to the limit of address space and it has nothing to do with TiDi - only the things your client needs to do.

You donā€™t actually need to leak memory to run out of it these days, EVE has changed in many ways and did not move to the 64 bit address space.

Server-side things that CCP shall improve on their grand quest to slay the legendary ā€œnode processā€ and slice it into little bits (with a long and shiny blade, preferably): RPC dropping in heavy TiDi (THIS is, most often, the cause of your frustration and ā€œMy module did not deactivate/activate!ā€)

Client-side things that CCP shall improve on their grand quest to replace the stonework of Castle EVE in one night with a magic spell: Fix large battle crashes, improve client performance during said large battles.

Meh large battles.

It will only use more RAM for everyone 99% of time.

Where did I heard something like this before?

Firstly, glad to hear the minimum standards are progressing to something that makes development and testing easier - modern toolsets are designed for 64bit systems and more recent APIs.

As the question has been asked and Iā€™m a wonk for the detail: ā€œwhat does a 64bit application give you?ā€ Isnā€™t really the question. Itā€™s more ā€œwhat is the problem with a 32bit application on a 64bit system?ā€

Summary: running a 32bit application on 64bit hardware and operating systems is slow, and limiting. A 64bit client will run faster for many reasons.

Bit of dating Information: the Prescott Pentium 4 processor (2004) was 64bit, Windows 7 was released in 2009 and invariably deployed the 64bit version. Even Vista post-dates 64bit processors being common and was deployed in 64bit mode on 64bit processors.
You are almost inevitably running a 64bit OS on a 64bit architecture. As CCP have noted, ā€œalmost allā€ is 99.5% of clients.

Firstly, if you have a PC with a 64bit processor then to run a mix of 64bit and 32bit code the processor has to internally mode switch - this is an appreciable overhead (I suspect because it requires the instruction pipelines to be cleared down first - but Iā€™d have to check). the scale of the overhead depends on how often you have to switch. Which brings us to the next issueā€¦
Windows handles 32 bit applications using WOW64 (Windows on Windows) - this is a translation layer reforming 32bit OS calls to the underlying 64bit OS to accept. Overhead. And of course, those 64bit parts of the OS occurring within a 32bit application drive more mode switching on the processor.
The 32bit operations set (the old i86 architecture) is a subset of the AMD64 (the 64bit) architecture. Basically there are useful things a 64bit CPU in 64bit mode that are not available to a 32bit applications - Encryption is a classic example, but there are many other ā€œin hardwareā€ services. So a 32bit application has to do a lot more in software - and is thus slower - than using the capabilities already in the hardware (capabilities that you paid for!!!)
The 64bit architecture also has more internal registers, which reduces register loading for many tasks and means code compiles into fewer operations, making it faster, it can address wider memory spaces, thereā€™s a plethora of improvements over operating in 32bit mode.

And Iā€™ve not mentioned ā€œmore RAM addressingā€ as thatā€™s a bit of a red herring, nice, but not critical - the win is in being able to access the capabilities of the processor and removing some of the performance overheads associated with running a 32bit application on a 64bit platform.

Bascially, the 64 bit client is the right place to be from a performance perspective, and while I donā€™t know the tools CCP use then Iā€™d be stunned if it didnā€™t make support a lot easier.

Dropping Directx9 support is a similar issue - it removes a layer of complexity for CCP.

10 Likes

Do the minimum requirements represent fully-enabled graphics, or potato mode?

I like that you are moving to a closer to the metal api but as a linux player I would love if you went with valken, I i think the mac users would like it to because theres a transltion layer for valken to metal now to.

I know it might be more work and MS wont be there to help.

Some of you are excited about ray tracing but in itā€™s current form it is pointless, and i by far no expert but i would imagine the ray count and picale sample count would be insane just to get a decent picture. Not to mention how dark it might make eveything, just like in space for real :slight_smile:

do we need CF/SLI in EVE, just being able to run a full screen toon on each monitor / GPU would properly be more popular.

I understand that DX12 might come with better support in the industry but the ā€˜cross platformā€™ ability that it would give you my help in the future.

Spreading cables from each monitor to each GPU is backward imo but if EVE is only thing you play i guess it works.
Problem begins if you use your GPUs regularly in rest of games / programs like benchmarks etc than you need to mess around back of your rig constantly.

There is also question of horsepower two cards can render alot more of stuff focusing on a single client if done properly busy scene will start to lag/drop frames much later(rendering vise).

Special bonus is DX12 it self, the way SLI works unlike in DX11 VRAM of each GPU is combined so you can pickup literally garbage lvl GPUs say r9 270/80(5 years old) for change and get 6GB VRAM combined and be able to play eve without investment in serious cash.

This would be first time bottom end of users left with options to tag along until proper upgrade or until those GPU died out kinda neat thing.