tbh, SLI and Crossfire is dead
there are limited games that support it, if any, and you will only get more problems and in some cases actually less performance than with single card
Not sure if Eve can take advantage of SLI when running on a single monitor, but if you play on two or monitors, you can plug one of those monitors into the second card, and then tell Eve to run a client on that card/monitor. So, with the right setup, yes, it would ease load on your card. However, that’s also a very expensive solution to solve… wait, why do you care? Is it because your GPU fans are loud? Adding a second card might help reduce heat generated by the first card, but it will also add more fans making noise, generate more heat, and possibly hurt air flow to the first card. So, it won’t necessarily reduce fan noise. You might be better off turning down a few settings a scoach, making sure your computer has good airflow (i.e. don’t block intakes), blowing out dust filters/heat sinks with some compressed air, or possibly getting a new case with better airflow.
Also, 75% utilization of an RTX 3090 with 3 clients? Are you running them at 4k? I ask cause that seems kind of high for 3 clients and a card that beefy.
As far as why do I care - I was just curious - the headset I have blocks out any noise from the fans (which I dont hear anyway). I spent a decent amount of money on the rig so I want to be able to throw whatever I want at it without hesitation.
There should be enough space between the two cards if I did install a second one. I just wasn’t aware I could set each screen to be independent of one another (is this something I can do for the entire rig or just EVE?)
It’s a Corsair 7000x RGB Case…I have 3 140mm intake fans up front and 4 120mm intake fans on the side. 4 exhaust 140mm fans (3 are connected to the radiator for the AIO CPU cooler)
Looking at Task Manager its using 2.2GB of the 24GB available but 3D (whatever that means) is at 50%.
The case fans are louder than the GPU fans (I think)…I couldnt honestly tell you because I havent played anything that requires the rig to throttle up all its fans to keep the system cool
If you plug your monitors into different cards, you can frequently determine what gets run on each card. Sometimes the program has a setting allowing you to select the card (such as Eve), sometimes you have to manually move the program (i.e. click and drag on the title bar) to the monitor/card you want to use, and sometimes the program (usually a game) will automatically run on whatever card/monitor you have set to be your primary monitor in windows. Hopefully that made sense.
Anyway, your GPU usage seems rather high for 3 clients at 1080p, and I’m just trying to figure out why. There could conceivably be something wrong, but not necessarily. For example, if you are you running your clients at 120 or 240hz, that would certainly account for it. Of course, if that’s the case, I wouldn’t sweat the high GPU usage, because extremely high framerates won’t help you to do better in Eve like they can in FPS’s. And, of course, your ability to enjoy pretty graphics is greatly diminished when you start trying to multibox more clients running at smaller resolutions. So, if you ever did want to throw more clients into the mix, you might as well start turning down frame rates and graphics settings.
Speaking of which, you don’t need to upgrade if your system is doing what you need it do. Start thinking about upgrading when that’s no longer the case. And this is especially true considering how much a second RTX will set you back. Moreover, the RTX 4000 series is rumored to be coming out later this year.
I’ll have to see if there is a better program to download that monitors performance. The old fashioned task manager shows a 3D usage of 50% now. But Im not sure exactly what that means - if thats 50% of the card being utilized or what. GPU Ram is only 2gb of the 24gb.
And the Corsair iCue software only shows fan RPM’s.
Speaking of which, I’ve seen task manager lie to me before when it comes to GPU utilization. Don’t know how often or why it happens, just that it can happen.
Anyway, the top portion of task manager shows GPU (Graphics Processing Unit) utilization, while the bottom section shows GRAM utilization. Now, GPU’s don’t have cores in the same way that CPU’s do, but they do have subsystems (referred to as engines) that are designed to be good at particular tasks, and that can operate parallel with each other. I don’t know if this is the most accurate description, but I think of it like this: the GPU is a workshop, the engines are the workers, and the physical silicon is the tools. And while the engines/workers can work on different stuff, they have to share the silicon/tools between themselves.
Anyway, the 3d engine is for displaying stuff (graphics cards can do other stuff, such as encoding/decoding videos, crypto mining, deep learning), so that’s what’s going to show a bunch of utilization when playing a game.
A 3090 on 1080p is already seriously over powered.
To give you something to benchmark against…
I run four clients on two 1440p monitors (165. Hertz) on a 3070ti.
Usage on the card rarely goes over 50% and that’s with all the client settings maxed.