GeForce GTX 1650 for the Proliant ML350 G6

Описание к видео GeForce GTX 1650 for the Proliant ML350 G6

Just for fun, placing a GeForce GTX 1650 in a HP Proliant ML350 G6.

In this video, I'll be replacing the GeForce GT710 for a GTX 1650 and then comparing both in a few game benchmarks.

The hardware:
HP Proliant ML350 G6
Intel Xeon 5540 x2
20GB RAM (10GB per cpu)
Kingston 240GB SSD (connected to onboard sata controller)
GeForce GT710 and GTX1650

Software:
Windows 10 Pro 22H2
Devil May Cry 4 benchmark (DX10 version)
Crysis pre-launch demo
World of Warcraft : Dragonflight (trial account)

For me, the GTX1650 has been quite an interesting card ever when it came out. Sure, it didn't get positive reviews because of it's price to performance ratio, but from a technical point of view I did find the card interesting. It's the fastest gaming card that doesn't require an extra power connector!
So I wanted the card for some time but, well, the price was quite high for just a 'fun project card' so I waited (as prices tend to go down right?). Then came the 'mining craze' and all GPU prices sky rocketed... including the GTX 1650, going well over 400 euros (for ease of sake, let's say 1 euro is about 1 US dollar, so 400 dollar for a GTX 1650.... meh, don't think so).
Earlier this year prices went down again, card was still on the expensive side (170 euro), but I was afraid it would go EOL so I bought the cheapest I could find, the Inno3d GTX 1650 with GDDR6.

It's a bit of a messy video, not realy happy with the quality but hey, just showing a few benchmarks on the old GT710 and comparing it with the GTX 1650 in my Proliant system.

With the GTX 1650 you cán actualy game on a Proliant server :)

Few notes to take in mind:
The PCI-e slot on the Proliant may have 16x in length, but it only has 8x lanes available. The lanes are also Gen 2 (PCI-e 2.0) and the card works best in a full 16x lane slot at Gen 3 (PCI-e 3.0), so it's going to affect performance a bit, but being low-midrange card, I guess not too much.
Did some tests on an Ivybride i5 (so full 16x 3.0 support) and it was obviously faster (didn't record it).

To run Win10 properly on the Proliant, I had to set 'Power regulator' to 'OS controlled' in the BIOS. Otherwise the CPU remain stuck at 1600Mhz (instead of 2,53Ghz). However, boost doesn't seem to work, as it didn't go to 2,66Ghz like it used to do in Windows 7.

The Crysis demo used is the pre-launch demo. It doesn't have any optimisation patches or anything and has a few quirks. Selecting a screen resolution for example, may vary per system what is selectable, even though card ánd monitor can do 1080P, it may not show up in the selectable resolutions. Most systems couldn't go higher than 1680x1050 (other options simply weren't there), and sometimes, it wouldn't even do wide-screen (even though, same monitor and cards that can do 1080 just fine).
The same system (the ML350 in the video), same card (GTX1650) and same monitor (1080P), would only do 1680x1050 while running Windows 7, but could do 1080P while running Windows 10.
Weird, and so far, this has been the only game that did this.
I made a save-game right for an ingame cinematic, so it's easier to compare cards while the scenes are exactly the same (it's ingame rendered, not a pre-rendered cinematic).

If you wonder how I connected the SSD, it's on the onboard sata controller of the motherboard, not the SAS bays in front. You have to change the boot-order in the BIOS (place the sata controller above the SAS controller) but other than that, it will boot fine.
The Kingston is a cheap sata based drive, but still, much quicker than a conventional mechanical harddrive. I was suprised that even a cache-less SSD was so much quicker than even Raid 0 10K rpm harddrives! (benchmark performance of SAS drives might give high numbers, but when loading an OS, or games, the SSD was clearly faster and more responsive).

"Why not test game this, or game that?!" Honestly, I haven't got many PC games, and not going to buy them just to run them for benchmarks. Some games have account requirements, limited installs, or stuff like building up shader cache (stuttery mess, and then switching card it has to rebuild the cache making comparing components long boring and anoying). Tried something like 3dMark06 (yeah old, but I can tell, a GT710 struggles with that one too), but it wouldn't do the CPU tests so you never get a score out of it (3dmark doesn't seem to like dual cpu or something).

"What about the Radeon RX6400?" Interesting card...sort of (low power, no extra power connector needed), but at the time I bought the 1650, the RX6400 was in the same price range or more expensive (it dropped in price now). It also has the problem of having only 4 lanes on the card itself, which greatly affects performance on anything that isn't PCI-e gen 4 or above. On a Gen4 board, it's about the same as the GDDR5 version of the 1650, but on gen 3 performance drops a lot, and gen 2... well, performance tanks.

Комментарии

Информация по комментариям в разработке