What are you running here?
AI bots that use integrated graphics
I’m sure you could probably replace all of these with a single mid range cpu lmao.
Yes but I run Integrated graphics on eahc and its required
But why?
To run the required software
You could still do that with a single powerful desktop?
10" racks
Consolidate power might not be possible 12v? Even 20amps is 4-6 computers. Just get 10" pdu’s they have 3 plugs. I know there’s 17 computers and 3 switches that’s like 7 pdu’s…
20 shelves
2x 10u 10" racks…
You’ll need to have a 10u empty to keep going for expandability.
Do you have some amazon links or pic links for those?
https://www.serverrack24.com/10-inch-12u-server-rack-with-glass-door-312x310x618mm-wxdxh.html
I don’t have USA links sorry but that’s the general gist
Serious question what is this for…
I run certain AI bots that require integrated cpu graphics to run 24/7
How to make this better?
Replace with a real server and virtualize the snott out of it.
Wont work, im running integrated graphics on each seperately
Meh, this isn’t what is seems. OP is just being cute. My guess is that he is imaging these hosts from a WDS server or similar. I’ve done deployments that mimic this setup exactly (though considerably more neatly), and its very typical to misconstrue what is actually happening at a glance.
I run Ai bots that require graphics
I was up against this last year, 130 Lenovo m75Q’s, how to fit 130 into a 42u rack including powerbricks. Solution was 3d printing.
Any pics regarding this if you would share?
Just close the door.
What door?
Get a mini rack system that fits on the big shelf and stack up the box things. Then go from there. And tie wraps for the power cables.
What about the constant 85 degrees heat that is emiting from it?
Table top fan obviously
How will this help with the heat at same time doesnt cause noise?
you can replace the 17 power supplies with a server PSU… these should all be 12v DC input with a standard 2.1x5.5MM barrel connectors…
flip them up onto their face, with the rear facing up. if you want to get fancy you could even make a base that integrates a little wedge and rod for the power button.
get at least a 24 port switch and micro ethernet cables.
this could be cleaned up to a single row on one shelf with no visible wires other than the ethernet and power lead running up and to the back. 1 switch under the machines… 1 PSU instead of 17… even one power cord…
or a little cable management at the very least…
I didnt know about server psu so will check that out. 24 port switches are expensive, all 3 switches are far cheaper than those big switches, tho im not sure if I add more 8 port switches ill be doing daisy chaining.
you’re clients are 1gbe… you can get a brand new unmanaged 24 port gbe switch for $50/£56 a fraction of one of those machines… https://www.amazon.co.uk/Tenda-Ethernet-Internet-Splitter-TEG1024D/dp/B09DPLVLPY/ -- not to mention you can get a used managed switch for less than half that.
if £56 is going to break your bank then get some double sided velcro and clean up your mess. wish you had mentioned your obscenely limited budget…
Budget is not an issue but noise, I sleep next to those machines like between me and them is the table desk u see on the right. So looking at that large switch i think it would have fans and such that would make alot of noise
the switch I linked to is fanless.
This looks like a problem that would be solved cheaper with less mess by a used Lenovo Thinkstation P720, a pair of Xeon Gold 6140s, a dozen 32GB DIMMs, a 4TB NVMe, and a copy of VMware.
Why on earth do you have so many minis?
But can it be solved if I need to run graphics on each of those devices and one graphics vietualized is not good?
I only have a passing interest in homelab stuff, but I just wanted to say that I got really excited because I thought those were a bunch of PS2s. Thought this was gonna be some weird FFXI or Battlefront LAN system.
Well at least all do run some graphics so there is some truth close to that