Project:Infrastructure/Shopping list

This page is used to list things on infra's shopping list. The goal is to solicit feedback from the community on available options and pricing.

Executive Summary
Replace dead & loaner network hardware, with new systems capable of supporting future needs.

Rationale
We have a temporary loaner switch at the Oregon State University Open Source Lab (OSUOSL), this is because the Cisco WS-C2970G-24T-E switch purchased in 2011 had failed during 2020. OSUOSL wants their loaner back, so we need new switchgear. The old switch provided multiple VLANs, which covered both OOB/IPMI as well as public/normal traffic.

Additionally, the hypervisor systems have high-bandwidth needs, and are presently using cross-over 10Gbit networking only, making it impossible to expand to more hypervisor systems or offer 10Gbit-or-better service to other hardware.

See switch discussion in Discussion page about potential FS.com switches

Requirements:

 * At least 12x 10GBASE-T ports
 * Transciever-based ports (SFP/SFP+/QSFP+/SFP28) not presently in use, but maybe for future expansion
 * Present 10GBASE-T hosts
 * oriole & ovenbird each have 2x 10GBASE-T
 * catbus has 4x 10GBASE-T
 * Future Hosts:
 * Ganeti replacements (2-3 hosts): at least 2x (10GBASE-T or SFP+ or SFP28 or QSFP28)
 * Remote management
 * VLAN support

Nice to have:

 * Bonding/trunking
 * redundant PSU: this would only be for the PSU failing, we don't have redundant power feeds in the rack (both breakers come from the same circuit)
 * 25G/40G options

Proposal

 * 1x FS.com S5860-20SQ (high-speed systems) - $1600 USD
 * One of:
 * 1x FS.com S3910-24TF (OOB/IPMI/embedded/non-10G systems) - $369 USD
 * 1x FS.com S3910-24TS (OOB/IPMI/embedded/non-10G systems), adds 10G uplink - $769 USD
 * Various SFP+/SFP28/QSFP28 transceivers (DAC/AOC/regular)
 * Upper bound: 26x$65USD = extra $1690 (20 SFP+, 4x SFP28, 2x QSFP+)
 * Slashbeast recommends using DAC if we just want 10gbps; like https://www.fs.com/products/21254.html?attribute=1318&id=222508
 * about 15$ per 10G DAC instead of 65 for the transceiver; but we need SFP+ compatible kits.
 * More for 25G/40G DAC options.
 * Optical Cabling
 * Wild Estimate: $500 USD total
 * Estimated total: $4159-$4559 USD

Computing
We have some aging hardware at the OSUOSL. We need generic computing boxes.

Goal
We don't need the fastest beefiest boxes. Primarily we need:
 * fast storage (SSD/NVM-e): almost all our operations are IO-bound.
 * Power Optimized: We are constrained on power in this hosting location; so don't want the fastest power-sucking threadripper; would rather take a low/mid range core with better power curves.
 * space optimized: 1-2U preferable.
 * We'd have 4-5 moderately sized units than 1-2 huge beefy units, esp. using Ganeti where the failure domain can be contained.

Hard requirements

 * 1) 10Gbit ethernet ports [preference for 10GBASE-T since there are no SFP+ systems in place so far)
 * 2) >5TB usable storage on each system (after any RAID [MD,LVM,ZFS])
 * 3) 15T usable storage pre-raid.
 * 4) 4 or more NVMe slots: U.2(2.5") / E1.S / E1.L / E3.* (Robin suggests something like https://ark.intel.com/content/www/us/en/ark/products/186674/intel-ssd-d5p4326-series-15-36tb-2-5in-pcie-3-1-x4-3d2-qlc.html)
 * 5) Dual PSU
 * 6) OS Boot disk: If the chassis has dedicated 2xM.2 slots, then yes, populate them w/ OS disks.

Migrate to VM:

 * vulture.gentoo.org (GSOC box), a Dell SC1435, 4GB RAM, dual opteron 2210, 2xST3500630NS (500 GB 7200 RPM spinning rust.)

Replacing:

 * jacamar.gentoo.org (CI box)
 * whitebox supermicro
 * 64GB ram
 * dual opteron 6272's
 * 2x Samsung 860 1T (RAID1)
 * 4x WD 1T (RAID10)
 * 2x WD mixed (old disks from other hosts)
 * sucks power
 * writes 100GiB/day


 * dipper.gentoo.org (masterdistfiles.gentoo.org):
 * dellpoweredge R415
 * 2x6 core opterons
 * 32G RAM
 * 8T usable disk available, ~5T used
 * 4x Seagate ST33000651AS in HWRAID5
 * Writes 200GiB/day


 * oriole.gentoo.org & ovenbird.gentoo.org (hypervisors in Ganeti cluster)
 * whitebox supermicro (2U SuperServer 2028TP-DECTR)
 * 128GB ram (each)
 * 2x E5-2620 v4 (each)
 * Disk (each):
 * 2x Samsung 850 1T (RAID1)
 * 4x Seagate ST2000NX0273 2T (mixed RAID1 LVs over the disks)
 * ~5T worth of data present
 * Writes 1TiB/day (each)

Recommendations (unaudited)

 * kitten suggests just goint Dell R700-series (both Intel and AMD available.) Basic specs are about 4-5k per unit. Fancy enterprise boxes, w/Procs. 4-5k does not include disks or RAM.
 * Slashbeast suggests going more workstation class (rather than server.) Barebones: https://www.newegg.com/asrock-rack-1u4lw-x570-2l2t-amd-ryzen-5000-series-desktop-processors-amd-ryzen-4000-g-series-desktop/p/N82E16816775021 We'd have to price out the other components.
 * https://www.dell.com/en-us/work/shop/servers-storage-and-networking/poweredge-r6515-rack-server/spd/poweredge-r6515/pe_r6515_13732_vi_vp?configurationid=0c73bea1-bab9-4099-9124-70ad7a48f921 (barebones dell)