Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Retiring the Raspberry Pi cluster #3102

Closed
rvagg opened this issue Nov 28, 2022 · 16 comments
Closed

Retiring the Raspberry Pi cluster #3102

rvagg opened this issue Nov 28, 2022 · 16 comments

Comments

@rvagg
Copy link
Member

rvagg commented Nov 28, 2022

I propose that it's time to retire the Raspberry Pi cluster. It's been an excellent addition to the CI pipeline and has had a very good life and helped us solve a lot of ARM problems; but I believe it's time to let it go.

Currently I think it's only being used for test runs for <18, which should just be Node.js 16 which is now in maintenance and is due to retire in 10 months. We're now primarily doing ARMv7 testing emulated (I believe).

Aside from the lack of use, the reasons I'd like to retire it are:

  • Maintenance burden - which isn't been so bad recently since it gets so little use, I haven't had to tinker with them for a few months, but the longer they hang on the more complicated it gets because they are using ageing setups that are increasingly difficult to keep working
  • Their electricity usage isn't free, I pay for that, and electricity is getting more expensive here
  • Security has always been a concern since they sit inside my home network--they're well isolated, but it's still not an awesome setup for a number of reasons
  • Space and a desire to tidy up my tech setup--I have too much tech going on in my garage and the bulk of it exists to support this cluster

The status is something like this:

  • We're still running Pi2 and Pi3's. 7 of the 2's are currently up and 5 of the 3's are currently up (there would be more if I was tending to them more carefully!), see https://ci.nodejs.org/label/pi2-docker/ and https://ci.nodejs.org/label/pi3-docker/
  • I have a pile of 1B+'s that we retired a couple of years ago that I've been cannibalising SD cards from to keep the remaining Pi's up as their cards get corrupted
  • Original setup of this cluster came entirely from community contributions, I did keep a spreadsheet tracking all of this and I could dig up details if it's useful for anyone; I and many community people purchased the individual Pi's and some people contributed funds to purchase some basic network switch and power equipment to run it. There was also a $1000 (IIRC) contribution from Dav Glass, which we slowly burned through for some initial setup (cables mainly) and then ongoing SD card purchases. We've used that up now and in the process we burned out some hardware on top of the SD cards (I think I burned out switch that I contributed and also an SSD that was originally used for the NFS root for them all; there's also some dud Pi's that are either not usable or lock up quickly after boot & use). They continue to use NFS boot & root off another server in my garage, but I'm also keen to retire that computer because it's not doing much more anymore.
  • The most recent "upgrade" involved replacing all of their cases with 3d printed stacks which make them much nicer to organise, I think I posted about that here somewhere, that was fun.
@sxa
Copy link
Member

sxa commented Nov 28, 2022

I feel it's a bit sad to see them go, but I completely understand the reasons for this (especially energy consumption at the moment) and since we're now doing a lot of testing in arm32 containers on our arm64 hosts which is a LOT faster (Note for clarity: It's actually better than just emulation - it's running natively in 32-bit docker containers on our arm64 systems whose systems still support the 32-bit armv7l instructions) Ref: #2775

The only real disadvantage of doing it exclusively on the containers on arm64 hosts is that we're not running on a 32-bit kernel which theoretically could show up unique problems. If we considered that a problem and wanted to retain something "real" for the purposes of problem reproduction then providers such as mythic-beasts in the UK can provide pi3 hosting from £57.50/yr if we really wanted to keep one going. Although there are probably enough pis out there that we could find someone to diagnose on if needed! I host a couple of ODROID-HC1 (8-core arm32) systems with SSDs for another OSS project at home to cover this scenario and yeah they can be a bit of a pain when they go down! Luckily those two systems only consume about 10W of power though.

@sxa
Copy link
Member

sxa commented Nov 28, 2022

I'd also like to say a very big thank you to you for setting this up and maintaining it over the years!

@richardlau
Copy link
Member

Also echoing a big thank you for hosting these for so long! And of course to everyone who sponsored/donated the pi's in the cluster.

In terms of practicalities, the easiest first step would be to disable the https://ci.nodejs.org/job/node-test-binary-arm-12+/ job. I think https://ci.nodejs.org/job/git-nodesource-update-reference/ could also go but we'd need to double check that it won't break https://ci.nodejs.org/job/node-test-binary-armv7l/ (the job that runs in the containers) -- I suspect it shouldn't do but I did copy it from the 12+ job so it may have references to it.

Currently I think it's only being used for test runs for <18, which should just be Node.js 16 which is now in maintenance and is due to retire in 10 months. We're now primarily doing ARMv7 testing emulated (I believe).

FWIW Node.js 14 is still in maintenance until April.

@jasnell
Copy link
Member

jasnell commented Nov 28, 2022

Yeah sad to see it go but completely understandable. Thanks for keeping these going all these years.

@mcollina
Copy link
Member

This is the end of an era ❤️

@rvagg
Copy link
Member Author

rvagg commented Dec 1, 2022

@richardlau I've disabled git-nodesource-update-reference because it looks safe to do so, it only runs on a pi3-docker label; I don't see an equivalent for the node-test-binary-armv7l job but I think that just means we don't get git bare clone reference updated on a regular interval which was mainly useful to speed up checkouts on the slow Pis (although I'm doubtful that was even helpful since we were cloning the cross-compile build anyway and not nodejs/node anymore!)

@mhdawson
Copy link
Member

mhdawson commented Dec 2, 2022

I'll echo the big thanks for running the cluster and that your suggestion make sense to me.

@rvagg
Copy link
Member Author

rvagg commented Feb 8, 2023

reopening because inventory still needs to be cleaned up

@targos
Copy link
Member

targos commented Apr 13, 2023

Inventory cleaned up in #3303 and #3308

Maybe one last step before closing this: Delete vagg-arm.nodejs.org from the DNS config.

@nschonni
Copy link
Member

Could probably remove the picture at the top of the README too https://github.com/nodejs/build#readme

@mhdawson
Copy link
Member

@rvagg a big thanks for hosting the cluster over all these years. I know it took a look of time/effort and really appreciate it.

@targos
Copy link
Member

targos commented Apr 17, 2023

I just removed the DNS record for vagg-arm.nodejs.org

Copy link

This issue is stale because it has been open many days with no activity. It will be closed soon unless the stale label is removed or a comment is made.

@github-actions github-actions bot added the stale label Feb 12, 2024
@sxa
Copy link
Member

sxa commented Feb 12, 2024

I'm personally happy with closing this unless @rvagg wants to have additional actions relating to clearing out his garage ;-)
Feel free to reopen if desired.

@sxa sxa closed this as completed Feb 12, 2024
@rvagg
Copy link
Member Author

rvagg commented Feb 12, 2024

🤷 I have a pile of Pi's in my garage now that are mostly from donations; they're old enough to not be all that useful, but they're fresh enough for me to feel uncomfortable binning them. So I'll hang on to them for a while longer in case someone around here wants to try and repurpose them for something.

Glad to simplify my setup and turn my power bill down a fraction though!

@sxa
Copy link
Member

sxa commented Feb 12, 2024

Side note: I didn't realise the ODROID-xU3 was octacore like the XU4 +I have an XU4 and a couple of HC1 which are pretty much the same bit no video and a SATA slot)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants