Bootstrap failing

The UI upgrade said I needed to do a cli upgrade, so trying a bootstrap, but it’s failing. I’ve tried commenting out all the plugins apart from docker_manager in case there was an issue with them, but no change. Here are the end of the messages, any ideas?

I, [2022-02-20T00:22:02.802120 #1]  INFO -- : > cd /var/www/discourse && [ ! -d 'node_modules' ] || su discourse -c 'yarn install --production && yarn cache clean'
I, [2022-02-20T00:22:02.833608 #1]  INFO -- :
I, [2022-02-20T00:22:02.835084 #1]  INFO -- : > cd /var/www/discourse && su discourse -c 'bundle exec rake plugin:pull_compatible_all'
WARNING: Plugins were activated before running `rake plugin:pull_compatible_all`
  You should prefix this command with LOAD_PLUGINS=0
I, [2022-02-20T00:22:17.738184 #1]  INFO -- : docker_manager is already at latest compatible version

I, [2022-02-20T00:22:17.739383 #1]  INFO -- : > cd /var/www/discourse && su discourse -c 'bundle exec rake db:migrate'
I, [2022-02-20T00:22:32.881239 #1]  INFO -- :
I, [2022-02-20T00:22:32.882131 #1]  INFO -- : > cd /var/www/discourse && su discourse -c 'bundle exec rake themes:update assets:precompile'
sh: 1: yarn: not found
I, [2022-02-20T00:22:42.298037 #1]  INFO -- : Checking 'Hamburger Theme Selector' for 'default'... up to date
Checking 'discourse-category-banners' for 'default'... up to date
Checking 'Hamburger links component' for 'default'...


FAILED
--------------------
Pups::ExecError: cd /var/www/discourse && su discourse -c 'bundle exec rake themes:update assets:precompile' failed with return #<Process::Status: pid 5174 exit 127>
Location of failure: /pups/lib/pups/exec_command.rb:112:in `spawn'
exec failed with the params {"cd"=>"$home", "hook"=>"assets_precompile", "cmd"=>["su discourse -c 'bundle exec rake themes:update assets:precompile'"]}
20aaf3f585012c6c3468f5a0f408f8546bc0a13db1da166771719e0197b4dab6
** FAILED TO BOOTSTRAP ** please scroll up and look for earlier error messages, there may be more than one.
1 Like

Is this a standard install?

How much ram and swap? What does

   df - h

Say?

1 Like

Installed originally using discourse-setup unless my mind is going, yes. Then split into separate data & web_only containers following the instructions on here. Also nowadays has an nginx proxy in front of it. Has successfully bootstrapped in the past though.

$ free -mh
              total        used        free      shared  buff/cache   available
Mem:          3.8Gi       1.8Gi       790Mi       278Mi       1.2Gi       1.4Gi
Swap:         2.0Gi       436Mi       1.6Gi
$ df -h
Filesystem      Size  Used Avail Use% Mounted on
udev            1.9G     0  1.9G   0% /dev
tmpfs           386M   40M  346M  11% /run
/dev/sda1        38G   21G   16G  58% /
tmpfs           1.9G   96K  1.9G   1% /dev/shm
tmpfs           5.0M     0  5.0M   0% /run/lock
tmpfs           1.9G     0  1.9G   0% /sys/fs/cgroup
/dev/sdb         79G   36G   44G  46% /var/discourse
/dev/sdc         42G   41G  719M  99% /mnt/HC_Volume_3697529
tmpfs           386M     0  386M   0% /run/user/1002
/dev/sdd         61G   53M   58G   1% /var/discourse/shared/web-only/backups
1 Like

Okay, I really should avoid doing this stuff when I’m a little tired, as it leads to doing totally stupid stuff. Like completely missing out the git pull and blanking on that fact no matter how many times you look :person_facepalming:!

I wondered briefly if there could be something odd with the deprecated alt-logo theme component (which I forked just before the max-version was committed, since I still have to use it), but thankfully not.

Sorry to have bothered you for something so silly & obvious!

1 Like

Oops!

It’s good to keep the knowledge around, someone else may hit this same problem.

2 Likes

Sounds like you did these too, but also make sure that you

git checkout main

That one got me recently.

And if you haven’t done a rebuild of data lately you’ll get an error with redis not being updated.

Didn’t get any errors about Redis, but had planned on doing a data rebuild anyway as it had been a little , which I’ve now done today. Thanks for the suggestion though.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.