88 Commits

Author SHA1 Message Date
99b324fb17 cmd/mbf: update pkgserver title text
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m51s
Test / ShareFS (push) Successful in 3m43s
Test / Hakurei (push) Successful in 3m52s
Test / Sandbox (race detector) (push) Successful in 5m24s
Test / Hakurei (race detector) (push) Successful in 6m31s
Test / Flake checks (push) Successful in 1m21s
This makes more sense for its purpose.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-17 18:57:26 +09:00
kat
6f50811dc9 cmd/mbf: bring back pkgserver's favicon!
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m56s
Test / ShareFS (push) Successful in 3m42s
Test / Hakurei (push) Successful in 3m56s
Test / Sandbox (race detector) (push) Successful in 5m19s
Test / Hakurei (race detector) (push) Successful in 6m26s
Test / Flake checks (push) Successful in 1m31s
It existed in mae's #33, but ozy was not satisfied with including
a binary file identical to https://hakurei.app's favicon, and hence
removed it. However, it's possible to explicitly specify the favicon
with a link tag [1]; provided a content security policy that isn't too
strong, this should work fine.

[1]: https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/Attributes/rel#icon
2026-05-17 19:47:42 +10:00
6b87bac401 cmd/mbf: clone pkgserver order slices
All checks were successful
Test / Create distribution (push) Successful in 36s
Test / ShareFS (push) Successful in 41s
Test / Sandbox (race detector) (push) Successful in 46s
Test / Sandbox (push) Successful in 48s
Test / Hakurei (push) Successful in 52s
Test / Hakurei (race detector) (push) Successful in 52s
Test / Flake checks (push) Successful in 1m19s
These are no longer arrays, so must be cloned for sorting.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-17 18:30:57 +09:00
a967aa3b6e internal/rosa/kernel: arch-specific headers checksum
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m50s
Test / ShareFS (push) Successful in 3m42s
Test / Hakurei (push) Successful in 3m52s
Test / Sandbox (race detector) (push) Successful in 5m21s
Test / Hakurei (race detector) (push) Successful in 6m27s
Test / Flake checks (push) Successful in 1m22s
These headers differ by target architecture.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-17 18:01:11 +09:00
38bc2c7508 internal/rosa: pass stage alongside state
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m57s
Test / ShareFS (push) Successful in 3m45s
Test / Hakurei (push) Successful in 3m50s
Test / Sandbox (race detector) (push) Successful in 5m19s
Test / Hakurei (race detector) (push) Successful in 6m31s
Test / Flake checks (push) Successful in 1m17s
This cleans up many function signatures.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-17 17:50:30 +09:00
30eb0d6a61 internal/rosa: key metadata by string
All checks were successful
Test / Create distribution (push) Successful in 1m5s
Test / Sandbox (push) Successful in 2m55s
Test / ShareFS (push) Successful in 3m46s
Test / Hakurei (push) Successful in 3m57s
Test / Sandbox (race detector) (push) Successful in 5m25s
Test / Hakurei (race detector) (push) Successful in 6m29s
Test / Flake checks (push) Successful in 1m21s
For upcoming azalea integration. The API is quite ugly right now to ease migration.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-17 15:56:53 +09:00
c2ff9c9fa5 internal/rosa/azalea: evaluator
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m48s
Test / ShareFS (push) Successful in 3m44s
Test / Hakurei (push) Successful in 3m52s
Test / Sandbox (race detector) (push) Successful in 5m30s
Test / Hakurei (race detector) (push) Successful in 6m30s
Test / Flake checks (push) Successful in 1m21s
Performance is sufficient for the use case, despite the fact that I could not even think of a lower-effort way to do this:

BenchmarkParse-128        	   55100	     21494 ns/op
BenchmarkEvaluate-128     	  131670	      9248 ns/op

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-17 12:44:34 +09:00
d38d306147 internal/rosa/azalea: ast and parser
All checks were successful
Test / Create distribution (push) Successful in 1m5s
Test / Sandbox (push) Successful in 2m45s
Test / ShareFS (push) Successful in 3m48s
Test / Hakurei (push) Successful in 3m56s
Test / Sandbox (race detector) (push) Successful in 5m20s
Test / Hakurei (race detector) (push) Successful in 6m31s
Test / Flake checks (push) Successful in 1m24s
This syntax is not final, but acts as a stopgap solution and a proof of concept.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-16 14:29:28 +09:00
c32c06b2e8 internal/rosa: mesa artifact
All checks were successful
Test / Create distribution (push) Successful in 1m5s
Test / Sandbox (push) Successful in 2m48s
Test / ShareFS (push) Successful in 3m43s
Test / Hakurei (push) Successful in 3m52s
Test / Sandbox (race detector) (push) Successful in 5m21s
Test / Hakurei (race detector) (push) Successful in 6m32s
Test / Flake checks (push) Successful in 1m22s
This has many dependencies.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 05:12:35 +09:00
61199f734c internal/rosa/glslang: remove headers prefix
All checks were successful
Test / Create distribution (push) Successful in 1m33s
Test / Sandbox (push) Successful in 3m18s
Test / ShareFS (push) Successful in 4m55s
Test / Hakurei (push) Successful in 5m3s
Test / Sandbox (race detector) (push) Successful in 6m23s
Test / Hakurei (race detector) (push) Successful in 7m24s
Test / Flake checks (push) Successful in 1m21s
Maintainers tried to be clever with this and breaks cmake paths.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 04:57:38 +09:00
87cf0d4e6b internal/rosa/mesa: libva artifact
All checks were successful
Test / Create distribution (push) Successful in 1m6s
Test / Sandbox (push) Successful in 2m49s
Test / ShareFS (push) Successful in 3m47s
Test / Hakurei (push) Successful in 3m53s
Test / Sandbox (race detector) (push) Successful in 5m23s
Test / Hakurei (race detector) (push) Successful in 6m26s
Test / Flake checks (push) Successful in 1m22s
Required by mesa.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 04:40:04 +09:00
cf0dffa0f5 internal/rosa/mesa: libglvnd enable glx
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m52s
Test / ShareFS (push) Successful in 3m44s
Test / Hakurei (push) Successful in 3m56s
Test / Sandbox (race detector) (push) Successful in 5m29s
Test / Hakurei (race detector) (push) Successful in 6m33s
Test / Flake checks (push) Successful in 1m32s
Required to break circular dependency.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 04:35:50 +09:00
686d7ec63a internal/rosa/x: xserver artifact
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m47s
Test / Hakurei (push) Successful in 3m47s
Test / ShareFS (push) Successful in 3m44s
Test / Sandbox (race detector) (push) Successful in 5m23s
Test / Hakurei (race detector) (push) Successful in 6m26s
Test / Flake checks (push) Successful in 1m22s
Required by libglvnd test suite.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 04:15:48 +09:00
4c653b1151 internal/rosa/x: xkeyboard-config artifact
All checks were successful
Test / Create distribution (push) Successful in 1m5s
Test / Sandbox (push) Successful in 2m56s
Test / ShareFS (push) Successful in 3m46s
Test / Hakurei (push) Successful in 3m54s
Test / Sandbox (race detector) (push) Successful in 5m23s
Test / Hakurei (race detector) (push) Successful in 6m28s
Test / Flake checks (push) Successful in 1m25s
Required by xserver test suite.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 03:59:22 +09:00
0b0a63d151 internal/rosa/x: libxcb-util-wm artifact
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m49s
Test / ShareFS (push) Successful in 3m45s
Test / Hakurei (push) Successful in 3m50s
Test / Sandbox (race detector) (push) Successful in 5m18s
Test / Hakurei (race detector) (push) Successful in 6m26s
Test / Flake checks (push) Successful in 1m23s
Required by xserver xephyr.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 03:46:04 +09:00
6231cfe2aa internal/rosa/x: libxcb-util-image artifact
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m47s
Test / ShareFS (push) Successful in 3m43s
Test / Hakurei (push) Successful in 3m51s
Test / Sandbox (race detector) (push) Successful in 5m19s
Test / Hakurei (race detector) (push) Successful in 6m28s
Test / Flake checks (push) Successful in 1m25s
Required by xserver xephyr.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 03:36:45 +09:00
712e80890b internal/rosa/x: libxcb-util artifact
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m52s
Test / ShareFS (push) Successful in 3m43s
Test / Hakurei (push) Successful in 3m51s
Test / Sandbox (race detector) (push) Successful in 5m22s
Test / Hakurei (race detector) (push) Successful in 6m28s
Test / Flake checks (push) Successful in 1m25s
Required by xserver xephyr.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 03:25:24 +09:00
3fe7d48014 internal/rosa/x: libxcb-render-util artifact
All checks were successful
Test / Create distribution (push) Successful in 1m5s
Test / Sandbox (push) Successful in 2m52s
Test / ShareFS (push) Successful in 3m44s
Test / Hakurei (push) Successful in 3m54s
Test / Sandbox (race detector) (push) Successful in 5m31s
Test / Hakurei (race detector) (push) Successful in 3m32s
Test / Flake checks (push) Successful in 1m24s
Required by xserver.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 03:09:37 +09:00
16f9d39427 internal/rosa: libepoxy artifact
All checks were successful
Test / Create distribution (push) Successful in 1m5s
Test / Sandbox (push) Successful in 2m47s
Test / ShareFS (push) Successful in 3m43s
Test / Hakurei (push) Successful in 3m49s
Test / Sandbox (race detector) (push) Successful in 5m23s
Test / Hakurei (race detector) (push) Successful in 6m30s
Test / Flake checks (push) Successful in 1m23s
Required by xserver.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 02:16:55 +09:00
c1cd5ba07b internal/rosa: libtirpc artifact
All checks were successful
Test / Create distribution (push) Successful in 1m5s
Test / Sandbox (push) Successful in 2m47s
Test / ShareFS (push) Successful in 3m44s
Test / Hakurei (push) Successful in 3m50s
Test / Sandbox (race detector) (push) Successful in 5m20s
Test / Hakurei (race detector) (push) Successful in 6m33s
Test / Flake checks (push) Successful in 1m23s
Required by xserver.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 02:07:25 +09:00
7b0cd2e472 internal/rosa/x: libXdmcp artifact
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m54s
Test / ShareFS (push) Successful in 3m44s
Test / Hakurei (push) Successful in 3m54s
Test / Sandbox (race detector) (push) Successful in 5m32s
Test / Hakurei (race detector) (push) Successful in 6m32s
Test / Flake checks (push) Successful in 1m22s
Required by xserver.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 01:44:37 +09:00
e580307528 internal/rosa/x: libxcvt artifact
All checks were successful
Test / Create distribution (push) Successful in 1m5s
Test / Sandbox (push) Successful in 2m50s
Test / ShareFS (push) Successful in 3m41s
Test / Hakurei (push) Successful in 3m47s
Test / Sandbox (race detector) (push) Successful in 5m23s
Test / Hakurei (race detector) (push) Successful in 6m26s
Test / Flake checks (push) Successful in 1m23s
Required by xserver.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 01:24:00 +09:00
ee1dffb676 internal/rosa/x: libXfont2 artifact
All checks were successful
Test / Create distribution (push) Successful in 1m6s
Test / Sandbox (push) Successful in 2m46s
Test / ShareFS (push) Successful in 3m43s
Test / Hakurei (push) Successful in 3m52s
Test / Sandbox (race detector) (push) Successful in 5m23s
Test / Hakurei (race detector) (push) Successful in 6m27s
Test / Flake checks (push) Successful in 1m23s
Required by xserver.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 01:17:27 +09:00
f095fcf181 internal/rosa/x: font-util and libfontenc artifact
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m53s
Test / ShareFS (push) Successful in 3m43s
Test / Hakurei (push) Successful in 3m50s
Test / Sandbox (race detector) (push) Successful in 5m27s
Test / Hakurei (race detector) (push) Successful in 6m27s
Test / Flake checks (push) Successful in 1m32s
Required by libXfont2.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 01:14:12 +09:00
ca8a130130 internal/rosa: freetype artifact
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m54s
Test / ShareFS (push) Successful in 3m43s
Test / Hakurei (push) Successful in 3m50s
Test / Sandbox (race detector) (push) Successful in 5m22s
Test / Hakurei (race detector) (push) Successful in 6m26s
Test / Flake checks (push) Successful in 1m23s
Required by libXfont.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-14 00:54:42 +09:00
0ad6b00e41 internal/rosa/x: xkbcomp artifact
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m44s
Test / ShareFS (push) Successful in 3m39s
Test / Hakurei (push) Successful in 3m45s
Test / Sandbox (race detector) (push) Successful in 5m16s
Test / Hakurei (race detector) (push) Successful in 6m29s
Test / Flake checks (push) Successful in 1m19s
Required by xserver.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 22:24:08 +09:00
ad0f1cf36b internal/rosa/x: libxkbfile artifact
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m53s
Test / ShareFS (push) Successful in 3m41s
Test / Hakurei (push) Successful in 3m57s
Test / Sandbox (race detector) (push) Successful in 5m27s
Test / Hakurei (race detector) (push) Successful in 6m32s
Test / Flake checks (push) Successful in 1m29s
Required by xkbcomp.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 22:22:02 +09:00
b12d924fa2 internal/rosa: pixman artifact
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m50s
Test / ShareFS (push) Successful in 3m41s
Test / Hakurei (push) Successful in 3m47s
Test / Sandbox (race detector) (push) Successful in 5m20s
Test / Hakurei (race detector) (push) Successful in 6m31s
Test / Flake checks (push) Successful in 1m21s
Required by xserver.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 22:07:53 +09:00
c31d8ae41a internal/rosa/x: libXfixes artifact
All checks were successful
Test / Create distribution (push) Successful in 1m3s
Test / Sandbox (push) Successful in 2m44s
Test / ShareFS (push) Successful in 3m40s
Test / Sandbox (race detector) (push) Successful in 5m20s
Test / Hakurei (race detector) (push) Successful in 6m30s
Test / Hakurei (push) Successful in 2m30s
Test / Flake checks (push) Successful in 1m20s
Required by libva.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 21:36:47 +09:00
6dbbf15c0e internal/rosa: lm_sensors artifact
All checks were successful
Test / Create distribution (push) Successful in 1m5s
Test / Sandbox (push) Successful in 2m46s
Test / ShareFS (push) Successful in 3m41s
Test / Hakurei (push) Successful in 3m51s
Test / Sandbox (race detector) (push) Successful in 5m26s
Test / Hakurei (race detector) (push) Successful in 6m34s
Test / Flake checks (push) Successful in 1m22s
Generally useful, and an optional dependency of mesa.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 20:11:37 +09:00
be7de68a42 internal/rosa/perl: Test::Cmd artifact
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m43s
Test / ShareFS (push) Successful in 3m41s
Test / Hakurei (push) Successful in 3m46s
Test / Sandbox (race detector) (push) Successful in 5m21s
Test / Hakurei (race detector) (push) Successful in 6m26s
Test / Flake checks (push) Successful in 1m31s
Required by lm_sensors test suite.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 20:05:43 +09:00
a759cf3666 internal/pkg: check exec substitution
All checks were successful
Test / Create distribution (push) Successful in 1m16s
Test / Sandbox (push) Successful in 3m0s
Test / Hakurei (push) Successful in 4m15s
Test / ShareFS (push) Successful in 4m9s
Test / Sandbox (race detector) (push) Successful in 5m45s
Test / Hakurei (race detector) (push) Successful in 6m47s
Test / Flake checks (push) Successful in 1m23s
This relies on the testtool having ident as relevant input to assert successful substitution.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 19:43:53 +09:00
8c2dd3e984 internal/pkg: verify status kind
All checks were successful
Test / Create distribution (push) Successful in 1m14s
Test / Sandbox (push) Successful in 3m0s
Test / ShareFS (push) Successful in 4m8s
Test / Hakurei (push) Successful in 4m13s
Test / Sandbox (race detector) (push) Successful in 5m47s
Test / Hakurei (race detector) (push) Successful in 6m45s
Test / Flake checks (push) Successful in 1m23s
While it is still impossible to reliably determine the expected contents of these status files, this checks their nature for expected substitution behaviour.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 19:27:58 +09:00
67038d5af4 internal/pkg: log fault in tests when available
All checks were successful
Test / Create distribution (push) Successful in 1m11s
Test / Sandbox (push) Successful in 3m15s
Test / ShareFS (push) Successful in 4m12s
Test / Hakurei (push) Successful in 4m19s
Test / Sandbox (race detector) (push) Successful in 5m34s
Test / Hakurei (race detector) (push) Successful in 6m44s
Test / Flake checks (push) Successful in 1m24s
This would otherwise only be available in verbose output, interleaved with everything else.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 18:58:18 +09:00
53d8d12e7f internal/rosa/git: disable flaky test
All checks were successful
Test / Create distribution (push) Successful in 1m8s
Test / Sandbox (push) Successful in 3m7s
Test / Hakurei (push) Successful in 4m12s
Test / ShareFS (push) Successful in 3m36s
Test / Sandbox (race detector) (push) Successful in 5m20s
Test / Hakurei (race detector) (push) Successful in 6m45s
Test / Flake checks (push) Successful in 1m21s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 18:51:11 +09:00
7997d79e56 cmd/mbf: display and destroy fault entries
All checks were successful
Test / Create distribution (push) Successful in 1m3s
Test / Sandbox (push) Successful in 2m52s
Test / Hakurei (push) Successful in 3m55s
Test / ShareFS (push) Successful in 3m52s
Test / Sandbox (race detector) (push) Successful in 5m30s
Test / Hakurei (race detector) (push) Successful in 6m33s
Test / Flake checks (push) Successful in 1m24s
This change extends cmd/mbf commands for working with fault entries.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 19:06:09 +09:00
f2f1726190 internal/pkg: record cure faults
All checks were successful
Test / ShareFS (push) Successful in 47s
Test / Sandbox (race detector) (push) Successful in 53s
Test / Sandbox (push) Successful in 54s
Test / Hakurei (race detector) (push) Successful in 59s
Test / Hakurei (push) Successful in 59s
Test / Create distribution (push) Successful in 1m12s
Test / Flake checks (push) Successful in 1m35s
These are useful for troubleshooting. This change records them in a separate directory.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 17:58:18 +09:00
f63203cb0a internal/pkg: populate substitute status
All checks were successful
Test / Hakurei (push) Successful in 58s
Test / Sandbox (push) Successful in 53s
Test / Sandbox (race detector) (push) Successful in 51s
Test / Hakurei (race detector) (push) Successful in 57s
Test / ShareFS (push) Successful in 42s
Test / Create distribution (push) Successful in 1m9s
Test / Flake checks (push) Successful in 1m36s
These are not created when taking the fast path, but should be inherited from the alternative.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 16:16:37 +09:00
19555c7670 internal/rosa/gtk: glib 2.88.0 to 2.88.1
All checks were successful
Test / ShareFS (push) Successful in 47s
Test / Create distribution (push) Successful in 41s
Test / Sandbox (push) Successful in 56s
Test / Sandbox (race detector) (push) Successful in 54s
Test / Hakurei (push) Successful in 1m1s
Test / Hakurei (race detector) (push) Successful in 59s
Test / Flake checks (push) Successful in 1m35s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 00:48:38 +09:00
a3beab8959 internal/rosa/libucontext: 1.5 to 1.5.1
All checks were successful
Test / Create distribution (push) Successful in 1m5s
Test / Sandbox (push) Successful in 2m49s
Test / ShareFS (push) Successful in 3m45s
Test / Hakurei (push) Successful in 3m51s
Test / Sandbox (race detector) (push) Successful in 5m27s
Test / Hakurei (race detector) (push) Successful in 6m28s
Test / Flake checks (push) Successful in 1m28s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 00:48:15 +09:00
2ea786d6a9 internal/rosa/libbsd: libmd 1.1.0 to 1.2.0
All checks were successful
Test / Create distribution (push) Successful in 1m3s
Test / Sandbox (push) Successful in 2m53s
Test / ShareFS (push) Successful in 3m50s
Test / Hakurei (push) Successful in 4m0s
Test / Sandbox (race detector) (push) Successful in 5m25s
Test / Hakurei (race detector) (push) Successful in 6m32s
Test / Flake checks (push) Successful in 1m45s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 00:47:57 +09:00
747d4ec4b0 internal/rosa/libexpat: 2.8.0 to 2.8.1
All checks were successful
Test / Create distribution (push) Successful in 1m14s
Test / Sandbox (push) Successful in 3m2s
Test / Hakurei (push) Successful in 4m15s
Test / ShareFS (push) Successful in 4m12s
Test / Sandbox (race detector) (push) Successful in 5m43s
Test / Hakurei (race detector) (push) Successful in 6m59s
Test / Flake checks (push) Successful in 1m22s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 00:47:32 +09:00
b76e6f6519 internal/rosa/tamago: 1.26.2 to 1.26.3
All checks were successful
Test / Create distribution (push) Successful in 1m14s
Test / Sandbox (push) Successful in 3m3s
Test / ShareFS (push) Successful in 4m8s
Test / Hakurei (push) Successful in 4m16s
Test / Sandbox (race detector) (push) Successful in 5m39s
Test / Hakurei (race detector) (push) Successful in 6m57s
Test / Flake checks (push) Successful in 1m22s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 00:47:05 +09:00
840d8f68bf internal/rosa/git: disable flaky test
All checks were successful
Test / Create distribution (push) Successful in 1m13s
Test / Sandbox (push) Successful in 3m15s
Test / ShareFS (push) Successful in 4m0s
Test / Hakurei (push) Successful in 4m15s
Test / Sandbox (race detector) (push) Successful in 5m38s
Test / Hakurei (race detector) (push) Successful in 6m41s
Test / Flake checks (push) Successful in 1m26s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 00:38:59 +09:00
4bede7ecdd internal/pkg: discontinue DCE resolution on signal
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m47s
Test / ShareFS (push) Successful in 3m45s
Test / Hakurei (push) Successful in 3m53s
Test / Sandbox (race detector) (push) Successful in 5m22s
Test / Hakurei (race detector) (push) Successful in 6m24s
Test / Flake checks (push) Successful in 1m28s
This serves as a stopgap measure to skip long-running DCE resolutions.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 00:29:01 +09:00
487a03b5a3 internal/pkg: deduplicate DCE by ident
All checks were successful
Test / Create distribution (push) Successful in 1m19s
Test / Sandbox (push) Successful in 3m19s
Test / ShareFS (push) Successful in 4m21s
Test / Hakurei (push) Successful in 4m33s
Test / Sandbox (race detector) (push) Successful in 5m48s
Test / Hakurei (race detector) (push) Successful in 6m52s
Test / Flake checks (push) Successful in 1m23s
This eliminates edge cases where target artifacts do not compare equal.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-13 00:18:27 +09:00
8f3c22896a internal/pkg: DCE benchmark unwrap only
All checks were successful
Test / Create distribution (push) Successful in 1m43s
Test / Sandbox (push) Successful in 3m54s
Test / ShareFS (push) Successful in 5m21s
Test / Hakurei (push) Successful in 5m28s
Test / Sandbox (race detector) (push) Successful in 6m50s
Test / Hakurei (race detector) (push) Successful in 7m52s
Test / Flake checks (push) Successful in 1m28s
This eliminates noise at lower depths.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 19:56:59 +09:00
a167c1aba5 internal/pkg: hold artifact in DCE
All checks were successful
Test / Create distribution (push) Successful in 1m3s
Test / Sandbox (push) Successful in 2m42s
Test / ShareFS (push) Successful in 3m43s
Test / Hakurei (push) Successful in 3m56s
Test / Sandbox (race detector) (push) Successful in 5m20s
Test / Hakurei (race detector) (push) Successful in 6m24s
Test / Flake checks (push) Successful in 1m21s
This is significantly slower but enables much better error reporting.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 19:45:25 +09:00
a6008ef68b internal/pkg: benchmark early DCE
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m45s
Test / ShareFS (push) Successful in 3m43s
Test / Hakurei (push) Successful in 3m55s
Test / Sandbox (race detector) (push) Successful in 5m26s
Test / Hakurei (race detector) (push) Successful in 6m24s
Test / Flake checks (push) Successful in 1m21s
This error has never had decent performance, now is a good time to improve that.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 18:59:25 +09:00
5228b27362 internal/rosa/glslang: 16.2.0 to 16.3.0
All checks were successful
Test / Create distribution (push) Successful in 1m14s
Test / Sandbox (push) Successful in 2m56s
Test / ShareFS (push) Successful in 4m3s
Test / Sandbox (race detector) (push) Successful in 5m36s
Test / Hakurei (race detector) (push) Successful in 6m48s
Test / Hakurei (push) Successful in 3m13s
Test / Flake checks (push) Successful in 1m26s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 17:53:35 +09:00
f00d3a07ad internal/rosa/python: trove-classifiers 2026.4.28.13 to 2026.5.7.17
All checks were successful
Test / Create distribution (push) Successful in 1m5s
Test / Sandbox (push) Successful in 3m21s
Test / Hakurei (push) Successful in 4m56s
Test / ShareFS (push) Successful in 4m21s
Test / Sandbox (race detector) (push) Successful in 5m40s
Test / Hakurei (race detector) (push) Successful in 7m12s
Test / Flake checks (push) Successful in 1m22s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 17:53:17 +09:00
f9538bc21b internal/rosa/python: 3.14.4 to 3.14.5
All checks were successful
Test / Create distribution (push) Successful in 1m13s
Test / Sandbox (push) Successful in 3m10s
Test / Hakurei (push) Successful in 4m43s
Test / ShareFS (push) Successful in 4m47s
Test / Sandbox (race detector) (push) Successful in 6m6s
Test / Hakurei (race detector) (push) Successful in 7m10s
Test / Flake checks (push) Successful in 1m23s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 17:52:53 +09:00
6ae5efec56 internal/rosa/gnu: gcc 15.2.0 to 16.1.0
All checks were successful
Test / Create distribution (push) Successful in 39s
Test / Sandbox (push) Successful in 3m6s
Test / Hakurei (push) Successful in 4m36s
Test / ShareFS (push) Successful in 4m44s
Test / Sandbox (race detector) (push) Successful in 6m6s
Test / Hakurei (race detector) (push) Successful in 7m9s
Test / Flake checks (push) Successful in 1m24s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 17:52:31 +09:00
14f4c59c8c internal/rosa/llvm: 22.1.4 to 22.1.5
All checks were successful
Test / Create distribution (push) Successful in 1m11s
Test / Sandbox (push) Successful in 3m1s
Test / ShareFS (push) Successful in 4m1s
Test / Hakurei (push) Successful in 2m54s
Test / Sandbox (race detector) (push) Successful in 2m42s
Test / Hakurei (race detector) (push) Successful in 3m26s
Test / Flake checks (push) Successful in 1m29s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 17:52:13 +09:00
688d43417b internal/pkg: rename measured exec type
All checks were successful
Test / Create distribution (push) Successful in 1m47s
Test / Sandbox (push) Successful in 7m1s
Test / Hakurei (push) Successful in 11m5s
Test / ShareFS (push) Successful in 11m11s
Test / Hakurei (race detector) (push) Successful in 4m45s
Test / Sandbox (race detector) (push) Successful in 9m1s
Test / Flake checks (push) Successful in 4m23s
This type is no longer exclusive to KindExecNet.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 15:23:33 +09:00
9f8fafa39b internal/rosa: measure kernel headers
All checks were successful
Test / Create distribution (push) Successful in 3m17s
Test / Sandbox (push) Successful in 9m8s
Test / Sandbox (race detector) (push) Successful in 10m19s
Test / ShareFS (push) Successful in 10m1s
Test / Hakurei (push) Successful in 10m46s
Test / Hakurei (race detector) (push) Successful in 14m51s
Test / Flake checks (push) Successful in 1m24s
This makes version bumps robust and much less tedious.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 15:19:57 +09:00
6643cfbeee internal/pkg: optionally measure exec artifact
All checks were successful
Test / Create distribution (push) Successful in 1m55s
Test / Sandbox (push) Successful in 5m22s
Test / Hakurei (push) Successful in 9m50s
Test / ShareFS (push) Successful in 11m20s
Test / Sandbox (race detector) (push) Successful in 3m29s
Test / Hakurei (race detector) (push) Successful in 12m9s
Test / Flake checks (push) Successful in 4m23s
Useful for verifying deterministic output without enabling network access.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 15:11:17 +09:00
dcde38f2e9 internal/rosa/llvm: set exclusive bit
All checks were successful
Test / Create distribution (push) Successful in 4m32s
Test / Sandbox (push) Successful in 6m44s
Test / ShareFS (push) Successful in 7m56s
Test / Hakurei (push) Successful in 8m10s
Test / Sandbox (race detector) (push) Successful in 9m27s
Test / Hakurei (race detector) (push) Successful in 12m50s
Test / Flake checks (push) Successful in 2m19s
This was missed when improving bootstrap.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 15:08:09 +09:00
deebbf6b1a internal/rosa/git: disable more flaky tests
All checks were successful
Test / Create distribution (push) Successful in 1m10s
Test / Hakurei (push) Successful in 9m32s
Test / ShareFS (push) Successful in 9m30s
Test / Sandbox (push) Successful in 1m43s
Test / Sandbox (race detector) (push) Successful in 2m29s
Test / Hakurei (race detector) (push) Successful in 3m28s
Test / Flake checks (push) Successful in 1m23s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 04:13:02 +09:00
0c557798bc internal/rosa/curl: 8.19.0 to 8.20.0
All checks were successful
Test / Create distribution (push) Successful in 1m9s
Test / ShareFS (push) Successful in 9m32s
Test / Sandbox (push) Successful in 1m32s
Test / Sandbox (race detector) (push) Successful in 2m23s
Test / Hakurei (push) Successful in 2m36s
Test / Hakurei (race detector) (push) Successful in 3m28s
Test / Flake checks (push) Successful in 1m21s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 04:12:40 +09:00
327e6ed5a2 internal/rosa/kernel: 6.12.84 to 6.12.87
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m43s
Test / ShareFS (push) Successful in 3m45s
Test / Hakurei (push) Successful in 3m56s
Test / Sandbox (race detector) (push) Successful in 5m29s
Test / Hakurei (race detector) (push) Successful in 6m42s
Test / Flake checks (push) Successful in 1m42s
This change also pins header version constants to the same values, to be updated manually on a real API change. This eliminates rebuilds on bumping kernel version.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 04:05:30 +09:00
76c7a423a9 internal/rosa/git: disable more flaky tests
All checks were successful
Test / Create distribution (push) Successful in 1m7s
Test / Sandbox (push) Successful in 3m27s
Test / ShareFS (push) Successful in 4m19s
Test / Sandbox (race detector) (push) Successful in 6m3s
Test / Hakurei (race detector) (push) Successful in 7m0s
Test / Hakurei (push) Successful in 2m38s
Test / Flake checks (push) Successful in 1m21s
Again, causing too many spurious failures.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 03:18:12 +09:00
6e113b8836 internal/pkg: content-based dependency substitution
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m54s
Test / ShareFS (push) Successful in 3m51s
Test / Hakurei (push) Successful in 4m6s
Test / Sandbox (race detector) (push) Successful in 5m31s
Test / Hakurei (race detector) (push) Successful in 6m55s
Test / Flake checks (push) Successful in 1m28s
This change introduces a new fast path for FloodArtifact. It is taken when a curing artifact has identical-by-content controlled relevant inputs and are otherwise identical to an already-cured artifact.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-12 00:19:42 +09:00
ce9f4b5f71 internal/rosa: vim artifact
All checks were successful
Test / Create distribution (push) Successful in 1m3s
Test / Sandbox (push) Successful in 2m47s
Test / ShareFS (push) Successful in 3m38s
Test / Hakurei (push) Successful in 3m46s
Test / Sandbox (race detector) (push) Successful in 5m26s
Test / Hakurei (race detector) (push) Successful in 6m23s
Test / Flake checks (push) Successful in 1m18s
Very useful for troubleshooting.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-10 21:45:56 +09:00
8f727273ef internal/pkg: add riscv64 sums
All checks were successful
Test / Create distribution (push) Successful in 1m3s
Test / Sandbox (push) Successful in 2m47s
Test / ShareFS (push) Successful in 3m38s
Test / Hakurei (push) Successful in 3m48s
Test / Sandbox (race detector) (push) Successful in 5m25s
Test / Hakurei (race detector) (push) Successful in 6m28s
Test / Flake checks (push) Successful in 1m20s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-10 17:12:30 +09:00
d0a63b942e internal/pkg: add arm64 sums
All checks were successful
Test / Create distribution (push) Successful in 1m41s
Test / Sandbox (push) Successful in 2m52s
Test / ShareFS (push) Successful in 4m6s
Test / Hakurei (push) Successful in 4m16s
Test / Sandbox (race detector) (push) Successful in 5m30s
Test / Hakurei (race detector) (push) Successful in 6m25s
Test / Flake checks (push) Successful in 1m19s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-10 16:42:42 +09:00
7f2126df32 internal/rosa/hakurei: 0.4.1 to 0.4.2
All checks were successful
Test / Create distribution (push) Successful in 1m3s
Test / Sandbox (push) Successful in 2m45s
Test / ShareFS (push) Successful in 3m39s
Test / Hakurei (push) Successful in 3m54s
Test / Sandbox (race detector) (push) Successful in 5m30s
Test / Hakurei (race detector) (push) Successful in 6m28s
Test / Flake checks (push) Successful in 1m20s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-10 16:30:12 +09:00
0cf0e18e35 release: 0.4.2
All checks were successful
Test / Create distribution (push) Successful in 38s
Test / ShareFS (push) Successful in 42s
Test / Sandbox (push) Successful in 48s
Test / Sandbox (race detector) (push) Successful in 49s
Test / Hakurei (push) Successful in 53s
Test / Hakurei (race detector) (push) Successful in 54s
Release / Create release (push) Successful in 1m10s
Test / Flake checks (push) Successful in 1m21s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-10 16:16:59 +09:00
ee5c0dd135 cmd/dist: optionally skip tests
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m43s
Test / ShareFS (push) Successful in 3m41s
Test / Hakurei (push) Successful in 3m47s
Test / Sandbox (race detector) (push) Successful in 5m21s
Test / Hakurei (race detector) (push) Successful in 6m30s
Test / Flake checks (push) Successful in 1m21s
Works around incomplete syscall translation by qemu.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-10 04:15:07 +09:00
92c48d82e2 internal/rosa/go: respect check flag
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m55s
Test / ShareFS (push) Successful in 3m43s
Test / Sandbox (race detector) (push) Successful in 5m21s
Test / Hakurei (race detector) (push) Successful in 6m27s
Test / Hakurei (push) Successful in 2m36s
Test / Flake checks (push) Successful in 1m31s
These tests are also quite expensive, so optionally skip them.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-10 04:01:06 +09:00
c79a4fe7f8 internal/rosa/stage0: add riscv64 tarball
All checks were successful
Test / Create distribution (push) Successful in 1m3s
Test / Sandbox (push) Successful in 2m44s
Test / Hakurei (push) Successful in 3m47s
Test / ShareFS (push) Successful in 3m45s
Test / Sandbox (race detector) (push) Successful in 5m26s
Test / Hakurei (race detector) (push) Successful in 6m32s
Test / Flake checks (push) Successful in 1m20s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-09 10:51:19 +09:00
0aeb2bccfb internal/rosa: libconfig artifact
All checks were successful
Test / Create distribution (push) Successful in 1m3s
Test / Sandbox (push) Successful in 2m43s
Test / ShareFS (push) Successful in 3m37s
Test / Hakurei (push) Successful in 3m46s
Test / Sandbox (race detector) (push) Successful in 5m19s
Test / Hakurei (race detector) (push) Successful in 6m29s
Test / Flake checks (push) Successful in 1m20s
Required by mesa.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-09 00:33:27 +09:00
50e079b99f internal/rosa: xcb-util-keysyms artifact
All checks were successful
Test / Create distribution (push) Successful in 1m3s
Test / Sandbox (push) Successful in 2m46s
Test / ShareFS (push) Successful in 3m40s
Test / Hakurei (push) Successful in 3m46s
Test / Sandbox (race detector) (push) Successful in 5m25s
Test / Hakurei (race detector) (push) Successful in 6m24s
Test / Flake checks (push) Successful in 1m20s
Required by mesa.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-09 00:16:06 +09:00
fb2cb5005a internal/rosa: libdisplay-info artifact
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m48s
Test / ShareFS (push) Successful in 3m38s
Test / Hakurei (push) Successful in 3m46s
Test / Sandbox (race detector) (push) Successful in 5m19s
Test / Hakurei (race detector) (push) Successful in 6m22s
Test / Flake checks (push) Successful in 1m20s
Required by mesa.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-09 00:07:43 +09:00
6e73c28a92 internal/rosa: hwdata artifact
All checks were successful
Test / Create distribution (push) Successful in 1m3s
Test / Sandbox (push) Successful in 2m45s
Test / ShareFS (push) Successful in 3m39s
Test / Hakurei (push) Successful in 3m46s
Test / Sandbox (race detector) (push) Successful in 5m18s
Test / Hakurei (race detector) (push) Successful in 6m23s
Test / Flake checks (push) Successful in 1m31s
Required by libdisplay-info.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-09 00:05:40 +09:00
2c08aa3674 internal/rosa/glslang: disable broken arm64 tests
All checks were successful
Test / Create distribution (push) Successful in 1m2s
Test / Sandbox (push) Successful in 2m44s
Test / ShareFS (push) Successful in 3m39s
Test / Hakurei (push) Successful in 3m58s
Test / Sandbox (race detector) (push) Successful in 5m19s
Test / Hakurei (race detector) (push) Successful in 6m31s
Test / Flake checks (push) Successful in 1m21s
These just fail on arm64, so disable them.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-08 23:56:19 +09:00
1af73ae7b4 internal/rosa/go: 1.26.2 to 1.26.3
All checks were successful
Test / Create distribution (push) Successful in 1m26s
Test / Sandbox (push) Successful in 3m19s
Test / ShareFS (push) Successful in 4m21s
Test / Hakurei (push) Successful in 4m38s
Test / Sandbox (race detector) (push) Successful in 5m51s
Test / Hakurei (race detector) (push) Successful in 7m10s
Test / Flake checks (push) Successful in 1m22s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-08 23:25:57 +09:00
c9aa5e04b1 internal/rosa/go: bootstrap 1.25.9 to 1.25.10
All checks were successful
Test / Create distribution (push) Successful in 1m15s
Test / Sandbox (push) Successful in 3m4s
Test / Hakurei (push) Successful in 4m18s
Test / ShareFS (push) Successful in 4m8s
Test / Sandbox (race detector) (push) Successful in 5m40s
Test / Hakurei (race detector) (push) Successful in 6m48s
Test / Flake checks (push) Successful in 1m22s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-08 23:20:39 +09:00
70a38bd3b0 internal/rosa: libarchive artifact
All checks were successful
Test / Create distribution (push) Successful in 1m26s
Test / Sandbox (push) Successful in 5m6s
Test / Hakurei (push) Successful in 7m25s
Test / Sandbox (race detector) (push) Successful in 8m6s
Test / ShareFS (push) Successful in 8m6s
Test / Hakurei (race detector) (push) Successful in 11m20s
Test / Flake checks (push) Successful in 14m59s
Required by mesa, also a cleaner implementation than GNU.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-08 23:16:33 +09:00
533b15da89 internal/rosa/mksh: respect check flag
All checks were successful
Test / Create distribution (push) Successful in 1m28s
Test / Sandbox (push) Successful in 5m13s
Test / Hakurei (push) Successful in 7m22s
Test / ShareFS (push) Successful in 8m3s
Test / Sandbox (race detector) (push) Successful in 8m12s
Test / Hakurei (race detector) (push) Successful in 11m10s
Test / Flake checks (push) Successful in 2m15s
This skips the test suite when OptSkipCheck is set.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-08 21:20:20 +09:00
a890e1d0e5 cmd/mbf: optionally override non-native flags
All checks were successful
Test / Create distribution (push) Successful in 1m29s
Test / Sandbox (push) Successful in 5m5s
Test / Hakurei (push) Successful in 7m21s
Test / Sandbox (race detector) (push) Successful in 8m1s
Test / ShareFS (push) Successful in 8m1s
Test / Hakurei (race detector) (push) Successful in 11m19s
Test / Flake checks (push) Successful in 2m25s
This is a clean workaround for configuration differences to save time during development.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-08 13:45:36 +09:00
e3520835bb cmd/mbf: optionally register all targets
All checks were successful
Test / Create distribution (push) Successful in 2m53s
Test / Sandbox (push) Successful in 7m1s
Test / Hakurei (push) Successful in 8m58s
Test / ShareFS (push) Successful in 9m25s
Test / Sandbox (race detector) (push) Successful in 9m30s
Test / Hakurei (race detector) (push) Successful in 12m26s
Test / Flake checks (push) Successful in 2m36s
This enables non-native cures from the daemon.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-08 13:29:58 +09:00
0e56847754 cmd/mbf: add arm64 magic
All checks were successful
Test / Create distribution (push) Successful in 1m5s
Test / Sandbox (push) Successful in 2m49s
Test / ShareFS (push) Successful in 3m40s
Test / Hakurei (push) Successful in 3m48s
Test / Sandbox (race detector) (push) Successful in 5m23s
Test / Hakurei (race detector) (push) Successful in 6m30s
Test / Flake checks (push) Successful in 1m23s
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-08 00:23:09 +09:00
145d03b366 cmd/mbf: optional emulated target architecture
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m48s
Test / ShareFS (push) Successful in 3m42s
Test / Hakurei (push) Successful in 3m50s
Test / Sandbox (race detector) (push) Successful in 5m19s
Test / Hakurei (race detector) (push) Successful in 6m24s
Test / Flake checks (push) Successful in 1m23s
This enables transparent cross-compilation without breaking purity.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-07 20:29:31 +09:00
2886228d40 internal/rosa/qemu: build static binaries
All checks were successful
Test / Create distribution (push) Successful in 1m3s
Test / Sandbox (push) Successful in 2m46s
Test / ShareFS (push) Successful in 3m37s
Test / Hakurei (push) Successful in 3m45s
Test / Sandbox (race detector) (push) Successful in 5m17s
Test / Hakurei (race detector) (push) Successful in 6m24s
Test / Flake checks (push) Successful in 1m30s
Dynamic linking here barely saves space, and this is required for binfmt.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-07 20:25:13 +09:00
e1e499b79e internal/rosa/git: disable more broken tests
All checks were successful
Test / Create distribution (push) Successful in 1m4s
Test / Sandbox (push) Successful in 2m40s
Test / ShareFS (push) Successful in 3m33s
Test / Hakurei (push) Successful in 3m48s
Test / Sandbox (race detector) (push) Successful in 5m35s
Test / Hakurei (race detector) (push) Successful in 6m38s
Test / Flake checks (push) Successful in 1m47s
These are causing many spurious failures.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-07 20:06:11 +09:00
65b7dd8b37 internal/rosa: configurable architecture
All checks were successful
Test / Create distribution (push) Successful in 1m6s
Test / Sandbox (push) Successful in 2m48s
Test / ShareFS (push) Successful in 3m40s
Test / Hakurei (push) Successful in 3m57s
Test / Sandbox (race detector) (push) Successful in 5m22s
Test / Hakurei (race detector) (push) Successful in 6m33s
Test / Flake checks (push) Successful in 1m29s
This enables curing via binfmt.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-07 20:01:44 +09:00
8d72b9e5bd internal/pkg: optionally register binfmt
All checks were successful
Test / Create distribution (push) Successful in 52s
Test / Sandbox (push) Successful in 2m0s
Test / ShareFS (push) Successful in 2m51s
Test / Hakurei (push) Successful in 3m5s
Test / Sandbox (race detector) (push) Successful in 6m5s
Test / Hakurei (race detector) (push) Successful in 7m8s
Test / Flake checks (push) Successful in 1m21s
This transparently supports curing foreign exec artifacts.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-05-07 19:43:06 +09:00
126 changed files with 4922 additions and 1451 deletions

9
all.sh
View File

@@ -1,10 +1,3 @@
#!/bin/sh -e
TOOLCHAIN_VERSION="$(go version)"
cd "$(dirname -- "$0")/"
echo "Building cmd/dist using ${TOOLCHAIN_VERSION}."
FLAGS=''
if test -n "$VERBOSE"; then
FLAGS="$FLAGS -v"
fi
go run $FLAGS --tags=dist ./cmd/dist
HAKUREI_DIST_MAKE='' exec "$(dirname -- "$0")/cmd/dist/dist.sh"

10
cmd/dist/dist.sh vendored Executable file
View File

@@ -0,0 +1,10 @@
#!/bin/sh -e
TOOLCHAIN_VERSION="$(go version)"
cd "$(dirname -- "$0")/../.."
echo "Building cmd/dist using ${TOOLCHAIN_VERSION}."
FLAGS=''
if test -n "$VERBOSE"; then
FLAGS="$FLAGS -v"
fi
go run $FLAGS --tags=dist ./cmd/dist

17
cmd/dist/main.go vendored
View File

@@ -46,6 +46,7 @@ func main() {
log.SetPrefix("")
verbose := os.Getenv("VERBOSE") != ""
runTests := os.Getenv("HAKUREI_DIST_MAKE") == ""
version := getenv("HAKUREI_VERSION", "untagged")
prefix := getenv("PREFIX", "/usr")
destdir := getenv("DESTDIR", "dist")
@@ -101,13 +102,15 @@ func main() {
)
log.Println()
log.Println("##### Testing Hakurei.")
mustRun(
ctx, "go", "test",
"-ldflags=-buildid= -linkmode external -extldflags=-static",
"./...",
)
log.Println()
if runTests {
log.Println("##### Testing Hakurei.")
mustRun(
ctx, "go", "test",
"-ldflags=-buildid= -linkmode external -extldflags=-static",
"./...",
)
log.Println()
}
log.Println("##### Creating distribution.")
const suffix = ".tar.gz"

View File

@@ -7,6 +7,7 @@ import (
"testing"
"hakurei.app/check"
"hakurei.app/container"
"hakurei.app/internal/pkg"
"hakurei.app/message"
)
@@ -19,9 +20,15 @@ type cache struct {
// Should generally not be used directly.
c *pkg.Cache
cures, jobs int
hostAbstract, idle bool
verboseInit bool
cures, jobs int
// Primarily to work around missing landlock LSM.
hostAbstract bool
// Set SCHED_IDLE.
idle bool
// Unset [pkg.CSuppressInit].
verboseInit bool
// Loaded artifact of [rosa.QEMU].
qemu pkg.Artifact
base string
}
@@ -74,6 +81,39 @@ func (cache *cache) open() (err error) {
cache.jobs,
base,
)
if err != nil {
return
}
done <- struct{}{}
if cache.qemu != nil {
var pathname *check.Absolute
pathname, _, err = cache.c.Cure(cache.qemu)
if err != nil {
cache.c.Close()
return
}
pkg.RegisterArch("riscv64", container.BinfmtEntry{
Offset: 0,
Magic: "\x7fELF\x02\x01\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\xf3\x00",
Mask: "\xff\xff\xff\xff\xff\xff\xff\x00\xff\xff\xff\xff\xff\xff\xff\xff\xfe\xff\xff\xff",
Interpreter: pathname.Append(
"system/bin",
"qemu-riscv64",
),
})
pkg.RegisterArch("arm64", container.BinfmtEntry{
Offset: 0,
Magic: "\x7fELF\x02\x01\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\xb7\x00",
Mask: "\xff\xff\xff\xff\xff\xff\xff\xfc\xff\xff\xff\xff\xff\xff\xff\xff\xfe\xff\xff\xff",
Interpreter: pathname.Append(
"system/bin",
"qemu-aarch64",
),
})
}
return
}

View File

@@ -6,6 +6,7 @@ import (
"io"
"os"
"strings"
"unique"
"hakurei.app/internal/pkg"
"hakurei.app/internal/rosa"
@@ -35,17 +36,20 @@ func commandInfo(
}
}
t := rosa.Native().Std()
for i, name := range args {
if p, ok := rosa.ResolveName(name); !ok {
handle := rosa.ArtifactH(unique.Make(name))
if meta := rosa.Native().Get(handle); meta == nil {
return fmt.Errorf("unknown artifact %q", name)
} else {
var suffix string
if version := rosa.Std.Version(p); version != rosa.Unversioned {
a, version := t.MustLoad(handle)
if version != rosa.Unversioned {
suffix += "-" + version
}
mustPrintln("name : " + name + suffix)
meta := rosa.GetMetadata(p)
mustPrintln("description : " + meta.Description)
if meta.Website != "" {
mustPrintln("website : " +
@@ -54,9 +58,9 @@ func commandInfo(
if len(meta.Dependencies) > 0 {
mustPrint("depends on :")
for _, d := range meta.Dependencies {
s := rosa.GetMetadata(d).Name
if version := rosa.Std.Version(d); version != rosa.Unversioned {
s += "-" + version
s := rosa.Native().MustGet(d).Name
if _, _version := t.Load(d); _version != rosa.Unversioned {
s += "-" + _version
}
mustPrint(" " + s)
}
@@ -68,7 +72,7 @@ func commandInfo(
if r == nil {
var f io.ReadSeekCloser
err = cm.Do(func(cache *pkg.Cache) (err error) {
f, err = cache.OpenStatus(rosa.Std.Load(p))
f, err = cache.OpenStatus(a)
return
})
if err != nil {
@@ -87,7 +91,7 @@ func commandInfo(
}
}
} else if err = cm.Do(func(cache *pkg.Cache) (err error) {
status, n := r.ArtifactOf(cache.Ident(rosa.Std.Load(p)))
status, n := r.ArtifactOf(cache.Ident(a))
if status == nil {
mustPrintln(
statusPrefix + "not in report",

View File

@@ -10,6 +10,7 @@ import (
"strings"
"syscall"
"testing"
"unique"
"unsafe"
"hakurei.app/internal/pkg"
@@ -20,6 +21,14 @@ import (
func TestInfo(t *testing.T) {
t.Parallel()
_t := rosa.Native().Std()
_, qemuVersion := _t.Load(rosa.QEMU)
_, glibVersion := _t.Load(rosa.GLib)
zlib, zlibVersion := _t.Load(rosa.Zlib)
_, zstdVersion := _t.Load(rosa.Zstd)
_, hakureiVersion := _t.Load(rosa.Hakurei)
_, hakureiDistVersion := _t.Load(rosa.HakureiDist)
testCases := []struct {
name string
args []string
@@ -29,24 +38,24 @@ func TestInfo(t *testing.T) {
wantErr any
}{
{"qemu", []string{"qemu"}, nil, "", `
name : qemu-` + rosa.Std.Version(rosa.QEMU) + `
name : qemu-` + qemuVersion + `
description : a generic and open source machine emulator and virtualizer
website : https://www.qemu.org
depends on : glib-` + rosa.Std.Version(rosa.GLib) + ` zstd-` + rosa.Std.Version(rosa.Zstd) + `
depends on : glib-` + glibVersion + ` zstd-` + zstdVersion + `
`, nil},
{"multi", []string{"hakurei", "hakurei-dist"}, nil, "", `
name : hakurei-` + rosa.Std.Version(rosa.Hakurei) + `
name : hakurei-` + hakureiVersion + `
description : low-level userspace tooling for Rosa OS
website : https://hakurei.app
name : hakurei-dist-` + rosa.Std.Version(rosa.HakureiDist) + `
name : hakurei-dist-` + hakureiDistVersion + `
description : low-level userspace tooling for Rosa OS (distribution tarball)
website : https://hakurei.app
`, nil},
{"nonexistent", []string{"zlib", "\x00"}, nil, "", `
name : zlib-` + rosa.Std.Version(rosa.Zlib) + `
name : zlib-` + zlibVersion + `
description : lossless data-compression library
website : https://zlib.net
@@ -56,12 +65,12 @@ website : https://zlib.net
"zstd": "internal/pkg (amd64) on satori\n",
"hakurei": "internal/pkg (amd64) on satori\n\n",
}, "", `
name : zlib-` + rosa.Std.Version(rosa.Zlib) + `
name : zlib-` + zlibVersion + `
description : lossless data-compression library
website : https://zlib.net
status : not yet cured
name : zstd-` + rosa.Std.Version(rosa.Zstd) + `
name : zstd-` + zstdVersion + `
description : a fast compression algorithm
website : https://facebook.github.io/zstd
status : internal/pkg (amd64) on satori
@@ -70,19 +79,19 @@ status : internal/pkg (amd64) on satori
{"status cache perm", []string{"zlib"}, map[string]string{
"zlib": "\x00",
}, "", `
name : zlib-` + rosa.Std.Version(rosa.Zlib) + `
name : zlib-` + zlibVersion + `
description : lossless data-compression library
website : https://zlib.net
`, func(cm *cache) error {
return &os.PathError{
Op: "open",
Path: filepath.Join(cm.base, "status", pkg.Encode(cm.c.Ident(rosa.Std.Load(rosa.Zlib)).Value())),
Path: filepath.Join(cm.base, "status", pkg.Encode(cm.c.Ident(zlib).Value())),
Err: syscall.EACCES,
}
}},
{"status report", []string{"zlib"}, nil, strings.Repeat("\x00", len(pkg.Checksum{})+8), `
name : zlib-` + rosa.Std.Version(rosa.Zlib) + `
name : zlib-` + zlibVersion + `
description : lossless data-compression library
website : https://zlib.net
status : not in report
@@ -131,8 +140,8 @@ status : not in report
if tc.status != nil {
for name, status := range tc.status {
p, ok := rosa.ResolveName(name)
if !ok {
a, _ := _t.Load(rosa.ArtifactH(unique.Make(name)))
if a == nil {
t.Fatalf("invalid name %q", name)
}
perm := os.FileMode(0400)
@@ -143,7 +152,7 @@ status : not in report
return os.WriteFile(filepath.Join(
cm.base,
"status",
pkg.Encode(cache.Ident(rosa.Std.Load(p)).Value()),
pkg.Encode(cache.Ident(a).Value()),
), unsafe.Slice(unsafe.StringData(status), len(status)), perm)
}); err != nil {
t.Fatalf("Do: error = %v", err)

View File

@@ -30,7 +30,7 @@ var (
// handleInfo writes constant system information.
func handleInfo(w http.ResponseWriter, _ *http.Request) {
infoPayloadOnce.Do(func() {
infoPayload.Count = int(rosa.PresetUnexportedStart)
infoPayload.Count = int(rosa.Native().Count())
infoPayload.HakureiVersion = info.Version()
})
// TODO(mae): cache entire response if no additional fields are planned
@@ -91,7 +91,7 @@ func (index *packageIndex) handleGet(w http.ResponseWriter, r *http.Request) {
if err != nil || i >= len(index.sorts[0]) || i < 0 {
http.Error(
w, "index must be an integer between 0 and "+
strconv.Itoa(int(rosa.PresetUnexportedStart-1)),
strconv.Itoa(len(index.sorts[0])-1),
http.StatusBadRequest,
)
return
@@ -125,7 +125,7 @@ func (index *packageIndex) handleSearch(w http.ResponseWriter, r *http.Request)
if err != nil || i >= len(index.sorts[0]) || i < 0 {
http.Error(
w, "index must be an integer between 0 and "+
strconv.Itoa(int(rosa.PresetUnexportedStart-1)),
strconv.Itoa(len(index.sorts[0])-1),
http.StatusBadRequest,
)
return

View File

@@ -3,7 +3,6 @@ package pkgserver
import (
"net/http"
"net/http/httptest"
"slices"
"strconv"
"testing"
@@ -32,7 +31,7 @@ func TestAPIInfo(t *testing.T) {
checkPayload(t, resp, struct {
Count int `json:"count"`
HakureiVersion string `json:"hakurei_version"`
}{int(rosa.PresetUnexportedStart), info.Version()})
}{rosa.Native().Count(), info.Version()})
}
func TestAPIGet(t *testing.T) {
@@ -96,8 +95,8 @@ func TestAPIGet(t *testing.T) {
t.Run("index", func(t *testing.T) {
t.Parallel()
checkValidate(
t, "limit=1&sort=0&index", 0, int(rosa.PresetUnexportedStart-1),
"index must be an integer between 0 and "+strconv.Itoa(int(rosa.PresetUnexportedStart-1)),
t, "limit=1&sort=0&index", 0, rosa.Native().Count()-1,
"index must be an integer between 0 and "+strconv.Itoa(rosa.Native().Count()-1),
)
})
@@ -108,74 +107,4 @@ func TestAPIGet(t *testing.T) {
"sort must be an integer between 0 and "+strconv.Itoa(int(sortOrderEnd)),
)
})
checkWithSuffix := func(name, suffix string, want []*metadata) {
t.Run(name, func(t *testing.T) {
t.Parallel()
w := newRequest(suffix)
resp := w.Result()
checkStatus(t, resp, http.StatusOK)
checkAPIHeader(t, w.Header())
checkPayloadFunc(t, resp, func(got *struct {
Values []*metadata `json:"values"`
}) bool {
return slices.EqualFunc(got.Values, want, func(a, b *metadata) bool {
return (a.Version == b.Version ||
a.Version == rosa.Unversioned ||
b.Version == rosa.Unversioned) &&
a.HasReport == b.HasReport &&
a.Name == b.Name &&
a.Description == b.Description &&
a.Website == b.Website
})
})
})
}
checkWithSuffix("declarationAscending", "?limit=2&index=1&sort=0", []*metadata{
{
Metadata: rosa.GetMetadata(1),
Version: rosa.Std.Version(1),
},
{
Metadata: rosa.GetMetadata(2),
Version: rosa.Std.Version(2),
},
})
checkWithSuffix("declarationAscending offset", "?limit=3&index=5&sort=0", []*metadata{
{
Metadata: rosa.GetMetadata(5),
Version: rosa.Std.Version(5),
},
{
Metadata: rosa.GetMetadata(6),
Version: rosa.Std.Version(6),
},
{
Metadata: rosa.GetMetadata(7),
Version: rosa.Std.Version(7),
},
})
checkWithSuffix("declarationDescending", "?limit=3&index=0&sort=1", []*metadata{
{
Metadata: rosa.GetMetadata(rosa.PresetUnexportedStart - 1),
Version: rosa.Std.Version(rosa.PresetUnexportedStart - 1),
},
{
Metadata: rosa.GetMetadata(rosa.PresetUnexportedStart - 2),
Version: rosa.Std.Version(rosa.PresetUnexportedStart - 2),
},
{
Metadata: rosa.GetMetadata(rosa.PresetUnexportedStart - 3),
Version: rosa.Std.Version(rosa.PresetUnexportedStart - 3),
},
})
checkWithSuffix("declarationDescending offset", "?limit=1&index=37&sort=1", []*metadata{
{
Metadata: rosa.GetMetadata(rosa.PresetUnexportedStart - 38),
Version: rosa.Std.Version(rosa.PresetUnexportedStart - 38),
},
})
}

View File

@@ -23,7 +23,7 @@ const (
// packageIndex refers to metadata by name and various sort orders.
type packageIndex struct {
sorts [sortOrderEnd + 1][rosa.PresetUnexportedStart]*metadata
sorts [sortOrderEnd + 1][]*metadata
names map[string]*metadata
search searchCache
// Taken from [rosa.Report] if available.
@@ -32,8 +32,8 @@ type packageIndex struct {
// metadata holds [rosa.Metadata] extended with additional information.
type metadata struct {
p rosa.PArtifact
*rosa.Metadata
handle rosa.ArtifactH
*rosa.Artifact
// Populated via [rosa.Toolchain.Version], [rosa.Unversioned] is equivalent
// to the zero value. Otherwise, the zero value is invalid.
@@ -56,15 +56,17 @@ func (index *packageIndex) populate(report *rosa.Report) (err error) {
index.handleAccess = report.HandleAccess
}
var work [rosa.PresetUnexportedStart]*metadata
handles := rosa.Native().Collect()
work := make([]*metadata, len(handles))
index.names = make(map[string]*metadata)
ir := pkg.NewIR()
for p := range rosa.PresetUnexportedStart {
for i, handle := range handles {
a, version := rosa.Native().Std().MustLoad(handle)
m := metadata{
p: p,
handle: handle,
Metadata: rosa.GetMetadata(p),
Version: rosa.Std.Version(p),
Artifact: rosa.Native().MustGet(handle),
Version: version,
}
if m.Version == "" {
return errors.New("invalid version from " + m.Name)
@@ -74,32 +76,32 @@ func (index *packageIndex) populate(report *rosa.Report) (err error) {
}
if report != nil {
id := ir.Ident(rosa.Std.Load(p))
id := ir.Ident(a)
m.ids = pkg.Encode(id.Value())
m.status, m.Size = report.ArtifactOf(id)
m.HasReport = m.Size >= 0
}
work[p] = &m
work[i] = &m
index.names[m.Name] = &m
}
index.sorts[declarationAscending] = work
index.sorts[declarationDescending] = work
index.sorts[declarationDescending] = slices.Clone(work)
slices.Reverse(index.sorts[declarationDescending][:])
index.sorts[nameAscending] = work
index.sorts[nameAscending] = slices.Clone(work)
slices.SortFunc(index.sorts[nameAscending][:], func(a, b *metadata) int {
return strings.Compare(a.Name, b.Name)
})
index.sorts[nameDescending] = index.sorts[nameAscending]
index.sorts[nameDescending] = slices.Clone(index.sorts[nameAscending])
slices.Reverse(index.sorts[nameDescending][:])
index.sorts[sizeAscending] = work
index.sorts[sizeAscending] = slices.Clone(work)
slices.SortFunc(index.sorts[sizeAscending][:], func(a, b *metadata) int {
return cmp.Compare(a.Size, b.Size)
})
index.sorts[sizeDescending] = index.sorts[sizeAscending]
index.sorts[sizeDescending] = slices.Clone(index.sorts[sizeAscending])
slices.Reverse(index.sorts[sizeDescending][:])
return

View File

@@ -4,11 +4,12 @@
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<link rel="stylesheet" href="style.css">
<title>Hakurei PkgServer</title>
<link rel="icon" href="https://hakurei.app/favicon.ico"/>
<title>Rosa OS Packages</title>
<script src="index.js"></script>
</head>
<body>
<h1>Hakurei PkgServer</h1>
<h1>Rosa OS Packages</h1>
<div class="top-controls" id="top-controls-regular">
<p>Showing entries <span id="entry-counter"></span>.</p>
<span id="search-bar">
@@ -54,4 +55,4 @@
</footer>
<script>main();</script>
</body>
</html>
</html>

View File

@@ -67,9 +67,13 @@ func main() {
var (
flagQuiet bool
flagQEMU bool
flagArch string
flagCheck bool
flagLTO bool
flagCrossOverride int
addr net.UnixAddr
)
c := command.New(os.Stderr, log.Printf, "mbf", func([]string) error {
@@ -93,13 +97,36 @@ func main() {
if !flagLTO {
flags |= rosa.OptLLVMNoLTO
}
rosa.DropCaches(flags)
rosa.Native().DropCaches("", flags)
cross := flagArch != "" && flagArch != runtime.GOARCH
if flagQEMU || cross {
cm.qemu, _ = rosa.Native().Std().Load(rosa.QEMU)
}
if cross {
if flagCrossOverride != -1 {
flags = flagCrossOverride
}
rosa.Native().DropCaches(flagArch, flags)
if !rosa.HasStage0() {
return pkg.UnsupportedArchError(flagArch)
}
}
return nil
}).Flag(
&flagQuiet,
"q", command.BoolFlag(false),
"Do not print cure messages",
).Flag(
&flagQEMU,
"register", command.BoolFlag(false),
"Enable additional target architectures",
).Flag(
&flagArch,
"arch", command.StringFlag(runtime.GOARCH),
"Target architecture",
).Flag(
&flagLTO,
"lto", command.BoolFlag(false),
@@ -108,6 +135,10 @@ func main() {
&flagCheck,
"check", command.BoolFlag(true),
"Run test suites",
).Flag(
&flagCrossOverride,
"cross-flags", command.IntFlag(-1),
"Override non-native target preset flags",
).Flag(
&cm.verboseInit,
"v", command.BoolFlag(false),
@@ -300,12 +331,12 @@ func main() {
n atomic.Uint64
)
w := make(chan rosa.PArtifact)
w := make(chan rosa.ArtifactH)
var wg sync.WaitGroup
for range max(flagJobs, 1) {
wg.Go(func() {
for p := range w {
meta := rosa.GetMetadata(p)
meta := rosa.Native().MustGet(p)
if meta.ID == 0 {
continue
}
@@ -318,8 +349,9 @@ func main() {
continue
}
_, version := rosa.Native().Std().Load(p)
if current, latest :=
rosa.Std.Version(p),
version,
meta.GetLatest(v); current != latest {
n.Add(1)
@@ -333,9 +365,9 @@ func main() {
}
done:
for i := range rosa.PresetEnd {
for _, p := range rosa.Native().Collect() {
select {
case w <- rosa.PArtifact(i):
case w <- p:
break
case <-ctx.Done():
break done
@@ -380,9 +412,9 @@ func main() {
"stage3",
"Check for toolchain 3-stage non-determinism",
func(args []string) (err error) {
t := rosa.Std
s := rosa.Std
if flagGentoo != "" {
t -= 3 // magic number to discourage misuse
s -= 3 // magic number to discourage misuse
var checksum pkg.Checksum
if len(flagChecksum) != 0 {
@@ -390,7 +422,7 @@ func main() {
return
}
}
rosa.SetGentooStage3(flagGentoo, checksum)
rosa.Native().SetGentooStage3(flagGentoo, checksum)
}
var (
@@ -399,9 +431,8 @@ func main() {
)
if err = cm.Do(func(cache *pkg.Cache) (err error) {
pathname, _, err = cache.Cure(
(t - 2).Load(rosa.LLVM),
)
llvm, _ := rosa.Native().New(s - 2).Load(rosa.LLVM)
pathname, _, err = cache.Cure(llvm)
return
}); err != nil {
return
@@ -409,18 +440,16 @@ func main() {
log.Println("stage1:", pathname)
if err = cm.Do(func(cache *pkg.Cache) (err error) {
pathname, checksum[0], err = cache.Cure(
(t - 1).Load(rosa.LLVM),
)
llvm, _ := rosa.Native().New(s - 1).Load(rosa.LLVM)
pathname, checksum[0], err = cache.Cure(llvm)
return
}); err != nil {
return
}
log.Println("stage2:", pathname)
if err = cm.Do(func(cache *pkg.Cache) (err error) {
pathname, checksum[1], err = cache.Cure(
t.Load(rosa.LLVM),
)
llvm, _ := rosa.Native().New(s).Load(rosa.LLVM)
pathname, checksum[1], err = cache.Cure(llvm)
return
}); err != nil {
return
@@ -441,9 +470,8 @@ func main() {
if flagStage0 {
if err = cm.Do(func(cache *pkg.Cache) (err error) {
pathname, _, err = cache.Cure(
t.Load(rosa.Stage0),
)
stage0, _ := rosa.Native().New(s).Load(rosa.Stage0)
pathname, _, err = cache.Cure(stage0)
return
}); err != nil {
return
@@ -475,6 +503,8 @@ func main() {
flagExport string
flagRemote bool
flagNoReply bool
flagFaults bool
flagPop bool
flagBoot bool
flagStd bool
@@ -486,10 +516,6 @@ func main() {
if len(args) != 1 {
return errors.New("cure requires 1 argument")
}
p, ok := rosa.ResolveName(args[0])
if !ok {
return fmt.Errorf("unknown artifact %q", args[0])
}
t := rosa.Std
if flagBoot {
@@ -498,11 +524,16 @@ func main() {
t -= 1
}
a, _ := rosa.Native().New(t).Load(rosa.ArtifactH(unique.Make(args[0])))
if a == nil {
return fmt.Errorf("unknown artifact %q", args[0])
}
switch {
default:
var pathname *check.Absolute
err := cm.Do(func(cache *pkg.Cache) (err error) {
pathname, _, err = cache.Cure(t.Load(p))
pathname, _, err = cache.Cure(a)
return
})
if err != nil {
@@ -544,7 +575,7 @@ func main() {
return err
}
if err = pkg.NewIR().EncodeAll(f, rosa.Std.Load(p)); err != nil {
if err = pkg.NewIR().EncodeAll(f, a); err != nil {
_ = f.Close()
return err
}
@@ -555,7 +586,7 @@ func main() {
return cm.Do(func(cache *pkg.Cache) error {
return cache.EnterExec(
ctx,
t.Load(p),
a,
true, os.Stdin, os.Stdout, os.Stderr,
rosa.AbsSystem.Append("bin", "mksh"),
"sh",
@@ -567,7 +598,6 @@ func main() {
if flagNoReply {
flags |= remoteNoReply
}
a := t.Load(p)
pathname, err := cureRemote(ctx, &addr, a, flags)
if !flagNoReply && err == nil {
log.Println(pathname)
@@ -583,6 +613,49 @@ func main() {
}
return err
case flagFaults:
var faults []pkg.Fault
if err := cm.Do(func(cache *pkg.Cache) (err error) {
faults, err = cache.ReadFaults(a)
return
}); err != nil {
return err
}
for _, fault := range faults {
log.Printf("%s: %s ago", fault.String(), time.Since(fault.Time()))
}
return nil
case flagPop:
var faults []pkg.Fault
if err := cm.Do(func(cache *pkg.Cache) (err error) {
faults, err = cache.ReadFaults(a)
return
}); err != nil {
return err
}
if len(faults) == 0 {
return errors.New("no fault entries found")
}
fault := faults[len(faults)-1]
r, err := fault.Open()
if err != nil {
return err
}
if _, err = io.Copy(os.Stdout, r); err != nil {
_ = r.Close()
return err
}
fmt.Println()
if err = r.Close(); err != nil {
return err
}
log.Printf("faulting cure terminated %s ago", time.Since(fault.Time()))
return fault.Destroy()
}
},
).Flag(
@@ -613,9 +686,40 @@ func main() {
&flagStd,
"std", command.BoolFlag(false),
"Build on the intermediate toolchain",
).Flag(
&flagFaults,
"faults", command.BoolFlag(false),
"Display fault entries of the specified artifact",
).Flag(
&flagPop,
"pop", command.BoolFlag(false),
"Display and destroy the most recent fault entry",
)
}
c.NewCommand(
"clear",
"Remove all fault entries from the cache",
func([]string) error {
return cm.Do(func(*pkg.Cache) error {
pathname := filepath.Join(cm.base, "fault")
dents, err := os.ReadDir(pathname)
if err != nil {
return err
}
for _, dent := range dents {
msg.Verbosef("destroying entry %s", dent.Name())
if err = os.Remove(filepath.Join(pathname, dent.Name())); err != nil {
return err
}
}
log.Printf("destroyed %d fault entries", len(dents))
return nil
})
},
)
c.NewCommand(
"abort",
"Abort all pending cures on the daemon",
@@ -633,27 +737,26 @@ func main() {
"shell",
"Interactive shell in the specified Rosa OS environment",
func(args []string) error {
presets := make([]rosa.PArtifact, len(args)+3)
handles := make([]rosa.ArtifactH, len(args)+3)
for i, arg := range args {
p, ok := rosa.ResolveName(arg)
if !ok {
handles[i] = rosa.ArtifactH(unique.Make(arg))
if rosa.Native().Get(handles[i]) == nil {
return fmt.Errorf("unknown artifact %q", arg)
}
presets[i] = p
}
base := rosa.LLVM
if !flagWithToolchain {
base = rosa.Musl
}
presets = append(presets,
handles = append(handles,
base,
rosa.Mksh,
rosa.Toybox,
)
root := make(pkg.Collect, 0, 6+len(args))
root = rosa.Std.AppendPresets(root, presets...)
root = rosa.Native().Std().Append(root, handles...)
if err := cm.Do(func(cache *pkg.Cache) error {
_, _, err := cache.Cure(&root)

View File

@@ -9,7 +9,7 @@ import (
)
func TestMain(m *testing.M) {
rosa.DropCaches(rosa.OptLLVMNoLTO)
rosa.Native().DropCaches("", rosa.OptLLVMNoLTO)
os.Exit(m.Run())
}
@@ -35,10 +35,10 @@ func TestCureAll(t *testing.T) {
}
})
for i := range rosa.PresetEnd {
p := rosa.PArtifact(i)
t.Run(rosa.GetMetadata(p).Name, func(t *testing.T) {
_, err := cureRemote(t.Context(), &addr, rosa.Std.Load(p), 0)
for _, handle := range rosa.Native().Collect() {
a, _ := rosa.Native().Std().MustLoad(handle)
t.Run(rosa.Native().MustGet(handle).Name, func(t *testing.T) {
_, err := cureRemote(t.Context(), &addr, a, 0)
if err != nil {
t.Error(err)
}

View File

@@ -20,11 +20,14 @@
};
virtualisation = {
# Hopefully reduces spurious test failures:
memorySize = if pkgs.stdenv.hostPlatform.is32bit then 2046 else 8192;
diskSize = 6 * 1024;
qemu.options = [
# Increase test performance:
"-smp 8"
"-smp 16"
];
};

View File

@@ -28,7 +28,7 @@ testers.nixosTest {
# For go tests:
(pkgs.writeShellScriptBin "sharefs-workload-hakurei-tests" ''
cp -r "${self.packages.${system}.hakurei.src}" "/sdcard/hakurei" && cd "/sdcard/hakurei"
${fhs}/bin/hakurei-fhs -c 'CC="clang -O3 -Werror" go test ./...'
${fhs}/bin/hakurei-fhs -c 'ROSA_SKIP_BINFMT=1 CC="clang -O3 -Werror" go test ./...'
'')
];

View File

@@ -9,8 +9,10 @@ import (
"os"
"os/exec"
"path/filepath"
"runtime"
"slices"
"strconv"
"sync"
"syscall"
"time"
"unique"
@@ -94,6 +96,32 @@ func MustPath(pathname string, writable bool, a ...Artifact) ExecPath {
return ExecPath{check.MustAbs(pathname), a, writable}
}
var (
binfmt map[string]container.BinfmtEntry
binfmtMu sync.RWMutex
)
// RegisterArch arranges for [KindExec] and [KindExecNet] to support a new
// architecture via a binfmt_misc entry. Each architecture must be registered
// at most once.
func RegisterArch(arch string, e container.BinfmtEntry) {
if arch == "" {
panic(UnsupportedArchError(arch))
}
binfmtMu.Lock()
defer binfmtMu.Unlock()
if binfmt == nil {
binfmt = make(map[string]container.BinfmtEntry)
}
if _, ok := binfmt[arch]; ok {
panic("attempting to register " + strconv.Quote(arch) + " twice")
}
binfmt[arch] = e
}
const (
// ExecTimeoutDefault replaces out of range [NewExec] timeout values.
ExecTimeoutDefault = 15 * time.Minute
@@ -110,6 +138,8 @@ type execArtifact struct {
// Caller-supplied user-facing reporting name, guaranteed to be nonzero
// during initialisation.
name string
// Target architecture.
arch string
// Caller-supplied inner mount points.
paths []ExecPath
@@ -132,28 +162,40 @@ type execArtifact struct {
var _ fmt.Stringer = new(execArtifact)
// execNetArtifact is like execArtifact but implements [KnownChecksum] and has
// its resulting container keep the host net namespace.
type execNetArtifact struct {
// execMeasuredArtifact is like execArtifact but implements [KnownChecksum] and
// has its resulting container optionally keep the host net namespace.
type execMeasuredArtifact struct {
checksum Checksum
// Whether to keep host net namespace.
hostNet bool
execArtifact
}
var _ KnownChecksum = new(execNetArtifact)
var _ KnownChecksum = new(execMeasuredArtifact)
// Checksum returns the caller-supplied checksum.
func (a *execNetArtifact) Checksum() Checksum { return a.checksum }
func (a *execMeasuredArtifact) Checksum() Checksum { return a.checksum }
// Kind returns the hardcoded [Kind] constant.
func (*execNetArtifact) Kind() Kind { return KindExecNet }
// Kind returns [KindExecNet], or [KindExec] if hostNet is false.
func (a *execMeasuredArtifact) Kind() Kind {
if a == nil || a.hostNet {
return KindExecNet
}
return KindExec
}
// Cure cures the [Artifact] in the container described by the caller. The
// container retains host networking.
func (a *execNetArtifact) Cure(f *FContext) error {
return a.cure(f, true)
// container optionally retains host networking.
func (a *execMeasuredArtifact) Cure(f *FContext) error {
return a.cure(f, a.hostNet)
}
// ErrNetChecksum is panicked by [NewExec] if host net namespace is requested
// with a nil checksum.
var ErrNetChecksum = errors.New("attempting to keep net namespace without checksum")
// NewExec returns a new [Artifact] that executes the program path in a
// container with specified paths bind mounted read-only in order. A private
// instance of /proc and /dev is made available to the container.
@@ -167,7 +209,7 @@ func (a *execNetArtifact) Cure(f *FContext) error {
// regular or symlink.
//
// If checksum is non-nil, the resulting [Artifact] implements [KnownChecksum]
// and its container runs in the host net namespace.
// and its container optionally runs in the host net namespace.
//
// The container is allowed to run for the specified duration before the initial
// process and all processes originating from it is terminated. A zero or
@@ -178,10 +220,10 @@ func (a *execNetArtifact) Cure(f *FContext) error {
// container and does not affect curing outcome. Because of this, it is omitted
// from parameter data for computing identifier.
func NewExec(
name string,
name, arch string,
checksum *Checksum,
timeout time.Duration,
exclusive bool,
hostNet, exclusive bool,
dir *check.Absolute,
env []string,
@@ -193,17 +235,23 @@ func NewExec(
if name == "" {
name = "exec-" + filepath.Base(pathname.String())
}
if arch == "" {
arch = runtime.GOARCH
}
if timeout <= 0 {
timeout = ExecTimeoutDefault
}
if timeout > ExecTimeoutMax {
timeout = ExecTimeoutMax
}
a := execArtifact{name, paths, dir, env, pathname, args, timeout, exclusive}
a := execArtifact{name, arch, paths, dir, env, pathname, args, timeout, exclusive}
if checksum == nil {
if hostNet {
panic(ErrNetChecksum)
}
return &a
}
return &execNetArtifact{*checksum, a}
return &execMeasuredArtifact{*checksum, hostNet, a}
}
// Kind returns the hardcoded [Kind] constant.
@@ -211,6 +259,7 @@ func (*execArtifact) Kind() Kind { return KindExec }
// Params writes paths, executable pathname and args.
func (a *execArtifact) Params(ctx *IContext) {
ctx.WriteString(a.arch)
ctx.WriteString(a.name)
ctx.WriteUint32(uint32(len(a.paths)))
@@ -257,11 +306,26 @@ func (a *execArtifact) Params(ctx *IContext) {
}
}
// UnsupportedArchError describes an unsupported or invalid architecture.
type UnsupportedArchError string
func (e UnsupportedArchError) Error() string {
if e == "" {
return "invalid architecture name"
}
return "unsupported architecture " + string(e)
}
// readExecArtifact interprets IR values and returns the address of execArtifact
// or execNetArtifact.
func readExecArtifact(r *IRReader, net bool) Artifact {
r.DiscardAll()
arch := r.ReadString()
if arch == "" {
panic(UnsupportedArchError(arch))
}
name := r.ReadString()
sz := r.ReadUint32()
@@ -312,22 +376,17 @@ func readExecArtifact(r *IRReader, net bool) Artifact {
exclusive := r.ReadUint32() != 0
checksum, ok := r.Finalise()
var checksumP *Checksum
if net {
if !ok {
panic(ErrExpectedChecksum)
}
checksumVal := checksum.Value()
checksumP = &checksumVal
} else {
if ok {
panic(ErrUnexpectedChecksum)
}
if ok {
checksumP = new(checksum.Value())
}
if net && !ok {
panic(ErrExpectedChecksum)
}
return NewExec(
name, checksumP, timeout, exclusive, dir, env, pathname, args, paths...,
name, arch, checksumP, timeout, net, exclusive, dir, env, pathname, args, paths...,
)
}
@@ -442,6 +501,17 @@ func (a *execArtifact) makeContainer(
z.Env = slices.Concat(a.env, []string{EnvJobs + "=" + strconv.Itoa(jobs)})
z.Grow(len(a.paths) + 4)
if a.arch != runtime.GOARCH {
binfmtMu.RLock()
e, ok := binfmt[a.arch]
binfmtMu.RUnlock()
if !ok {
return nil, UnsupportedArchError(a.arch)
}
z.Binfmt = []container.BinfmtEntry{e}
z.InitAsRoot = true
}
for i, b := range a.paths {
if i == overlayWorkIndex {
if err = os.MkdirAll(work.String(), 0700); err != nil {
@@ -528,9 +598,9 @@ func (c *Cache) EnterExec(
case *execArtifact:
e = f
case *execNetArtifact:
case *execMeasuredArtifact:
e = &f.execArtifact
hostNet = true
hostNet = f.hostNet
default:
return ErrNotExec

View File

@@ -1,6 +1,7 @@
package pkg_test
import (
"bytes"
_ "embed"
"encoding/gob"
"errors"
@@ -8,12 +9,14 @@ import (
"net"
"os"
"os/exec"
"path/filepath"
"slices"
"testing"
"unique"
"hakurei.app/check"
"hakurei.app/container"
"hakurei.app/hst"
"hakurei.app/internal/info"
"hakurei.app/internal/pkg"
"hakurei.app/internal/stub"
@@ -27,6 +30,17 @@ import (
//go:embed internal/testtool/testtool
var testtoolBin []byte
func init() {
pathname, err := filepath.Abs("internal/testtool/testtool")
if err != nil {
panic(err)
}
pkg.RegisterArch("cafe", container.BinfmtEntry{
Magic: expected.Magic,
Interpreter: check.MustAbs(pathname),
})
}
func TestExec(t *testing.T) {
t.Parallel()
@@ -36,14 +50,21 @@ func TestExec(t *testing.T) {
"check": {Mode: 0400, Data: []byte{0}},
}
wantOfflineEncode := pkg.Encode(wantOffline.hash())
failingArtifact := &stubArtifact{
kind: pkg.KindTar,
params: []byte("doomed artifact"),
cure: func(t *pkg.TContext) error {
return stub.UniqueError(0xcafe)
},
}
checkWithCache(t, []cacheTestCase{
{"offline", pkg.CValidateKnown, nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
{"offline", pkg.CValidateKnown | checkDestroySubstitutes, nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
testtool, testtoolDestroy := newTesttool()
cureMany(t, c, []cureStep{
{"container", pkg.NewExec(
"exec-offline", nil, 0, false,
"exec-offline", "", new(wantOffline.hash()), 0, false, false,
pkg.AbsWork,
[]string{"HAKUREI_TEST=1"},
check.MustAbs("/opt/bin/testtool"),
@@ -65,31 +86,47 @@ func TestExec(t *testing.T) {
pkg.MustPath("/opt", false, testtool),
), ignorePathname, wantOffline, nil},
{"error passthrough", pkg.NewExec(
"", nil, 0, true,
{"substitution", pkg.NewExec(
"exec-offline", "", new(wantOffline.hash()), 0, false, false,
pkg.AbsWork,
[]string{"HAKUREI_TEST=1"},
check.MustAbs("/opt/bin/testtool"),
[]string{"testtool"},
pkg.MustPath("/proc/nonexistent", false, &stubArtifact{
pkg.MustPath("/file", false, newStubFile(
pkg.KindHTTPGet,
pkg.ID{0xfe, 0},
nil,
nil, nil,
)),
// substitution miss fails in testtool due to differing idents
pkg.MustPath("/.hakurei", false, &stubArtifact{
kind: pkg.KindTar,
params: []byte("doomed artifact"),
params: []byte("empty directory (substituted)"),
cure: func(t *pkg.TContext) error {
return stub.UniqueError(0xcafe)
return os.MkdirAll(t.GetWorkDir().String(), 0700)
},
}),
pkg.MustPath("/opt", false, testtool),
), ignorePathname, wantOffline, nil},
{"error passthrough", pkg.NewExec(
"", "", nil, 0, false, true,
pkg.AbsWork,
[]string{"HAKUREI_TEST=1"},
check.MustAbs("/opt/bin/testtool"),
[]string{"testtool"},
pkg.MustPath("/proc/nonexistent", false, failingArtifact),
), nil, nil, &pkg.DependencyCureError{
{
Ident: unique.Make(pkg.ID(pkg.MustDecode(
"Sowo6oZRmG6xVtUaxB6bDWZhVsqAJsIJWUp0OPKlE103cY0lodx7dem8J-qQF0Z1",
))),
A: failingArtifact,
Err: stub.UniqueError(0xcafe),
},
}},
{"invalid paths", pkg.NewExec(
"", nil, 0, false,
"", "", nil, 0, false, false,
pkg.AbsWork,
[]string{"HAKUREI_TEST=1"},
check.MustAbs("/opt/bin/testtool"),
@@ -100,18 +137,41 @@ func TestExec(t *testing.T) {
})
// check init failure passthrough
var exitError *exec.ExitError
if _, _, err := c.Cure(pkg.NewExec(
"", nil, 0, false,
initFailureArtifact := pkg.NewExec(
"", "", nil, 0, false, false,
pkg.AbsWork,
nil,
check.MustAbs("/opt/bin/testtool"),
[]string{"testtool"},
)); !errors.As(err, &exitError) ||
)
var exitError *exec.ExitError
if _, _, err := c.Cure(initFailureArtifact); !errors.As(err, &exitError) ||
exitError.ExitCode() != hst.ExitFailure {
t.Fatalf("Cure: error = %v, want init exit status 1", err)
}
var faultStatus []byte
if faults, err := c.ReadFaults(initFailureArtifact); err != nil {
t.Fatal(err)
} else if len(faults) != 1 {
t.Fatalf("ReadFaults: %v", faults)
} else if faultStatus, err = os.ReadFile(faults[0].String()); err != nil {
t.Fatal(err)
} else if err = faults[0].Destroy(); err != nil {
t.Fatal(err)
} else {
t.Logf("destroyed expected fault at %s", faults[0].Time().UTC())
}
if !bytes.HasPrefix(faultStatus, []byte(
"internal/pkg ",
)) || !bytes.Contains(faultStatus, []byte(
"\ninit: fork/exec /opt/bin/testtool: no such file or directory\n",
)) {
t.Errorf("unexpected status:\n%s", string(faultStatus))
}
destroyStatus(t, base, 2, 1)
testtoolDestroy(t, base, c)
}, expectsFS{
".": {Mode: fs.ModeDir | 0700},
@@ -123,15 +183,19 @@ func TestExec(t *testing.T) {
"checksum/OLBgp1GsljhM2TJ-sbHjaiH9txEUvgdDTAzHv2P24donTt6_529l-9Ua0vFImLlb": {Mode: 0400, Data: []byte{}},
"identifier": {Mode: fs.ModeDir | 0700},
"identifier/IY91PCtOpCYy21AaIK0c9f8-Z6fb2_2ewoHWkt4dxoLf0GOrWqS8yAGFLV84b1Dw": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/" + wantOfflineEncode)},
"identifier/QwS7SmiatdqryQYgESdGw7Yw2PcpNf0vNfpvUA0t92BTlKiUjfCrXyMW17G2X77X": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"identifier/_gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/OLBgp1GsljhM2TJ-sbHjaiH9txEUvgdDTAzHv2P24donTt6_529l-9Ua0vFImLlb")},
"identifier/" + expected.Offline: {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/" + wantOfflineEncode)},
"identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"substitute": {Mode: fs.ModeDir | 0700},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
{"net", pkg.CValidateKnown, nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
{"net", pkg.CValidateKnown | checkDestroySubstitutes, nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
testtool, testtoolDestroy := newTesttool()
wantNet := expectsFS{
@@ -141,7 +205,7 @@ func TestExec(t *testing.T) {
}
cureMany(t, c, []cureStep{
{"container", pkg.NewExec(
"exec-net", new(wantNet.hash()), 0, false,
"exec-net", "", new(wantNet.hash()), 0, true, false,
pkg.AbsWork,
[]string{"HAKUREI_TEST=1"},
check.MustAbs("/opt/bin/testtool"),
@@ -164,6 +228,7 @@ func TestExec(t *testing.T) {
), ignorePathname, wantNet, nil},
})
destroyStatus(t, base, 2, 0)
testtoolDestroy(t, base, c)
}, expectsFS{
".": {Mode: fs.ModeDir | 0700},
@@ -179,16 +244,18 @@ func TestExec(t *testing.T) {
"identifier/_gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/OLBgp1GsljhM2TJ-sbHjaiH9txEUvgdDTAzHv2P24donTt6_529l-9Ua0vFImLlb")},
"identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"substitute": {Mode: fs.ModeDir | 0700},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
{"overlay root", pkg.CValidateKnown, nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
{"overlay root", pkg.CValidateKnown | checkDestroySubstitutes, nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
testtool, testtoolDestroy := newTesttool()
cureMany(t, c, []cureStep{
{"container", pkg.NewExec(
"exec-overlay-root", nil, 0, false,
"exec-overlay-root", "", nil, 0, false, false,
pkg.AbsWork,
[]string{"HAKUREI_TEST=1", "HAKUREI_ROOT=1"},
check.MustAbs("/opt/bin/testtool"),
@@ -205,6 +272,7 @@ func TestExec(t *testing.T) {
), ignorePathname, wantOffline, nil},
})
destroyStatus(t, base, 2, 0)
testtoolDestroy(t, base, c)
}, expectsFS{
".": {Mode: fs.ModeDir | 0700},
@@ -218,16 +286,18 @@ func TestExec(t *testing.T) {
"identifier/" + expected.OvlRoot: {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/" + wantOfflineEncode)},
"identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"substitute": {Mode: fs.ModeDir | 0700},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
{"overlay work", pkg.CValidateKnown, nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
{"overlay work", pkg.CValidateKnown | checkDestroySubstitutes, nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
testtool, testtoolDestroy := newTesttool()
cureMany(t, c, []cureStep{
{"container", pkg.NewExec(
"exec-overlay-work", nil, 0, false,
"exec-overlay-work", "", nil, 0, false, false,
pkg.AbsWork,
[]string{"HAKUREI_TEST=1", "HAKUREI_ROOT=1"},
check.MustAbs("/work/bin/testtool"),
@@ -249,6 +319,7 @@ func TestExec(t *testing.T) {
), ignorePathname, wantOffline, nil},
})
destroyStatus(t, base, 2, 0)
testtoolDestroy(t, base, c)
}, expectsFS{
".": {Mode: fs.ModeDir | 0700},
@@ -262,16 +333,18 @@ func TestExec(t *testing.T) {
"identifier/" + expected.Work: {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/" + wantOfflineEncode)},
"identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"substitute": {Mode: fs.ModeDir | 0700},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
{"multiple layers", pkg.CValidateKnown, nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
{"multiple layers", pkg.CValidateKnown | checkDestroySubstitutes, nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
testtool, testtoolDestroy := newTesttool()
cureMany(t, c, []cureStep{
{"container", pkg.NewExec(
"exec-multiple-layers", nil, 0, false,
"exec-multiple-layers", "", nil, 0, false, false,
pkg.AbsWork,
[]string{"HAKUREI_TEST=1", "HAKUREI_ROOT=1"},
check.MustAbs("/opt/bin/testtool"),
@@ -315,6 +388,7 @@ func TestExec(t *testing.T) {
), ignorePathname, wantOffline, nil},
})
destroyStatus(t, base, 2, 0)
testtoolDestroy(t, base, c)
}, expectsFS{
".": {Mode: fs.ModeDir | 0700},
@@ -333,16 +407,18 @@ func TestExec(t *testing.T) {
"identifier/" + expected.Layers: {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/" + wantOfflineEncode)},
"identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"substitute": {Mode: fs.ModeDir | 0700},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
{"overlay layer promotion", pkg.CValidateKnown, nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
{"overlay layer promotion", pkg.CValidateKnown | checkDestroySubstitutes, nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
testtool, testtoolDestroy := newTesttool()
cureMany(t, c, []cureStep{
{"container", pkg.NewExec(
"exec-layer-promotion", nil, 0, true,
"exec-layer-promotion", "", nil, 0, false, true,
pkg.AbsWork,
[]string{"HAKUREI_TEST=1", "HAKUREI_ROOT=1"},
check.MustAbs("/opt/bin/testtool"),
@@ -365,6 +441,7 @@ func TestExec(t *testing.T) {
), ignorePathname, wantOffline, nil},
})
destroyStatus(t, base, 2, 0)
testtoolDestroy(t, base, c)
}, expectsFS{
".": {Mode: fs.ModeDir | 0700},
@@ -379,6 +456,75 @@ func TestExec(t *testing.T) {
"identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"identifier/" + expected.Promote: {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/" + wantOfflineEncode)},
"substitute": {Mode: fs.ModeDir | 0700},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
{"binfmt", pkg.CValidateKnown | checkDestroySubstitutes, nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
if info.CanDegrade && os.Getenv("ROSA_SKIP_BINFMT") != "" {
t.Skip("binfmt_misc test explicitly skipped")
}
cureMany(t, c, []cureStep{
{"container", pkg.NewExec(
"exec-binfmt", "cafe", nil, 0, false, true,
pkg.AbsWork,
[]string{"HAKUREI_TEST=1", "HAKUREI_BINFMT=1"},
check.MustAbs("/opt/bin/sample"),
[]string{"sample"},
pkg.MustPath("/", true, &stubArtifact{
kind: pkg.KindTar,
params: []byte("empty directory"),
cure: func(t *pkg.TContext) error {
return os.MkdirAll(t.GetWorkDir().String(), 0700)
},
}),
pkg.MustPath("/opt", false, overrideIdent{pkg.ID{0xfe, 0xff}, &stubArtifact{
kind: pkg.KindTar,
cure: func(t *pkg.TContext) error {
work := t.GetWorkDir()
if err := os.MkdirAll(
work.Append("bin").String(),
0700,
); err != nil {
return err
}
return os.WriteFile(t.GetWorkDir().Append(
"bin",
"sample",
).String(), []byte(expected.Full), 0500)
},
}}),
), ignorePathname, expectsFS{
".": {Mode: fs.ModeDir | 0500},
"check": {Mode: 0400, Data: []byte("binfmt")},
}, nil},
})
destroyStatus(t, base, 2, 0)
}, expectsFS{
".": {Mode: fs.ModeDir | 0700},
"checksum": {Mode: fs.ModeDir | 0700},
"checksum/5aevg3YpDxjqQZ-pdvXK7YqgkL5JKqcoStYQxeD96kuYar6K2mRQWMHib6NQRnpV": {Mode: fs.ModeDir | 0500},
"checksum/5aevg3YpDxjqQZ-pdvXK7YqgkL5JKqcoStYQxeD96kuYar6K2mRQWMHib6NQRnpV/bin": {Mode: fs.ModeDir | 0700},
"checksum/5aevg3YpDxjqQZ-pdvXK7YqgkL5JKqcoStYQxeD96kuYar6K2mRQWMHib6NQRnpV/bin/sample": {Mode: 0500, Data: []byte("\xca\xfe\xba\xbe\xfd\xfd:3")},
"checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU": {Mode: fs.ModeDir | 0500},
"checksum/UnDo4B5KneEUY5b4vRUk_y9MWgkWuw2N8f8a2XayO686xXur-aZmX2-7n_8tKMe3": {Mode: fs.ModeDir | 0500},
"checksum/UnDo4B5KneEUY5b4vRUk_y9MWgkWuw2N8f8a2XayO686xXur-aZmX2-7n_8tKMe3/check": {Mode: 0400, Data: []byte("binfmt")},
"identifier": {Mode: fs.ModeDir | 0700},
"identifier/6VQTJ1lI5BmVuI1YFYJ8ClO3MRORvTTrcWFDcUU-l5Ga8EofxCxGlSTYN-u8dKj_": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/UnDo4B5KneEUY5b4vRUk_y9MWgkWuw2N8f8a2XayO686xXur-aZmX2-7n_8tKMe3")},
"identifier/_v8AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/5aevg3YpDxjqQZ-pdvXK7YqgkL5JKqcoStYQxeD96kuYar6K2mRQWMHib6NQRnpV")},
"identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"substitute": {Mode: fs.ModeDir | 0700},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},

View File

@@ -29,6 +29,8 @@ func TestFile(t *testing.T) {
"identifier": {Mode: fs.ModeDir | 0700},
"identifier/3376ALA7hIUm2LbzH2fDvRezgzod1eTK_G6XjyOgbM2u-6swvkFaF0BOwSl_juBi": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/vsAhtPNo4waRNOASwrQwcIPTqb3SBuJOXw2G4T1mNmVZM-wrQTRllmgXqcIIoRcX")},
"substitute": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
})

View File

@@ -1,2 +0,0 @@
// Package expected contains expected identifiers for exec artifact tests.
package expected

View File

@@ -0,0 +1,9 @@
// Package expected contains data shared between test helper and test harness.
package expected
const (
// Magic are magic bytes in the binfmt test case.
Magic = "\xca\xfe\xba\xbe\xfd\xfd"
// Full is the full content of the binfmt test case executable.
Full = Magic + ":3"
)

View File

@@ -1,10 +1,10 @@
package expected
const (
Offline = "dztPS6jRjiZtCF4_p8AzfnxGp6obkhrgFVsxdodbKWUoAEVtDz3MykepJB4kI_ks"
OvlRoot = "RdMA-mubnrHuu3Ky1wWyxauSYCO0ZH_zCPUj3uDHqkfwv5sGcByoF_g5PjlGiClb"
Layers = "p1t_drXr34i-jZNuxDMLaMOdL6tZvQqhavNafGynGqxOZoXAUTSn7kqNh3Ovv3DT"
Net = "G8qPxD9puvvoOVV7lrT80eyDeIl3G_CCFoKw12c8mCjMdG1zF7NEPkwYpNubClK3"
Promote = "xXTIYcXmgJWNLC91c417RRrNM9cjELwEZHpGvf8Fk_GNP5agRJp_SicD0w9aMeLJ"
Work = "5hlaukCirnXE4W_RSLJFOZN47Z5RiHnacXzdFp_70cLgiJUGR6cSb_HaFftkzi0-"
Offline = "q5ktDTq0miP-VvB2blxqXQeaRXCUWgP_KbC18KNtUDtyoaI_h5mHmGuPMArVEBDs"
OvlRoot = "NacZGXwuRkTvcHaG08a22ujJ8qCWN0RSoFlRSR5FSt0ZcBbJ28FRvkYsHEtX7G8i"
Layers = "WBJDrATtX6rIE5yAu8ePX3WmDF0Tt9kFiue0m3cRnyRoVx1my8a67fh3CAW486oP"
Net = "CmYtj2sNB3LHtqiDuck_Lz3MjLLIiwyP8N4NDitQ1Icvv__LVP9p8tm-sHeQaKKp"
Promote = "TX3eCloaQFkV-SZIH6Jg6E5WKH--rcXY1P0jnZKmLFKWrNqnOzd4G9eIBh6i5ywN"
Work = "OuNiLSC68pZhAOr1YQ4WbV1tzASA0nxLEBcK7lO7MqxDY_j8dmP_C612RTuF23Lu"
)

View File

@@ -0,0 +1,10 @@
package expected
const (
Offline = "WapqyoPxbWSnq07dWHt71mHaJXq99pAjJfFlELlJljSiZMhTFqqlzU1_mN86shSj"
OvlRoot = "V9anFOiRvjGfAeBhLl14AL8TKdWZyD0WTPYe4fS9mOBw8iW5Lmarvt6TG6MV8uWm"
Layers = "tKx7JNRoSBdK_7MdzI-nwTNV2wmiPzwYdcd17oLmXKL_iLmUzUiA79qTqdrTasrv"
Net = "aXyDLzBCJ9XltXZIfetEVsEkrqHfcXuD5XE_FcUnYbN3emwL55N6P8LlHzNfGnM5"
Promote = "3k4V16n96Lq04gjFSKmm4sFjyQ883FFBNXgTy9s_DjeTwxT3pg_iacEh8yMb_S4m"
Work = "6Q49MhFWRE3Ne6MycwAotgl1GtoU5WCHqJNWG2byYZCY-zX-IxPrWiKk7bKkNzhE"
)

View File

@@ -0,0 +1,10 @@
package expected
const (
Offline = "Z6yXE5gOJScL3srmnVMWgCXccDiUNZ5snSrf6RkXuU1_U0rX_kGVwsfHUgNG_awd"
OvlRoot = "zYXJHFRLuxvUhuisZEXgGgVvdQd6piMfp5jmtT6jdVjvC2gICXquOq-UTwlrSD5I"
Layers = "_F8EDazHbcLeT0sVSQXRN_kn9IjduqJcDYgzXpsT-hpKU4EBcZ0PISN2zchpqMbm"
Net = "CA_FAaSIYJgapBEHV40doxpH23PdUEy_6s1TZc7wfSPN0XYqwGpMceXXDSabGveO"
Promote = "_3LPrLp--4h9k4GsNNApu9hHtAafq-GUhfU6d4hJKBDKT3bz_szOsvkXxc5sK53d"
Work = "FEgHeiCD_WT4wsfB-9kDH5n6cRWCEYtJmXdKZgmUUukAOoXumH_hLlosXREC-tqq"
)

View File

@@ -24,6 +24,19 @@ func main() {
log.SetFlags(0)
log.SetPrefix("testtool: ")
if os.Getenv("HAKUREI_BINFMT") == "1" {
wantArgs := []string{"/interpreter", "/opt/bin/sample"}
if !slices.Equal(os.Args, wantArgs) {
log.Fatalf("Args: %q, want %q", os.Args, wantArgs)
}
if err := os.WriteFile("check", []byte("binfmt"), 0400); err != nil {
log.Fatal(err)
}
return
}
environ := slices.DeleteFunc(slices.Clone(os.Environ()), func(s string) bool {
return s == "CURE_JOBS="+strconv.Itoa(runtime.NumCPU())
})

View File

@@ -76,6 +76,9 @@ type IContext struct {
// Written to by various methods, should be zeroed after [Artifact.Params]
// returns and must not be exposed directly.
w io.Writer
// Optional [Artifact] to cureRes cache, replaces [IRKindIdent] with
// checksum values if non-nil.
inputs map[Artifact]cureRes
}
// irZero is a zero IR word.
@@ -163,7 +166,15 @@ func (i *IContext) WriteIdent(a Artifact) {
defer i.ic.putIdentBuf(buf)
IRKindIdent.encodeHeader(0).put(buf[:])
*(*ID)(buf[wordSize:]) = i.ic.Ident(a).Value()
if i.inputs != nil {
res, ok := i.inputs[a]
if !ok {
panic(InvalidLookupError(i.ic.Ident(a).Value()))
}
*(*ID)(buf[wordSize:]) = res.checksum.Value()
} else {
*(*ID)(buf[wordSize:]) = i.ic.Ident(a).Value()
}
i.mustWrite(buf[:])
}
@@ -207,19 +218,44 @@ func (i *IContext) WriteString(s string) {
// Encode writes a deterministic, efficient representation of a to w and returns
// the first non-nil error encountered while writing to w.
func (ic *irCache) Encode(w io.Writer, a Artifact) (err error) {
return ic.encode(w, a, nil)
}
// encode implements Encode but replaces identifiers with their cured checksums
// for a non-nil ident. Caller must acquire Cache.identMu.
func (ic *irCache) encode(
w io.Writer,
a Artifact,
inputs map[Artifact]cureRes,
) (err error) {
deps := a.Dependencies()
idents := make([]*extIdent, len(deps))
for i, d := range deps {
dbuf, did := ic.unsafeIdent(d, true)
if dbuf == nil {
dbuf = ic.getIdentBuf()
binary.LittleEndian.PutUint64(dbuf[:], uint64(d.Kind()))
*(*ID)(dbuf[wordSize:]) = did.Value()
} else {
ic.storeIdent(d, dbuf)
if inputs == nil {
for i, d := range deps {
dbuf, did := ic.unsafeIdent(d, true)
if dbuf == nil {
dbuf = ic.getIdentBuf()
binary.LittleEndian.PutUint64(dbuf[:], uint64(d.Kind()))
*(*ID)(dbuf[wordSize:]) = did.Value()
} else {
ic.storeIdent(d, dbuf)
}
defer ic.putIdentBuf(dbuf)
idents[i] = dbuf
}
} else {
for i, d := range deps {
res, ok := inputs[d]
if !ok {
return InvalidLookupError(ic.Ident(d).Value())
}
dbuf := ic.getIdentBuf()
binary.LittleEndian.PutUint64(dbuf[:], uint64(d.Kind()))
*(*ID)(dbuf[wordSize:]) = res.checksum.Value()
defer ic.putIdentBuf(dbuf)
idents[i] = dbuf
}
defer ic.putIdentBuf(dbuf)
idents[i] = dbuf
}
slices.SortFunc(idents, func(a, b *extIdent) int {
return bytes.Compare(a[:], b[:])
@@ -244,7 +280,7 @@ func (ic *irCache) Encode(w io.Writer, a Artifact) (err error) {
}
func() {
i := IContext{ic, w}
i := IContext{ic, w, inputs}
defer panicToError(&err)
defer func() { i.ic, i.w = nil, nil }()

View File

@@ -39,7 +39,7 @@ func TestIRRoundtrip(t *testing.T) {
)},
{"exec offline", pkg.NewExec(
"exec-offline", nil, 0, false,
"exec-offline", "", nil, 0, false, false,
pkg.AbsWork,
[]string{"HAKUREI_TEST=1"},
check.MustAbs("/opt/bin/testtool"),
@@ -59,9 +59,9 @@ func TestIRRoundtrip(t *testing.T) {
)},
{"exec net", pkg.NewExec(
"exec-net",
"exec-net", "",
(*pkg.Checksum)(bytes.Repeat([]byte{0xfc}, len(pkg.Checksum{}))),
0, false,
0, false, false,
pkg.AbsWork,
[]string{"HAKUREI_TEST=1"},
check.MustAbs("/opt/bin/testtool"),
@@ -80,6 +80,28 @@ func TestIRRoundtrip(t *testing.T) {
)),
)},
{"exec measured", pkg.NewExec(
"exec-measured", "",
(*pkg.Checksum)(bytes.Repeat([]byte{0xfd}, len(pkg.Checksum{}))),
0, false, false,
pkg.AbsWork,
[]string{"HAKUREI_TEST=1"},
check.MustAbs("/opt/bin/testtool"),
[]string{"testtool", "measured"},
pkg.MustPath("/file", false, pkg.NewFile("file", []byte(
"stub file",
))), pkg.MustPath("/.hakurei", false, pkg.NewHTTPGetTar(
nil, "file:///hakurei.tar",
pkg.Checksum(bytes.Repeat([]byte{0xfd}, len(pkg.Checksum{}))),
pkg.TarUncompressed,
)), pkg.MustPath("/opt", false, pkg.NewHTTPGetTar(
nil, "file:///testtool.tar.gz",
pkg.Checksum(bytes.Repeat([]byte{0xfd}, len(pkg.Checksum{}))),
pkg.TarGzip,
)),
)},
{"file anonymous", pkg.NewFile("", []byte{0})},
{"file", pkg.NewFile("stub", []byte("stub"))},
}
@@ -110,6 +132,7 @@ func TestIRRoundtrip(t *testing.T) {
".": {Mode: fs.ModeDir | 0700},
"checksum": {Mode: fs.ModeDir | 0700},
"identifier": {Mode: fs.ModeDir | 0700},
"substitute": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
},
}

View File

@@ -90,6 +90,7 @@ func TestHTTPGet(t *testing.T) {
".": {Mode: fs.ModeDir | 0700},
"checksum": {Mode: fs.ModeDir | 0700},
"identifier": {Mode: fs.ModeDir | 0700},
"substitute": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
@@ -159,6 +160,8 @@ func TestHTTPGet(t *testing.T) {
"identifier": {Mode: fs.ModeDir | 0700},
"identifier/oM-2pUlk-mOxK1t3aMWZer69UdOQlAXiAgMrpZ1476VoOqpYVP1aGFS9_HYy-D8_": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/fLYGIMHgN1louE-JzITJZJo2SDniPu-IHBXubtvQWFO-hXnDVKNuscV7-zlyr5fU")},
"substitute": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
})

View File

@@ -4,6 +4,7 @@ package pkg
import (
"bufio"
"bytes"
"cmp"
"context"
"crypto/sha512"
"encoding/base64"
@@ -15,6 +16,7 @@ import (
"io/fs"
"maps"
"os"
"os/signal"
"path/filepath"
"runtime"
"slices"
@@ -24,6 +26,7 @@ import (
"sync/atomic"
"syscall"
"testing"
"time"
"unique"
"unsafe"
@@ -247,7 +250,14 @@ func (t *TContext) destroy(errP *error) {
*errP = errors.Join(*errP, err)
}
if *errP != nil {
*errP = errors.Join(*errP, os.Remove(t.statusPath.String()))
*errP = errors.Join(*errP, os.Rename(
t.statusPath.String(), t.cache.base.Append(
dirFault,
t.ids+"."+strconv.FormatUint(uint64(
time.Now().UnixNano(),
), 10),
).String(),
))
}
t.status = nil
}
@@ -508,33 +518,34 @@ const (
)
const (
// fileLock is the file name appended to Cache.base for guaranteeing
// exclusive access to the cache directory.
// fileLock is the lock file for exclusive access to the cache directory.
fileLock = "lock"
// fileVariant is the file name appended to Cache.base holding the variant
// identification string set by a prior call to [SetExtension].
// fileVariant is a file holding the variant identification string set by a
// prior call to [SetExtension].
fileVariant = "variant"
// dirIdentifier is the directory name appended to Cache.base for storing
// artifacts named after their [ID].
// dirSubstitute holds symlinks to artifacts by checksum, named after their
// substitute identifier.
dirSubstitute = "substitute"
// dirIdentifier holds symlinks to artifacts by checksum, named after their
// IR-based identifier.
dirIdentifier = "identifier"
// dirChecksum is the directory name appended to Cache.base for storing
// artifacts named after their [Checksum].
// dirChecksum holds artifacts named after their [Checksum].
dirChecksum = "checksum"
// dirStatus is the directory name appended to Cache.base for storing
// artifact metadata and logs named after their [ID].
// dirStatus holds artifact metadata and logs named after their IR-based
// identifier. For [FloodArtifact], the same file is also available under
// its substitute identifier.
dirStatus = "status"
// dirFault holds status files of faulted cures.
dirFault = "fault"
// dirWork is the directory name appended to Cache.base for working
// pathnames set up during [Cache.Cure].
// dirWork holds working pathnames set up during [Cache.Cure].
dirWork = "work"
// dirTemp is the directory name appended to Cache.base for scratch space
// pathnames allocated during [Cache.Cure].
// dirTemp holds scratch space allocated during [Cache.Cure].
dirTemp = "temp"
// dirExecScratch is the directory name appended to Cache.base for scratch
// space setting up the container started by [Cache.EnterExec]. Exclusivity
// via Cache.inExec.
// dirExecScratch is scratch space set up for the container started by
// [Cache.EnterExec]. Exclusivity via Cache.inExec.
dirExecScratch = "scratch"
// checksumLinknamePrefix is prepended to the encoded [Checksum] value
@@ -621,6 +632,9 @@ const (
// CSuppressInit arranges for verbose output of the container init to be
// suppressed regardless of [message.Msg] state.
CSuppressInit
// CIgnoreSubstitutes disables content-based dependency substitution.
CIgnoreSubstitutes
)
// toplevel holds [context.WithCancel] over caller-supplied context, where all
@@ -676,6 +690,11 @@ type Cache struct {
// Synchronises access to dirChecksum.
checksumMu sync.RWMutex
// Presence of an alternative in the cache. Keys are not valid identifiers
// and must not be used as such.
substitute map[unique.Handle[ID]]unique.Handle[Checksum]
// Synchronises access to substitute and corresponding filesystem entries.
substituteMu sync.RWMutex
// Identifier to content pair cache.
ident map[unique.Handle[ID]]unique.Handle[Checksum]
// Identifier to error pair for unrecoverably faulted [Artifact].
@@ -886,11 +905,14 @@ func (c *Cache) Scrub(checks int) error {
checks = runtime.NumCPU()
}
c.substituteMu.Lock()
defer c.substituteMu.Unlock()
c.identMu.Lock()
defer c.identMu.Unlock()
c.checksumMu.Lock()
defer c.checksumMu.Unlock()
c.substitute = make(map[unique.Handle[ID]]unique.Handle[Checksum])
c.ident = make(map[unique.Handle[ID]]unique.Handle[Checksum])
c.identErr = make(map[unique.Handle[ID]]error)
c.artifact.Clear()
@@ -998,47 +1020,52 @@ func (c *Cache) Scrub(checks int) error {
wg.Wait()
}
dir = c.base.Append(dirIdentifier)
if entries, readdirErr := os.ReadDir(dir.String()); readdirErr != nil {
addErr(dir, readdirErr)
} else {
wg.Add(len(entries))
for _, ent := range entries {
w <- checkEntry{ent, func(ent os.DirEntry, want *Checksum) bool {
got := p.Get().(*Checksum)
defer p.Put(got)
for _, suffix := range []string{
dirSubstitute,
dirIdentifier,
} {
dir = c.base.Append(suffix)
if entries, readdirErr := os.ReadDir(dir.String()); readdirErr != nil {
addErr(dir, readdirErr)
} else {
wg.Add(len(entries))
for _, ent := range entries {
w <- checkEntry{ent, func(ent os.DirEntry, want *Checksum) bool {
got := p.Get().(*Checksum)
defer p.Put(got)
pathname := dir.Append(ent.Name())
if linkname, err := os.Readlink(
pathname.String(),
); err != nil {
seMu.Lock()
se.Errs[pathname.Handle()] = append(se.Errs[pathname.Handle()], err)
se.DanglingIdentifiers = append(se.DanglingIdentifiers, *want)
seMu.Unlock()
return false
} else if err = Decode(got, filepath.Base(linkname)); err != nil {
seMu.Lock()
lnp := dir.Append(linkname)
se.Errs[lnp.Handle()] = append(se.Errs[lnp.Handle()], err)
se.DanglingIdentifiers = append(se.DanglingIdentifiers, *want)
seMu.Unlock()
return false
}
if _, err := os.Stat(pathname.String()); err != nil {
if !errors.Is(err, os.ErrNotExist) {
addErr(pathname, err)
pathname := dir.Append(ent.Name())
if linkname, err := os.Readlink(
pathname.String(),
); err != nil {
seMu.Lock()
se.Errs[pathname.Handle()] = append(se.Errs[pathname.Handle()], err)
se.DanglingIdentifiers = append(se.DanglingIdentifiers, *want)
seMu.Unlock()
return false
} else if err = Decode(got, filepath.Base(linkname)); err != nil {
seMu.Lock()
lnp := dir.Append(linkname)
se.Errs[lnp.Handle()] = append(se.Errs[lnp.Handle()], err)
se.DanglingIdentifiers = append(se.DanglingIdentifiers, *want)
seMu.Unlock()
return false
}
seMu.Lock()
se.DanglingIdentifiers = append(se.DanglingIdentifiers, *want)
seMu.Unlock()
return false
}
return true
}}
if _, err := os.Stat(pathname.String()); err != nil {
if !errors.Is(err, os.ErrNotExist) {
addErr(pathname, err)
}
seMu.Lock()
se.DanglingIdentifiers = append(se.DanglingIdentifiers, *want)
seMu.Unlock()
return false
}
return true
}}
}
wg.Wait()
}
wg.Wait()
}
dir = c.base.Append(dirStatus)
@@ -1186,6 +1213,52 @@ func (c *Cache) finaliseIdent(
close(done)
}
// zeroChecksum is a zero [Checksum] handle, used for comparison only.
var zeroChecksum unique.Handle[Checksum]
// loadSubstitute returns a checksum corresponding to a substitute identifier,
// or zeroChecksum if an alternative is not available.
func (c *Cache) loadSubstitute(
substitute unique.Handle[ID],
) (unique.Handle[Checksum], error) {
c.substituteMu.RLock()
if checksum, ok := c.substitute[substitute]; ok {
c.substituteMu.RUnlock()
return checksum, nil
}
linkname, err := os.Readlink(c.base.Append(
dirSubstitute,
Encode(substitute.Value()),
).String())
c.substituteMu.RUnlock()
if err != nil {
if !errors.Is(err, os.ErrNotExist) {
return zeroChecksum, err
}
c.substituteMu.Lock()
c.substitute[substitute] = zeroChecksum
c.substituteMu.Unlock()
return zeroChecksum, nil
}
var checksum unique.Handle[Checksum]
buf := c.getIdentBuf()
err = Decode((*Checksum)(buf[:]), filepath.Base(linkname))
if err == nil {
checksum = unique.Make(Checksum(buf[:]))
c.substituteMu.Lock()
c.substitute[substitute] = checksum
c.substituteMu.Unlock()
}
c.putIdentBuf(buf)
return checksum, err
}
// Done returns a channel that is closed when the ongoing cure of an [Artifact]
// referred to by the specified identifier completes. Done may return nil if
// no ongoing cure of the specified identifier exists.
@@ -1420,8 +1493,8 @@ func (c *Cache) Cure(a Artifact) (
// CureError wraps a non-nil error returned attempting to cure an [Artifact].
type CureError struct {
Ident unique.Handle[ID]
Err error
A Artifact
Err error
}
// Unwrap returns the underlying error.
@@ -1434,40 +1507,63 @@ func (e *CureError) Error() string { return e.Err.Error() }
type DependencyCureError []*CureError
// unwrapM recursively expands underlying errors into a caller-supplied map.
func (e *DependencyCureError) unwrapM(me map[unique.Handle[ID]]*CureError) {
func (e *DependencyCureError) unwrapM(
ctx context.Context,
ir *IRCache,
me map[unique.Handle[ID]]*CureError,
) {
for _, err := range *e {
if _, ok := me[err.Ident]; ok {
if ctx.Err() != nil {
break
}
id := ir.Ident(err.A)
if _, ok := me[id]; ok {
continue
}
if _e, ok := err.Err.(*DependencyCureError); ok {
_e.unwrapM(me)
_e.unwrapM(ctx, ir, me)
continue
}
me[err.Ident] = err
me[id] = err
}
}
// unwrap recursively expands and deduplicates underlying errors.
func (e *DependencyCureError) unwrap() DependencyCureError {
func (e *DependencyCureError) unwrap(
ctx context.Context,
ir *IRCache,
) DependencyCureError {
me := make(map[unique.Handle[ID]]*CureError)
e.unwrapM(me)
errs := slices.AppendSeq(
make(DependencyCureError, 0, len(me)),
maps.Values(me),
)
e.unwrapM(ctx, ir, me)
type ent struct {
id unique.Handle[ID]
err *CureError
}
errs := make([]*ent, 0, len(me))
for id, err := range me {
errs = append(errs, &ent{id, err})
}
var identBuf [2]ID
slices.SortFunc(errs, func(a, b *CureError) int {
identBuf[0], identBuf[1] = a.Ident.Value(), b.Ident.Value()
slices.SortFunc(errs, func(a, b *ent) int {
identBuf[0], identBuf[1] = a.id.Value(), b.id.Value()
return slices.Compare(identBuf[0][:], identBuf[1][:])
})
return errs
_errs := make(DependencyCureError, len(errs))
for i, v := range errs {
_errs[i] = v.err
}
return _errs
}
// Unwrap returns a deduplicated slice of underlying errors.
func (e *DependencyCureError) Unwrap() []error {
errs := e.unwrap()
ctx, cancel := signal.NotifyContext(context.Background(), os.Interrupt)
defer cancel()
errs := e.unwrap(ctx, NewIR())
_errs := make([]error, len(errs))
for i, err := range errs {
_errs[i] = err
@@ -1477,14 +1573,23 @@ func (e *DependencyCureError) Unwrap() []error {
// Error returns a user-facing multiline error message.
func (e *DependencyCureError) Error() string {
errs := e.unwrap()
ctx, cancel := signal.NotifyContext(context.Background(), os.Interrupt)
defer cancel()
ir := NewIR()
errs := e.unwrap(ctx, ir)
if len(errs) == 0 {
return "invalid dependency cure outcome"
}
var buf strings.Builder
buf.WriteString("errors curing dependencies:")
for _, err := range errs {
buf.WriteString("\n\t" + Encode(err.Ident.Value()) + ": " + err.Error())
buf.WriteString("\n\t" +
reportName(err.A, ir.Ident(err.A)) + ": " +
err.Error())
}
if ctx.Err() != nil {
buf.WriteString("\nerror resolution cancelled")
}
return buf.String()
}
@@ -1654,16 +1759,44 @@ func (c *Cache) cure(a Artifact, curesExempt bool) (
return
}
var checksums string
var (
checksums string
substitute unique.Handle[ID]
alternative *check.Absolute
)
defer func() {
if err == nil && checksums != "" {
linkname := checksumLinknamePrefix + checksums
err = os.Symlink(
checksumLinknamePrefix+checksums,
linkname,
pathname.String(),
)
if err == nil {
err = zeroTimes(pathname.String())
}
if err == nil && alternative != nil {
c.substituteMu.Lock()
err = os.Symlink(
linkname,
alternative.String(),
)
if errors.Is(err, os.ErrExist) {
c.msg.Verbosef(
"creating alternative over %s for artifact %s",
Encode(substitute.Value()), ids,
)
err = nil
}
if err == nil {
err = zeroTimes(alternative.String())
}
if err == nil && checksum != zeroChecksum {
c.substitute[substitute] = checksum
}
c.substituteMu.Unlock()
}
}
}()
@@ -1860,11 +1993,64 @@ func (c *Cache) cure(a Artifact, curesExempt bool) (
f.deps[deps[i]] = p
}
sh := sha512.New384()
err = c.encode(sh, a, f.deps)
if err != nil {
return
}
buf := c.getIdentBuf()
sh.Sum(buf[wordSize:wordSize])
substitute = unique.Make(ID(buf[wordSize:]))
substitutes := Encode(substitute.Value())
c.putIdentBuf(buf)
alternative = c.base.Append(
dirSubstitute,
substitutes,
)
if c.flags&CIgnoreSubstitutes == 0 {
var substituteChecksum unique.Handle[Checksum]
substituteChecksum, err = c.loadSubstitute(substitute)
if err != nil {
return
}
if substituteChecksum != zeroChecksum {
checksum = substituteChecksum
checksums = Encode(checksum.Value())
checksumPathname = c.base.Append(
dirChecksum,
checksums,
)
if _, err = os.Lstat(c.base.Append(
dirStatus,
substitutes,
).String()); err == nil {
err = os.Symlink(substitutes, c.base.Append(
dirStatus,
ids,
).String())
} else if errors.Is(err, os.ErrNotExist) {
err = nil
}
return
}
}
defer f.destroy(&err)
if err = c.enterCure(a, curesExempt); err != nil {
return
}
err = ca.Cure(&f)
if err == nil && f.status != nil {
err = os.Link(c.base.Append(
dirStatus,
ids,
).String(), c.base.Append(
dirStatus,
substitutes,
).String())
}
c.exitCure(a, curesExempt)
if err != nil {
return
@@ -1957,7 +2143,7 @@ func (pending *pendingArtifactDep) cure(c *Cache) {
}
pending.errsMu.Lock()
*pending.errs = append(*pending.errs, &CureError{c.Ident(pending.a), err})
*pending.errs = append(*pending.errs, &CureError{pending.a, err})
pending.errsMu.Unlock()
}
@@ -1973,6 +2159,52 @@ func (c *Cache) OpenStatus(a Artifact) (r io.ReadSeekCloser, err error) {
return
}
// Fault holds the pathname and termination time of an [Artifact] fault entry.
type Fault struct {
*check.Absolute
t uint64
}
// Time returns the instant in time where the fault occurred.
func (f Fault) Time() time.Time { return time.Unix(0, int64(f.t)) }
// Open opens the underlying entry for reading.
func (f Fault) Open() (io.ReadCloser, error) { return os.Open(f.Absolute.String()) }
// Destroy removes the underlying fault entry.
func (f Fault) Destroy() error { return os.Remove(f.Absolute.String()) }
// ReadFaults returns fault entries for an [Artifact].
func (c *Cache) ReadFaults(a Artifact) (faults []Fault, err error) {
prefix := Encode(c.Ident(a).Value()) + "."
var dents []os.DirEntry
if dents, err = os.ReadDir(c.base.Append(dirFault).String()); err != nil {
return
}
for _, dent := range dents {
name := dent.Name()
if !strings.HasPrefix(name, prefix) {
continue
}
var t uint64
t, err = strconv.ParseUint(name[len(prefix):], 10, 64)
if err != nil {
return
}
faults = append(faults, Fault{c.base.Append(
dirFault,
name,
), t})
}
slices.SortFunc(faults, func(a, b Fault) int {
return cmp.Compare(a.t, b.t)
})
return
}
// Abort cancels all pending cures and waits for them to clean up, but does not
// close the cache.
func (c *Cache) Abort() {
@@ -2072,9 +2304,11 @@ func open(
}
for _, name := range []string{
dirSubstitute,
dirIdentifier,
dirChecksum,
dirStatus,
dirFault,
dirWork,
} {
if err := os.MkdirAll(
@@ -2097,6 +2331,7 @@ func open(
irCache: zeroIRCache(),
substitute: make(map[unique.Handle[ID]]unique.Handle[Checksum]),
ident: make(map[unique.Handle[ID]]unique.Handle[Checksum]),
identErr: make(map[unique.Handle[ID]]error),
identPending: make(map[unique.Handle[ID]]*pendingCure),

View File

@@ -21,6 +21,7 @@ import (
"syscall"
"testing"
"testing/fstest"
"time"
"unique"
"unsafe"
@@ -244,6 +245,41 @@ func newDestroyArtifactFunc(a pkg.Artifact) func(
}
}
// destroyStatus counts non-substitution status entries and destroys them.
func destroyStatus(t *testing.T, base *check.Absolute, c, s int) {
dents, err := os.ReadDir(base.Append("status").String())
if err != nil {
t.Fatal(err)
}
var gotC, gotS int
for _, dent := range dents {
if err = os.Remove(base.Append(
"status",
dent.Name(),
).String()); err != nil {
t.Fatal(err)
}
if dent.Type().IsRegular() {
gotC++
continue
}
if dent.Type()&fs.ModeSymlink == fs.ModeSymlink {
gotS++
continue
}
t.Errorf("%s: %s", dent.Name(), dent.Type())
}
if gotC != c {
t.Errorf("status: c = %d, want %d", gotC, c)
}
if gotS != s {
t.Errorf("status: s = %d, want %d", gotS, s)
}
}
func TestIdent(t *testing.T) {
t.Parallel()
@@ -392,6 +428,12 @@ type cacheTestCase struct {
want expectsFS
}
const (
// checkDestroySubstitutes arranges for substitutes to be destroyed before
// measurement during checkWithCache.
checkDestroySubstitutes = 1 << (iota + 32)
)
// checkWithCache runs a slice of cacheTestCase.
func checkWithCache(t *testing.T, testCases []cacheTestCase) {
t.Helper()
@@ -472,10 +514,27 @@ func checkWithCache(t *testing.T, testCases []cacheTestCase) {
}
}
// destroy non-deterministic status files
if err := os.RemoveAll(base.Append("status").String()); err != nil {
// destroy non-deterministic substitutes
if tc.flags&checkDestroySubstitutes != 0 {
substitute := base.Append("substitute")
if err := os.RemoveAll(substitute.String()); err != nil {
t.Fatal(err)
} else if err = os.Mkdir(substitute.String(), 0700); err != nil {
t.Fatal(err)
}
}
// destroy empty status directory
if err := syscall.Rmdir(base.Append("status").String()); err != nil {
t.Error(expectsFrom(base.Append("status").String()))
t.Fatal(err)
}
// destroy empty fault directory
if err := os.Remove(base.Append("fault").String()); err != nil {
t.Fatal(err)
}
want := tc.want.hash()
var checksum pkg.Checksum
@@ -539,6 +598,21 @@ func cureMany(t *testing.T, c *pkg.Cache, steps []cureStep) {
for _, step := range steps {
t.Log("cure step:", step.name)
if pathname, checksum, err := c.Cure(step.a); !reflect.DeepEqual(err, step.err) {
faults, _err := c.ReadFaults(step.a)
if _err != nil {
t.Errorf("ReadFaults: error = %v", _err)
}
var p []byte
for _, fault := range faults {
p, _err = os.ReadFile(fault.String())
if _err != nil {
t.Error(_err)
continue
}
t.Log(string(p))
t.Logf("faulting cure terminated %s ago", time.Since(faults[0].Time()))
}
t.Fatalf("Cure: error = %v, want %v", err, step.err)
} else if step.pathname != ignorePathname && !pathname.Is(step.pathname) {
t.Fatalf("Cure: pathname = %q, want %q", pathname, step.pathname)
@@ -635,6 +709,15 @@ func TestCache(t *testing.T) {
"identifier",
"cafebabecafebabecafebabecafebabecafebabecafebabecafebabecafebabe",
)
failingFile := newStubFile(
pkg.KindHTTPGet,
pkg.ID{0xff, 3},
nil,
nil, struct {
_ []byte
stub.UniqueError
}{UniqueError: 0xbad},
)
cureMany(t, c, []cureStep{
{"initial file", newStubFile(
@@ -716,22 +799,14 @@ func TestCache(t *testing.T) {
{"noncomparable error", &stubArtifactF{
kind: pkg.KindExec,
params: []byte("artifact with dependency returning noncomparable error"),
deps: []pkg.Artifact{newStubFile(
pkg.KindHTTPGet,
pkg.ID{0xff, 3},
nil,
nil, struct {
_ []byte
stub.UniqueError
}{UniqueError: 0xbad},
)},
deps: []pkg.Artifact{failingFile},
cure: func(f *pkg.FContext) error {
panic("attempting to cure impossible artifact")
},
}, nil, nil, &pkg.DependencyCureError{
{
Ident: unique.Make(pkg.ID{0xff, 3}),
A: failingFile,
Err: struct {
_ []byte
stub.UniqueError
@@ -788,6 +863,8 @@ func TestCache(t *testing.T) {
"identifier/cafebabecafebabecafebabecafebabecafebabecafebabecafebabecafebabe": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/0bSFPu5Tnd-2Jj0Mv6co23PW2t3BmHc7eLFj9TgY3eIBg8zislo7xZYNBqovVLcq")},
"identifier/deadbeefdeadbeefdeadbeefdeadbeefdeadbeefdeadbeefdeadbeefdeadbeef": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/0bSFPu5Tnd-2Jj0Mv6co23PW2t3BmHc7eLFj9TgY3eIBg8zislo7xZYNBqovVLcq")},
"substitute": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
@@ -972,11 +1049,87 @@ func TestCache(t *testing.T) {
}}, nil, nil, pkg.InvalidFileModeError(
fs.ModeSymlink | 0777,
)},
{"alternative", &stubArtifactF{
kind: pkg.KindExec,
params: []byte("substitutable artifact"),
deps: []pkg.Artifact{newStubFile(
pkg.KindHTTPGet,
pkg.ID{0xff, 8},
nil,
[]byte("substitutable dependency"),
nil,
)},
cure: func(f *pkg.FContext) error {
return makeSample(&f.TContext)
},
}, base.Append(
"identifier",
"xMDWovje7OfyIaDy_2VnjpKxRqSOQ_LoeD946t-3WsS2V2SeMJ7nDGrNfpa4Pbc-",
), want, nil},
{"substitutable", &stubArtifactF{
kind: pkg.KindExec,
params: []byte("substitutable artifact"),
deps: []pkg.Artifact{newStubFile(
pkg.KindHTTPGet,
pkg.ID{0xff, 10},
nil,
[]byte("substitutable dependency"),
nil,
)},
cure: func(f *pkg.FContext) error {
panic("substitution missed")
},
}, base.Append(
"identifier",
"k2ilgG5KQ9NXnMoT2oB6NdwOnSPRn_H24oXQc4l6qOYIxIG9XfuEczeyrR8UEv_f",
), want, nil},
})
if c0, err := unsafeOpen(
t.Context(),
message.New(nil),
0, 0, 0, base, false,
); err != nil {
t.Fatalf("open: error = %v", err)
} else {
t.Cleanup(c.Close)
cureMany(t, c0, []cureStep{
{"substitutable", &stubArtifactF{
kind: pkg.KindExec,
params: []byte("substitutable artifact"),
deps: []pkg.Artifact{newStubFile(
pkg.KindHTTPGet,
pkg.ID{0xff, 0xff, 0xfd, 0xfd},
nil,
[]byte("substitutable dependency"),
nil,
)},
cure: func(f *pkg.FContext) error {
panic("substitution missed")
},
}, base.Append(
"identifier",
"_EmV5nsYZ2UWHgRmLDMU8i-rJWDx-kv5_1pFrzQI7vMMCM5mAXivO8UZtVfOqMR_",
), want, nil},
})
}
if dents, err := os.ReadDir(base.Append("status").String()); err != nil {
t.Fatal(err)
} else if len(dents) > 0 {
t.Errorf("ReadDir: %v", dents)
}
}, expectsFS{
".": {Mode: fs.ModeDir | 0700},
"checksum": {Mode: fs.ModeDir | 0700},
"checksum/7lfQ4QwSpV8nw7IDh0JiQ_jqUPrPv3_Vfie034RxsSy-cy4vO8DVvxgpx2LW08oO": {Mode: 0400, Data: []byte("substitutable dependency")},
"checksum/qRN6in76LndiiOZJheHkwyW8UT1N5-f-bXvHfDvwrMw2fSkOoZdh8pWE1qhLk65b": {Mode: fs.ModeDir | 0500},
"checksum/qRN6in76LndiiOZJheHkwyW8UT1N5-f-bXvHfDvwrMw2fSkOoZdh8pWE1qhLk65b/check": {Mode: 0400, Data: []byte{0, 0}},
"checksum/qRN6in76LndiiOZJheHkwyW8UT1N5-f-bXvHfDvwrMw2fSkOoZdh8pWE1qhLk65b/lib": {Mode: fs.ModeDir | 0700},
@@ -986,6 +1139,15 @@ func TestCache(t *testing.T) {
"identifier": {Mode: fs.ModeDir | 0700},
"identifier/HnySzeLQvSBZuTUcvfmLEX_OmH4yJWWH788NxuLuv7kVn8_uPM6Ks4rqFWM2NZJY": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/qRN6in76LndiiOZJheHkwyW8UT1N5-f-bXvHfDvwrMw2fSkOoZdh8pWE1qhLk65b")},
"identifier/Zx5ZG9BAwegNT3zQwCySuI2ktCXxNgxirkGLFjW4FW06PtojYVaCdtEw8yuntPLa": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/qRN6in76LndiiOZJheHkwyW8UT1N5-f-bXvHfDvwrMw2fSkOoZdh8pWE1qhLk65b")},
"identifier/_EmV5nsYZ2UWHgRmLDMU8i-rJWDx-kv5_1pFrzQI7vMMCM5mAXivO8UZtVfOqMR_": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/qRN6in76LndiiOZJheHkwyW8UT1N5-f-bXvHfDvwrMw2fSkOoZdh8pWE1qhLk65b")},
"identifier/___9_QAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/7lfQ4QwSpV8nw7IDh0JiQ_jqUPrPv3_Vfie034RxsSy-cy4vO8DVvxgpx2LW08oO")},
"identifier/_wgAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/7lfQ4QwSpV8nw7IDh0JiQ_jqUPrPv3_Vfie034RxsSy-cy4vO8DVvxgpx2LW08oO")},
"identifier/_woAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/7lfQ4QwSpV8nw7IDh0JiQ_jqUPrPv3_Vfie034RxsSy-cy4vO8DVvxgpx2LW08oO")},
"identifier/k2ilgG5KQ9NXnMoT2oB6NdwOnSPRn_H24oXQc4l6qOYIxIG9XfuEczeyrR8UEv_f": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/qRN6in76LndiiOZJheHkwyW8UT1N5-f-bXvHfDvwrMw2fSkOoZdh8pWE1qhLk65b")},
"identifier/xMDWovje7OfyIaDy_2VnjpKxRqSOQ_LoeD946t-3WsS2V2SeMJ7nDGrNfpa4Pbc-": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/qRN6in76LndiiOZJheHkwyW8UT1N5-f-bXvHfDvwrMw2fSkOoZdh8pWE1qhLk65b")},
"substitute": {Mode: fs.ModeDir | 0700},
"substitute/OyBGorh72Z9kVw35JUa8FbqDbpR4DqT-MX1jic0uKN5PdYmUBiAF38BRsIRnBigf": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/qRN6in76LndiiOZJheHkwyW8UT1N5-f-bXvHfDvwrMw2fSkOoZdh8pWE1qhLk65b")},
"work": {Mode: fs.ModeDir | 0700},
}},
@@ -1061,6 +1223,7 @@ func TestCache(t *testing.T) {
".": {Mode: fs.ModeDir | 0700},
"checksum": {Mode: fs.ModeDir | 0700},
"identifier": {Mode: fs.ModeDir | 0700},
"substitute": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
@@ -1119,6 +1282,7 @@ func TestCache(t *testing.T) {
".": {Mode: fs.ModeDir | 0700},
"checksum": {Mode: fs.ModeDir | 0700},
"identifier": {Mode: fs.ModeDir | 0700},
"substitute": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
@@ -1180,6 +1344,8 @@ func TestCache(t *testing.T) {
"identifier/_wAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/Aubi5EG4_Y8DhL9bQ3Q4HFBhLRF7X5gt9D3CNCQfT-TeBtlRXc7Zi_JYZEMoCC7M")},
"identifier/_wEAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/Aubi5EG4_Y8DhL9bQ3Q4HFBhLRF7X5gt9D3CNCQfT-TeBtlRXc7Zi_JYZEMoCC7M")},
"substitute": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
@@ -1230,6 +1396,7 @@ func TestCache(t *testing.T) {
".": {Mode: fs.ModeDir | 0700},
"checksum": {Mode: fs.ModeDir | 0700},
"identifier": {Mode: fs.ModeDir | 0700},
"substitute": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
}
@@ -1289,6 +1456,11 @@ func TestErrors(t *testing.T) {
{"UnsupportedVariantError", pkg.UnsupportedVariantError(
"rosa",
), `unsupported variant "rosa"`},
{"UnsupportedArchError zero", pkg.UnsupportedArchError(""),
"invalid architecture name"},
{"UnsupportedArchError", pkg.UnsupportedArchError("riscv64"),
"unsupported architecture riscv64"},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
@@ -1404,6 +1576,14 @@ errors during scrub:
func TestDependencyCureError(t *testing.T) {
t.Parallel()
makeIdent := func(ident ...byte) pkg.Artifact {
var a overrideIdent
copy(a.id[:], ident)
// does not compare equal
a.TrivialArtifact = new(stubArtifact)
return a
}
testCases := []struct {
name string
err pkg.DependencyCureError
@@ -1411,51 +1591,51 @@ func TestDependencyCureError(t *testing.T) {
unwrap []error
}{
{"simple", pkg.DependencyCureError{
{Ident: unique.Make(pkg.ID{0xff, 9}), Err: stub.UniqueError(0xbad09)},
{Ident: unique.Make(pkg.ID{0xff, 0}), Err: stub.UniqueError(0xbad00)},
{Ident: unique.Make(pkg.ID{0xff, 0xf}), Err: stub.UniqueError(0xbad0f)},
{Ident: unique.Make(pkg.ID{0xff, 1}), Err: stub.UniqueError(0xbad01)},
{A: makeIdent(0xff, 9), Err: stub.UniqueError(0xbad09)},
{A: makeIdent(0xff, 0), Err: stub.UniqueError(0xbad00)},
{A: makeIdent(0xff, 0xf), Err: stub.UniqueError(0xbad0f)},
{A: makeIdent(0xff, 1), Err: stub.UniqueError(0xbad01)},
}, `errors curing dependencies:
_wAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA: unique error 765184 injected by the test suite
_wEAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA: unique error 765185 injected by the test suite
_wkAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA: unique error 765193 injected by the test suite
_w8AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA: unique error 765199 injected by the test suite`, []error{
&pkg.CureError{Ident: unique.Make(pkg.ID{0xff, 0}), Err: stub.UniqueError(0xbad00)},
&pkg.CureError{Ident: unique.Make(pkg.ID{0xff, 1}), Err: stub.UniqueError(0xbad01)},
&pkg.CureError{Ident: unique.Make(pkg.ID{0xff, 9}), Err: stub.UniqueError(0xbad09)},
&pkg.CureError{Ident: unique.Make(pkg.ID{0xff, 0xf}), Err: stub.UniqueError(0xbad0f)},
&pkg.CureError{A: makeIdent(0xff, 0), Err: stub.UniqueError(0xbad00)},
&pkg.CureError{A: makeIdent(0xff, 1), Err: stub.UniqueError(0xbad01)},
&pkg.CureError{A: makeIdent(0xff, 9), Err: stub.UniqueError(0xbad09)},
&pkg.CureError{A: makeIdent(0xff, 0xf), Err: stub.UniqueError(0xbad0f)},
}},
{"dedup", pkg.DependencyCureError{
{Ident: unique.Make(pkg.ID{0xff, 9}), Err: stub.UniqueError(0xbad09)},
{Ident: unique.Make(pkg.ID{0xff, 0}), Err: stub.UniqueError(0xbad00)},
{Ident: unique.Make(pkg.ID{0xff, 0xfd}), Err: &pkg.DependencyCureError{
{Ident: unique.Make(pkg.ID{0xff, 9}), Err: stub.UniqueError(0xbad09)},
{Ident: unique.Make(pkg.ID{0xff, 0xc}), Err: &pkg.DependencyCureError{
{Ident: unique.Make(pkg.ID{0xff, 0xf}), Err: stub.UniqueError(0xbad0f)},
{Ident: unique.Make(pkg.ID{0xff, 0}), Err: stub.UniqueError(0xbad00)},
{A: makeIdent(0xff, 9), Err: stub.UniqueError(0xbad09)},
{A: makeIdent(0xff, 0), Err: stub.UniqueError(0xbad00)},
{A: makeIdent(0xff, 0xfd), Err: &pkg.DependencyCureError{
{A: makeIdent(0xff, 9), Err: stub.UniqueError(0xbad09)},
{A: makeIdent(0xff, 0xc), Err: &pkg.DependencyCureError{
{A: makeIdent(0xff, 0xf), Err: stub.UniqueError(0xbad0f)},
{A: makeIdent(0xff, 0), Err: stub.UniqueError(0xbad00)},
}},
{Ident: unique.Make(pkg.ID{0xff, 0}), Err: stub.UniqueError(0xbad00)},
{Ident: unique.Make(pkg.ID{0xff, 0}), Err: stub.UniqueError(0xbad00)},
{A: makeIdent(0xff, 0), Err: stub.UniqueError(0xbad00)},
{A: makeIdent(0xff, 0), Err: stub.UniqueError(0xbad00)},
}},
{Ident: unique.Make(pkg.ID{0xff, 0xff}), Err: &pkg.DependencyCureError{
{Ident: unique.Make(pkg.ID{0xff, 9}), Err: stub.UniqueError(0xbad09)},
{Ident: unique.Make(pkg.ID{0xff, 0xc}), Err: &pkg.DependencyCureError{
{Ident: unique.Make(pkg.ID{0xff, 0}), Err: stub.UniqueError(0xbad00)},
{A: makeIdent(0xff, 0xff), Err: &pkg.DependencyCureError{
{A: makeIdent(0xff, 9), Err: stub.UniqueError(0xbad09)},
{A: makeIdent(0xff, 0xc), Err: &pkg.DependencyCureError{
{A: makeIdent(0xff, 0), Err: stub.UniqueError(0xbad00)},
}},
{Ident: unique.Make(pkg.ID{0xff, 0}), Err: stub.UniqueError(0xbad00)},
{A: makeIdent(0xff, 0), Err: stub.UniqueError(0xbad00)},
}},
{Ident: unique.Make(pkg.ID{0xff, 0xf}), Err: stub.UniqueError(0xbad0f)},
{Ident: unique.Make(pkg.ID{0xff, 1}), Err: stub.UniqueError(0xbad01)},
{A: makeIdent(0xff, 0xf), Err: stub.UniqueError(0xbad0f)},
{A: makeIdent(0xff, 1), Err: stub.UniqueError(0xbad01)},
}, `errors curing dependencies:
_wAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA: unique error 765184 injected by the test suite
_wEAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA: unique error 765185 injected by the test suite
_wkAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA: unique error 765193 injected by the test suite
_w8AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA: unique error 765199 injected by the test suite`, []error{
&pkg.CureError{Ident: unique.Make(pkg.ID{0xff, 0}), Err: stub.UniqueError(0xbad00)},
&pkg.CureError{Ident: unique.Make(pkg.ID{0xff, 1}), Err: stub.UniqueError(0xbad01)},
&pkg.CureError{Ident: unique.Make(pkg.ID{0xff, 9}), Err: stub.UniqueError(0xbad09)},
&pkg.CureError{Ident: unique.Make(pkg.ID{0xff, 0xf}), Err: stub.UniqueError(0xbad0f)},
&pkg.CureError{A: makeIdent(0xff, 0), Err: stub.UniqueError(0xbad00)},
&pkg.CureError{A: makeIdent(0xff, 1), Err: stub.UniqueError(0xbad01)},
&pkg.CureError{A: makeIdent(0xff, 9), Err: stub.UniqueError(0xbad09)},
&pkg.CureError{A: makeIdent(0xff, 0xf), Err: stub.UniqueError(0xbad0f)},
}},
}
for _, tc := range testCases {
@@ -1496,6 +1676,25 @@ func (a earlyFailureF) Cure(*pkg.FContext) error {
return stub.UniqueError(0xcafe)
}
func BenchmarkEarlyDCE(b *testing.B) {
msg := message.New(log.New(os.Stderr, "dce: ", 0))
msg.SwapVerbose(testing.Verbose())
c, err := pkg.Open(b.Context(), msg, 0, 0, 0, check.MustAbs(b.TempDir()))
if err != nil {
b.Fatal(err)
}
_, _, err = c.Cure(earlyFailureF(8))
if !errors.Is(err, stub.UniqueError(0xcafe)) {
b.Fatalf("Cure: error = %v", err)
}
c.Close()
dce := err.(*pkg.DependencyCureError)
for b.Loop() {
dce.Unwrap()
}
}
func TestDependencyCureErrorEarly(t *testing.T) {
t.Parallel()
@@ -1509,6 +1708,7 @@ func TestDependencyCureErrorEarly(t *testing.T) {
".": {Mode: fs.ModeDir | 0700},
"checksum": {Mode: fs.ModeDir | 0700},
"identifier": {Mode: fs.ModeDir | 0700},
"substitute": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
})

View File

@@ -83,6 +83,8 @@ func TestTar(t *testing.T) {
"identifier/W5S65DEhawz_WKaok5NjUKLmnD9dNl5RPauNJjcOVcB3VM4eGhSaLGmXbL8vZpiw": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/" + wantEncode)},
"identifier/rg7F1D5hwv6o4xctjD5zDq4i5MD0mArTsUIWfhUbik8xC6Bsyt3mjXXOm3goojTz": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/" + wantEncode)},
"substitute": {Mode: fs.ModeDir | 0700},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},
@@ -105,6 +107,8 @@ func TestTar(t *testing.T) {
"identifier/W5S65DEhawz_WKaok5NjUKLmnD9dNl5RPauNJjcOVcB3VM4eGhSaLGmXbL8vZpiw": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/" + wantExpandEncode)},
"identifier/_v1blm2h-_KA-dVaawdpLas6MjHc6rbhhFS8JWwx8iJxZGUu8EBbRrhr5AaZ9PJL": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/" + wantExpandEncode)},
"substitute": {Mode: fs.ModeDir | 0700},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
}},

View File

@@ -65,7 +65,7 @@ ln -s ../../system/bin/perl /usr/bin
), version
}
func init() {
artifactsM[Attr] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newAttr,
Name: "attr",
@@ -73,7 +73,7 @@ func init() {
Website: "https://savannah.nongnu.org/projects/attr/",
ID: 137,
}
})
}
func (t Toolchain) newACL() (pkg.Artifact, string) {
@@ -94,7 +94,7 @@ func (t Toolchain) newACL() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[ACL] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newACL,
Name: "acl",
@@ -106,5 +106,5 @@ func init() {
},
ID: 16,
}
})
}

View File

@@ -1,406 +0,0 @@
package rosa
import (
"context"
"encoding/json"
"errors"
"fmt"
"net/http"
"strconv"
"sync"
"hakurei.app/internal/pkg"
)
// PArtifact is a lazily-initialised [pkg.Artifact] preset.
type PArtifact int
const (
LLVM PArtifact = iota
// EarlyInit is the Rosa OS init program.
EarlyInit
// ImageSystem is the Rosa OS /system image.
ImageSystem
// ImageInitramfs is the Rosa OS initramfs archive.
ImageInitramfs
// Kernel is the generic Rosa OS Linux kernel.
Kernel
// KernelHeaders is an installation of kernel headers for [Kernel].
KernelHeaders
// KernelSource is a writable kernel source tree installed to [AbsUsrSrc].
KernelSource
// Firmware is firmware blobs for use with the Linux kernel.
Firmware
ACL
ArgpStandalone
Attr
Autoconf
Automake
BC
Bash
Binutils
Bison
Bzip2
CMake
Connman
Coreutils
Curl
DBus
DTC
Diffutils
Elfutils
Fakeroot
Findutils
Flex
Fuse
GMP
GLib
Gawk
GenInitCPIO
Gettext
Git
Glslang
GnuTLS
Go
Gperf
Grep
Gzip
Hakurei
HakureiDist
IPTables
Kmod
LIT
LibX11
LibXau
LibXext
LibXrandr
LibXrender
LibXxf86vm
Libbsd
Libcap
Libdrm
Libev
Libexpat
Libffi
Libgd
Libglvnd
Libiconv
Libmd
Libmnl
Libnftnl
Libpciaccess
Libpng
Libpsl
Libseccomp
Libtasn1
Libtool
Libucontext
Libunistring
Libxshmfence
Libxml2
Libxslt
Libxtrans
M4
MPC
MPFR
Make
Meson
Mksh
MuslFts
MuslObstack
NSS
NSSCACert
Ncurses
Nettle
Ninja
OpenSSL
P11Kit
PCRE2
Parallel
Patch
Perl
PerlLocaleGettext
PerlMIMECharset
PerlModuleBuild
PerlPodParser
PerlSGMLS
PerlTermReadKey
PerlTextCharWidth
PerlTextWrapI18N
PerlUnicodeLineBreak
PerlYAMLTiny
PkgConfig
Procps
Python
PythonFlitCore
PythonHatchling
PythonIniConfig
PythonMako
PythonMarkupSafe
PythonPackaging
PythonPathspec
PythonPluggy
PythonPyTest
PythonPyYAML
PythonPycparser
PythonPygments
PythonSetuptools
PythonSetuptoolsSCM
PythonTroveClassifiers
PythonVCSVersioning
PythonWheel
QEMU
Rdfind
Readline
Rsync
Sed
SPIRVHeaders
SPIRVLLVMTranslator
SPIRVTools
SquashfsTools
Strace
TamaGo
Tar
Texinfo
Toybox
toyboxEarly
Unzip
UtilLinux
Wayland
WaylandProtocols
XCB
XCBProto
XDGDBusProxy
XZ
XorgProto
Zlib
Zstd
// PresetUnexportedStart is the first unexported preset.
PresetUnexportedStart
llvmSource = iota - 1
// earlyCompilerRT is an early, standalone compiler-rt installation for the
// standalone runtimes build.
//
// earlyCompilerRT must only be loaded by [LLVM].
earlyCompilerRT
// earlyRuntimes is an early, standalone installation of LLVM runtimes to
// work around the cmake build system leaking the system LLVM installation
// when invoking the newly built toolchain.
//
// earlyRuntimes must only be loaded by [LLVM].
earlyRuntimes
buildcatrust
utilMacros
// Musl is a standalone libc that does not depend on the toolchain.
Musl
// muslHeaders is a system installation of [Musl] headers.
muslHeaders
// gcc is a hacked-to-pieces GCC toolchain meant for use in intermediate
// stages only. This preset and its direct output must never be exposed.
gcc
// nettle3 is an older version of [Nettle].
nettle3
// Stage0 is a tarball containing all compile-time dependencies of artifacts
// part of the [Std] toolchain.
Stage0
// PresetEnd is the total number of presets and does not denote a preset.
PresetEnd
)
// P represents multiple [PArtifact] and is stable through JSON.
type P []PArtifact
// MarshalJSON represents [PArtifact] by their [Metadata.Name].
func (s P) MarshalJSON() ([]byte, error) {
names := make([]string, len(s))
for i, p := range s {
names[i] = GetMetadata(p).Name
}
return json.Marshal(names)
}
// UnmarshalJSON resolves the value created by MarshalJSON back to [P].
func (s *P) UnmarshalJSON(data []byte) error {
var names []string
if err := json.Unmarshal(data, &names); err != nil {
return err
}
*s = make(P, len(names))
for i, name := range names {
if p, ok := ResolveName(name); !ok {
return fmt.Errorf("unknown artifact %q", name)
} else {
(*s)[i] = p
}
}
return nil
}
// Metadata is stage-agnostic information of a [PArtifact] not directly
// representable in the resulting [pkg.Artifact].
type Metadata struct {
f func(t Toolchain) (a pkg.Artifact, version string)
// Unique package name.
Name string `json:"name"`
// Short user-facing description.
Description string `json:"description"`
// Project home page.
Website string `json:"website,omitempty"`
// Runtime dependencies.
Dependencies P `json:"dependencies"`
// Project identifier on [Anitya].
//
// [Anitya]: https://release-monitoring.org/
ID int `json:"-"`
// Optional custom version checking behaviour.
latest func(v *Versions) string
}
// GetLatest returns the latest version described by v.
func (meta *Metadata) GetLatest(v *Versions) string {
if meta.latest != nil {
return meta.latest(v)
}
return v.Latest
}
// Unversioned denotes an unversioned [PArtifact].
const Unversioned = "\x00"
// UnpopulatedIDError is returned by [Metadata.GetLatest] for an instance of
// [Metadata] where ID is not populated.
type UnpopulatedIDError struct{}
func (UnpopulatedIDError) Unwrap() error { return errors.ErrUnsupported }
func (UnpopulatedIDError) Error() string { return "Anitya ID is not populated" }
// Versions are package versions returned by Anitya.
type Versions struct {
// The latest version for the project, as determined by the version sorting algorithm.
Latest string `json:"latest_version"`
// List of all versions that arent flagged as pre-release.
Stable []string `json:"stable_versions"`
// List of all versions stored, sorted from newest to oldest.
All []string `json:"versions"`
}
// getStable returns the first Stable version, or Latest if that is unavailable.
func (v *Versions) getStable() string {
if len(v.Stable) == 0 {
return v.Latest
}
return v.Stable[0]
}
// GetVersions returns versions fetched from Anitya.
func (meta *Metadata) GetVersions(ctx context.Context) (*Versions, error) {
if meta.ID == 0 {
return nil, UnpopulatedIDError{}
}
var resp *http.Response
if req, err := http.NewRequestWithContext(
ctx,
http.MethodGet,
"https://release-monitoring.org/api/v2/versions/?project_id="+
strconv.Itoa(meta.ID),
nil,
); err != nil {
return nil, err
} else {
req.Header.Set("User-Agent", "Rosa/1.1")
if resp, err = http.DefaultClient.Do(req); err != nil {
return nil, err
}
}
var v Versions
err := json.NewDecoder(resp.Body).Decode(&v)
return &v, errors.Join(err, resp.Body.Close())
}
var (
// artifactsM is an array of [PArtifact] metadata.
artifactsM [PresetEnd]Metadata
// artifacts stores the result of Metadata.f.
artifacts [_toolchainEnd][len(artifactsM)]struct {
a pkg.Artifact
v string
}
// artifactsOnce is for lazy initialisation of artifacts.
artifactsOnce [_toolchainEnd][len(artifactsM)]sync.Once
// presetOpts globally modifies behaviour of presets.
presetOpts int
)
const (
// OptSkipCheck skips running all test suites.
OptSkipCheck = 1 << iota
// OptLLVMNoLTO disables LTO in all [LLVM] stages.
OptLLVMNoLTO
)
// Flags returns the current preset flags
func Flags() int { return presetOpts }
// zero zeros the value pointed to by p.
func zero[T any](p *T) { var v T; *p = v }
// DropCaches arranges for all cached [pkg.Artifact] to be freed some time after
// it returns. Must not be used concurrently with any other function from this
// package.
func DropCaches(flags int) {
presetOpts = flags
zero(&artifacts)
zero(&artifactsOnce)
}
// GetMetadata returns [Metadata] of a [PArtifact].
func GetMetadata(p PArtifact) *Metadata { return &artifactsM[p] }
// construct constructs a [pkg.Artifact] corresponding to a [PArtifact] once.
func (t Toolchain) construct(p PArtifact) {
artifactsOnce[t][p].Do(func() {
artifacts[t][p].a, artifacts[t][p].v = artifactsM[p].f(t)
})
}
// Load returns the resulting [pkg.Artifact] of [PArtifact].
func (t Toolchain) Load(p PArtifact) pkg.Artifact {
t.construct(p)
return artifacts[t][p].a
}
// Version returns the version string of [PArtifact].
func (t Toolchain) Version(p PArtifact) string {
t.construct(p)
return artifacts[t][p].v
}
// ResolveName returns a [PArtifact] by name.
func ResolveName(name string) (p PArtifact, ok bool) {
for i := range PresetUnexportedStart {
if name == artifactsM[i].Name {
return i, true
}
}
return 0, false
}

View File

@@ -1,81 +0,0 @@
package rosa_test
import (
"testing"
"hakurei.app/internal/rosa"
)
func TestLoad(t *testing.T) {
t.Parallel()
for i := range rosa.PresetEnd {
p := rosa.PArtifact(i)
t.Run(rosa.GetMetadata(p).Name, func(t *testing.T) {
t.Parallel()
rosa.Std.Load(p)
})
}
}
func BenchmarkAll(b *testing.B) {
flags := rosa.Flags()
b.Cleanup(func() { rosa.DropCaches(flags) })
for b.Loop() {
for i := range rosa.PresetEnd {
rosa.Std.Load(rosa.PArtifact(i))
}
b.StopTimer()
rosa.DropCaches(0)
b.StartTimer()
}
}
func TestResolveName(t *testing.T) {
t.Parallel()
for i := range rosa.PresetUnexportedStart {
p := i
name := rosa.GetMetadata(p).Name
t.Run(name, func(t *testing.T) {
t.Parallel()
if got, ok := rosa.ResolveName(name); !ok {
t.Fatal("ResolveName: ok = false")
} else if got != p {
t.Fatalf("ResolveName: %d, want %d", got, p)
}
})
}
}
func TestResolveNameUnexported(t *testing.T) {
t.Parallel()
for i := rosa.PresetUnexportedStart; i < rosa.PresetEnd; i++ {
p := i
name := rosa.GetMetadata(p).Name
t.Run(name, func(t *testing.T) {
t.Parallel()
if got, ok := rosa.ResolveName(name); ok {
t.Fatalf("ResolveName: resolved unexported preset %d", got)
}
})
}
}
func TestUnique(t *testing.T) {
t.Parallel()
names := make(map[string]struct{})
for i := range rosa.PresetEnd {
name := rosa.GetMetadata(rosa.PArtifact(i)).Name
if _, ok := names[name]; ok {
t.Fatalf("name %s is not unique", name)
}
names[name] = struct{}{}
}
}

View File

@@ -26,11 +26,11 @@ install -D -m755 libargp.a /work/system/lib/libargp.a
), version
}
func init() {
artifactsM[ArgpStandalone] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newArgpStandalone,
Name: "argp-standalone",
Description: "hierarchical argument parsing library broken out from glibc",
Website: "http://www.lysator.liu.se/~nisse/misc/",
}
})
}

View File

@@ -0,0 +1,348 @@
// Package azalea implements a proof-of-concept, domain-specific language for
// Rosa OS software packaging.
package azalea
import (
"errors"
"io"
"strconv"
"text/scanner"
)
// idents are runes accepted in an identifier.
var idents = [...]bool{
'0': true, '1': true, '2': true, '3': true, '4': true, '5': true, '6': true,
'7': true, '8': true, '9': true,
'A': true, 'B': true, 'C': true, 'D': true, 'E': true, 'F': true, 'G': true,
'H': true, 'I': true, 'J': true, 'K': true, 'L': true, 'M': true, 'N': true,
'O': true, 'P': true, 'Q': true, 'R': true, 'S': true, 'T': true, 'U': true,
'V': true, 'W': true, 'X': true, 'Y': true, 'Z': true,
'a': true, 'b': true, 'c': true, 'd': true, 'e': true, 'f': true, 'g': true,
'h': true, 'i': true, 'j': true, 'k': true, 'l': true, 'm': true, 'n': true,
'o': true, 'p': true, 'q': true, 'r': true, 's': true, 't': true, 'u': true,
'v': true, 'w': true, 'x': true, 'y': true, 'z': true,
'-': true, '_': true,
}
// TokenError describes an unexpected token.
type TokenError [2]rune
func (e TokenError) Error() string {
return "expected " + scanner.TokenString(e[0]) +
", found " + scanner.TokenString(e[1])
}
// ExprError is an unexpected token encountered while parsing an expression.
type ExprError rune
func (e ExprError) Error() string {
return "unexpected token " + scanner.TokenString(rune(e))
}
// must1 returns v, or panics if err is not nil.
func must1[T any](v T, err error) T {
if err != nil {
panic(err)
}
return v
}
// parser retains the current token.
type parser struct {
s scanner.Scanner
tok rune
}
// scan advances the underlying scanner to the next token, storing its result.
func (p *parser) scan() rune { p.tok = p.s.Scan(); return p.tok }
// expects panics with [TokenError] for an unexpected tok.
func (p *parser) expects(expects rune) {
if p.tok != expects {
panic(TokenError{expects, p.tok})
}
}
// scanAs advances the scanner for an expected token.
func (p *parser) scanAs(expects rune) { p.scan(); p.expects(expects) }
// An Int is the value represented by an integer literal.
type Int int64
func (v Int) GoString() string {
return "azalea.Int(" + strconv.FormatInt(int64(v), 10) + ")"
}
// parseInt parses the current token as a base 10 representation of a 64-bit
// signed integer.
func (p *parser) parseInt() Int {
v, err := strconv.ParseInt(p.s.TokenText(), 10, 64)
return must1(Int(v), err)
}
// A String holds the unquoted content of a string literal.
type String string
func (v String) GoString() string {
return "azalea.String(" + strconv.Quote(string(v)) + ")"
}
// parseString parses the current token as a string.
func (p *parser) parseString() String {
s, err := strconv.Unquote(p.s.TokenText())
return must1(String(s), err)
}
// An Ident holds the name of an identifier.
type Ident string
func (v Ident) GoString() string {
return "azalea.Ident(" + strconv.Quote(string(v)) + ")"
}
// A Val are statements joined by the '+' operator. Only the [String] type
// supports concatenation.
type Val []any
// parseVal parses until the end of the [Val].
func (p *parser) parseVal() (v Val) {
v = append(v, p.parseExpr())
for p.tok == '+' {
p.scan()
v = append(v, p.parseExpr())
}
return
}
// An Array holds statements in an array.
type Array []Val
// A KV holds a key/value pair.
type KV struct {
K String
V Val
}
// An Arg represents an argument of [Func].
type Arg struct {
K []Ident
V Val
R bool
}
// Func is a function call or package declaration.
type Func struct {
// Function or package identifier.
Ident Ident
// Whether this is a package declaration.
Package bool
// Key-value arguments.
Args []Arg
}
// parseExpr parses the current expression.
func (p *parser) parseExpr() any {
switch p.tok {
case scanner.Int:
v := p.parseInt()
p.scan()
return v
case scanner.String, scanner.RawString:
v := p.parseString()
p.scan()
return v
case scanner.Ident:
var v Func
v.Ident = Ident(p.s.TokenText())
if v.Package = v.Ident == "package"; v.Package {
p.scanAs(scanner.Ident)
v.Ident = Ident(p.s.TokenText())
}
p.scan()
switch p.tok {
case '{':
for {
p.scan()
switch p.tok {
case '}':
p.scan()
return v
case scanner.Ident:
break
default:
panic(TokenError{scanner.Ident, p.tok})
}
arg := Arg{K: []Ident{Ident(p.s.TokenText())}}
delim := true
arg:
for {
p.scan()
switch p.tok {
case ',':
if delim {
delim = false
continue
}
panic(ExprError(p.tok))
case scanner.Ident:
if delim {
panic(TokenError{',', p.tok})
}
delim = true
arg.K = append(arg.K, Ident(p.s.TokenText()))
default:
break arg
}
}
switch p.tok {
case '=':
break
case '*':
arg.R = true
p.scanAs('=')
default:
panic(TokenError{'=', p.tok})
}
p.scan()
arg.V = p.parseVal()
v.Args = append(v.Args, arg)
p.expects(';')
}
default:
return v.Ident
}
case '{':
var v []KV
for {
p.scan()
switch p.tok {
case '}':
p.scan()
return v
case scanner.String:
pair := KV{K: p.parseString()}
p.scan()
switch p.tok {
case ';':
break
case ':':
p.scan()
pair.V = p.parseVal()
p.expects(';')
break
default:
panic(ExprError(p.tok))
}
v = append(v, pair)
default:
panic(ExprError(p.tok))
}
}
case '[':
var (
v Array
delim bool
)
p.scan()
for {
switch p.tok {
case ',':
if delim {
p.scan()
delim = false
continue
}
panic(ExprError(','))
case ']':
p.scan()
return v
case scanner.EOF:
panic(ExprError(scanner.EOF))
default:
if delim {
panic(TokenError{',', p.tok})
}
delim = true
break
}
v = append(v, p.parseVal())
}
default:
panic(ExprError(p.tok))
}
}
// ScanError is the error count parsing all expressions.
type ScanError int
func (ScanError) Error() string {
return "aborting due to scanning errors"
}
// Parse parses expressions from r.
func Parse(r io.Reader) (e []any, err error) {
var p parser
p.s.Init(r)
p.s.Mode = scanner.ScanIdents |
scanner.ScanInts |
scanner.ScanStrings |
scanner.ScanRawStrings |
scanner.ScanComments |
scanner.SkipComments
p.s.IsIdentRune = func(ch rune, i int) bool {
if i == 0 && ch >= '0' && ch <= '9' {
return false
}
return ch > 0 && ch < rune(len(idents)) && idents[ch]
}
defer func() {
v := recover()
if v == nil {
return
}
_err, ok := v.(error)
if !ok {
panic(v)
}
if err == nil {
err = _err
return
}
err = errors.Join(err, _err)
}()
p.scan()
for p.tok != scanner.EOF {
e = append(e, p.parseExpr())
}
if p.s.ErrorCount != 0 {
err = ScanError(p.s.ErrorCount)
}
return
}

View File

@@ -0,0 +1,169 @@
package azalea_test
import (
_ "embed"
"reflect"
"strings"
"testing"
"text/scanner"
. "hakurei.app/internal/rosa/azalea"
)
//go:embed testdata/gcc.az
var sample string
func TestParse(t *testing.T) {
t.Parallel()
testCases := []struct {
name string
data string
want []any
err error
}{
{"invalid", "}", nil, ExprError('}')},
{"bad sep", "f{v?}", nil, TokenError{'=', '?'}},
{"bad ident", "f{9}", nil, TokenError{scanner.Ident, scanner.Int}},
{"share bad sep", "f { v,,v = v; }", nil, ExprError(',')},
{"share missing sep", "f { v v }", nil, TokenError{',', scanner.Ident}},
{"ident", `v`, []any{Ident("v")}, nil},
{"concat", `f { v = v+"\xfd"+p{}+9; }`, []any{Func{
Ident: "f",
Args: []Arg{{K: []Ident{"v"}, V: Val{
Ident("v"),
String("\xfd"),
Func{Ident: "p"},
Int(9),
}}},
}}, nil},
{"truncated string concat", `f { v = v+; }`, nil,
ExprError(';')},
{"empty pairs", `{}`, []any{[]KV(nil)}, nil},
{"short kv", `{"\x00":v;}`, []any{[]KV{
{K: "\x00", V: Val{Ident("v")}},
}}, nil},
{"truncated kv", `{"\x00"`, nil, ExprError(scanner.EOF)},
{"ident kv", `{v="";}`, nil, ExprError(scanner.Ident)},
{"empty array", `[]`, []any{Array(nil)}, nil},
{"integer array", `[9]`, []any{Array{{Int(9)}}}, nil},
{"short array", `[ "\x00" ]`, []any{
Array{{String("\x00")}},
}, nil},
{"short array delim", `[ "\x00", ]`, []any{
Array{{String("\x00")}},
}, nil},
{"missing array value", `[ "\x00", , v ]`, nil, ExprError(',')},
{"missing array delimiter", `[ v0 v1 ]`, nil, TokenError{',', scanner.Ident}},
{"truncated array", `[ "\x00"`, nil,
ExprError(scanner.EOF)},
{"gcc", sample, []any{Func{
Ident: Ident("gcc"),
Package: true,
Args: []Arg{
{K: []Ident{Ident("description")}, V: Val{String("The GNU Compiler Collection")}},
{K: []Ident{Ident("website")}, V: Val{String("https://www.gnu.org/software/gcc")}},
{K: []Ident{Ident("anitya")}, V: Val{Int(6502)}},
{K: []Ident{Ident("version")}, V: Val{String("16.1.0")}, R: true},
{K: []Ident{Ident("source")}, V: Val{Func{
Ident: Ident("remoteTar"),
Args: []Arg{
{K: []Ident{Ident("url")}, V: Val{
String("https://ftp.tsukuba.wide.ad.jp/software/gcc/releases/"),
String("gcc-"),
Ident("version"),
String("/gcc-"),
Ident("version"),
String(".tar.gz"),
}},
{K: []Ident{Ident("checksum")}, V: Val{String("4ASoWbxaA2FW7PAB0zzHDPC5XnNhyaAyjtDPpGzceSLeYnEIXsNYZR3PA_Zu5P0K")}},
{K: []Ident{Ident("compress")}, V: Val{Ident("gzip")}},
},
}}},
{K: []Ident{Ident("patches")}, V: Val{Array{
{String("musl-off64_t-loff_t.patch")},
{String("musl-legacy-lfs.patch")},
}}},
{K: []Ident{Ident("exclusive")}, V: Val{Ident("true")}},
{K: []Ident{Ident("exec")}, V: Val{Func{
Ident: Ident("make"),
Args: []Arg{
{K: []Ident{Ident("configure")}, V: Val{[]KV{
{K: String("disable-multilib")},
{K: String("enable-default-pie")},
{K: String("disable-nls")},
{K: String("with-gnu-as")},
{K: String("with-gnu-ld")},
{K: String("with-system-zlib")},
{K: String("enable-languages"), V: Val{String("c,c++,go")}},
{K: String("with-native-system-header-dir"), V: Val{String("/system/include")}},
{K: String("with-multilib-list"), V: Val{Func{
Ident: Ident("arch"),
Args: []Arg{
{K: []Ident{Ident("amd64"), Ident("arm64")}, V: Val{String("''")}},
{K: []Ident{Ident("default")}, V: Val{Ident("unset")}},
},
}}},
}}},
{K: []Ident{Ident("make")}, V: Val{Array{
{String("BOOT_CFLAGS='-O2 -g'")},
{
Func{Ident: Ident("noop"), Args: []Arg{{K: []Ident{Ident("key")}, V: Val{Ident("value")}}}},
String("\x00"),
},
{String("bootstrap")},
}}},
{K: []Ident{Ident("skip-check")}, V: Val{Ident("true")}},
},
}}},
{K: []Ident{Ident("inputs")}, V: Val{Array{
{Ident("binutils")},
{Ident("mpc")},
{Ident("zlib")},
{Ident("libucontext")},
{Ident("kernel-headers")},
}}},
},
}}, nil},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
t.Parallel()
p, err := Parse(strings.NewReader(tc.data))
if !reflect.DeepEqual(p, tc.want) {
t.Errorf("Parse: %#v, want %#v", p, tc.want)
}
if !reflect.DeepEqual(err, tc.err) {
t.Errorf("Parse: error = %v, want %v", err, tc.err)
}
})
}
}
func BenchmarkParse(b *testing.B) {
r := strings.NewReader(sample)
for b.Loop() {
if _, err := Parse(r); err != nil {
b.Fatal(err)
}
b.StopTimer()
r.Reset(sample)
b.StartTimer()
}
}

View File

@@ -0,0 +1,308 @@
package azalea
import (
"errors"
"fmt"
"maps"
"reflect"
"unique"
)
// Value are types supported by the language.
type Value interface {
bool | int64 | string | []string | [][2]string
}
type (
// FArg is an argument passed to [F].
FArg struct {
K unique.Handle[Ident]
V any
R bool
}
// FArgs are arguments passed to [F].
FArgs []FArg
// F is the implementation of a [Func].
F struct {
F func(isPackage bool, args FArgs) (v any, set bool, err error)
V map[unique.Handle[Ident]]any
}
)
// Apply applies named arguments and rejects unused arguments.
func (args FArgs) Apply(v map[unique.Handle[Ident]]any) error {
for _, arg := range args {
if arg.V == nil {
// unset
continue
}
r, ok := v[arg.K]
if !ok {
if arg.R {
continue
}
return UndefinedError(arg.K.Value())
}
err := storeE(r, arg.V)
if err != nil {
return err
}
}
return nil
}
// A Frame refers to local variables and debugging information.
type Frame struct {
// Local constants.
Val map[unique.Handle[Ident]]any
// Functions.
Func map[unique.Handle[Ident]]F
}
// UnsupportedExprError is an expression with invalid concrete type.
type UnsupportedExprError struct{ E any }
func (e UnsupportedExprError) Error() string {
return fmt.Sprintf("unsupported expression %#v", e.E)
}
// UndefinedError is an identifier not defined in any stack frame visible to the
// expression containing it.
type UndefinedError Ident
func (e UndefinedError) Error() string {
return "undefined: " + string(e)
}
// evaluate is evaluateAny with a type parameter.
func evaluate[T Value](s []Frame, expr any, rp *T) bool {
return evaluateAny(s, expr, rp)
}
// TypeError is an unexpected type during evaluation.
type TypeError struct {
Concrete, Asserted reflect.Type
}
func (e TypeError) Error() string {
return "expected " + e.Asserted.String() + ", got " + e.Concrete.String()
}
func (e TypeError) Is(err error) bool {
var v TypeError
return errors.As(err, &v) &&
e.Asserted == v.Asserted &&
e.Concrete == v.Concrete
}
// storeE is a convenience function to set the value of a result pointer.
func storeE(rp any, r any) error {
pv := reflect.ValueOf(rp).Elem()
v := reflect.ValueOf(r)
pt, vt := pv.Type(), v.Type()
if !vt.AssignableTo(pt) {
return TypeError{vt, pt}
}
pv.Set(v)
return nil
}
// store is like storeE, but panics if error is non-nil.
func store[T Value](rp any, r T) {
err := storeE(rp, r)
if err != nil {
panic(err)
}
}
// EvaluationError is an error and the expression it occurred in.
type EvaluationError struct {
Expr any
Err error
}
// Unwrap returns the underlying error.
func (e EvaluationError) Unwrap() error { return e.Err }
// Error returns a very long error description that should not be presented
// to the user directly.
func (e EvaluationError) Error() string {
return fmt.Sprintf("expression %#v: %v", e.Expr, e.Err)
}
// evaluateAny implements [Evaluate].
func evaluateAny(s []Frame, expr, rp any) bool {
defer func() {
r := recover()
if r == nil {
return
}
err, ok := r.(error)
if !ok {
panic(r)
}
if _, ok = err.(EvaluationError); ok {
panic(err)
}
panic(EvaluationError{expr, err})
}()
switch e := expr.(type) {
case Int:
store(rp, int64(e))
return true
case String:
store(rp, string(e))
return true
case Ident:
var (
v any
ok bool
)
for i := range s {
v, ok = s[len(s)-1-i].Val[unique.Make(e)]
if ok {
break
}
}
if !ok {
panic(UndefinedError(e))
}
if err := storeE(rp, v); err != nil {
panic(err)
}
return true
case Val:
if len(e) == 1 {
switch v := e[0].(type) {
case Ident:
switch v {
case "unset":
return false
case "true":
store(rp, true)
return true
case "false":
store(rp, false)
return true
default:
return evaluateAny(s, v, rp)
}
default:
return evaluateAny(s, e[0], rp)
}
}
var v string
for i := range e {
var _r string
if evaluate(s, e[i], &_r) {
v += _r
}
}
store(rp, v)
return true
case Array:
r := make([]string, 0, len(e))
for i := range e {
var _r string
if evaluate(s, e[i], &_r) {
r = append(r, _r)
}
}
store(rp, r)
return true
case []KV:
r := make([][2]string, 0, len(e))
for i := range e {
var _r string
if e[i].V == nil || evaluate(s, e[i].V, &_r) {
r = append(r, [2]string{string(e[i].K), _r})
}
}
store(rp, r)
return true
case Func:
var (
f F
ok bool
)
for i := range s {
f, ok = s[len(s)-1-i].Func[unique.Make(e.Ident)]
if ok {
break
}
}
if !ok {
panic(UndefinedError(e.Ident))
}
argc := len(e.Args)
for _, arg := range e.Args {
argc += len(arg.K) - 1
}
fargs := make([]FArg, 0, len(e.Args))
s = append(s, Frame{Val: maps.Clone(f.V)})
fp := &s[len(s)-1]
for _, arg := range e.Args {
farg := FArg{R: arg.R}
if !evaluateAny(s, arg.V, &farg.V) {
farg.V = nil
}
for _, name := range arg.K {
h := unique.Make(name)
farg.K = h
fargs = append(fargs, farg)
if arg.R && farg.V != nil {
if fp.Val == nil {
fp.Val = make(map[unique.Handle[Ident]]any)
}
(*fp).Val[h] = farg.V
}
}
}
v, set, err := f.F(e.Package, fargs)
if err != nil {
panic(err)
} else if v != nil {
if err = storeE(rp, v); err != nil {
panic(err)
}
}
return set
default:
panic(UnsupportedExprError{expr})
}
}
// Evaluate evaluates a statement and returns its value.
func Evaluate[T Value](s []Frame, expr any) (v T, set bool, err error) {
defer func() {
r := recover()
if r == nil {
return
}
_err, ok := r.(error)
if !ok {
panic(r)
}
err = _err
}()
set = evaluate[T](s, expr, &v)
return
}

View File

@@ -0,0 +1,358 @@
package azalea_test
import (
"errors"
"fmt"
"reflect"
"strings"
"testing"
"unique"
. "hakurei.app/internal/rosa/azalea"
)
// makeStackCheck creates a stack with a single frame with a single function "f"
// which calls the check function internally.
func makeStackCheck(check func(args FArgs) (any, error)) []Frame {
return []Frame{{Func: map[unique.Handle[Ident]]F{
unique.Make(Ident("f")): {F: func(
isPackage bool,
args FArgs,
) (v any, set bool, err error) {
set = true
v, err = check(args)
if isPackage {
err = errors.New("unexpected package")
}
return
}},
}}}
}
func TestEvaluate(t *testing.T) {
t.Parallel()
testCases := []struct {
name string
data string
s []Frame
want string
err error
}{
{"apply unset", `f { v = unset; }`, makeStackCheck(func(
args FArgs,
) (v any, err error) {
v = "\xfd"
err = args.Apply(map[unique.Handle[Ident]]any{
unique.Make(Ident("v")): &v,
})
return
}), "\xfd", nil},
{"apply bad type", `f { v = 9; }`, makeStackCheck(func(
args FArgs,
) (v any, err error) {
v = "\xfd"
err = args.Apply(map[unique.Handle[Ident]]any{
unique.Make(Ident("v")): &v,
})
return
}), "", TypeError{
Concrete: reflect.TypeFor[int64](),
Asserted: reflect.TypeFor[string](),
}},
{"apply undefined", `f { v = 9; }`, makeStackCheck(func(
args FArgs,
) (v any, err error) {
v = "\xfd"
err = args.Apply(map[unique.Handle[Ident]]any{})
return
}), "", EvaluationError{
Expr: Func{
Ident: Ident("f"),
Args: []Arg{
{K: []Ident{"v"}, V: Val{Int(9)}},
},
},
Err: UndefinedError("v"),
}},
{"apply bound undefined", `f { _v* = "\x00"; v = _v; }`, makeStackCheck(func(
args FArgs,
) (v any, err error) {
v = "\xfd"
err = args.Apply(map[unique.Handle[Ident]]any{
unique.Make(Ident("v")): &v,
})
return
}), "\x00", nil},
{"undefined function", `f {}`, nil, "", EvaluationError{
Expr: Func{Ident: "f"},
Err: UndefinedError("f"),
}},
{"error wrap deep", `f { v = nil; }`, makeStackCheck(func(
FArgs,
) (any, error) {
panic("unreachable")
}), "", EvaluationError{
Expr: Ident("nil"),
Err: UndefinedError("nil"),
}},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
t.Parallel()
var expr Func
if e, err := Parse(strings.NewReader(tc.data)); err != nil {
t.Fatal(err)
} else if len(e) != 1 {
t.Fatalf("got expression %#v", e)
} else {
expr = e[0].(Func)
}
r, set, err := Evaluate[string](tc.s, expr)
if set != (err == nil) {
t.Error("Evaluate: unexpected unset")
}
if r != tc.want {
t.Errorf("Evaluate: %q, want %q", r, tc.want)
}
var errEquals bool
if errors.As(err, new(TypeError)) {
errEquals = errors.Is(err, tc.err)
} else {
errEquals = reflect.DeepEqual(err, tc.err)
}
if !errEquals {
t.Errorf("Evaluate: error = %v, want %v", err, tc.err)
}
})
}
}
func TestEvaluateGCC(t *testing.T) {
t.Parallel()
var gcc Func
if e, err := Parse(strings.NewReader(sample)); err != nil {
t.Fatal(err)
} else {
gcc = e[0].(Func)
}
var got [3]FArgs
if r, set, err := Evaluate[string]([]Frame{{
Func: map[unique.Handle[Ident]]F{
unique.Make(Ident("gcc")): {F: func(
isPackage bool,
args FArgs,
) (v any, set bool, err error) {
v = "\x00"
if !isPackage {
err = errors.New("not a package")
}
set = true
got[0] = args
return
}, V: map[unique.Handle[Ident]]any{
unique.Make(Ident("binutils")): "binutils",
unique.Make(Ident("mpc")): "mpc",
unique.Make(Ident("zlib")): "zlib",
unique.Make(Ident("libucontext")): "libucontext",
unique.Make(Ident("kernel-headers")): "kernel-headers",
}},
unique.Make(Ident("remoteTar")): {F: func(
isPackage bool,
args FArgs,
) (v any, set bool, err error) {
if isPackage {
err = errors.New("unexpected package")
return
}
var url, checksum string
var compress int
if err = args.Apply(map[unique.Handle[Ident]]any{
unique.Make(Ident("url")): &url,
unique.Make(Ident("checksum")): &checksum,
unique.Make(Ident("compress")): &compress,
}); err != nil {
return
}
if compress != 0xcafe {
err = fmt.Errorf("unexpected compress %#v", compress)
}
set = true
v = url + "?checksum=" + checksum
return
}, V: map[unique.Handle[Ident]]any{
unique.Make(Ident("gzip")): 0xcafe,
}},
unique.Make(Ident("make")): {F: func(
isPackage bool,
args FArgs,
) (v any, set bool, err error) {
v = args
if isPackage {
err = errors.New("unexpected package")
}
set = true
return
}},
unique.Make(Ident("arch")): {F: func(
isPackage bool,
args FArgs,
) (v any, set bool, err error) {
set = false
if isPackage {
err = errors.New("unexpected package")
}
got[1] = args
return
}},
unique.Make(Ident("noop")): {F: func(
isPackage bool,
args FArgs,
) (v any, set bool, err error) {
set = false
if isPackage {
err = errors.New("unexpected package")
}
set = true
got[2] = args
return
}, V: map[unique.Handle[Ident]]any{
unique.Make(Ident("value")): "\xfd",
}},
},
}}, gcc); err != nil {
t.Fatal(err)
} else if r != "\x00" {
t.Fatalf("package: %q", r)
} else if !set {
t.Fatal("package: unset")
}
want := [...]FArgs{
{
{K: unique.Make(Ident("description")), V: "The GNU Compiler Collection"},
{K: unique.Make(Ident("website")), V: "https://www.gnu.org/software/gcc"},
{K: unique.Make(Ident("anitya")), V: int64(6502)},
{K: unique.Make(Ident("version")), V: "16.1.0", R: true},
{K: unique.Make(Ident("source")), V: "https://ftp.tsukuba.wide.ad.jp/software/gcc/releases/gcc-16.1.0/gcc-16.1.0.tar.gz?checksum=4ASoWbxaA2FW7PAB0zzHDPC5XnNhyaAyjtDPpGzceSLeYnEIXsNYZR3PA_Zu5P0K"},
{K: unique.Make(Ident("patches")), V: []string{"musl-off64_t-loff_t.patch", "musl-legacy-lfs.patch"}},
{K: unique.Make(Ident("exclusive")), V: true},
{K: unique.Make(Ident("exec")), V: FArgs{
{K: unique.Make(Ident("configure")), V: [][2]string{
{"disable-multilib", ""},
{"enable-default-pie", ""},
{"disable-nls", ""},
{"with-gnu-as", ""},
{"with-gnu-ld", ""},
{"with-system-zlib", ""},
{"enable-languages", "c,c++,go"},
{"with-native-system-header-dir", "/system/include"},
}},
{K: unique.Make(Ident("make")), V: []string{
"BOOT_CFLAGS='-O2 -g'",
"\x00",
"bootstrap",
}},
{K: unique.Make(Ident("skip-check")), V: true},
}},
{K: unique.Make(Ident("inputs")), V: []string{
"binutils",
"mpc",
"zlib",
"libucontext",
"kernel-headers",
}},
},
{
{K: unique.Make(Ident("amd64")), V: "''"},
{K: unique.Make(Ident("arm64")), V: "''"},
{K: unique.Make(Ident("default"))},
},
{{K: unique.Make(Ident("key")), V: "\xfd"}},
}
if !reflect.DeepEqual(got, want) {
t.Errorf("package: args = %#v, want %#v", got, want)
}
}
func BenchmarkEvaluate(b *testing.B) {
var gcc Func
if e, err := Parse(strings.NewReader(sample)); err != nil {
b.Fatal(err)
} else {
gcc = e[0].(Func)
}
s := []Frame{{
Func: map[unique.Handle[Ident]]F{
unique.Make(Ident("gcc")): {F: func(
bool,
FArgs,
) (v any, set bool, err error) {
return
}, V: map[unique.Handle[Ident]]any{
unique.Make(Ident("binutils")): "binutils",
unique.Make(Ident("mpc")): "mpc",
unique.Make(Ident("zlib")): "zlib",
unique.Make(Ident("libucontext")): "libucontext",
unique.Make(Ident("kernel-headers")): "kernel-headers",
}},
unique.Make(Ident("remoteTar")): {F: func(
bool,
FArgs,
) (v any, set bool, err error) {
return
}, V: map[unique.Handle[Ident]]any{
unique.Make(Ident("gzip")): 0xcafe,
}},
unique.Make(Ident("make")): {F: func(
bool,
FArgs,
) (v any, set bool, err error) {
return
}},
unique.Make(Ident("arch")): {F: func(
bool,
FArgs,
) (v any, set bool, err error) {
return
}},
unique.Make(Ident("noop")): {F: func(
bool,
FArgs,
) (v any, set bool, err error) {
return
}, V: map[unique.Handle[Ident]]any{
unique.Make(Ident("value")): "\xfd",
}},
},
}}
for b.Loop() {
if _, _, err := Evaluate[string](s, gcc); err != nil {
b.Fatal(err)
}
}
}

57
internal/rosa/azalea/testdata/gcc.az vendored Normal file
View File

@@ -0,0 +1,57 @@
package gcc {
description = "The GNU Compiler Collection";
website = "https://www.gnu.org/software/gcc";
anitya = 6502;
version* = "16.1.0";
source = remoteTar {
url = "https://ftp.tsukuba.wide.ad.jp/software/gcc/releases/"+
"gcc-"+version+"/gcc-"+version+".tar.gz";
checksum = "4ASoWbxaA2FW7PAB0zzHDPC5XnNhyaAyjtDPpGzceSLeYnEIXsNYZR3PA_Zu5P0K";
compress = gzip;
};
patches = [
"musl-off64_t-loff_t.patch",
"musl-legacy-lfs.patch",
];
// GCC spends most of its time in its many configure scripts, however
// it also saturates the CPU for a consequential amount of time.
exclusive = true;
exec = make {
configure = {
"disable-multilib";
"enable-default-pie";
"disable-nls";
"with-gnu-as";
"with-gnu-ld";
"with-system-zlib";
"enable-languages": "c,c++,go";
"with-native-system-header-dir": "/system/include";
"with-multilib-list": arch {
amd64, arm64 = "''";
default = unset;
};
};
make = [
"BOOT_CFLAGS='-O2 -g'",
noop { key = value; } + "\x00",
"bootstrap",
];
// This toolchain is hacked to pieces, it is not expected to ever work
// well in its current state. That does not matter as long as the
// toolchain it produces passes its own test suite.
skip-check = true;
};
inputs = [
binutils,
mpc,
zlib,
libucontext,
kernel-headers,
];
}

View File

@@ -5,7 +5,6 @@ import (
"io"
"net/http"
"os"
"runtime"
"time"
"hakurei.app/fhs"
@@ -86,13 +85,13 @@ func (a busyboxBin) Cure(t *pkg.TContext) (err error) {
// newBusyboxBin returns a [pkg.Artifact] containing a busybox installation from
// the https://busybox.net/downloads/binaries/ binary release.
func newBusyboxBin() pkg.Artifact {
func (s *S) newBusyboxBin() pkg.Artifact {
var version, url, checksum string
switch runtime.GOARCH {
switch s.arch {
case "amd64":
version = "1.35.0"
url = "https://busybox.net/downloads/binaries/" +
version + "-" + linuxArch() + "-linux-musl/busybox"
version + "-" + s.linuxArch() + "-linux-musl/busybox"
checksum = "L7OBIsPu9enNHn7FqpBT1kOg_mCLNmetSeNMA3i4Y60Z5jTgnlX3qX3zcQtLx5AB"
case "arm64":
version = "1.31.0"
@@ -101,11 +100,11 @@ func newBusyboxBin() pkg.Artifact {
checksum = "npJjBO7iwhjW6Kx2aXeSxf8kXhVgTCDChOZTTsI8ZfFfa3tbsklxRiidZQdrVERg"
default:
panic("unsupported target " + runtime.GOARCH)
panic("unsupported target " + s.arch)
}
return pkg.NewExec(
"busybox-bin-"+version, nil, pkg.ExecTimeoutMax, false,
"busybox-bin-"+version, s.arch, nil, pkg.ExecTimeoutMax, false, false,
fhs.AbsRoot, []string{
"PATH=/system/bin",
},

View File

@@ -26,7 +26,7 @@ func (t Toolchain) newBzip2() (pkg.Artifact, string) {
}), version
}
func init() {
artifactsM[Bzip2] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newBzip2,
Name: "bzip2",
@@ -34,5 +34,5 @@ func init() {
Website: "https://sourceware.org/bzip2/",
ID: 237,
}
})
}

View File

@@ -106,7 +106,7 @@ index 2ead810437..f85cbb8b1c 100644
), version
}
func init() {
artifactsM[CMake] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newCMake,
Name: "cmake",
@@ -114,7 +114,7 @@ func init() {
Website: "https://cmake.org/",
ID: 306,
}
})
}
// CMakeHelper is the [CMake] build system helper.
@@ -164,7 +164,7 @@ func (*CMakeHelper) createDir() bool { return true }
func (*CMakeHelper) wantsDir() string { return "/cure/" }
// script generates the cure script.
func (attr *CMakeHelper) script(name string) string {
func (attr *CMakeHelper) script(s *S, name string) string {
if attr == nil {
attr = new(CMakeHelper)
}
@@ -180,7 +180,7 @@ func (attr *CMakeHelper) script(name string) string {
}
script := attr.Script
if !attr.SkipTest && presetOpts&OptSkipCheck == 0 {
if !attr.SkipTest && s.opts&OptSkipCheck == 0 {
script += "\n" + test
}

View File

@@ -90,7 +90,7 @@ func (t Toolchain) newConnman() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Connman] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newConnman,
Name: "connman",
@@ -105,5 +105,5 @@ func init() {
},
ID: 337,
}
})
}

View File

@@ -4,8 +4,8 @@ import "hakurei.app/internal/pkg"
func (t Toolchain) newCurl() (pkg.Artifact, string) {
const (
version = "8.19.0"
checksum = "YHuVLVVp8q_Y7-JWpID5ReNjq2Zk6t7ArHB6ngQXilp_R5l3cubdxu3UKo-xDByv"
version = "8.20.0"
checksum = "xyHXwrngIRGMasuzhn-I5MSCOhktwINbsWt1f_LuR-5jRVvyx_g6U1EQfDLEbr9r"
)
return t.NewPackage("curl", version, newTar(
"https://curl.se/download/curl-"+version+".tar.bz2",
@@ -39,7 +39,7 @@ chmod +w tests/data && rm -f tests/data/test459
), version
}
func init() {
artifactsM[Curl] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newCurl,
Name: "curl",
@@ -52,5 +52,5 @@ func init() {
},
ID: 381,
}
})
}

View File

@@ -29,7 +29,7 @@ func (t Toolchain) newDBus() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[DBus] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newDBus,
Name: "dbus",
@@ -42,7 +42,7 @@ func init() {
},
ID: 5356,
}
})
}
func (t Toolchain) newXDGDBusProxy() (pkg.Artifact, string) {
@@ -65,7 +65,7 @@ func (t Toolchain) newXDGDBusProxy() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[XDGDBusProxy] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newXDGDBusProxy,
Name: "xdg-dbus-proxy",
@@ -77,5 +77,5 @@ func init() {
},
ID: 58434,
}
})
}

View File

@@ -31,7 +31,7 @@ func (t Toolchain) newDTC() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[DTC] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newDTC,
Name: "dtc",
@@ -39,5 +39,5 @@ func init() {
Website: "https://git.kernel.org/pub/scm/utils/dtc/dtc.git/",
ID: 16911,
}
})
}

View File

@@ -39,7 +39,7 @@ func (t Toolchain) newElfutils() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Elfutils] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newElfutils,
Name: "elfutils",
@@ -55,5 +55,5 @@ func init() {
},
ID: 5679,
}
})
}

View File

@@ -46,7 +46,7 @@ index f135ad9..85c784c 100644
), version
}
func init() {
artifactsM[Fakeroot] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newFakeroot,
Name: "fakeroot",
@@ -54,5 +54,5 @@ func init() {
Website: "https://salsa.debian.org/clint/fakeroot",
ID: 12048,
}
})
}

View File

@@ -20,7 +20,7 @@ func (t Toolchain) newFlex() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Flex] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newFlex,
Name: "flex",
@@ -28,5 +28,5 @@ func init() {
Website: "https://github.com/westes/flex/",
ID: 819,
}
})
}

27
internal/rosa/freetype.go Normal file
View File

@@ -0,0 +1,27 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newFreetype() (pkg.Artifact, string) {
const (
version = "2.14.3"
checksum = "-WfLv8fVJNyCHpP_lriiDzOcVbBL9ajdQ3tl8AzIIUa9-8sVpU9irxOmSMgRHWYz"
)
return t.NewPackage("freetype", version, newTar(
"https://download.savannah.gnu.org/releases/freetype/"+
"freetype-"+version+".tar.gz",
checksum,
pkg.TarGzip,
), nil, (*MakeHelper)(nil)), version
}
func init() {
native.MustRegister(&Artifact{
f: Toolchain.newFreetype,
Name: "freetype",
Description: "a freely available software library to render fonts",
Website: "http://www.freetype.org/",
ID: 854,
})
}

View File

@@ -31,7 +31,7 @@ func (t Toolchain) newFuse() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Fuse] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newFuse,
Name: "fuse",
@@ -39,5 +39,5 @@ func init() {
Website: "https://github.com/libfuse/libfuse/",
ID: 861,
}
})
}

View File

@@ -55,6 +55,14 @@ disable_test t9300-fast-import
disable_test t0211-trace2-perf
disable_test t1517-outside-repo
disable_test t2200-add-update
disable_test t0027-auto-crlf
disable_test t7513-interpret-trailers
disable_test t7703-repack-geometric
disable_test t7002-mv-sparse-checkout
disable_test t1451-fsck-buffer
disable_test t4104-apply-boundary
disable_test t4200-rerere
disable_test t5515-fetch-merge-logic
`,
Check: []string{
"-C t",
@@ -80,7 +88,7 @@ disable_test t2200-add-update
), version
}
func init() {
artifactsM[Git] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newGit,
Name: "git",
@@ -94,7 +102,7 @@ func init() {
},
ID: 5350,
}
})
}
// NewViaGit returns a [pkg.Artifact] for cloning a git repository.
@@ -105,7 +113,7 @@ func (t Toolchain) NewViaGit(
return t.New(strings.TrimSuffix(
path.Base(url),
".git",
)+"-src-"+path.Base(rev), 0, t.AppendPresets(nil,
)+"-src-"+path.Base(rev), THostNet, t.Append(nil,
NSSCACert,
Git,
), &checksum, nil, `

View File

@@ -22,7 +22,7 @@ func (t Toolchain) newSPIRVHeaders() (pkg.Artifact, string) {
}), version
}
func init() {
artifactsM[SPIRVHeaders] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newSPIRVHeaders,
Name: "spirv-headers",
@@ -49,7 +49,7 @@ func init() {
}
return v.Latest
},
}
})
}
func (t Toolchain) newSPIRVTools() (pkg.Artifact, string) {
@@ -72,7 +72,7 @@ func (t Toolchain) newSPIRVTools() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[SPIRVTools] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newSPIRVTools,
Name: "spirv-tools",
@@ -86,13 +86,13 @@ func init() {
ID: 14894,
latest: (*Versions).getStable,
}
})
}
func (t Toolchain) newGlslang() (pkg.Artifact, string) {
const (
version = "16.2.0"
checksum = "6_UuF9reLRDaVkgO-9IfB3kMwme3lQZM8LL8YsJwPdUFkrjzxJtf2A9X3w9nFxj2"
version = "16.3.0"
checksum = "xyqDf8k3-D0_BXHGi0uLgMglnJ05Rf3j73QgbDs3sGtKNdBIQhY8JfqX1NcNoJQN"
)
return t.NewPackage("glslang", version, newFromGitHub(
"KhronosGroup/glslang",
@@ -116,7 +116,7 @@ func (t Toolchain) newGlslang() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Glslang] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newGlslang,
Name: "glslang",
@@ -124,7 +124,7 @@ func init() {
Website: "https://github.com/KhronosGroup/glslang",
ID: 205796,
}
})
}
func (t Toolchain) newSPIRVLLVMTranslator() (pkg.Artifact, string) {
@@ -132,15 +132,75 @@ func (t Toolchain) newSPIRVLLVMTranslator() (pkg.Artifact, string) {
version = "22.1.2"
checksum = "JZAaV5ewYcm-35YA_U2BM2IcsQouZtX1BLZR0zh2vSlfEXMsT5OCtY4Gh5RJkcGy"
)
skipChecks := []string{
// error: line 13: OpTypeCooperativeMatrixKHR Scope is limited to Workgroup and Subgroup
"cooperative_matrix_constant_null.spvasm",
}
switch t.arch {
case "arm64":
skipChecks = append(skipChecks,
// LLVM ERROR: unsupported calling convention
"DebugInfo/COFF/no-cus.ll",
"DebugInfo/Generic/2009-11-05-DeadGlobalVariable.ll",
"DebugInfo/Generic/2009-11-10-CurrentFn.ll",
"DebugInfo/Generic/2010-01-05-DbgScope.ll",
"DebugInfo/Generic/2010-03-12-llc-crash.ll",
"DebugInfo/Generic/2010-03-24-MemberFn.ll",
"DebugInfo/Generic/2010-04-19-FramePtr.ll",
"DebugInfo/Generic/2010-06-29-InlinedFnLocalVar.ll",
"DebugInfo/Generic/2010-10-01-crash.ll",
"DebugInfo/Generic/PR20038.ll",
"DebugInfo/Generic/constant-pointers.ll",
"DebugInfo/Generic/dead-argument-order.ll",
"DebugInfo/Generic/debug-info-eis-option.ll",
"DebugInfo/Generic/def-line.ll",
"DebugInfo/Generic/discriminator.ll",
"DebugInfo/Generic/dwarf-public-names.ll",
"DebugInfo/Generic/enum.ll",
"DebugInfo/Generic/func-using-decl.ll",
"DebugInfo/Generic/global.ll",
"DebugInfo/Generic/imported-name-inlined.ll",
"DebugInfo/Generic/incorrect-variable-debugloc1.ll",
"DebugInfo/Generic/inline-scopes.ll",
"DebugInfo/Generic/inlined-arguments.ll",
"DebugInfo/Generic/inlined-vars.ll",
"DebugInfo/Generic/linear-dbg-value.ll",
"DebugInfo/Generic/linkage-name-abstract.ll",
"DebugInfo/Generic/member-order.ll",
"DebugInfo/Generic/missing-abstract-variable.ll",
"DebugInfo/Generic/multiline.ll",
"DebugInfo/Generic/namespace_function_definition.ll",
"DebugInfo/Generic/namespace_inline_function_definition.ll",
"DebugInfo/Generic/noscopes.ll",
"DebugInfo/Generic/ptrsize.ll",
"DebugInfo/Generic/restrict.ll",
"DebugInfo/Generic/two-cus-from-same-file.ll",
"DebugInfo/Generic/version.ll",
"DebugInfo/LocalAddressSpace.ll",
"DebugInfo/UnknownBaseType.ll",
"DebugInfo/expr-opcode.ll",
)
}
return t.NewPackage("spirv-llvm-translator", version, newFromGitHub(
"KhronosGroup/SPIRV-LLVM-Translator",
"v"+version, checksum,
), &PackageAttr{
Patches: []KV{
{"remove-early-prefix", `diff --git a/CMakeLists.txt b/CMakeLists.txt
index c000a77e..86f79b03 100644
index c000a77e..f18f3fde 100644
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -164,7 +164,7 @@ install(
${LLVM_SPIRV_INCLUDE_DIRS}/LLVMSPIRVOpts.h
${LLVM_SPIRV_INCLUDE_DIRS}/LLVMSPIRVExtensions.inc
DESTINATION
- ${CMAKE_INSTALL_PREFIX}/include/LLVMSPIRVLib
+ include/LLVMSPIRVLib
)
configure_file(LLVMSPIRVLib.pc.in ${CMAKE_BINARY_DIR}/LLVMSPIRVLib.pc @ONLY)
@@ -172,5 +172,5 @@ install(
FILES
${CMAKE_BINARY_DIR}/LLVMSPIRVLib.pc
@@ -148,14 +208,12 @@ index c000a77e..86f79b03 100644
- ${CMAKE_INSTALL_PREFIX}/lib${LLVM_LIBDIR_SUFFIX}/pkgconfig
+ lib${LLVM_LIBDIR_SUFFIX}/pkgconfig
)
`},
;`},
},
// litArgs emits shell syntax
ScriptEarly: `
export LIT_OPTS=` + litArgs(true,
// error: line 13: OpTypeCooperativeMatrixKHR Scope is limited to Workgroup and Subgroup
"cooperative_matrix_constant_null.spvasm") + `
export LIT_OPTS=` + litArgs(true, skipChecks...) + `
`,
}, &CMakeHelper{
Cache: []KV{
@@ -174,7 +232,7 @@ export LIT_OPTS=` + litArgs(true,
), version
}
func init() {
artifactsM[SPIRVLLVMTranslator] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newSPIRVLLVMTranslator,
Name: "spirv-llvm-translator",
@@ -186,5 +244,5 @@ func init() {
},
ID: 227273,
}
})
}

View File

@@ -1,7 +1,6 @@
package rosa
import (
"runtime"
"slices"
"strconv"
"strings"
@@ -64,7 +63,7 @@ chmod +w tests/test-c32ispunct.sh && echo '#!/bin/sh' > tests/test-c32ispunct.sh
), version
}
func init() {
artifactsM[M4] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newM4,
Name: "m4",
@@ -72,7 +71,7 @@ func init() {
Website: "https://www.gnu.org/software/m4/",
ID: 1871,
}
})
}
func (t Toolchain) newBison() (pkg.Artifact, string) {
@@ -99,7 +98,7 @@ func (t Toolchain) newBison() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Bison] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newBison,
Name: "bison",
@@ -107,7 +106,7 @@ func init() {
Website: "https://www.gnu.org/software/bison/",
ID: 193,
}
})
}
func (t Toolchain) newSed() (pkg.Artifact, string) {
@@ -126,7 +125,7 @@ func (t Toolchain) newSed() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Sed] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newSed,
Name: "sed",
@@ -134,7 +133,7 @@ func init() {
Website: "https://www.gnu.org/software/sed/",
ID: 4789,
}
})
}
func (t Toolchain) newAutoconf() (pkg.Artifact, string) {
@@ -161,7 +160,7 @@ func (t Toolchain) newAutoconf() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Autoconf] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newAutoconf,
Name: "autoconf",
@@ -174,7 +173,7 @@ func init() {
},
ID: 141,
}
})
}
func (t Toolchain) newAutomake() (pkg.Artifact, string) {
@@ -206,7 +205,7 @@ test_disable '#!/bin/sh' t/pr9.sh
), version
}
func init() {
artifactsM[Automake] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newAutomake,
Name: "automake",
@@ -218,7 +217,7 @@ func init() {
},
ID: 144,
}
})
}
func (t Toolchain) newLibtool() (pkg.Artifact, string) {
@@ -232,7 +231,7 @@ func (t Toolchain) newLibtool() (pkg.Artifact, string) {
pkg.TarGzip,
), nil, &MakeHelper{
// _Z2a2c: symbol not found
SkipCheck: t.isStage0(),
SkipCheck: t.stage.isStage0(),
Check: []string{
"TESTSUITEFLAGS=" + jobsFlagE,
@@ -244,7 +243,7 @@ func (t Toolchain) newLibtool() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Libtool] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibtool,
Name: "libtool",
@@ -252,7 +251,7 @@ func init() {
Website: "https://www.gnu.org/software/libtool/",
ID: 1741,
}
})
}
func (t Toolchain) newGzip() (pkg.Artifact, string) {
@@ -270,7 +269,7 @@ func (t Toolchain) newGzip() (pkg.Artifact, string) {
}), version
}
func init() {
artifactsM[Gzip] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newGzip,
Name: "gzip",
@@ -278,7 +277,7 @@ func init() {
Website: "https://www.gnu.org/software/gzip/",
ID: 1290,
}
})
}
func (t Toolchain) newGettext() (pkg.Artifact, string) {
@@ -316,7 +315,7 @@ touch gettext-tools/autotools/archive.dir.tar
), version
}
func init() {
artifactsM[Gettext] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newGettext,
Name: "gettext",
@@ -324,7 +323,7 @@ func init() {
Website: "https://www.gnu.org/software/gettext/",
ID: 898,
}
})
}
func (t Toolchain) newDiffutils() (pkg.Artifact, string) {
@@ -349,7 +348,7 @@ test_disable '#!/bin/sh' tests/cmp
}, (*MakeHelper)(nil)), version
}
func init() {
artifactsM[Diffutils] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newDiffutils,
Name: "diffutils",
@@ -357,7 +356,7 @@ func init() {
Website: "https://www.gnu.org/software/diffutils/",
ID: 436,
}
})
}
func (t Toolchain) newPatch() (pkg.Artifact, string) {
@@ -381,7 +380,7 @@ test_disable '#!/bin/sh' tests/need-filename
}, (*MakeHelper)(nil)), version
}
func init() {
artifactsM[Patch] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newPatch,
Name: "patch",
@@ -389,7 +388,7 @@ func init() {
Website: "https://savannah.gnu.org/projects/patch/",
ID: 2597,
}
})
}
func (t Toolchain) newBash() (pkg.Artifact, string) {
@@ -411,7 +410,7 @@ func (t Toolchain) newBash() (pkg.Artifact, string) {
}), version
}
func init() {
artifactsM[Bash] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newBash,
Name: "bash",
@@ -419,7 +418,7 @@ func init() {
Website: "https://www.gnu.org/software/bash/",
ID: 166,
}
})
}
func (t Toolchain) newCoreutils() (pkg.Artifact, string) {
@@ -457,7 +456,7 @@ test_disable 'int main(){return 0;}' gnulib-tests/test-lchown.c
), version
}
func init() {
artifactsM[Coreutils] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newCoreutils,
Name: "coreutils",
@@ -465,7 +464,7 @@ func init() {
Website: "https://www.gnu.org/software/coreutils/",
ID: 343,
}
})
}
func (t Toolchain) newTexinfo() (pkg.Artifact, string) {
@@ -485,7 +484,7 @@ func (t Toolchain) newTexinfo() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Texinfo] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newTexinfo,
Name: "texinfo",
@@ -498,7 +497,7 @@ func init() {
},
ID: 4958,
}
})
}
func (t Toolchain) newGperf() (pkg.Artifact, string) {
@@ -515,7 +514,7 @@ func (t Toolchain) newGperf() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Gperf] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newGperf,
Name: "gperf",
@@ -523,7 +522,7 @@ func init() {
Website: "https://www.gnu.org/software/gperf/",
ID: 1237,
}
})
}
func (t Toolchain) newGawk() (pkg.Artifact, string) {
@@ -543,7 +542,7 @@ func (t Toolchain) newGawk() (pkg.Artifact, string) {
}), version
}
func init() {
artifactsM[Gawk] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newGawk,
Name: "gawk",
@@ -551,7 +550,7 @@ func init() {
Website: "https://www.gnu.org/software/gawk/",
ID: 868,
}
})
}
func (t Toolchain) newGrep() (pkg.Artifact, string) {
@@ -576,7 +575,7 @@ test_disable 'int main(){return 0;}' gnulib-tests/test-c32ispunct.c
), version
}
func init() {
artifactsM[Grep] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newGrep,
Name: "grep",
@@ -584,7 +583,7 @@ func init() {
Website: "https://www.gnu.org/software/grep/",
ID: 1251,
}
})
}
func (t Toolchain) newFindutils() (pkg.Artifact, string) {
@@ -607,7 +606,7 @@ echo 'int main(){return 0;}' > tests/xargs/test-sigusr.c
), version
}
func init() {
artifactsM[Findutils] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newFindutils,
Name: "findutils",
@@ -615,7 +614,7 @@ func init() {
Website: "https://www.gnu.org/software/findutils/",
ID: 812,
}
})
}
func (t Toolchain) newBC() (pkg.Artifact, string) {
@@ -636,7 +635,7 @@ func (t Toolchain) newBC() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[BC] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newBC,
Name: "bc",
@@ -644,7 +643,7 @@ func init() {
Website: "https://www.gnu.org/software/bc/",
ID: 170,
}
})
}
func (t Toolchain) newLibiconv() (pkg.Artifact, string) {
@@ -659,7 +658,7 @@ func (t Toolchain) newLibiconv() (pkg.Artifact, string) {
), nil, (*MakeHelper)(nil)), version
}
func init() {
artifactsM[Libiconv] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibiconv,
Name: "libiconv",
@@ -667,7 +666,7 @@ func init() {
Website: "https://www.gnu.org/software/libiconv/",
ID: 10656,
}
})
}
func (t Toolchain) newTar() (pkg.Artifact, string) {
@@ -701,7 +700,7 @@ func (t Toolchain) newTar() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Tar] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newTar,
Name: "tar",
@@ -709,7 +708,7 @@ func init() {
Website: "https://www.gnu.org/software/tar/",
ID: 4939,
}
})
}
func (t Toolchain) newParallel() (pkg.Artifact, string) {
@@ -731,7 +730,7 @@ ln -s ../system/bin/bash /bin/
), version
}
func init() {
artifactsM[Parallel] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newParallel,
Name: "parallel",
@@ -743,7 +742,7 @@ func init() {
},
ID: 5448,
}
})
}
func (t Toolchain) newLibunistring() (pkg.Artifact, string) {
@@ -768,7 +767,7 @@ test_disable 'int main(){return 0;}' tests/test-c32ispunct.c
), version
}
func init() {
artifactsM[Libunistring] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibunistring,
Name: "libunistring",
@@ -776,7 +775,7 @@ func init() {
Website: "https://www.gnu.org/software/libunistring/",
ID: 1747,
}
})
}
func (t Toolchain) newLibtasn1() (pkg.Artifact, string) {
@@ -791,7 +790,7 @@ func (t Toolchain) newLibtasn1() (pkg.Artifact, string) {
), nil, (*MakeHelper)(nil)), version
}
func init() {
artifactsM[Libtasn1] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibtasn1,
Name: "libtasn1",
@@ -799,7 +798,7 @@ func init() {
Website: "https://www.gnu.org/software/libtasn1/",
ID: 1734,
}
})
}
func (t Toolchain) newReadline() (pkg.Artifact, string) {
@@ -821,7 +820,7 @@ func (t Toolchain) newReadline() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Readline] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newReadline,
Name: "readline",
@@ -833,7 +832,7 @@ func init() {
},
ID: 4173,
}
})
}
func (t Toolchain) newGnuTLS() (pkg.Artifact, string) {
@@ -843,7 +842,7 @@ func (t Toolchain) newGnuTLS() (pkg.Artifact, string) {
)
var configureExtra []KV
switch runtime.GOARCH {
switch t.arch {
case "arm64":
configureExtra = []KV{
{"disable-hardware-acceleration"},
@@ -998,7 +997,7 @@ index 1b78b8cf1..350156a86 100644
), version
}
func init() {
artifactsM[GnuTLS] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newGnuTLS,
Name: "gnutls",
@@ -1014,7 +1013,7 @@ func init() {
},
ID: 1221,
}
})
}
func (t Toolchain) newBinutils() (pkg.Artifact, string) {
@@ -1031,7 +1030,7 @@ func (t Toolchain) newBinutils() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Binutils] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newBinutils,
Name: "binutils",
@@ -1039,7 +1038,7 @@ func init() {
Website: "https://www.gnu.org/software/binutils/",
ID: 7981,
}
})
}
func (t Toolchain) newGMP() (pkg.Artifact, string) {
@@ -1061,7 +1060,7 @@ func (t Toolchain) newGMP() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[GMP] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newGMP,
Name: "gmp",
@@ -1069,7 +1068,7 @@ func init() {
Website: "https://gmplib.org/",
ID: 1186,
}
})
}
func (t Toolchain) newMPFR() (pkg.Artifact, string) {
@@ -1087,7 +1086,7 @@ func (t Toolchain) newMPFR() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[MPFR] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newMPFR,
Name: "mpfr",
@@ -1099,7 +1098,7 @@ func init() {
},
ID: 2019,
}
})
}
func (t Toolchain) newMPC() (pkg.Artifact, string) {
@@ -1126,7 +1125,7 @@ func (t Toolchain) newMPC() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[MPC] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newMPC,
Name: "mpc",
@@ -1138,17 +1137,17 @@ func init() {
},
ID: 1667,
}
})
}
func (t Toolchain) newGCC() (pkg.Artifact, string) {
const (
version = "15.2.0"
checksum = "TXJ5WrbXlGLzy1swghQTr4qxgDCyIZFgJry51XEPTBZ8QYbVmFeB4lZbSMtPJ-a1"
version = "16.1.0"
checksum = "4ASoWbxaA2FW7PAB0zzHDPC5XnNhyaAyjtDPpGzceSLeYnEIXsNYZR3PA_Zu5P0K"
)
var configureExtra []KV
switch runtime.GOARCH {
switch t.arch {
case "amd64", "arm64":
configureExtra = append(configureExtra, KV{"with-multilib-list", "''"})
}
@@ -1347,7 +1346,7 @@ ln -s system/lib /work/
), version
}
func init() {
artifactsM[gcc] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newGCC,
Name: "gcc",
@@ -1363,5 +1362,5 @@ func init() {
},
ID: 6502,
}
})
}

View File

@@ -1,7 +1,6 @@
package rosa
import (
"runtime"
"slices"
"hakurei.app/internal/pkg"
@@ -10,7 +9,7 @@ import (
// newGoBootstrap returns the Go bootstrap toolchain.
func (t Toolchain) newGoBootstrap() pkg.Artifact {
const checksum = "8o9JL_ToiQKadCTb04nvBDkp8O1xiWOolAxVEqaTGodieNe4lOFEjlOxN3bwwe23"
return t.New("go1.4-bootstrap", 0, t.AppendPresets(nil,
return t.New("go1.4-bootstrap", 0, t.Append(nil,
Bash,
), nil, []string{
"CGO_ENABLED=0",
@@ -35,7 +34,11 @@ func (t Toolchain) newGo(
script string,
extra ...pkg.Artifact,
) pkg.Artifact {
return t.New("go"+version, 0, t.AppendPresets(extra,
name := "all"
if t.opts&OptSkipCheck != 0 {
name = "make"
}
return t.New("go"+version, 0, t.Append(extra,
Bash,
), nil, slices.Concat([]string{
"CC=cc",
@@ -48,7 +51,7 @@ cp -r /usr/src/go /work/system
cd /work/system/go/src
chmod -R +w ..
`+script+`
./all.bash
./`+name+`.bash
mkdir /work/system/bin
ln -s \
@@ -69,17 +72,17 @@ func (t Toolchain) newGoLatest() (pkg.Artifact, string) {
finalEnv []string
)
switch runtime.GOARCH {
switch t.arch {
case "amd64":
bootstrapExtra = append(bootstrapExtra, t.newGoBootstrap())
case "arm64", "riscv64":
bootstrapEnv = append(bootstrapEnv, "GOROOT_BOOTSTRAP=/system")
bootstrapExtra = t.AppendPresets(bootstrapExtra, gcc)
bootstrapExtra = t.Append(bootstrapExtra, gcc)
finalEnv = append(finalEnv, "CGO_ENABLED=0")
default:
panic("unsupported target " + runtime.GOARCH)
panic("unsupported target " + t.arch)
}
go119 := t.newGo(
@@ -103,8 +106,8 @@ echo \
"YtrDka402BOAEwywx03Vz4QlVwoBiguJHzG7PuythMCPHXS8CVMLvzmvgEbu4Tzu",
[]string{"CGO_ENABLED=0"}, `
sed -i \
's,/lib/ld-musl-`+linuxArch()+`.so.1,/system/bin/linker,' \
cmd/link/internal/`+runtime.GOARCH+`/obj.go
's,/lib/ld-musl-`+t.linuxArch()+`.so.1,/system/bin/linker,' \
cmd/link/internal/`+t.arch+`/obj.go
rm \
crypto/tls/handshake_client_test.go \
@@ -121,18 +124,18 @@ echo \
"wcI32bl1tkqbgcelGtGWPI4RtlEddd-PTd76Eb-k7nXA5LbE9yTNdIL9QSOOxMOs",
[]string{"CGO_ENABLED=0"}, `
sed -i \
's,/lib/ld-musl-`+linuxArch()+`.so.1,/system/bin/linker,' \
cmd/link/internal/`+runtime.GOARCH+`/obj.go
's,/lib/ld-musl-`+t.linuxArch()+`.so.1,/system/bin/linker,' \
cmd/link/internal/`+t.arch+`/obj.go
`, go121,
)
go125 := t.newGo(
"1.25.9",
"gShJb9uOMk5AxqPSwvn53ZO56S6PyP6nfojzrHUiJ3krAvrgjJpYa6-DPA-jxbpN",
"1.25.10",
"TwKwatkpwal-j9U2sDSRPEdM3YesI4Gm88YgGV59wtU-L85K9gA7UPy9SCxn6PMb",
[]string{"CGO_ENABLED=0"}, `
sed -i \
's,/lib/ld-musl-`+linuxArch()+`.so.1,/system/bin/linker,' \
cmd/link/internal/`+runtime.GOARCH+`/obj.go
's,/lib/ld-musl-`+t.linuxArch()+`.so.1,/system/bin/linker,' \
cmd/link/internal/`+t.arch+`/obj.go
rm \
os/root_unix_test.go \
@@ -141,16 +144,16 @@ rm \
)
const (
version = "1.26.2"
checksum = "v-6BE89_1g3xYf-9oIYpJKFXlo3xKHYJj2_VGkaUq8ZVkIVQmLwrto-xGG03OISH"
version = "1.26.3"
checksum = "lEiFocZFnN5fKvZzmwVdqc9pYUjAuhzqZGbuiOqxUP4XdcY8yECisKcqsQ_eNn1N"
)
return t.newGo(
version,
checksum,
finalEnv, `
sed -i \
's,/lib/ld-musl-`+linuxArch()+`.so.1,/system/bin/linker,' \
cmd/link/internal/`+runtime.GOARCH+`/obj.go
's,/lib/ld-musl-`+t.linuxArch()+`.so.1,/system/bin/linker,' \
cmd/link/internal/`+t.arch+`/obj.go
sed -i \
's/cpu.X86.HasAVX512VBMI/& \&\& cpu.X86.HasPOPCNT/' \
internal/runtime/gc/scan/scan_amd64.go
@@ -163,7 +166,7 @@ rm \
), version
}
func init() {
artifactsM[Go] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newGoLatest,
Name: "go",
@@ -171,5 +174,5 @@ func init() {
Website: "https://go.dev/",
ID: 1227,
}
})
}

View File

@@ -7,8 +7,8 @@ import (
func (t Toolchain) newGLib() (pkg.Artifact, string) {
const (
version = "2.88.0"
checksum = "T79Cg4z6j-sDZ2yIwvbY4ccRv2-fbwbqgcw59F5NQ6qJT6z4v261vbYp3dHO6Ma3"
version = "2.88.1"
checksum = "Rkszn6W4RHjyspyqfXdVAVawdwDJCuS0Zu0f7qot7tbJhnw2fUDoUUJB40m-1MCX"
)
return t.NewPackage("glib", version, t.newTagRemote(
"https://gitlab.gnome.org/GNOME/glib.git",
@@ -42,7 +42,7 @@ func (t Toolchain) newGLib() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[GLib] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newGLib,
Name: "glib",
@@ -56,5 +56,5 @@ func init() {
},
ID: 10024,
}
})
}

View File

@@ -14,7 +14,7 @@ go build -o /bin/hostname /usr/src/hostname/main.go
hostname = ""
}
return t.New("hakurei"+suffix+"-"+hakureiVersion, 0, t.AppendPresets(nil,
return t.New("hakurei"+suffix+"-"+hakureiVersion, 0, t.Append(nil,
Go,
PkgConfig,
@@ -58,7 +58,7 @@ func main() {
)))
}
func init() {
artifactsM[Hakurei] = Metadata{
native.MustRegister(&Artifact{
f: func(t Toolchain) (pkg.Artifact, string) {
return t.newHakurei("", `
mkdir -p /work/system/libexec/hakurei/
@@ -93,17 +93,21 @@ mkdir -p /work/system/bin/
Website: "https://hakurei.app/",
ID: 388834,
}
artifactsM[HakureiDist] = Metadata{
})
native.MustRegister(&Artifact{
f: func(t Toolchain) (pkg.Artifact, string) {
name := "all"
if t.opts&OptSkipCheck != 0 {
name = "make"
}
return t.newHakurei("-dist", `
export HAKUREI_VERSION
DESTDIR=/work /usr/src/hakurei/all.sh
DESTDIR=/work /usr/src/hakurei/`+name+`.sh
`, true), hakureiVersion
},
Name: "hakurei-dist",
Description: "low-level userspace tooling for Rosa OS (distribution tarball)",
Website: "https://hakurei.app/",
}
})
}

View File

@@ -4,13 +4,13 @@ package rosa
import "hakurei.app/internal/pkg"
const hakureiVersion = "0.4.1"
const hakureiVersion = "0.4.2"
// hakureiSource is the source code of a hakurei release.
var hakureiSource = newTar(
"https://git.gensokyo.uk/rosa/hakurei/archive/"+
"v"+hakureiVersion+".tar.gz",
"8bHvZcjUQOXUPbKL-qq99pHFTPnn-h7j1fkJudbGs8waLm3OmkI6eHfQev5bug2y",
"jadgaOrxv5ABGvzQ_Rk0aPGz7U8K-427TbMhQNQ32scSizEnlR44Pu7NoWYWVZWq",
pkg.TarGzip,
)

34
internal/rosa/hwdata.go Normal file
View File

@@ -0,0 +1,34 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newHwdata() (pkg.Artifact, string) {
const (
version = "0.407"
checksum = "6p1XD0CRuzt6hLfjv4ShKBW934BexmoPkRrmwxD4J63fBVCzVBRHyF8pVJdW_Xjm"
)
return t.NewPackage("hwdata", version, newFromGitHub(
"vcrhonek/hwdata",
"v"+version, checksum,
), &PackageAttr{
Writable: true,
EnterSource: true,
}, &MakeHelper{
// awk: fatal: cannot open file `hwdata.spec' for reading: No such file or directory
InPlace: true,
// lspci: Unknown option 'A' (see "lspci --help")
SkipCheck: true,
}), version
}
func init() {
native.MustRegister(&Artifact{
f: Toolchain.newHwdata,
Name: "hwdata",
Description: "contains various hardware identification and configuration data",
Website: "https://github.com/vcrhonek/hwdata",
ID: 5387,
})
}

View File

@@ -6,7 +6,7 @@ import (
)
func init() {
artifactsM[EarlyInit] = Metadata{
native.MustRegister(&Artifact{
Name: "earlyinit",
Description: "Rosa OS initramfs init program",
@@ -24,15 +24,14 @@ go build -trimpath -v -o /work/system/libexec/hakurei -ldflags="-s -w
echo
`, false), Unversioned
},
}
})
}
func (t Toolchain) newImageSystem() (pkg.Artifact, string) {
return t.New("system.img", TNoToolchain, t.AppendPresets(nil,
return t.New("system.img", TNoToolchain, t.Append(nil,
SquashfsTools,
), nil, nil, `
mksquashfs /mnt/system /work/system.img
`, pkg.Path(fhs.AbsRoot.Append("mnt"), false, t.AppendPresets(nil,
`, pkg.Path(fhs.AbsRoot.Append("mnt"), false, t.Append(nil,
Musl,
Mksh,
Toybox,
@@ -43,16 +42,15 @@ mksquashfs /mnt/system /work/system.img
)...)), Unversioned
}
func init() {
artifactsM[ImageSystem] = Metadata{
native.MustRegister(&Artifact{
Name: "system-image",
Description: "Rosa OS system image",
f: Toolchain.newImageSystem,
}
})
}
func (t Toolchain) newImageInitramfs() (pkg.Artifact, string) {
return t.New("initramfs", TNoToolchain, t.AppendPresets(nil,
return t.New("initramfs", TNoToolchain, t.Append(nil,
Zstd,
EarlyInit,
GenInitCPIO,
@@ -66,10 +64,10 @@ file /init /system/libexec/hakurei/earlyinit 0555 0 0
`)))), Unversioned
}
func init() {
artifactsM[ImageInitramfs] = Metadata{
native.MustRegister(&Artifact{
Name: "initramfs-image",
Description: "Rosa OS initramfs image",
f: Toolchain.newImageInitramfs,
}
})
}

View File

@@ -2,12 +2,12 @@ package rosa
import "hakurei.app/internal/pkg"
const kernelVersion = "6.12.84"
const kernelVersion = "6.12.87"
var kernelSource = newTar(
"https://git.kernel.org/pub/scm/linux/kernel/git/stable/linux.git/"+
"snapshot/linux-"+kernelVersion+".tar.gz",
"GJLUEu68r3DpLYoTcMl4wA_ThMBs_Zwc0gZsp82ii_3AOfcVxpI639IKfq2jAAY2",
"QTl5teIy0K5KsOLYGHQ3FbnPCZNRH2bySXVzghiOoHDdM3zAcSPUkmdly85lMzHx",
pkg.TarGzip,
)
@@ -19,18 +19,35 @@ chmod -R +w /work/usr/src/linux/
`, pkg.Path(AbsUsrSrc.Append("linux"), false, kernelSource)), kernelVersion
}
func init() {
artifactsM[KernelSource] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newKernelSource,
Name: "kernel-source",
Description: "a writable kernel source tree installed to /usr/src/linux",
Website: "https://kernel.org/",
}
})
}
func (t Toolchain) newKernelHeaders() (pkg.Artifact, string) {
checksum := perArch[string]{
"amd64": "lCmBNcMeUmXifg0vecKOPy3GAaFcJSmOPnf3wit9xYTDSTsFADPt1xxUFfmTn1fD",
"arm64": "PlRxp4JzZeMGx7CScRlT1NBzc2NVyJlb8Gm8sa3ofFght9ZT101ZJhcIXiCkHSHM",
}
return t.NewPackage("kernel-headers", kernelVersion, kernelSource, &PackageAttr{
Flag: TEarly,
Flag: TEarly,
KnownChecksum: new(mustDecode(checksum.unwrap(t.S))),
Paths: []pkg.ExecPath{
// updated manually for API changes
pkg.Path(AbsUsrSrc.Append("version.h"), false, pkg.NewFile(
"version.h", []byte(`#define LINUX_VERSION_CODE 396372
#define KERNEL_VERSION(a,b,c) (((a) << 16) + ((b) << 8) + ((c) > 255 ? 255 : (c)))
#define LINUX_VERSION_MAJOR 6
#define LINUX_VERSION_PATCHLEVEL 12
#define LINUX_VERSION_SUBLEVEL 84
`),
)),
},
}, &MakeHelper{
SkipConfigure: true,
@@ -43,19 +60,23 @@ func (t Toolchain) newKernelHeaders() (pkg.Artifact, string) {
"INSTALL_HDR_PATH=/work/system",
"headers_install",
},
Install: "# headers installed during make",
Install: `
cat \
/usr/src/version.h > \
/work/system/include/linux/version.h
`,
},
Rsync,
), kernelVersion
}
func init() {
artifactsM[KernelHeaders] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newKernelHeaders,
Name: "kernel-headers",
Description: "an installation of kernel headers",
Website: "https://kernel.org/",
}
})
}
func (t Toolchain) newKernel() (pkg.Artifact, string) {
@@ -1254,7 +1275,7 @@ rm -v /work/system/lib/modules/` + kernelVersion + `/build
), kernelVersion
}
func init() {
artifactsM[Kernel] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newKernel,
Name: "kernel",
@@ -1262,7 +1283,7 @@ func init() {
Website: "https://kernel.org/",
ID: 375621,
}
})
}
func (t Toolchain) newGenInitCPIO() (pkg.Artifact, string) {
@@ -1272,12 +1293,12 @@ cc -o /work/system/bin/gen_init_cpio /usr/src/linux/usr/gen_init_cpio.c
`, pkg.Path(AbsUsrSrc.Append("linux"), false, kernelSource)), kernelVersion
}
func init() {
artifactsM[GenInitCPIO] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newGenInitCPIO,
Name: "gen_init_cpio",
Description: "a program in the kernel source tree for creating initramfs archive",
}
})
}
func (t Toolchain) newFirmware() (pkg.Artifact, string) {
@@ -1319,7 +1340,7 @@ func (t Toolchain) newFirmware() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Firmware] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newFirmware,
Name: "firmware",
@@ -1327,5 +1348,5 @@ func init() {
Website: "https://git.kernel.org/pub/scm/linux/kernel/git/firmware/linux-firmware.git/",
ID: 141464,
}
})
}

View File

@@ -32,7 +32,7 @@ func (t Toolchain) newKmod() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Kmod] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newKmod,
Name: "kmod",
@@ -46,5 +46,5 @@ func init() {
},
ID: 1517,
}
})
}

100
internal/rosa/libarchive.go Normal file
View File

@@ -0,0 +1,100 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newLibarchive() (pkg.Artifact, string) {
const (
version = "3.8.7"
checksum = "CUJK4MDQmZmATClgQBH2Wt-7Ts4iiSUlg1J_TVb6-5IK3rVUgVLIMc5k-bnWB9w3"
)
return t.NewPackage("libarchive", version, newFromGitHub(
"libarchive/libarchive",
"v"+version, checksum,
), &PackageAttr{
Paths: []pkg.ExecPath{
pkg.Path(AbsUsrSrc.Append(
"CTestCustom.cmake",
), false, pkg.NewFile("CTestCustom.cmake", []byte(`
list(APPEND CTEST_CUSTOM_TESTS_IGNORE
"libarchive_test_archive_string_conversion_fail_c"
"libarchive_test_archive_string_conversion_fail_latin1"
"libarchive_test_archive_string_update_utf8_koi8"
"libarchive_test_gnutar_filename_encoding_KOI8R_UTF8"
"libarchive_test_gnutar_filename_encoding_KOI8R_CP866"
"libarchive_test_gnutar_filename_encoding_CP1251_UTF8"
"libarchive_test_gnutar_filename_encoding_Russian_Russia"
"libarchive_test_gnutar_filename_encoding_EUCJP_UTF8"
"libarchive_test_gnutar_filename_encoding_EUCJP_CP932"
"libarchive_test_gnutar_filename_encoding_CP932_UTF8"
"libarchive_test_pax_filename_encoding_KOI8R"
"libarchive_test_pax_filename_encoding_CP1251"
"libarchive_test_pax_filename_encoding_EUCJP"
"libarchive_test_pax_filename_encoding_CP932"
"libarchive_test_read_format_cpio_filename_UTF8_eucJP"
"libarchive_test_read_format_cpio_filename_CP866_KOI8R"
"libarchive_test_read_format_cpio_filename_KOI8R_CP866"
"libarchive_test_read_format_cpio_filename_UTF8_KOI8R"
"libarchive_test_read_format_cpio_filename_UTF8_CP866"
"libarchive_test_read_format_cpio_filename_eucJP_CP932"
"libarchive_test_read_format_cpio_filename_UTF8_CP932"
"libarchive_test_read_format_cpio_filename_CP866_CP1251"
"libarchive_test_read_format_cpio_filename_CP866_CP1251_win"
"libarchive_test_read_format_cpio_filename_KOI8R_CP1251"
"libarchive_test_read_format_cpio_filename_UTF8_CP1251"
"libarchive_test_read_format_gtar_filename_CP866_KOI8R"
"libarchive_test_read_format_gtar_filename_KOI8R_CP866"
"libarchive_test_read_format_gtar_filename_eucJP_CP932"
"libarchive_test_read_format_gtar_filename_CP866_CP1251"
"libarchive_test_read_format_gtar_filename_CP866_CP1251_win"
"libarchive_test_read_format_gtar_filename_KOI8R_CP1251"
"libarchive_test_read_format_rar_unicode_CP932"
"libarchive_test_read_format_zip_filename_CP932_eucJP"
"libarchive_test_read_format_zip_filename_UTF8_eucJP"
"libarchive_test_read_format_zip_filename_CP866_KOI8R"
"libarchive_test_read_format_zip_filename_KOI8R_CP866"
"libarchive_test_read_format_zip_filename_UTF8_KOI8R"
"libarchive_test_read_format_zip_filename_UTF8_CP866"
"libarchive_test_read_format_zip_filename_CP932_CP932"
"libarchive_test_read_format_zip_filename_UTF8_CP932"
"libarchive_test_read_format_zip_filename_CP866_CP1251"
"libarchive_test_read_format_zip_filename_CP866_CP1251_win"
"libarchive_test_read_format_zip_filename_KOI8R_CP1251"
"libarchive_test_read_format_zip_filename_UTF8_CP1251"
"libarchive_test_ustar_filename_encoding_KOI8R_UTF8"
"libarchive_test_ustar_filename_encoding_KOI8R_CP866"
"libarchive_test_ustar_filename_encoding_CP1251_UTF8"
"libarchive_test_ustar_filename_encoding_Russian_Russia"
"libarchive_test_ustar_filename_encoding_EUCJP_UTF8"
"libarchive_test_ustar_filename_encoding_EUCJP_CP932"
"libarchive_test_ustar_filename_encoding_CP932_UTF8"
"libarchive_test_zip_filename_encoding_KOI8R"
"libarchive_test_zip_filename_encoding_ru_RU_CP1251"
"libarchive_test_zip_filename_encoding_Russian_Russia"
"libarchive_test_zip_filename_encoding_EUCJP"
"libarchive_test_zip_filename_encoding_CP932"
"libarchive_test_read_format_cab_filename"
"libarchive_test_read_format_lha_filename"
"libarchive_test_read_format_tar_filename"
"libarchive_test_read_format_ustar_filename"
"libarchive_test_read_append_wrong_filter"
)
`))),
},
Writable: true,
ScriptEarly: `
install -Dv /usr/src/CTestCustom.cmake /cure/
`,
}, (*CMakeHelper)(nil)), version
}
func init() {
native.MustRegister(&Artifact{
f: Toolchain.newLibarchive,
Name: "libarchive",
Description: "multi-format archive and compression library",
Website: "https://www.libarchive.org/",
ID: 1558,
})
}

View File

@@ -4,8 +4,8 @@ import "hakurei.app/internal/pkg"
func (t Toolchain) newLibmd() (pkg.Artifact, string) {
const (
version = "1.1.0"
checksum = "9apYqPPZm0j5HQT8sCsVIhnVIqRD7XgN7kPIaTwTqnTuUq5waUAMq4M7ev8CODJ1"
version = "1.2.0"
checksum = "1rJ6joAO0wwMZvSfnRNkc1MOhywyAq7SM8VmF92NvDtv7Qdl1LRbjm5fg_DFFtGj"
)
return t.NewPackage("libmd", version, t.newTagRemote(
"https://git.hadrons.org/git/libmd.git",
@@ -21,7 +21,7 @@ install -D /usr/src/libmd/src/helper.c src/helper.c
), version
}
func init() {
artifactsM[Libmd] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibmd,
Name: "libmd",
@@ -29,7 +29,7 @@ func init() {
Website: "https://www.hadrons.org/software/libmd/",
ID: 15525,
}
})
}
func (t Toolchain) newLibbsd() (pkg.Artifact, string) {
@@ -50,7 +50,7 @@ func (t Toolchain) newLibbsd() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Libbsd] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibbsd,
Name: "libbsd",
@@ -58,5 +58,5 @@ func init() {
Website: "https://libbsd.freedesktop.org/",
ID: 1567,
}
})
}

View File

@@ -42,7 +42,7 @@ ln -s ../system/bin/bash /bin/
), version
}
func init() {
artifactsM[Libcap] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibcap,
Name: "libcap",
@@ -50,5 +50,5 @@ func init() {
Website: "https://sites.google.com/site/fullycapable/",
ID: 1569,
}
})
}

View File

@@ -0,0 +1,50 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newLibconfig() (pkg.Artifact, string) {
const (
version = "1.8.2"
checksum = "fD32hjeAZuTz98g6WYHRwsxphrgrEFqxi5Z1jlJemPckPBfxpS3i5HgshAuA6vmT"
)
return t.NewPackage("libconfig", version, newFromGitHub(
"hyperrealm/libconfig",
"v"+version,
checksum,
), &PackageAttr{
Patches: []KV{
{"disable-broken-tests", `diff --git a/tests/tests.c b/tests/tests.c
index eba7eae..f916d2e 100644
--- a/tests/tests.c
+++ b/tests/tests.c
@@ -753,7 +753,6 @@ int main(int argc, char **argv)
int failures;
TT_SUITE_START(LibConfigTests);
- TT_SUITE_TEST(LibConfigTests, ParsingAndFormatting);
TT_SUITE_TEST(LibConfigTests, ParseInvalidFiles);
TT_SUITE_TEST(LibConfigTests, ParseInvalidStrings);
TT_SUITE_TEST(LibConfigTests, BigInt1);
@@ -768,7 +767,6 @@ int main(int argc, char **argv)
TT_SUITE_TEST(LibConfigTests, OverrideSetting);
TT_SUITE_TEST(LibConfigTests, SettingLookups);
TT_SUITE_TEST(LibConfigTests, ReadStream);
- TT_SUITE_TEST(LibConfigTests, BinaryAndHex);
TT_SUITE_RUN(LibConfigTests);
failures = TT_SUITE_NUM_FAILURES(LibConfigTests);
TT_SUITE_END(LibConfigTests);
`},
},
}, (*CMakeHelper)(nil)), version
}
func init() {
native.MustRegister(&Artifact{
f: Toolchain.newLibconfig,
Name: "libconfig",
Description: "a simple library for processing structured configuration files",
Website: "https://hyperrealm.github.io/libconfig/",
ID: 1580,
})
}

View File

@@ -0,0 +1,30 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newLibdisplayInfo() (pkg.Artifact, string) {
const (
version = "0.3.0"
checksum = "yjOqPUHHYgRtpqGw5RI1n2Q1_hO5j0LiFNMbjcRWV4Nf71XwwoC9fZMlKBDeLchT"
)
return t.NewPackage("libdisplay-info", version, newFromGitLab(
"gitlab.freedesktop.org",
"emersion/libdisplay-info",
version, checksum,
), nil, (*MesonHelper)(nil),
Diffutils,
Hwdata,
), version
}
func init() {
native.MustRegister(&Artifact{
f: Toolchain.newLibdisplayInfo,
Name: "libdisplay-info",
Description: "EDID and DisplayID library",
Website: "https://gitlab.freedesktop.org/emersion/libdisplay-info",
ID: 326668,
})
}

33
internal/rosa/libepoxy.go Normal file
View File

@@ -0,0 +1,33 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newLibepoxy() (pkg.Artifact, string) {
const (
version = "1.5.10"
checksum = "OHI8wshrlGw6BMGrmSyejJtwzM2gPhyFJrTsKxULyKMmYrfgcOe7Iw2ibVoUND_Q"
)
return t.NewPackage("libepoxy", version, newFromGitHub(
"anholt/libepoxy",
version,
checksum,
), nil, &MesonHelper{
Setup: []KV{
{"Dglx", "no"},
{"Degl", "no"},
},
},
LibX11,
), version
}
func init() {
native.MustRegister(&Artifact{
f: Toolchain.newLibepoxy,
Name: "libepoxy",
Description: "a library for handling OpenGL function pointer management",
Website: "https://github.com/anholt/libepoxy",
ID: 6090,
})
}

View File

@@ -14,7 +14,7 @@ func (t Toolchain) newLibev() (pkg.Artifact, string) {
), nil, (*MakeHelper)(nil)), version
}
func init() {
artifactsM[Libev] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibev,
Name: "libev",
@@ -22,5 +22,5 @@ func init() {
Website: "http://libev.schmorp.de/",
ID: 1605,
}
})
}

View File

@@ -8,8 +8,8 @@ import (
func (t Toolchain) newLibexpat() (pkg.Artifact, string) {
const (
version = "2.8.0"
checksum = "pnwZ_JSif-OfoWIwk2JYXWHagOWMA3Sh-Ea0p-4Rz9U9mDEeAebhyvnfD7OYOMCk"
version = "2.8.1"
checksum = "iMEtbOJhQfGof2GxSlxffQSI1va_NDDQ9VIuqcPbNZ0291Dr8wttD5QecYyjIQap"
)
return t.NewPackage("libexpat", version, newFromGitHubRelease(
"libexpat/libexpat",
@@ -22,7 +22,7 @@ func (t Toolchain) newLibexpat() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Libexpat] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibexpat,
Name: "libexpat",
@@ -30,5 +30,5 @@ func init() {
Website: "https://libexpat.github.io/",
ID: 770,
}
})
}

View File

@@ -18,7 +18,7 @@ func (t Toolchain) newLibffi() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Libffi] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibffi,
Name: "libffi",
@@ -26,5 +26,5 @@ func init() {
Website: "https://sourceware.org/libffi/",
ID: 1611,
}
})
}

View File

@@ -24,7 +24,7 @@ mkdir /dev/shm/gd
), version
}
func init() {
artifactsM[Libgd] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibgd,
Name: "libgd",
@@ -36,5 +36,5 @@ func init() {
},
ID: 880,
}
})
}

View File

@@ -22,7 +22,7 @@ func (t Toolchain) newLibpng() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Libpng] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibpng,
Name: "libpng",
@@ -34,5 +34,5 @@ func init() {
},
ID: 1705,
}
})
}

View File

@@ -25,7 +25,7 @@ test_disable 'int main(){return 0;}' tests/test-is-public-builtin.c
), version
}
func init() {
artifactsM[Libpsl] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibpsl,
Name: "libpsl",
@@ -33,5 +33,5 @@ func init() {
Website: "https://rockdaboot.github.io/libpsl/",
ID: 7305,
}
})
}

View File

@@ -43,7 +43,7 @@ index adccef3..65a277a 100644
), version
}
func init() {
artifactsM[Libseccomp] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibseccomp,
Name: "libseccomp",
@@ -51,5 +51,5 @@ func init() {
Website: "https://github.com/seccomp/libseccomp/",
ID: 13823,
}
})
}

44
internal/rosa/libtirpc.go Normal file
View File

@@ -0,0 +1,44 @@
package rosa
import (
"strings"
"hakurei.app/internal/pkg"
)
func (t Toolchain) newLibtirpc() (pkg.Artifact, string) {
const (
version = "1.3.7"
checksum = "nzFfu7LNvnSNiNAryD1vtnNWnU-Xqee8KqfXUKoBf5yjb5-dkeRkYuRijdCoYLof"
)
return t.NewPackage("libtirpc", version, t.newTagRemote(
"git://linux-nfs.org/~steved/libtirpc",
"libtirpc-"+
strings.Join(strings.SplitN(version, ".", 3), "-"),
checksum,
), nil, &MakeHelper{
Generate: "sh -e ./bootstrap",
Configure: []KV{
{"CFLAGS", `"$(pkg-config --cflags libbsd-overlay) ${CFLAGS:-}"`},
{"disable-gssapi"},
},
},
Automake,
Libtool,
PkgConfig,
Libbsd,
KernelHeaders,
), version
}
func init() {
native.MustRegister(&Artifact{
f: Toolchain.newLibtirpc,
Name: "libtirpc",
Description: "a port of Suns Transport-Independent RPC library to Linux",
Website: "https://sourceforge.net/projects/libtirpc/",
ID: 1740,
})
}

View File

@@ -4,8 +4,8 @@ import "hakurei.app/internal/pkg"
func (t Toolchain) newLibucontext() (pkg.Artifact, string) {
const (
version = "1.5"
checksum = "Ggk7FMmDNBdCx1Z9PcNWWW6LSpjGYssn2vU0GK5BLXJYw7ZxZbA2m_eSgT9TFnIG"
version = "1.5.1"
checksum = "mUgeyJknjMxT-5fORzz-rqhZfP3Y7EZGBhOwvhuX7MsF4Pk9wkuwtrLf5IML-jWu"
)
return t.NewPackage("libucontext", version, newFromGitHub(
"kaniini/libucontext",
@@ -21,13 +21,13 @@ func (t Toolchain) newLibucontext() (pkg.Artifact, string) {
SkipConfigure: true,
InPlace: true,
Make: []string{
"ARCH=" + linuxArch(),
"ARCH=" + t.linuxArch(),
},
Install: "make prefix=/system DESTDIR=/work install",
}), version
}
func init() {
artifactsM[Libucontext] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibucontext,
Name: "libucontext",
@@ -35,5 +35,5 @@ func init() {
Website: "https://github.com/kaniini/libucontext/",
ID: 17085,
}
})
}

View File

@@ -20,7 +20,7 @@ func (t Toolchain) newLibxml2() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Libxml2] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibxml2,
Name: "libxml2",
@@ -28,5 +28,5 @@ func init() {
Website: "https://gitlab.gnome.org/GNOME/libxml2/",
ID: 1783,
}
})
}

View File

@@ -26,7 +26,7 @@ func (t Toolchain) newLibxslt() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Libxslt] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibxslt,
Name: "libxslt",
@@ -38,5 +38,5 @@ func init() {
},
ID: 13301,
}
})
}

View File

@@ -2,7 +2,6 @@ package rosa
import (
"regexp"
"runtime"
"slices"
"strings"
@@ -31,9 +30,9 @@ func litArgs(verbose bool, skipChecks ...string) string {
}
func (t Toolchain) newEarlyCompilerRT() (pkg.Artifact, string) {
version := t.Version(llvmSource)
source, version := t.Load(llvmSource)
major, _, _ := strings.Cut(version, ".")
return t.NewPackage("early-compiler-rt", version, t.Load(llvmSource), &PackageAttr{
return t.NewPackage("early-compiler-rt", version, source, &PackageAttr{
Flag: TExclusive,
}, &CMakeHelper{
Append: []string{"compiler-rt"},
@@ -72,10 +71,10 @@ ln -s \
"/work/system/lib/clang/` + major + `/lib/"
ln -s \
"clang_rt.crtbegin-` + linuxArch() + `.o" \
"clang_rt.crtbegin-` + t.linuxArch() + `.o" \
"/work/system/lib/${ROSA_TRIPLE}/crtbeginS.o"
ln -s \
"clang_rt.crtend-` + linuxArch() + `.o" \
"clang_rt.crtend-` + t.linuxArch() + `.o" \
"/work/system/lib/${ROSA_TRIPLE}/crtendS.o"
`,
},
@@ -86,7 +85,7 @@ ln -s \
), version
}
func init() {
artifactsM[earlyCompilerRT] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newEarlyCompilerRT,
Name: "early-compiler-rt",
@@ -95,12 +94,12 @@ func init() {
Dependencies: P{
Musl,
},
}
})
}
func (t Toolchain) newEarlyRuntimes() (pkg.Artifact, string) {
version := t.Version(llvmSource)
return t.NewPackage("early-runtimes", version, t.Load(llvmSource), &PackageAttr{
source, version := t.Load(llvmSource)
return t.NewPackage("early-runtimes", version, source, &PackageAttr{
Flag: TExclusive,
}, &CMakeHelper{
Append: []string{"runtimes"},
@@ -146,7 +145,7 @@ func (t Toolchain) newEarlyRuntimes() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[earlyRuntimes] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newEarlyRuntimes,
Name: "early-runtimes",
@@ -155,12 +154,12 @@ func init() {
Dependencies: P{
earlyCompilerRT,
},
}
})
}
func (t Toolchain) newLLVM() (pkg.Artifact, string) {
var early PArtifact = muslHeaders
if t.isStage0() {
early := muslHeaders
if t.stage.isStage0() {
// The LLVM build system uses the system installation when building with
// LLVM_LINK_LLVM_DYLIB, since it builds runtimes after the fact, using
// the just-built toolchain. This is unacceptable in stage0 due to the
@@ -218,7 +217,7 @@ func (t Toolchain) newLLVM() (pkg.Artifact, string) {
}, ";") + "'"},
}
if !t.isStage0() {
if !t.stage.isStage0() {
skipChecks := []string{
// expensive, pointless to run here
"benchmarks",
@@ -248,7 +247,7 @@ func (t Toolchain) newLLVM() (pkg.Artifact, string) {
// unwind: fails on musl
"eh_frame_fde_pc_range",
}
switch runtime.GOARCH {
switch t.arch {
case "arm64":
skipChecks = append(skipChecks,
// LLVM: intermittently crashes
@@ -258,7 +257,7 @@ func (t Toolchain) newLLVM() (pkg.Artifact, string) {
)
}
if presetOpts&OptLLVMNoLTO == 0 {
if t.opts&OptLLVMNoLTO == 0 {
cache = append(cache, []KV{
// very expensive
{"LLVM_ENABLE_LTO", "Thin"},
@@ -274,8 +273,10 @@ func (t Toolchain) newLLVM() (pkg.Artifact, string) {
}...)
}
version := t.Version(llvmSource)
return t.NewPackage("llvm", version, t.Load(llvmSource), nil, &CMakeHelper{
source, version := t.Load(llvmSource)
return t.NewPackage("llvm", version, source, &PackageAttr{
Flag: TExclusive,
}, &CMakeHelper{
Append: []string{"llvm"},
Cache: cache,
@@ -290,7 +291,7 @@ ln -s clang++ /work/system/bin/c++
// installation into test environment, and the tests end up testing the
// system installation instead. Tests are disabled on stage0 and relies
// on 3-stage determinism to test later stages.
SkipTest: t.isStage0(),
SkipTest: t.stage.isStage0(),
Test: `
chmod +w /bin && ln -s \
@@ -319,11 +320,11 @@ ninja ` + jobsFlagE + ` check-all
}
func init() {
const (
version = "22.1.4"
checksum = "Bk3t-tV5sD5T0bqefFMcLeFuAwXnhFipywZmqst5hAZs97QQWGKB_5XyAFjj5tDB"
version = "22.1.5"
checksum = "32gOaLPHcUlo3hkdk5RbFumTE01XKeCAYZcpvn8IDHF95egXVfDFSl6eZL3ChMen"
)
artifactsM[llvmSource] = Metadata{
native.MustRegister(&Artifact{
f: func(t Toolchain) (pkg.Artifact, string) {
return t.NewPatchedSource("llvm", version, newFromGitHub(
"llvm/llvm-project",
@@ -336,9 +337,9 @@ func init() {
Description: "LLVM monorepo with Rosa OS patches",
ID: 1830,
}
})
artifactsM[LLVM] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLLVM,
Name: "llvm",
@@ -350,5 +351,5 @@ func init() {
Zstd,
Musl,
},
}
})
}

View File

@@ -0,0 +1,59 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newLMSensors() (pkg.Artifact, string) {
const (
version = "3-6-2"
checksum = "7JYNutrihe-FP6r3ftf96uFZJJWPfxnBHL0ALSMA-vovaXVRr-sAjlLitw7WWpCI"
)
return t.NewPackage("lm_sensors", version, newFromGitHub(
"lm-sensors/lm-sensors",
"V"+version,
checksum,
), &PackageAttr{
Writable: true,
Chmod: true,
EnterSource: true,
ScriptEarly: `
ln -s \
../../system/bin/perl \
/usr/bin/
`,
}, &MakeHelper{
InPlace: true,
SkipConfigure: true,
Make: []string{
"CC=cc",
"ETCDIR=/system/etc",
"PREFIX=/system",
},
Check: []string{
"CC=cc",
"check",
},
Install: "make DESTDIR=/work PREFIX=/system install",
},
Perl,
PerlTestCmd,
M4,
Bison,
Flex,
), version
}
func init() {
native.MustRegister(&Artifact{
f: Toolchain.newLMSensors,
Name: "lm_sensors",
Description: "user-space support for hardware monitoring drivers",
Website: "https://hwmon.wiki.kernel.org/lm_sensors",
ID: 1831,
})
}

View File

@@ -27,7 +27,7 @@ cd "$(mktemp -d)"
))), version
}
func init() {
artifactsM[Make] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newMake,
Name: "make",
@@ -35,7 +35,7 @@ func init() {
Website: "https://www.gnu.org/software/make/",
ID: 1877,
}
})
}
// MakeHelper is the [Make] build system helper.
@@ -123,7 +123,7 @@ func (attr *MakeHelper) wantsDir() string {
}
// script generates the cure script.
func (attr *MakeHelper) script(name string) string {
func (attr *MakeHelper) script(s *S, name string) string {
if attr == nil {
attr = new(MakeHelper)
}
@@ -194,7 +194,7 @@ make \
}
scriptMake += "\n"
if !attr.SkipCheck && presetOpts&OptSkipCheck == 0 {
if !attr.SkipCheck && s.opts&OptSkipCheck == 0 {
scriptMake += attr.ScriptCheckEarly + `make \
` + jobsFlagE + ` \
`

View File

@@ -1,6 +1,10 @@
package rosa
import "hakurei.app/internal/pkg"
import (
"strings"
"hakurei.app/internal/pkg"
)
func (t Toolchain) newLibglvnd() (pkg.Artifact, string) {
const (
@@ -12,20 +16,38 @@ func (t Toolchain) newLibglvnd() (pkg.Artifact, string) {
"glvnd/libglvnd",
"v"+version,
checksum,
), nil, (*MesonHelper)(nil),
), nil, &MesonHelper{
Setup: []KV{
{"Dx11", "enabled"},
{"Dglx", "enabled"},
},
ScriptCompiled: `
export DISPLAY=':0'
Xvfb &
XVFB_PID="$!"
trap 'kill $XVFB_PID && wait $XVFB_PID' EXIT
`,
},
Binutils, // symbols check fail with llvm nm
Xserver, // test suite wants X server
LibXext,
), version
}
func init() {
artifactsM[Libglvnd] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibglvnd,
Name: "libglvnd",
Description: "The GL Vendor-Neutral Dispatch library",
Website: "https://gitlab.freedesktop.org/glvnd/libglvnd",
Dependencies: P{
LibXext,
},
ID: 12098,
}
})
}
func (t Toolchain) newLibdrm() (pkg.Artifact, string) {
@@ -50,7 +72,7 @@ func (t Toolchain) newLibdrm() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Libdrm] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibdrm,
Name: "libdrm",
@@ -62,5 +84,182 @@ func init() {
},
ID: 1596,
}
})
}
func (t Toolchain) newLibva() (pkg.Artifact, string) {
const (
version = "2.23.0"
checksum = "UmF5tPyWIG_w5kiR3KFpoYbF7UUcaak5tyc-RhOheNTwQlLkPlifreFYCM9FQxbq"
)
return t.NewPackage("libva", version, newFromGitHub(
"intel/libva",
version,
checksum,
), nil, &MesonHelper{
Setup: []KV{
{"Dwith_x11", "yes"},
{"Dwith_glx", "yes"},
{"Dwith_wayland", "yes"},
},
},
Libdrm,
LibXfixes,
Libglvnd,
Wayland,
KernelHeaders,
), version
}
func init() {
native.MustRegister(&Artifact{
f: Toolchain.newLibva,
Name: "libva",
Description: "an implementation for VA-API (Video Acceleration API)",
Website: "https://01.org/vaapi",
Dependencies: P{
Libdrm,
LibXfixes,
Libglvnd,
Wayland,
},
ID: 1752,
})
}
func (t Toolchain) newMesa() (pkg.Artifact, string) {
const (
version = "26.1.0"
checksum = "zU0fjqevySBaoi_5SLW3e2UffmGeBdxOuHmAHTH68n2hV-sjYoqg30koLqFXuk5y"
)
return t.NewPackage("mesa", version, newFromGitLab(
"gitlab.freedesktop.org",
"mesa/mesa",
"mesa-"+version,
checksum,
), nil, &MesonHelper{
Setup: []KV{
{"Dplatforms", "x11,wayland"},
{"Dvideo-codecs", "all"},
{"Dglvnd", "enabled"},
{"Dgbm", "enabled"},
{"Dgallium-drivers", strings.Join([]string{
"asahi", // Apple AGX
"crocus", // Intel legacy
"etnaviv", // Vivante GPU designs (mostly NXP/Marvell SoCs)
"freedreno", // Qualcomm Adreno (all Qualcomm SoCs)
"i915", // Intel extra legacy
"iris", // new Intel (Broadwell+)
"lima", // ARM Mali 4xx
"llvmpipe", // software renderer
"nouveau", // Nvidia
"panfrost", // ARM Mali Midgard and up (T/G series)
"r300", // very old AMD
"r600", // less old AMD
"radeonsi", // new AMD (GCN+)
"softpipe", // older software renderer
"svga", // VMWare virtualized GPU
"tegra", // Nvidia Tegra SoCs
"v3d", // Broadcom VC5 (Raspberry Pi 4)
"vc4", // Broadcom VC4 (Raspberry Pi 0-3)
"virgl", // QEMU virtualized GPU (aka VirGL)
"zink", // generic OpenGL over Vulkan, experimental
// d3d12: WSL emulated GPU (aka Dozen)
// ethosu: accelerator
// rocket: accelerator
}, ",")},
{"Dvulkan-drivers", strings.Join([]string{
"amd", // AMD (aka RADV)
"broadcom", // Broadcom VC5 (Raspberry Pi 4, aka V3D)
"freedreno", // Qualcomm Adreno (all Qualcomm SoCs)
"intel", // new Intel (aka ANV)
"intel_hasvk", // Intel Haswell/Broadwell, "legacy" Vulkan driver (https://www.phoronix.com/news/Intel-HasVK-Drop-Dead-Code)
"panfrost", // ARM Mali Midgard and up (T/G series)
"swrast", // software renderer (aka Lavapipe)
"virtio", // QEMU virtualized GPU (aka VirGL)
"imagination", // PowerVR Rogue
"asahi", // Apple AGX
"gfxstream", // Android virtualized GPU
// nouveau: Nouveau (aka NVK), requires rust
// microsoft-experimental: WSL virtualized GPU (aka DZN/Dozen)
// kosmickrisp: macOS-specific
}, ",")},
{"Dvulkan-layers", strings.Join([]string{
"device-select",
"intel-nullhw",
"overlay",
"screenshot",
"anti-lag",
"vram-report-limit",
}, ",")},
{"Dfreedreno-kmds", "msm,virtio"},
{"Damdgpu-virtio", "true"},
},
},
M4,
PythonPackaging,
PythonMako,
PythonPyYAML,
PythonPycparser,
Glslang,
SPIRVLLVMTranslator,
Zlib,
Zstd,
Gzip,
Ncurses,
Libglvnd,
Libexpat,
Libva,
Libdrm,
Elfutils,
Bison,
Flex,
LMSensors,
Libconfig,
LibdisplayInfo,
Wayland,
WaylandProtocols,
Libxshmfence,
LibXxf86vm,
LibXrandr,
LibxcbUtilKeysyms,
Libpng,
Libarchive,
KernelHeaders,
), version
}
func init() {
native.MustRegister(&Artifact{
f: Toolchain.newMesa,
Name: "mesa",
Description: "open source implementations of OpenGL, OpenGL ES, Vulkan, OpenCL, and more",
Website: "https://mesa3d.org",
Dependencies: P{
Libdrm,
Elfutils,
LMSensors,
LibdisplayInfo,
Wayland,
Libxshmfence,
LibXxf86vm,
LibXrandr,
LibxcbUtilKeysyms,
Libpng,
},
ID: 1970,
latest: (*Versions).getStable,
})
}

View File

@@ -52,7 +52,7 @@ python3 ./run_project_tests.py \
), version
}
func init() {
artifactsM[Meson] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newMeson,
Name: "meson",
@@ -67,7 +67,7 @@ func init() {
},
ID: 6472,
}
})
}
// MesonHelper is the [Meson] build system helper.
@@ -106,7 +106,7 @@ func (*MesonHelper) createDir() bool { return false }
func (*MesonHelper) wantsDir() string { return `"$(mktemp -d)"` }
// script generates the cure script.
func (attr *MesonHelper) script(name string) string {
func (attr *MesonHelper) script(s *S, name string) string {
if attr == nil {
attr = new(MesonHelper)
}
@@ -117,7 +117,7 @@ func (attr *MesonHelper) script(name string) string {
}
var scriptTest string
if !attr.SkipTest && presetOpts&OptSkipCheck == 0 {
if !attr.SkipTest && s.opts&OptSkipCheck == 0 {
scriptTest = `
meson test \
--print-errorlogs`

View File

@@ -7,7 +7,11 @@ func (t Toolchain) newMksh() (pkg.Artifact, string) {
version = "59c"
checksum = "0Zj-k4nXEu3IuJY4lvwD2OrC2t27GdZj8SPy4DoaeuBRH1padWb7oREpYgwY8JNq"
)
return t.New("mksh-"+version, 0, t.AppendPresets(nil,
scriptTest := "./test.sh -C regress:no-ctty\n"
if t.opts&OptSkipCheck != 0 {
scriptTest = ""
}
return t.New("mksh-"+version, 0, t.Append(nil,
Perl,
Coreutils,
), nil, []string{
@@ -18,8 +22,7 @@ cd "$(mktemp -d)"
sh /usr/src/mksh/Build.sh -r
CPPFLAGS="${CPPFLAGS} -DMKSH_BINSHPOSIX -DMKSH_BINSHREDUCED" \
sh /usr/src/mksh/Build.sh -r -L
./test.sh -C regress:no-ctty
`+scriptTest+`
mkdir -p /work/system/bin/
cp -v mksh /work/system/bin/
cp -v lksh /work/system/bin/sh
@@ -33,7 +36,7 @@ ln -vs ../system/bin/sh /work/bin/
))), version
}
func init() {
artifactsM[Mksh] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newMksh,
Name: "mksh",
@@ -41,5 +44,5 @@ func init() {
Website: "https://www.mirbsd.org/mksh",
ID: 5590,
}
})
}

View File

@@ -24,7 +24,7 @@ func (t Toolchain) newMuslFts() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[MuslFts] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newMuslFts,
Name: "musl-fts",
@@ -32,5 +32,5 @@ func init() {
Website: "https://github.com/void-linux/musl-fts",
ID: 26980,
}
})
}

View File

@@ -24,7 +24,7 @@ func (t Toolchain) newMuslObstack() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[MuslObstack] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newMuslObstack,
Name: "musl-obstack",
@@ -32,5 +32,5 @@ func init() {
Website: "https://github.com/void-linux/musl-obstack",
ID: 146206,
}
})
}

View File

@@ -14,7 +14,7 @@ func (t Toolchain) newMusl(headers bool) (pkg.Artifact, string) {
SkipCheck: true,
Script: `
mkdir -p /work/system/bin
COMPAT_LINKER_NAME="ld-musl-` + linuxArch() + `.so.1"
COMPAT_LINKER_NAME="ld-musl-` + t.linuxArch() + `.so.1"
ln -vs ../lib/libc.so /work/system/bin/linker
ln -vs ../lib/libc.so /work/system/bin/ldd
ln -vs libc.so "/work/system/lib/${COMPAT_LINKER_NAME}"
@@ -34,9 +34,9 @@ rmdir -v /work/lib
}
env := []string{
"LDFLAGS=" + earlyLDFLAGS(false),
"LDFLAGS=" + t.earlyLDFLAGS(false),
}
if t.isStage0() {
if t.stage.isStage0() {
env = append(env,
"CC=clang",
"AR=ar",
@@ -75,7 +75,7 @@ index 715948f4..c2fece68 100644
), version
}
func init() {
artifactsM[Musl] = Metadata{
native.MustRegister(&Artifact{
f: func(t Toolchain) (pkg.Artifact, string) {
return t.newMusl(false)
},
@@ -85,14 +85,14 @@ func init() {
Website: "https://musl.libc.org/",
ID: 11688,
}
})
artifactsM[muslHeaders] = Metadata{
native.MustRegister(&Artifact{
f: func(t Toolchain) (pkg.Artifact, string) {
return t.newMusl(true)
},
Name: "musl-headers",
Description: "system installation of musl headers",
}
})
}

View File

@@ -26,7 +26,7 @@ func (t Toolchain) newNcurses() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Ncurses] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newNcurses,
Name: "ncurses",
@@ -34,5 +34,5 @@ func init() {
Website: "https://invisible-island.net/ncurses/",
ID: 373226,
}
})
}

View File

@@ -39,7 +39,7 @@ index d223ac2..a7878d0 100644
), version
}
func init() {
artifactsM[Libmnl] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibmnl,
Name: "libmnl",
@@ -47,7 +47,7 @@ func init() {
Website: "https://www.netfilter.org/projects/libmnl/",
ID: 1663,
}
})
}
func (t Toolchain) newLibnftnl() (pkg.Artifact, string) {
@@ -77,7 +77,7 @@ func (t Toolchain) newLibnftnl() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Libnftnl] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newLibnftnl,
Name: "libnftnl",
@@ -89,7 +89,7 @@ func init() {
},
ID: 1681,
}
})
}
func (t Toolchain) newIPTables() (pkg.Artifact, string) {
@@ -131,7 +131,7 @@ chmod +w /etc/ && ln -s ../usr/src/iptables/etc/ethertypes /etc/
), version
}
func init() {
artifactsM[IPTables] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newIPTables,
Name: "iptables",
@@ -143,5 +143,5 @@ func init() {
},
ID: 1394,
}
})
}

View File

@@ -19,7 +19,7 @@ func (t Toolchain) newNettle() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Nettle] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newNettle,
Name: "nettle",
@@ -31,5 +31,5 @@ func init() {
},
ID: 2073,
}
})
}

View File

@@ -19,7 +19,7 @@ func (t Toolchain) newNettle3() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[nettle3] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newNettle3,
Name: "nettle3",
@@ -29,5 +29,5 @@ func init() {
Dependencies: P{
GMP,
},
}
})
}

View File

@@ -7,9 +7,11 @@ func (t Toolchain) newNinja() (pkg.Artifact, string) {
version = "1.13.2"
checksum = "ygKWMa0YV2lWKiFro5hnL-vcKbc_-RACZuPu0Io8qDvgQlZ0dxv7hPNSFkt4214v"
)
python, _ := t.Load(Python)
bash, _ := t.Load(Bash)
return t.New("ninja-"+version, 0, []pkg.Artifact{
t.Load(Python),
t.Load(Bash),
python,
bash,
}, nil, nil, `
cd "$(mktemp -d)"
python3 /usr/src/ninja/configure.py \
@@ -37,7 +39,7 @@ cp ninja /work/system/bin/
))), version
}
func init() {
artifactsM[Ninja] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newNinja,
Name: "ninja",
@@ -45,5 +47,5 @@ func init() {
Website: "https://ninja-build.org/",
ID: 2089,
}
})
}

View File

@@ -67,7 +67,7 @@ cp -r \
), version
}
func init() {
artifactsM[NSS] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newNSS,
Name: "nss",
@@ -79,7 +79,7 @@ func init() {
},
ID: 2503,
}
})
}
func init() {
@@ -87,7 +87,7 @@ func init() {
version = "0.5.1"
checksum = "oxjnuIrPVMPvD6x8VFLqB7EdbfuhouGQdtPuHDpEHGzoyH5nkxqtYN9UthMY9noA"
)
artifactsM[buildcatrust] = newPythonPackage(
native.newPythonPackage(
"buildcatrust", 233988,
"transform certificate stores between formats",
"https://github.com/nix-community/buildcatrust",
@@ -103,7 +103,7 @@ rm buildcatrust/tests/test_nonhermetic.py
}
func (t Toolchain) newNSSCACert() (pkg.Artifact, string) {
return t.New("nss-cacert", 0, t.AppendPresets(nil,
return t.New("nss-cacert", 0, t.Append(nil,
Bash,
NSS,
@@ -120,11 +120,11 @@ buildcatrust \
`), Unversioned
}
func init() {
artifactsM[NSSCACert] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newNSSCACert,
Name: "nss-cacert",
Description: "bundle of X.509 certificates of public Certificate Authorities",
Website: "https://curl.se/docs/caextract.html",
}
})
}

View File

@@ -40,7 +40,7 @@ func (t Toolchain) newOpenSSL() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[OpenSSL] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newOpenSSL,
Name: "openssl",
@@ -51,5 +51,5 @@ func init() {
// strange malformed tags treated as pre-releases in Anitya
latest: (*Versions).getStable,
}
})
}

View File

@@ -23,7 +23,7 @@ func (t Toolchain) newP11Kit() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[P11Kit] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newP11Kit,
Name: "p11-kit",
@@ -36,5 +36,5 @@ func init() {
},
ID: 2582,
}
})
}

View File

@@ -32,7 +32,7 @@ ln -s ../system/bin/toybox /bin/echo
), version
}
func init() {
artifactsM[PCRE2] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newPCRE2,
Name: "pcre2",
@@ -40,5 +40,5 @@ func init() {
Website: "https://pcre2project.github.io/pcre2/",
ID: 5832,
}
})
}

View File

@@ -50,7 +50,7 @@ chmod +w /system/bin && rm -f /system/bin/ps # perl does not like toybox ps
}), version
}
func init() {
artifactsM[Perl] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newPerl,
Name: "perl",
@@ -61,7 +61,7 @@ func init() {
// odd-even versioning
latest: (*Versions).getStable,
}
})
}
// newViaPerlModuleBuild installs a perl module via Build.PL.
@@ -69,12 +69,12 @@ func (t Toolchain) newViaPerlModuleBuild(
name, version string,
source pkg.Artifact,
patches []KV,
extra ...PArtifact,
extra ...ArtifactH,
) pkg.Artifact {
if name == "" || version == "" {
panic("names must be non-empty")
}
return t.New("perl-"+name, 0, t.AppendPresets(nil,
return t.New("perl-"+name, 0, t.Append(nil,
slices.Concat(P{Perl}, extra)...,
), nil, nil, `
cd /usr/src/`+name+`
@@ -100,7 +100,7 @@ func (t Toolchain) newPerlModuleBuild() (pkg.Artifact, string) {
), nil), version
}
func init() {
artifactsM[PerlModuleBuild] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newPerlModuleBuild,
Name: "perl-Module::Build",
@@ -110,7 +110,7 @@ func init() {
Dependencies: P{
Perl,
},
}
})
}
// newViaPerlMakeMaker installs a perl module via Makefile.PL.
@@ -118,7 +118,7 @@ func (t Toolchain) newViaPerlMakeMaker(
name, version string,
source pkg.Artifact,
patches []KV,
extra ...PArtifact,
extra ...ArtifactH,
) pkg.Artifact {
return t.NewPackage("perl-"+name, version, source, &PackageAttr{
// uses source tree as scratch space
@@ -154,7 +154,7 @@ func (t Toolchain) newPerlLocaleGettext() (pkg.Artifact, string) {
), nil), version
}
func init() {
artifactsM[PerlLocaleGettext] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newPerlLocaleGettext,
Name: "perl-Locale::gettext",
@@ -162,7 +162,7 @@ func init() {
Website: "https://metacpan.org/release/Locale-gettext",
ID: 7523,
}
})
}
func (t Toolchain) newPerlPodParser() (pkg.Artifact, string) {
@@ -178,7 +178,7 @@ func (t Toolchain) newPerlPodParser() (pkg.Artifact, string) {
), nil), version
}
func init() {
artifactsM[PerlPodParser] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newPerlPodParser,
Name: "perl-Pod::Parser",
@@ -186,7 +186,7 @@ func init() {
Website: "https://metacpan.org/release/Pod-Parser",
ID: 3244,
}
})
}
func (t Toolchain) newPerlSGMLS() (pkg.Artifact, string) {
@@ -202,7 +202,7 @@ func (t Toolchain) newPerlSGMLS() (pkg.Artifact, string) {
), nil), version
}
func init() {
artifactsM[PerlSGMLS] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newPerlSGMLS,
Name: "perl-SGMLS",
@@ -224,7 +224,7 @@ func init() {
}
return v.Latest
},
}
})
}
func (t Toolchain) newPerlTermReadKey() (pkg.Artifact, string) {
@@ -240,7 +240,7 @@ func (t Toolchain) newPerlTermReadKey() (pkg.Artifact, string) {
), nil), version
}
func init() {
artifactsM[PerlTermReadKey] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newPerlTermReadKey,
Name: "perl-Term::ReadKey",
@@ -248,7 +248,7 @@ func init() {
Website: "https://metacpan.org/release/TermReadKey",
ID: 3372,
}
})
}
func (t Toolchain) newPerlTextCharWidth() (pkg.Artifact, string) {
@@ -264,7 +264,7 @@ func (t Toolchain) newPerlTextCharWidth() (pkg.Artifact, string) {
), nil), version
}
func init() {
artifactsM[PerlTextCharWidth] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newPerlTextCharWidth,
Name: "perl-Text::CharWidth",
@@ -272,7 +272,7 @@ func init() {
Website: "https://metacpan.org/release/Text-CharWidth",
ID: 14380,
}
})
}
func (t Toolchain) newPerlTextWrapI18N() (pkg.Artifact, string) {
@@ -290,7 +290,7 @@ func (t Toolchain) newPerlTextWrapI18N() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[PerlTextWrapI18N] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newPerlTextWrapI18N,
Name: "perl-Text::WrapI18N",
@@ -302,7 +302,7 @@ func init() {
},
ID: 14385,
}
})
}
func (t Toolchain) newPerlMIMECharset() (pkg.Artifact, string) {
@@ -318,7 +318,7 @@ func (t Toolchain) newPerlMIMECharset() (pkg.Artifact, string) {
), nil), version
}
func init() {
artifactsM[PerlMIMECharset] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newPerlMIMECharset,
Name: "perl-MIME::Charset",
@@ -326,7 +326,7 @@ func init() {
Website: "https://metacpan.org/release/MIME-Charset",
ID: 3070,
}
})
}
func (t Toolchain) newPerlUnicodeLineBreak() (pkg.Artifact, string) {
@@ -344,7 +344,7 @@ func (t Toolchain) newPerlUnicodeLineBreak() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[PerlUnicodeLineBreak] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newPerlUnicodeLineBreak,
Name: "perl-Unicode::LineBreak",
@@ -356,7 +356,7 @@ func init() {
},
ID: 6033,
}
})
}
func (t Toolchain) newPerlYAMLTiny() (pkg.Artifact, string) {
@@ -372,7 +372,7 @@ func (t Toolchain) newPerlYAMLTiny() (pkg.Artifact, string) {
), nil), version
}
func init() {
artifactsM[PerlYAMLTiny] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newPerlYAMLTiny,
Name: "perl-YAML::Tiny",
@@ -380,5 +380,29 @@ func init() {
Website: "https://metacpan.org/release/YAML-Tiny",
ID: 3549,
}
})
}
func (t Toolchain) newPerlTestCmd() (pkg.Artifact, string) {
const (
version = "1.09"
checksum = "gpGUwyC9IozDiYSgW_kXARNfXsTPFa6cTowJmmCBbPqcs2-pONZca_SB06FGy-7H"
)
return t.newViaPerlMakeMaker("Test::Cmd", version, newFromCPAN(
"NEILB",
"Test-Cmd",
version,
checksum,
), nil), version
}
func init() {
native.MustRegister(&Artifact{
f: Toolchain.newPerlTestCmd,
Name: "perl-Test::Cmd",
Description: "portable testing of commands and scripts",
Website: "https://metacpan.org/release/Test-Cmd",
ID: 6014,
})
}

31
internal/rosa/pixman.go Normal file
View File

@@ -0,0 +1,31 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newPixman() (pkg.Artifact, string) {
const (
version = "0.46.4"
checksum = "iECDxLG9SxUrvGHqeDoaBa-b3uqdT5DC4zudjtrwb8Wodq82pyacmFNEAo4SDsiE"
)
return t.NewPackage("pixman", version, newFromGitLab(
"gitlab.freedesktop.org",
"pixman/pixman",
"pixman-"+version,
checksum,
), nil, &MesonHelper{
Setup: []KV{
{"Dtests", "enabled"},
},
}), version
}
func init() {
native.MustRegister(&Artifact{
f: Toolchain.newPixman,
Name: "pixman",
Description: "a low-level software library for pixel manipulation",
Website: "https://pixman.org/",
ID: 3648,
})
}

View File

@@ -24,7 +24,7 @@ func (t Toolchain) newPkgConfig() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[PkgConfig] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newPkgConfig,
Name: "pkg-config",
@@ -32,5 +32,5 @@ func init() {
Website: "https://pkgconfig.freedesktop.org/",
ID: 3649,
}
})
}

View File

@@ -27,7 +27,7 @@ func (t Toolchain) newProcps() (pkg.Artifact, string) {
), version
}
func init() {
artifactsM[Procps] = Metadata{
native.MustRegister(&Artifact{
f: Toolchain.newProcps,
Name: "procps",
@@ -35,5 +35,5 @@ func init() {
Website: "https://gitlab.com/procps-ng/procps",
ID: 3708,
}
})
}

Some files were not shown because too many files have changed in this diff Show More