81 Commits

Author SHA1 Message Date
1d8b1807aa internal/rosa/go: alternative bootstrap path
For targets where the bootstrap toolchain is not available.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-08 11:40:33 +09:00
2f3e323c46 internal/rosa/gnu: gcc toolchain artifact
This toolchain is hacked to pieces. It works well enough to bootstrap Go, though.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-08 01:00:15 +09:00
1fc9c3200f internal/rosa: libucontext artifact
Required by GCC on musl.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-07 22:33:12 +09:00
096a25ad3a cmd/mbf: dump IR of artifact presets
This exposes IR outside test cases, useful for verifying correctness of alternative IR emitters.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-07 17:21:43 +09:00
ffd2f979fb internal/pkg: skip duplicate early
This significantly increases IR generation performance.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-07 17:11:41 +09:00
31a8cc9b5c internal/rosa/gnu: binutils artifact
Appears to be required by GCC? It complains with stuff installed by LLVM.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-07 14:02:23 +09:00
bb3f60fc74 internal/rosa/gnu: gmp, mpfr, mpc artifacts
Required by GCC.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-07 13:55:40 +09:00
697c91e04d internal/rosa/cmake: expose earlier build script
This allows for more flexible build setups.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-07 13:23:13 +09:00
3f7b8b4332 internal/rosa/git: git clone helper
For obtaining sources of projects that stubbornly refuse to provide release tarballs.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-06 21:10:59 +09:00
fa94155f42 internal/rosa/etc: resolv.conf
Required by programs that download from the internet in measured execArtifact.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-06 21:04:59 +09:00
233bd163fb internal/rosa/git: disable flaky test
This fails intermittently.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-06 20:45:52 +09:00
f9b69c94bc internal/rosa/ssl: prefix CA paths
This makes prefixes consistent with everything else since this will end up in the final Rosa OS image.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-06 20:41:58 +09:00
68aefa6d59 internal/rosa/openssl: fix paths
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-06 19:53:51 +09:00
159fd55dbb internal/rosa/ssl: fix dependencies
These used to be provided by busybox.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-05 19:12:48 +09:00
ce6b3ff53b internal/rosa: unzip artifact
Because the zip format is too awful and cannot be streamed anyway, supporting it natively comes with no benefit.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-05 19:10:32 +09:00
30afa0e2ab internal/rosa/git: compile with http support
This should be able to fetch repositories deterministically.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-05 18:51:02 +09:00
9b751de078 internal/rosa/gnu: fix test suite flags
This sets the correct flag and also avoids changing ident per system.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-05 18:29:47 +09:00
d77ad3bb6e internal/rosa: curl artifact
Required for http support in git.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-05 18:15:16 +09:00
0142fc90b0 internal/rosa/make: post-configure script
Required for some projects with broken build scripts.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-05 18:13:48 +09:00
3c9f7cfcd0 internal/rosa: libpsl artifact
Required by curl.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-05 18:06:33 +09:00
a3526b3ceb internal/rosa: openssl artifact
Optional for many programs.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-05 18:03:18 +09:00
6ad21e2288 internal/rosa: register custom artifacts
This also encodes extra information for iana-etc.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-05 17:50:48 +09:00
27e2e3f996 internal/rosa/llvm: drop git dependency
This was added quite early and has no effect. Remove it to avoid compiling git early.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-05 17:44:58 +09:00
e0c720681b internal/pkg: standardise artifact IR
This should hopefully provide good separation between the artifact curing backend implementation and the (still work in progress) language. Making the IR parseable also guarantees uniqueness of the representation.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-05 08:24:09 +09:00
f982b13a59 internal/pkg: improve error resolution
This was taking way too long for early failures.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-03 10:01:44 +09:00
443911ada1 internal/rosa: use stage3 mirror
These get taken down periodically and causes way too many rebuilds when they are taken down. Use mirror until a more elegant solution is available.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-02 23:43:34 +09:00
d7a3706db3 internal/rosa/x: regenerate build system
These come with 16-year-old scripts that do not understand aarch64 or really anything else relevant to Rosa OS.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-02 19:57:39 +09:00
3226dc44dc internal/rosa/gnu: libtool artifact
Required when generating autotools build systems.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-02 19:52:08 +09:00
9f98d12ad8 internal/rosa/gnu: automake artifact
This is very expensive. Avoid.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-02 18:49:18 +09:00
550e83dda9 internal/rosa/gnu: grep artifact
Some GNU software do not like the grep in toybox.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-02 18:38:01 +09:00
7877b4e627 cmd/mbf: print extra stage3 information
This includes ident of all three stages and the matching checksum if check is passing.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-02 18:33:16 +09:00
47ce6f5bd0 internal/rosa/llvm: conditionally add Rosa OS paths
This change also moves rpath flags to a more appropriate method.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-02-02 07:29:35 +09:00
48f4ccba33 internal/rosa/llvm: add rosa vendor
This cleans up checks specific to Rosa OS, and fixes stack overflow in llvm under certain conditions.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-31 22:47:13 +09:00
c31884bee4 internal/rosa: disable broken tests
These fail when running as users with supplementary groups, since they are unmapped in the container. This was not the case in the development container where all groups were dropped, so the failure was missed.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-31 14:12:40 +09:00
f8661ad479 internal/rosa/hakurei: backport test case fix
This patch will be removed in the next release.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-31 12:00:21 +09:00
536f0cbae6 internal/rosa/gnu: gettext 0.26 to 1.0
This now requires kernel headers for some reason.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-31 11:41:07 +09:00
8d872ff1cd internal/rosa: fetch from gnu mirror
GNU infrastructure is extraordinarily flaky and fetching from it killed the server too many times.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-31 11:26:48 +09:00
bf14a412e4 container: fix host-dependent test cases
These are not fully controlled by hakurei and may change depending on host configuration.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-31 10:59:56 +09:00
8b4576bc5f internal/rosa: migrate to make helper
This migrates artifacts that the helper cannot produce an identical instance of.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-31 08:55:33 +09:00
29ebc52e26 internal/rosa/hakurei: suffix variants
This makes log output more useful.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-31 05:03:15 +09:00
5f81aac0e2 internal/rosa: make helper
This change only migrates artifacts that remain unchanged under the helper, so this change should not cause any rebuilds.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-31 05:01:22 +09:00
47490823be internal/rosa: improve cmake interface
This should make the call site look better for new artifacts.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-31 02:42:01 +09:00
1ac8ca7a80 internal/rosa: isolate make implementation
This will come with a helper eventually.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-31 02:33:14 +09:00
fd8b2fd522 internal/rosa: fix up dependencies
These are no longer provided by the (incomplete) toybox implementations, so they need to be specified explicitly.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-30 03:11:16 +09:00
20a8519044 internal/rosa/mksh: also build lksh
This is better suited for /system/bin/sh. Full mksh is still included, installed at /system/bin/mksh.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-30 00:54:30 +09:00
8c4fd00c50 internal/rosa/ninja: build in $TMPDIR
This used to build in /work/system/bin/ and unfortunately leaves its garbage there. This behaviour is from very early stages of this package, and was never fixed. This change updates it to use the "$(mktemp -d)" convention that every other artifact uses.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-30 00:36:43 +09:00
bc3dd6fbb0 internal/rosa: chmod via patch helper
This works around the zfs overlay mount overhead and significantly reduces I/O in general.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-30 00:29:44 +09:00
616ed29edf internal/rosa: early toybox variant
This is a variant of toybox with unfinished tools enabled, for artifacts that will end up in a dependency loop without them.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-30 00:24:14 +09:00
9d9b7294a4 internal/rosa: flags for toolchain-dependent artifact
This is much cleaner to add extra flags to.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-29 20:44:43 +09:00
6c1e2f10a7 internal/rosa: remove busybox artifact
This is no longer used and its implementation is unacceptably shaky.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-29 19:58:51 +09:00
abf96d2283 internal/rosa: replace busybox with toybox
The busybox artifact does not run on aarch64, and the workarounds required for it to compile successfully on x86_64 is unacceptably shaky. This change fully replaces it with toybox.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-29 19:51:18 +09:00
6c90e879da internal/rosa/llvm: enable asan
This is required by test suite of latest toybox.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-29 18:30:37 +09:00
d1b404dc3a internal/rosa: findutils artifact
Required by llvm test suite, compiler-rt sanitisers-related tests fail on toybox xargs.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-29 18:24:01 +09:00
744e4e0632 internal/rosa: sed artifact
Required by various GNU programs as they are not happy with toybox sed.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-29 18:17:18 +09:00
85eda49b2b internal/rosa: xz artifact
Wanted to avoid this as much as possible. Unfortunately newer versions of GNU findutils only come in xz and is required for llvm compiler-rt sanitisers.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-29 18:06:17 +09:00
b26bc05bb0 internal/rosa: remove unused receiver
This returns the preset itself, it is up to the caller to load the underlying artifact.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-29 17:52:50 +09:00
2d63ea8fee internal/rosa: gzip artifact
Toybox does not implement this, and it is used by many programs, including toybox itself.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-29 17:46:30 +09:00
dd4326418c internal/rosa: toybox artifact
This compiles surprisingly quickly and required no workarounds, unlike busybox.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-29 17:44:30 +09:00
79c0106ea0 internal/rosa: replace busybox dash with mksh
Toybox does not provide a shell, mksh fills that gap.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-29 02:00:32 +09:00
536db533de internal/rosa: install bash as sh
This works around software relying on bashisms even when explicitly invoking sh.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-29 01:25:44 +09:00
07927006a8 internal/pkg: set User-Agent header
Avoid living under the default user agent and be at the mercy of some IDS.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-29 01:17:35 +09:00
77ea27b038 internal/rosa: mksh artifact
This provides a shell, as part of the effort to replace busybox.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-29 00:51:32 +09:00
e76bc6a13a internal/rosa: resolve preset by name
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-28 20:57:51 +09:00
cc403c96d8 internal/rosa: remove busybox patch
This allows different versions of busybox to be attempted, to find one that works on arm.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-28 01:24:56 +09:00
66118ba941 internal/rosa: gawk artifact
Replaces broken awk in busybox.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-28 01:22:42 +09:00
823ba08dbc internal/rosa: use patch helper
This is significantly cleaner and runs somewhat faster.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-28 00:58:34 +09:00
660835151e internal/rosa: disable busybox SHA1_HWACCEL
This also pretties up the build output and sets correct linker path.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-28 00:51:09 +09:00
53e6df7e81 internal/rosa: remove uname
This does not change ident based on target.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-27 23:37:29 +09:00
bd80327a8f internal/rosa: add arm64 strings
This enables building on arm64.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-27 21:12:39 +09:00
41f9aebbb7 internal/pkg: allow multiarch
The armv8l busybox binary release needs this to run correctly.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-27 21:07:18 +09:00
a2a0e36802 internal/rosa: cross-platform stage3
The stage3 binary seed is arch-specific.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-27 20:58:31 +09:00
fbe93fc771 internal/rosa/busybox: cross-platform binary
The initial binary seed is arch-specific.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-27 20:54:11 +09:00
968d8dbaf1 internal/pkg: encode checksum in ident
This also rearranges the ident ir to be more predictable, and avoids an obvious and somewhat easy to get into inconsistent state.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-27 20:18:34 +09:00
f1758a6fa8 internal/rosa: nss artifacts
Not used by anything for now, but will be part of Rosa OS.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-27 08:17:58 +09:00
88aaa4497c internal/rosa/hakurei: dist tarball
The patch will be removed in the next release.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-27 07:34:45 +09:00
b7ea68de35 internal/rosa/hakurei: isolate hakurei helper
For creating the dist tarball, which runs in the same environment.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-27 07:28:13 +09:00
67e453f5c4 dist: run tests
This used to be impossible due to nix jank which has been addressed.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-27 07:00:39 +09:00
67092c835a internal/rosa/hakurei: v0.3.3 to v0.3.4
This now contains the sharefs program which pulls in fuse.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-27 05:40:53 +09:00
18918d9a0d internal/rosa: fuse artifact
Required by hakurei for sharefs.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-27 05:34:42 +09:00
380ca4e022 internal/rosa: pytest artifact
Required by libfuse. This pulls in many dependencies.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-27 05:20:37 +09:00
887aef8514 internal/rosa: python pip helper
Fuse requires pytest which depends on many packages. This helper eases the pain of packaging them.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-01-27 05:14:59 +09:00
60 changed files with 3504 additions and 1198 deletions

View File

@@ -113,6 +113,7 @@ func main() {
"stage3",
"Check for toolchain 3-stage non-determinism",
func(args []string) (err error) {
_, _, _, stage1 := (rosa.Std - 2).NewLLVM()
_, _, _, stage2 := (rosa.Std - 1).NewLLVM()
_, _, _, stage3 := rosa.Std.NewLLVM()
var (
@@ -120,6 +121,11 @@ func main() {
checksum [2]unique.Handle[pkg.Checksum]
)
if pathname, _, err = cache.Cure(stage1); err != nil {
return err
}
log.Println("stage1:", pathname)
if pathname, checksum[0], err = cache.Cure(stage2); err != nil {
return err
}
@@ -134,105 +140,60 @@ func main() {
Got: checksum[0].Value(),
Want: checksum[1].Value(),
}
} else {
log.Println(
"stage2 is identical to stage3",
"("+pkg.Encode(checksum[0].Value())+")",
)
}
return
},
)
c.NewCommand(
"cure",
"Cure the named artifact and show its path",
func(args []string) error {
if len(args) != 1 {
return errors.New("cure requires 1 argument")
}
var p rosa.PArtifact
switch args[0] {
case "acl":
p = rosa.ACL
case "attr":
p = rosa.Attr
case "autoconf":
p = rosa.Autoconf
case "bash":
p = rosa.Bash
case "busybox":
p = rosa.Busybox
case "cmake":
p = rosa.CMake
case "coreutils":
p = rosa.Coreutils
case "diffutils":
p = rosa.Diffutils
case "gettext":
p = rosa.Gettext
case "git":
p = rosa.Git
case "go":
p = rosa.Go
case "gperf":
p = rosa.Gperf
case "hakurei":
p = rosa.Hakurei
case "kernel-headers":
p = rosa.KernelHeaders
case "libXau":
p = rosa.LibXau
case "libexpat":
p = rosa.Libexpat
case "libseccomp":
p = rosa.Libseccomp
case "libxml2":
p = rosa.Libxml2
case "libffi":
p = rosa.Libffi
case "libgd":
p = rosa.Libgd
case "m4":
p = rosa.M4
case "make":
p = rosa.Make
case "meson":
p = rosa.Meson
case "ninja":
p = rosa.Ninja
case "patch":
p = rosa.Patch
case "perl":
p = rosa.Perl
case "pkg-config":
p = rosa.PkgConfig
case "python":
p = rosa.Python
case "rsync":
p = rosa.Rsync
case "setuptools":
p = rosa.Setuptools
case "wayland":
p = rosa.Wayland
case "wayland-protocols":
p = rosa.WaylandProtocols
case "xcb":
p = rosa.XCB
case "xcb-proto":
p = rosa.XCBProto
case "xproto":
p = rosa.Xproto
case "zlib":
p = rosa.Zlib
{
var (
flagDump string
)
c.NewCommand(
"cure",
"Cure the named artifact and show its path",
func(args []string) error {
if len(args) != 1 {
return errors.New("cure requires 1 argument")
}
if p, ok := rosa.ResolveName(args[0]); !ok {
return fmt.Errorf("unsupported artifact %q", args[0])
} else if flagDump == "" {
pathname, _, err := cache.Cure(rosa.Std.Load(p))
if err == nil {
log.Println(pathname)
}
return err
} else {
f, err := os.OpenFile(
flagDump,
os.O_WRONLY|os.O_CREATE|os.O_EXCL,
0644,
)
if err != nil {
return err
}
default:
return fmt.Errorf("unsupported artifact %q", args[0])
}
if err = cache.EncodeAll(f, rosa.Std.Load(p)); err != nil {
_ = f.Close()
return err
}
pathname, _, err := cache.Cure(rosa.Std.Load(p))
if err == nil {
log.Println(pathname)
}
return err
},
)
return f.Close()
}
},
).
Flag(
&flagDump,
"dump", command.StringFlag(""),
"Write IR to specified pathname and terminate",
)
}
c.MustParse(os.Args[1:], func(err error) {
if cache != nil {

View File

@@ -275,12 +275,12 @@ var containerTestCases = []struct {
),
earlyMnt(
ent("/", "/dev", "ro,nosuid,nodev,relatime", "tmpfs", "devtmpfs", ignore),
ent("/null", "/dev/null", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
ent("/zero", "/dev/zero", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
ent("/full", "/dev/full", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
ent("/random", "/dev/random", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
ent("/urandom", "/dev/urandom", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
ent("/tty", "/dev/tty", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
ent("/null", "/dev/null", ignore, "devtmpfs", "devtmpfs", ignore),
ent("/zero", "/dev/zero", ignore, "devtmpfs", "devtmpfs", ignore),
ent("/full", "/dev/full", ignore, "devtmpfs", "devtmpfs", ignore),
ent("/random", "/dev/random", ignore, "devtmpfs", "devtmpfs", ignore),
ent("/urandom", "/dev/urandom", ignore, "devtmpfs", "devtmpfs", ignore),
ent("/tty", "/dev/tty", ignore, "devtmpfs", "devtmpfs", ignore),
ent("/", "/dev/pts", "rw,nosuid,noexec,relatime", "devpts", "devpts", "rw,mode=620,ptmxmode=666"),
ent("/", "/dev/mqueue", "rw,nosuid,nodev,noexec,relatime", "mqueue", "mqueue", "rw"),
ent("/", "/dev/shm", "rw,nosuid,nodev,relatime", "tmpfs", "tmpfs", ignore),
@@ -293,12 +293,12 @@ var containerTestCases = []struct {
),
earlyMnt(
ent("/", "/dev", "ro,nosuid,nodev,relatime", "tmpfs", "devtmpfs", ignore),
ent("/null", "/dev/null", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
ent("/zero", "/dev/zero", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
ent("/full", "/dev/full", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
ent("/random", "/dev/random", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
ent("/urandom", "/dev/urandom", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
ent("/tty", "/dev/tty", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
ent("/null", "/dev/null", ignore, "devtmpfs", "devtmpfs", ignore),
ent("/zero", "/dev/zero", ignore, "devtmpfs", "devtmpfs", ignore),
ent("/full", "/dev/full", ignore, "devtmpfs", "devtmpfs", ignore),
ent("/random", "/dev/random", ignore, "devtmpfs", "devtmpfs", ignore),
ent("/urandom", "/dev/urandom", ignore, "devtmpfs", "devtmpfs", ignore),
ent("/tty", "/dev/tty", ignore, "devtmpfs", "devtmpfs", ignore),
ent("/", "/dev/pts", "rw,nosuid,noexec,relatime", "devpts", "devpts", "rw,mode=620,ptmxmode=666"),
ent("/", "/dev/shm", "rw,nosuid,nodev,relatime", "tmpfs", "tmpfs", ignore),
),
@@ -696,6 +696,9 @@ func init() {
mnt[i].VfsOptstr = strings.TrimSuffix(mnt[i].VfsOptstr, ",relatime")
mnt[i].VfsOptstr = strings.TrimSuffix(mnt[i].VfsOptstr, ",noatime")
cur.FsOptstr = strings.Replace(cur.FsOptstr, ",seclabel", "", 1)
mnt[i].FsOptstr = strings.Replace(mnt[i].FsOptstr, ",seclabel", "", 1)
if !cur.EqualWithIgnore(mnt[i], "\x00") {
fail = true
log.Printf("[FAIL] %s", cur)

21
dist/release.sh vendored
View File

@@ -2,19 +2,30 @@
cd "$(dirname -- "$0")/.."
VERSION="${HAKUREI_VERSION:-untagged}"
pname="hakurei-${VERSION}"
out="dist/${pname}"
out="${DESTDIR:-dist}/${pname}"
echo '# Preparing distribution files.'
mkdir -p "${out}"
cp -v "README.md" "dist/hsurc.default" "dist/install.sh" "${out}"
cp -rv "dist/comp" "${out}"
echo
echo '# Building hakurei.'
go generate ./...
go build -trimpath -v -o "${out}/bin/" -ldflags "-s -w -buildid= -extldflags '-static'
go build -trimpath -v -o "${out}/bin/" -ldflags "-s -w
-buildid= -extldflags '-static'
-X hakurei.app/internal/info.buildVersion=${VERSION}
-X hakurei.app/internal/info.hakureiPath=/usr/bin/hakurei
-X hakurei.app/internal/info.hsuPath=/usr/bin/hsu
-X main.hakureiPath=/usr/bin/hakurei" ./...
echo
rm -f "./${out}.tar.gz" && tar -C dist -czf "${out}.tar.gz" "${pname}"
rm -rf "./${out}"
(cd dist && sha512sum "${pname}.tar.gz" > "${pname}.tar.gz.sha512")
echo '# Testing hakurei.'
go test -ldflags='-buildid= -extldflags=-static' ./...
echo
echo '# Creating distribution.'
rm -f "${out}.tar.gz" && tar -C "${out}/.." -vczf "${out}.tar.gz" "${pname}"
rm -rf "${out}"
(cd "${out}/.." && sha512sum "${pname}.tar.gz" > "${pname}.tar.gz.sha512")
echo

View File

@@ -143,19 +143,27 @@
"bin/mount.fuse.sharefs" = "${hakurei}/libexec/sharefs";
};
dist = pkgs.runCommand "${hakurei.name}-dist" { buildInputs = hakurei.targetPkgs ++ [ pkgs.pkgsStatic.musl ]; } ''
# go requires XDG_CACHE_HOME for the build cache
export XDG_CACHE_HOME="$(mktemp -d)"
dist =
pkgs.runCommand "${hakurei.name}-dist"
{
buildInputs = hakurei.targetPkgs ++ [
pkgs.pkgsStatic.musl
];
}
''
cd $(mktemp -d) \
&& cp -r ${hakurei.src}/. . \
&& chmod +w cmd && cp -r ${hsu.src}/. cmd/hsu/ \
&& chmod -R +w .
# get a different workdir as go does not like /build
cd $(mktemp -d) \
&& cp -r ${hakurei.src}/. . \
&& chmod +w cmd && cp -r ${hsu.src}/. cmd/hsu/ \
&& chmod -R +w .
export HAKUREI_VERSION="v${hakurei.version}"
CC="clang -O3 -Werror" ./dist/release.sh && mkdir $out && cp -v "dist/hakurei-$HAKUREI_VERSION.tar.gz"* $out
'';
CC="musl-clang -O3 -Werror -Qunused-arguments" \
GOCACHE="$(mktemp -d)" \
HAKUREI_TEST_SKIP_ACL=1 \
PATH="${pkgs.pkgsStatic.musl.bin}/bin:$PATH" \
DESTDIR="$out" \
HAKUREI_VERSION="v${hakurei.version}" \
./dist/release.sh
'';
}
);

View File

@@ -24,7 +24,7 @@ var (
)
func TestUpdate(t *testing.T) {
if os.Getenv("GO_TEST_SKIP_ACL") == "1" {
if os.Getenv("HAKUREI_TEST_SKIP_ACL") == "1" {
t.Skip("acl test skipped")
}

View File

@@ -28,14 +28,6 @@ type FlatEntry struct {
| data []byte |
*/
// wordSize is the boundary which binary segments are always aligned to.
const wordSize = 8
// alignSize returns the padded size for aligning sz.
func alignSize(sz int) int {
return sz + (wordSize-(sz)%wordSize)%wordSize
}
// Encode encodes the entry for transmission or hashing.
func (ent *FlatEntry) Encode(w io.Writer) (n int, err error) {
pPathSize := alignSize(len(ent.Path))

View File

@@ -76,7 +76,7 @@ func TestFlatten(t *testing.T) {
"checksum/fLYGIMHgN1louE-JzITJZJo2SDniPu-IHBXubtvQWFO-hXnDVKNuscV7-zlyr5fU": {Mode: 0400, Data: []byte("\x7f\xe1\x69\xa2\xdd\x63\x96\x26\x83\x79\x61\x8b\xf0\x3f\xd5\x16\x9a\x39\x3a\xdb\xcf\xb1\xbc\x8d\x33\xff\x75\xee\x62\x56\xa9\xf0\x27\xac\x13\x94\x69")},
"identifier": {Mode: fs.ModeDir | 0700},
"identifier/NqVORkT6L9HX6Za7kT2zcibY10qFqBaxEjPiYFrBQX-ZFr3yxCzJxbKOP0zVjeWb": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/fLYGIMHgN1louE-JzITJZJo2SDniPu-IHBXubtvQWFO-hXnDVKNuscV7-zlyr5fU")},
"identifier/oM-2pUlk-mOxK1t3aMWZer69UdOQlAXiAgMrpZ1476VoOqpYVP1aGFS9_HYy-D8_": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/fLYGIMHgN1louE-JzITJZJo2SDniPu-IHBXubtvQWFO-hXnDVKNuscV7-zlyr5fU")},
"work": {Mode: fs.ModeDir | 0700},
}, []pkg.FlatEntry{
@@ -86,10 +86,10 @@ func TestFlatten(t *testing.T) {
{Mode: 0400, Path: "checksum/fLYGIMHgN1louE-JzITJZJo2SDniPu-IHBXubtvQWFO-hXnDVKNuscV7-zlyr5fU", Data: []byte("\x7f\xe1\x69\xa2\xdd\x63\x96\x26\x83\x79\x61\x8b\xf0\x3f\xd5\x16\x9a\x39\x3a\xdb\xcf\xb1\xbc\x8d\x33\xff\x75\xee\x62\x56\xa9\xf0\x27\xac\x13\x94\x69")},
{Mode: fs.ModeDir | 0700, Path: "identifier"},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/NqVORkT6L9HX6Za7kT2zcibY10qFqBaxEjPiYFrBQX-ZFr3yxCzJxbKOP0zVjeWb", Data: []byte("../checksum/fLYGIMHgN1louE-JzITJZJo2SDniPu-IHBXubtvQWFO-hXnDVKNuscV7-zlyr5fU")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/oM-2pUlk-mOxK1t3aMWZer69UdOQlAXiAgMrpZ1476VoOqpYVP1aGFS9_HYy-D8_", Data: []byte("../checksum/fLYGIMHgN1louE-JzITJZJo2SDniPu-IHBXubtvQWFO-hXnDVKNuscV7-zlyr5fU")},
{Mode: fs.ModeDir | 0700, Path: "work"},
}, pkg.MustDecode("bqtn69RkV5E7V7GhhgCFjcvbxmaqrO8DywamM4Tyjf10F6EJBHjXiIa_tFRtF4iN"), nil},
}, pkg.MustDecode("L_0RFHpr9JUS4Zp14rz2dESSRvfLzpvqsLhR1-YjQt8hYlmEdVl7vI3_-v8UNPKs"), nil},
{"sample directory step simple", fstest.MapFS{
".": {Mode: fs.ModeDir | 0500},
@@ -208,8 +208,8 @@ func TestFlatten(t *testing.T) {
"checksum/cTw0h3AmYe7XudSoyEMByduYXqGi-N5ZkTZ0t9K5elsu3i_jNIVF5T08KR1roBFM/work": {Mode: fs.ModeDir | 0500},
"identifier": {Mode: fs.ModeDir | 0700},
"identifier/-P_1iw6yVq_letMHncqcExSE0bYcDhYI5OdY6b1wKASf-Corufvj__XTBUq2Qd2a": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/cTw0h3AmYe7XudSoyEMByduYXqGi-N5ZkTZ0t9K5elsu3i_jNIVF5T08KR1roBFM")},
"identifier/0_rRxIqbX9LK9L_KDbuafotFz6HFkonNgO9gXhK1asM_Y1Pxn0amg756vRTo6m74": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/cTw0h3AmYe7XudSoyEMByduYXqGi-N5ZkTZ0t9K5elsu3i_jNIVF5T08KR1roBFM")},
"identifier/W5S65DEhawz_WKaok5NjUKLmnD9dNl5RPauNJjcOVcB3VM4eGhSaLGmXbL8vZpiw": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/cTw0h3AmYe7XudSoyEMByduYXqGi-N5ZkTZ0t9K5elsu3i_jNIVF5T08KR1roBFM")},
"identifier/rg7F1D5hwv6o4xctjD5zDq4i5MD0mArTsUIWfhUbik8xC6Bsyt3mjXXOm3goojTz": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/cTw0h3AmYe7XudSoyEMByduYXqGi-N5ZkTZ0t9K5elsu3i_jNIVF5T08KR1roBFM")},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
@@ -230,12 +230,12 @@ func TestFlatten(t *testing.T) {
{Mode: fs.ModeDir | 0500, Path: "checksum/cTw0h3AmYe7XudSoyEMByduYXqGi-N5ZkTZ0t9K5elsu3i_jNIVF5T08KR1roBFM/work"},
{Mode: fs.ModeDir | 0700, Path: "identifier"},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/-P_1iw6yVq_letMHncqcExSE0bYcDhYI5OdY6b1wKASf-Corufvj__XTBUq2Qd2a", Data: []byte("../checksum/cTw0h3AmYe7XudSoyEMByduYXqGi-N5ZkTZ0t9K5elsu3i_jNIVF5T08KR1roBFM")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/0_rRxIqbX9LK9L_KDbuafotFz6HFkonNgO9gXhK1asM_Y1Pxn0amg756vRTo6m74", Data: []byte("../checksum/cTw0h3AmYe7XudSoyEMByduYXqGi-N5ZkTZ0t9K5elsu3i_jNIVF5T08KR1roBFM")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/W5S65DEhawz_WKaok5NjUKLmnD9dNl5RPauNJjcOVcB3VM4eGhSaLGmXbL8vZpiw", Data: []byte("../checksum/cTw0h3AmYe7XudSoyEMByduYXqGi-N5ZkTZ0t9K5elsu3i_jNIVF5T08KR1roBFM")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/rg7F1D5hwv6o4xctjD5zDq4i5MD0mArTsUIWfhUbik8xC6Bsyt3mjXXOm3goojTz", Data: []byte("../checksum/cTw0h3AmYe7XudSoyEMByduYXqGi-N5ZkTZ0t9K5elsu3i_jNIVF5T08KR1roBFM")},
{Mode: fs.ModeDir | 0700, Path: "temp"},
{Mode: fs.ModeDir | 0700, Path: "work"},
}, pkg.MustDecode("sxbgyX-bPoezbha214n2lbQhiVfTUBkhZ0EX6zI7mmkMdrCdwuMwhMBJphLQsy94"), nil},
}, pkg.MustDecode("NQTlc466JmSVLIyWklm_u8_g95jEEb98PxJU-kjwxLpfdjwMWJq0G8ze9R4Vo1Vu"), nil},
{"sample tar expand step unpack", fstest.MapFS{
".": {Mode: fs.ModeDir | 0500},
@@ -255,8 +255,8 @@ func TestFlatten(t *testing.T) {
"checksum/CH3AiUrCCcVOjOYLaMKKK1Da78989JtfHeIsxMzWOQFiN4mrCLDYpoDxLWqJWCUN/libedac.so": {Mode: fs.ModeSymlink | 0777, Data: []byte("/proc/nonexistent/libedac.so")},
"identifier": {Mode: fs.ModeDir | 0700},
"identifier/-P_1iw6yVq_letMHncqcExSE0bYcDhYI5OdY6b1wKASf-Corufvj__XTBUq2Qd2a": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/CH3AiUrCCcVOjOYLaMKKK1Da78989JtfHeIsxMzWOQFiN4mrCLDYpoDxLWqJWCUN")},
"identifier/0_rRxIqbX9LK9L_KDbuafotFz6HFkonNgO9gXhK1asM_Y1Pxn0amg756vRTo6m74": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/CH3AiUrCCcVOjOYLaMKKK1Da78989JtfHeIsxMzWOQFiN4mrCLDYpoDxLWqJWCUN")},
"identifier/W5S65DEhawz_WKaok5NjUKLmnD9dNl5RPauNJjcOVcB3VM4eGhSaLGmXbL8vZpiw": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/CH3AiUrCCcVOjOYLaMKKK1Da78989JtfHeIsxMzWOQFiN4mrCLDYpoDxLWqJWCUN")},
"identifier/_v1blm2h-_KA-dVaawdpLas6MjHc6rbhhFS8JWwx8iJxZGUu8EBbRrhr5AaZ9PJL": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/CH3AiUrCCcVOjOYLaMKKK1Da78989JtfHeIsxMzWOQFiN4mrCLDYpoDxLWqJWCUN")},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
@@ -268,12 +268,12 @@ func TestFlatten(t *testing.T) {
{Mode: fs.ModeSymlink | 0777, Path: "checksum/CH3AiUrCCcVOjOYLaMKKK1Da78989JtfHeIsxMzWOQFiN4mrCLDYpoDxLWqJWCUN/libedac.so", Data: []byte("/proc/nonexistent/libedac.so")},
{Mode: fs.ModeDir | 0700, Path: "identifier"},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/-P_1iw6yVq_letMHncqcExSE0bYcDhYI5OdY6b1wKASf-Corufvj__XTBUq2Qd2a", Data: []byte("../checksum/CH3AiUrCCcVOjOYLaMKKK1Da78989JtfHeIsxMzWOQFiN4mrCLDYpoDxLWqJWCUN")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/0_rRxIqbX9LK9L_KDbuafotFz6HFkonNgO9gXhK1asM_Y1Pxn0amg756vRTo6m74", Data: []byte("../checksum/CH3AiUrCCcVOjOYLaMKKK1Da78989JtfHeIsxMzWOQFiN4mrCLDYpoDxLWqJWCUN")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/W5S65DEhawz_WKaok5NjUKLmnD9dNl5RPauNJjcOVcB3VM4eGhSaLGmXbL8vZpiw", Data: []byte("../checksum/CH3AiUrCCcVOjOYLaMKKK1Da78989JtfHeIsxMzWOQFiN4mrCLDYpoDxLWqJWCUN")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/_v1blm2h-_KA-dVaawdpLas6MjHc6rbhhFS8JWwx8iJxZGUu8EBbRrhr5AaZ9PJL", Data: []byte("../checksum/CH3AiUrCCcVOjOYLaMKKK1Da78989JtfHeIsxMzWOQFiN4mrCLDYpoDxLWqJWCUN")},
{Mode: fs.ModeDir | 0700, Path: "temp"},
{Mode: fs.ModeDir | 0700, Path: "work"},
}, pkg.MustDecode("4I8wx_h7NSJTlG5lbuz-GGEXrOg0GYC3M_503LYEBhv5XGWXfNIdIY9Q3eVSYldX"), nil},
}, pkg.MustDecode("hSoSSgCYTNonX3Q8FjvjD1fBl-E-BQyA6OTXro2OadXqbST4tZ-akGXszdeqphRe"), nil},
{"testtool", fstest.MapFS{
".": {Mode: fs.ModeDir | 0500},
@@ -295,9 +295,9 @@ func TestFlatten(t *testing.T) {
"checksum/OLBgp1GsljhM2TJ-sbHjaiH9txEUvgdDTAzHv2P24donTt6_529l-9Ua0vFImLlb": {Mode: 0400, Data: []byte{}},
"identifier": {Mode: fs.ModeDir | 0700},
"identifier/U2cbgVgEtjfRuvHfE1cQnZ3t8yoexULQyo_VLgvxAVJSsobMcNaFIsuDWtmt7kzK": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
"identifier/_gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/OLBgp1GsljhM2TJ-sbHjaiH9txEUvgdDTAzHv2P24donTt6_529l-9Ua0vFImLlb")},
"identifier/nfeISfLeFDr1k-g3hpE1oZ440kTqDdfF8TDpoLdbTPqaMMIl95oiqcvqjRkMjubA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"identifier/dztPS6jRjiZtCF4_p8AzfnxGp6obkhrgFVsxdodbKWUoAEVtDz3MykepJB4kI_ks": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
"identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
@@ -311,13 +311,13 @@ func TestFlatten(t *testing.T) {
{Mode: 0400, Path: "checksum/OLBgp1GsljhM2TJ-sbHjaiH9txEUvgdDTAzHv2P24donTt6_529l-9Ua0vFImLlb", Data: []byte{}},
{Mode: fs.ModeDir | 0700, Path: "identifier"},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/U2cbgVgEtjfRuvHfE1cQnZ3t8yoexULQyo_VLgvxAVJSsobMcNaFIsuDWtmt7kzK", Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/_gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA", Data: []byte("../checksum/OLBgp1GsljhM2TJ-sbHjaiH9txEUvgdDTAzHv2P24donTt6_529l-9Ua0vFImLlb")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/nfeISfLeFDr1k-g3hpE1oZ440kTqDdfF8TDpoLdbTPqaMMIl95oiqcvqjRkMjubA", Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/dztPS6jRjiZtCF4_p8AzfnxGp6obkhrgFVsxdodbKWUoAEVtDz3MykepJB4kI_ks", Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK", Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
{Mode: fs.ModeDir | 0700, Path: "temp"},
{Mode: fs.ModeDir | 0700, Path: "work"},
}, pkg.MustDecode("UiV6kMz7KrTsc_yphiyQzFLqjRanHxUOwrBMtkKuWo4mOO6WgPFAcoUEeSp7eVIW"), nil},
}, pkg.MustDecode("Q5DluWQCAeohLoiGRImurwFp3vdz9IfQCoj7Fuhh73s4KQPRHpEQEnHTdNHmB8Fx"), nil},
{"testtool net", fstest.MapFS{
".": {Mode: fs.ModeDir | 0500},
@@ -339,9 +339,9 @@ func TestFlatten(t *testing.T) {
"checksum/a1F_i9PVQI4qMcoHgTQkORuyWLkC1GLIxOhDt2JpU1NGAxWc5VJzdlfRK-PYBh3W/check": {Mode: 0400, Data: []byte("net")},
"identifier": {Mode: fs.ModeDir | 0700},
"identifier/G8qPxD9puvvoOVV7lrT80eyDeIl3G_CCFoKw12c8mCjMdG1zF7NEPkwYpNubClK3": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/a1F_i9PVQI4qMcoHgTQkORuyWLkC1GLIxOhDt2JpU1NGAxWc5VJzdlfRK-PYBh3W")},
"identifier/_gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/OLBgp1GsljhM2TJ-sbHjaiH9txEUvgdDTAzHv2P24donTt6_529l-9Ua0vFImLlb")},
"identifier/QdsJhGgnk5N2xdUNGcndXQxFKifxf1V_2t9X8CQ-pDcg24x6mGJC_BiLfGbs6Qml": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/a1F_i9PVQI4qMcoHgTQkORuyWLkC1GLIxOhDt2JpU1NGAxWc5VJzdlfRK-PYBh3W")},
"identifier/nfeISfLeFDr1k-g3hpE1oZ440kTqDdfF8TDpoLdbTPqaMMIl95oiqcvqjRkMjubA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
@@ -355,13 +355,13 @@ func TestFlatten(t *testing.T) {
{Mode: 0400, Path: "checksum/a1F_i9PVQI4qMcoHgTQkORuyWLkC1GLIxOhDt2JpU1NGAxWc5VJzdlfRK-PYBh3W/check", Data: []byte("net")},
{Mode: fs.ModeDir | 0700, Path: "identifier"},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/QdsJhGgnk5N2xdUNGcndXQxFKifxf1V_2t9X8CQ-pDcg24x6mGJC_BiLfGbs6Qml", Data: []byte("../checksum/a1F_i9PVQI4qMcoHgTQkORuyWLkC1GLIxOhDt2JpU1NGAxWc5VJzdlfRK-PYBh3W")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/G8qPxD9puvvoOVV7lrT80eyDeIl3G_CCFoKw12c8mCjMdG1zF7NEPkwYpNubClK3", Data: []byte("../checksum/a1F_i9PVQI4qMcoHgTQkORuyWLkC1GLIxOhDt2JpU1NGAxWc5VJzdlfRK-PYBh3W")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/_gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA", Data: []byte("../checksum/OLBgp1GsljhM2TJ-sbHjaiH9txEUvgdDTAzHv2P24donTt6_529l-9Ua0vFImLlb")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/nfeISfLeFDr1k-g3hpE1oZ440kTqDdfF8TDpoLdbTPqaMMIl95oiqcvqjRkMjubA", Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK", Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
{Mode: fs.ModeDir | 0700, Path: "temp"},
{Mode: fs.ModeDir | 0700, Path: "work"},
}, pkg.MustDecode("ek4K-0d4iRSArkY2TCs3WK34DbiYeOmhE_4vsJTSu_6roY4ZF3YG6eKRooal-i1o"), nil},
}, pkg.MustDecode("bPYvvqxpfV7xcC1EptqyKNK1klLJgYHMDkzBcoOyK6j_Aj5hb0mXNPwTwPSK5F6Z"), nil},
{"sample exec container overlay root", fstest.MapFS{
".": {Mode: fs.ModeDir | 0700},
@@ -372,8 +372,8 @@ func TestFlatten(t *testing.T) {
"checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU": {Mode: fs.ModeDir | 0500},
"identifier": {Mode: fs.ModeDir | 0700},
"identifier/5ey2wpmMpj483YYa7ZZQciYLA2cx3_l167JCqWW4Pd-5DVp81dj9EsBtVTwYptF6": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
"identifier/nfeISfLeFDr1k-g3hpE1oZ440kTqDdfF8TDpoLdbTPqaMMIl95oiqcvqjRkMjubA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"identifier/RdMA-mubnrHuu3Ky1wWyxauSYCO0ZH_zCPUj3uDHqkfwv5sGcByoF_g5PjlGiClb": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
"identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
@@ -386,12 +386,12 @@ func TestFlatten(t *testing.T) {
{Mode: fs.ModeDir | 0500, Path: "checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU"},
{Mode: fs.ModeDir | 0700, Path: "identifier"},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/5ey2wpmMpj483YYa7ZZQciYLA2cx3_l167JCqWW4Pd-5DVp81dj9EsBtVTwYptF6", Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/nfeISfLeFDr1k-g3hpE1oZ440kTqDdfF8TDpoLdbTPqaMMIl95oiqcvqjRkMjubA", Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/RdMA-mubnrHuu3Ky1wWyxauSYCO0ZH_zCPUj3uDHqkfwv5sGcByoF_g5PjlGiClb", Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK", Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
{Mode: fs.ModeDir | 0700, Path: "temp"},
{Mode: fs.ModeDir | 0700, Path: "work"},
}, pkg.MustDecode("VIqqpf0ip9jcyw63i6E8lCMGUcLivQBe4Bevt3WusNac-1MSy5bzB647qGUBzl-W"), nil},
}, pkg.MustDecode("PO2DSSCa4yoSgEYRcCSZfQfwow1yRigL3Ry-hI0RDI4aGuFBha-EfXeSJnG_5_Rl"), nil},
{"sample exec container overlay work", fstest.MapFS{
".": {Mode: fs.ModeDir | 0700},
@@ -402,8 +402,8 @@ func TestFlatten(t *testing.T) {
"checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU": {Mode: fs.ModeDir | 0500},
"identifier": {Mode: fs.ModeDir | 0700},
"identifier/acaDzHZv40dZaz4cGAXayqbRMgbEOuiuiUijZL8IgDQvyeCNMFE3onBMYfny-kXA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
"identifier/nfeISfLeFDr1k-g3hpE1oZ440kTqDdfF8TDpoLdbTPqaMMIl95oiqcvqjRkMjubA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"identifier/5hlaukCirnXE4W_RSLJFOZN47Z5RiHnacXzdFp_70cLgiJUGR6cSb_HaFftkzi0-": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
"identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
@@ -416,12 +416,12 @@ func TestFlatten(t *testing.T) {
{Mode: fs.ModeDir | 0500, Path: "checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU"},
{Mode: fs.ModeDir | 0700, Path: "identifier"},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/acaDzHZv40dZaz4cGAXayqbRMgbEOuiuiUijZL8IgDQvyeCNMFE3onBMYfny-kXA", Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/nfeISfLeFDr1k-g3hpE1oZ440kTqDdfF8TDpoLdbTPqaMMIl95oiqcvqjRkMjubA", Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/5hlaukCirnXE4W_RSLJFOZN47Z5RiHnacXzdFp_70cLgiJUGR6cSb_HaFftkzi0-", Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK", Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
{Mode: fs.ModeDir | 0700, Path: "temp"},
{Mode: fs.ModeDir | 0700, Path: "work"},
}, pkg.MustDecode("q8x2zQg4YZbKpPqKlEBj_uxXD9vOBaZ852qOuIsl9QdO73I_UMNpuUoPLtunxUYl"), nil},
}, pkg.MustDecode("iaRt6l_Wm2n-h5UsDewZxQkCmjZjyL8r7wv32QT2kyV55-Lx09Dq4gfg9BiwPnKs"), nil},
{"sample exec container multiple layers", fstest.MapFS{
".": {Mode: fs.ModeDir | 0700},
@@ -436,9 +436,9 @@ func TestFlatten(t *testing.T) {
"identifier": {Mode: fs.ModeDir | 0700},
"identifier/_gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/OLBgp1GsljhM2TJ-sbHjaiH9txEUvgdDTAzHv2P24donTt6_529l-9Ua0vFImLlb")},
"identifier/nfeISfLeFDr1k-g3hpE1oZ440kTqDdfF8TDpoLdbTPqaMMIl95oiqcvqjRkMjubA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"identifier/rXLKjjYfGSyoWmuvEJooHkvGJIZaC0IAWnKGvtPZkM15gBxAgW7mIXcxRVNOXAr4": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/nY_CUdiaUM1OL4cPr5TS92FCJ3rCRV7Hm5oVTzAvMXwC03_QnTRfQ5PPs7mOU9fK")},
"identifier/tfjrsVuBuFgzWgwz-yPppFtylYuC1VFWnKhyBiHbWTGkyz8lt7Ee9QXWaIHPXs4x": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
"identifier/B-kc5iJMx8GtlCua4dz6BiJHnDAOUfPjgpbKq4e-QEn0_CZkSYs3fOA1ve06qMs2": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/nY_CUdiaUM1OL4cPr5TS92FCJ3rCRV7Hm5oVTzAvMXwC03_QnTRfQ5PPs7mOU9fK")},
"identifier/p1t_drXr34i-jZNuxDMLaMOdL6tZvQqhavNafGynGqxOZoXAUTSn7kqNh3Ovv3DT": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
"identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
@@ -454,14 +454,14 @@ func TestFlatten(t *testing.T) {
{Mode: 0400, Path: "checksum/nY_CUdiaUM1OL4cPr5TS92FCJ3rCRV7Hm5oVTzAvMXwC03_QnTRfQ5PPs7mOU9fK/check", Data: []byte("layers")},
{Mode: fs.ModeDir | 0700, Path: "identifier"},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/B-kc5iJMx8GtlCua4dz6BiJHnDAOUfPjgpbKq4e-QEn0_CZkSYs3fOA1ve06qMs2", Data: []byte("../checksum/nY_CUdiaUM1OL4cPr5TS92FCJ3rCRV7Hm5oVTzAvMXwC03_QnTRfQ5PPs7mOU9fK")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/_gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA", Data: []byte("../checksum/OLBgp1GsljhM2TJ-sbHjaiH9txEUvgdDTAzHv2P24donTt6_529l-9Ua0vFImLlb")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/nfeISfLeFDr1k-g3hpE1oZ440kTqDdfF8TDpoLdbTPqaMMIl95oiqcvqjRkMjubA", Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/rXLKjjYfGSyoWmuvEJooHkvGJIZaC0IAWnKGvtPZkM15gBxAgW7mIXcxRVNOXAr4", Data: []byte("../checksum/nY_CUdiaUM1OL4cPr5TS92FCJ3rCRV7Hm5oVTzAvMXwC03_QnTRfQ5PPs7mOU9fK")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/tfjrsVuBuFgzWgwz-yPppFtylYuC1VFWnKhyBiHbWTGkyz8lt7Ee9QXWaIHPXs4x", Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/p1t_drXr34i-jZNuxDMLaMOdL6tZvQqhavNafGynGqxOZoXAUTSn7kqNh3Ovv3DT", Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK", Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
{Mode: fs.ModeDir | 0700, Path: "temp"},
{Mode: fs.ModeDir | 0700, Path: "work"},
}, pkg.MustDecode("SITnQ6PTV12PAQQjIuLUxkvsXQiC9Gq_HJQlcb4BPL5YnRHnx8lsW7PRM9YMLBsx"), nil},
}, pkg.MustDecode("O2YzyR7IUGU5J2CADy0hUZ3A5NkP_Vwzs4UadEdn2oMZZVWRtH0xZGJ3HXiimTnZ"), nil},
{"sample exec container layer promotion", fstest.MapFS{
".": {Mode: fs.ModeDir | 0700},
@@ -472,9 +472,9 @@ func TestFlatten(t *testing.T) {
"checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU": {Mode: fs.ModeDir | 0500},
"identifier": {Mode: fs.ModeDir | 0700},
"identifier/1tQZOGmVk_JkpyiG84AKW_BXmlK_MvHUbh5WtMuthGbHUq7i7nL1bvdF-LoJbqNh": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"identifier/O-6VjlIUxc4PYLf5v35uhIeL8kkYCbHYklqlmDjFPXe0m4j6GkUDg5qwTzBRESnf": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
"identifier/nfeISfLeFDr1k-g3hpE1oZ440kTqDdfF8TDpoLdbTPqaMMIl95oiqcvqjRkMjubA": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"identifier/kvJIqZo5DKFOxC2ZQ-8_nPaQzEAz9cIm3p6guO-uLqm-xaiPu7oRkSnsu411jd_U": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
"identifier/xXTIYcXmgJWNLC91c417RRrNM9cjELwEZHpGvf8Fk_GNP5agRJp_SicD0w9aMeLJ": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
"temp": {Mode: fs.ModeDir | 0700},
"work": {Mode: fs.ModeDir | 0700},
@@ -487,13 +487,13 @@ func TestFlatten(t *testing.T) {
{Mode: fs.ModeDir | 0500, Path: "checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU"},
{Mode: fs.ModeDir | 0700, Path: "identifier"},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/1tQZOGmVk_JkpyiG84AKW_BXmlK_MvHUbh5WtMuthGbHUq7i7nL1bvdF-LoJbqNh", Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/O-6VjlIUxc4PYLf5v35uhIeL8kkYCbHYklqlmDjFPXe0m4j6GkUDg5qwTzBRESnf", Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/nfeISfLeFDr1k-g3hpE1oZ440kTqDdfF8TDpoLdbTPqaMMIl95oiqcvqjRkMjubA", Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/kvJIqZo5DKFOxC2ZQ-8_nPaQzEAz9cIm3p6guO-uLqm-xaiPu7oRkSnsu411jd_U", Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/vjz1MHPcGBKV7sjcs8jQP3cqxJ1hgPTiQBMCEHP9BGXjGxd-tJmEmXKaStObo5gK", Data: []byte("../checksum/MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/xXTIYcXmgJWNLC91c417RRrNM9cjELwEZHpGvf8Fk_GNP5agRJp_SicD0w9aMeLJ", Data: []byte("../checksum/GPa4aBakdSJd7Tz7LYj_VJFoojzyZinmVcG3k6M5xI6CZ821J5sXLhLDDuS47gi9")},
{Mode: fs.ModeDir | 0700, Path: "temp"},
{Mode: fs.ModeDir | 0700, Path: "work"},
}, pkg.MustDecode("fuC20BhMKr86TYzNPP2A-9P7mGLvdcOiG10exlhRvZm8ySI7csf0LhW3im_26l1N"), nil},
}, pkg.MustDecode("3EaW6WibLi9gl03_UieiFPaFcPy5p4x3JPxrnLJxGaTI-bh3HU9DK9IMx7c3rrNm"), nil},
{"sample file short", fstest.MapFS{
".": {Mode: fs.ModeDir | 0700},
@@ -502,7 +502,7 @@ func TestFlatten(t *testing.T) {
"checksum/vsAhtPNo4waRNOASwrQwcIPTqb3SBuJOXw2G4T1mNmVZM-wrQTRllmgXqcIIoRcX": {Mode: 0400, Data: []byte{0}},
"identifier": {Mode: fs.ModeDir | 0700},
"identifier/lIx_W4M7tVOcQ8jh08EJOfXf4brRmkEEjvUa7c17vVUzlmtUxlhhrgqmc9aZhjbn": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/vsAhtPNo4waRNOASwrQwcIPTqb3SBuJOXw2G4T1mNmVZM-wrQTRllmgXqcIIoRcX")},
"identifier/3376ALA7hIUm2LbzH2fDvRezgzod1eTK_G6XjyOgbM2u-6swvkFaF0BOwSl_juBi": {Mode: fs.ModeSymlink | 0777, Data: []byte("../checksum/vsAhtPNo4waRNOASwrQwcIPTqb3SBuJOXw2G4T1mNmVZM-wrQTRllmgXqcIIoRcX")},
"work": {Mode: fs.ModeDir | 0700},
}, []pkg.FlatEntry{
@@ -511,10 +511,10 @@ func TestFlatten(t *testing.T) {
{Mode: 0400, Path: "checksum/vsAhtPNo4waRNOASwrQwcIPTqb3SBuJOXw2G4T1mNmVZM-wrQTRllmgXqcIIoRcX", Data: []byte{0}},
{Mode: fs.ModeDir | 0700, Path: "identifier"},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/lIx_W4M7tVOcQ8jh08EJOfXf4brRmkEEjvUa7c17vVUzlmtUxlhhrgqmc9aZhjbn", Data: []byte("../checksum/vsAhtPNo4waRNOASwrQwcIPTqb3SBuJOXw2G4T1mNmVZM-wrQTRllmgXqcIIoRcX")},
{Mode: fs.ModeSymlink | 0777, Path: "identifier/3376ALA7hIUm2LbzH2fDvRezgzod1eTK_G6XjyOgbM2u-6swvkFaF0BOwSl_juBi", Data: []byte("../checksum/vsAhtPNo4waRNOASwrQwcIPTqb3SBuJOXw2G4T1mNmVZM-wrQTRllmgXqcIIoRcX")},
{Mode: fs.ModeDir | 0700, Path: "work"},
}, pkg.MustDecode("hnrfmJtivNKcgtETsKnU9gP_OwPgpNY3DSUJnmxnmeOODSO-YBvEBiTgieY4AAd7"), nil},
}, pkg.MustDecode("iR6H5OIsyOW4EwEgtm9rGzGF6DVtyHLySEtwnFE8bnus9VJcoCbR4JIek7Lw-vwT"), nil},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {

View File

@@ -18,6 +18,7 @@ import (
"hakurei.app/container"
"hakurei.app/container/check"
"hakurei.app/container/fhs"
"hakurei.app/container/seccomp"
"hakurei.app/container/std"
"hakurei.app/message"
)
@@ -102,8 +103,7 @@ type execArtifact struct {
args []string
// Duration the initial process is allowed to run. The zero value is
// equivalent to execTimeoutDefault. This value is never encoded in Params
// because it cannot affect outcome.
// equivalent to [ExecTimeoutDefault].
timeout time.Duration
// Caller-supplied exclusivity value, returned as is by IsExclusive.
@@ -128,12 +128,6 @@ func (a *execNetArtifact) Checksum() Checksum { return a.checksum }
// Kind returns the hardcoded [Kind] constant.
func (*execNetArtifact) Kind() Kind { return KindExecNet }
// Params is [Checksum] concatenated with [KindExec] params.
func (a *execNetArtifact) Params(ctx *IContext) {
ctx.GetHash().Write(a.checksum[:])
a.execArtifact.Params(ctx)
}
// Cure cures the [Artifact] in the container described by the caller. The
// container retains host networking.
func (a *execNetArtifact) Cure(f *FContext) error {
@@ -197,38 +191,131 @@ func (*execArtifact) Kind() Kind { return KindExec }
// Params writes paths, executable pathname and args.
func (a *execArtifact) Params(ctx *IContext) {
h := ctx.GetHash()
ctx.WriteString(a.name)
_0, _1 := []byte{0}, []byte{1}
ctx.WriteUint32(uint32(len(a.paths)))
for _, p := range a.paths {
if p.W {
h.Write(_1)
} else {
h.Write(_0)
}
if p.P != nil {
h.Write([]byte(p.P.String()))
ctx.WriteString(p.P.String())
} else {
h.Write([]byte("invalid P\x00"))
ctx.WriteString("invalid P\x00")
}
h.Write(_0)
ctx.WriteUint32(uint32(len(p.A)))
for _, d := range p.A {
ctx.WriteIdent(d)
}
h.Write(_0)
if p.W {
ctx.WriteUint32(1)
} else {
ctx.WriteUint32(0)
}
}
h.Write(_0)
h.Write([]byte(a.dir.String()))
h.Write(_0)
ctx.WriteString(a.dir.String())
ctx.WriteUint32(uint32(len(a.env)))
for _, e := range a.env {
h.Write([]byte(e))
ctx.WriteString(e)
}
h.Write(_0)
h.Write([]byte(a.path.String()))
h.Write(_0)
ctx.WriteString(a.path.String())
ctx.WriteUint32(uint32(len(a.args)))
for _, arg := range a.args {
h.Write([]byte(arg))
ctx.WriteString(arg)
}
ctx.WriteUint32(uint32(a.timeout & 0xffffffff))
ctx.WriteUint32(uint32(a.timeout >> 32))
if a.exclusive {
ctx.WriteUint32(1)
} else {
ctx.WriteUint32(0)
}
}
// readExecArtifact interprets IR values and returns the address of execArtifact
// or execNetArtifact.
func readExecArtifact(r *IRReader, net bool) Artifact {
r.DiscardAll()
name := r.ReadString()
sz := r.ReadUint32()
if sz > irMaxDeps {
panic(ErrIRDepend)
}
paths := make([]ExecPath, sz)
for i := range paths {
paths[i].P = check.MustAbs(r.ReadString())
sz = r.ReadUint32()
if sz > irMaxDeps {
panic(ErrIRDepend)
}
paths[i].A = make([]Artifact, sz)
for j := range paths[i].A {
paths[i].A[j] = r.ReadIdent()
}
paths[i].W = r.ReadUint32() != 0
}
dir := check.MustAbs(r.ReadString())
sz = r.ReadUint32()
if sz > irMaxValues {
panic(ErrIRValues)
}
env := make([]string, sz)
for i := range env {
env[i] = r.ReadString()
}
pathname := check.MustAbs(r.ReadString())
sz = r.ReadUint32()
if sz > irMaxValues {
panic(ErrIRValues)
}
args := make([]string, sz)
for i := range args {
args[i] = r.ReadString()
}
timeout := time.Duration(r.ReadUint32())
timeout |= time.Duration(r.ReadUint32()) << 32
exclusive := r.ReadUint32() != 0
checksum, ok := r.Finalise()
var checksumP *Checksum
if net {
if !ok {
panic(ErrExpectedChecksum)
}
checksumVal := checksum.Value()
checksumP = &checksumVal
} else {
if ok {
panic(ErrUnexpectedChecksum)
}
}
return NewExec(
name, checksumP, timeout, exclusive, dir, env, pathname, args, paths...,
)
}
func init() {
register(KindExec,
func(r *IRReader) Artifact { return readExecArtifact(r, false) })
register(KindExecNet,
func(r *IRReader) Artifact { return readExecArtifact(r, true) })
}
// Dependencies returns a slice of all artifacts collected from caller-supplied
@@ -302,6 +389,7 @@ func (a *execArtifact) cure(f *FContext, hostNet bool) (err error) {
z := container.New(ctx, f.GetMessage())
z.WaitDelay = execWaitDelay
z.SeccompPresets |= std.PresetStrict & ^std.PresetDenyNS
z.SeccompFlags |= seccomp.AllowMultiarch
z.ParentPerm = 0700
z.HostNet = hostNet
z.Hostname = "cure"

View File

@@ -78,7 +78,7 @@ func TestExec(t *testing.T) {
), nil, pkg.Checksum{}, &pkg.DependencyCureError{
{
Ident: unique.Make(pkg.ID(pkg.MustDecode(
"CWEoJqnSBpWf8uryC2qnIe3O1a_FZWUWZGbiVPsQFGW7pvDHiSwoK3QCU9-uxN87",
"Sowo6oZRmG6xVtUaxB6bDWZhVsqAJsIJWUp0OPKlE103cY0lodx7dem8J-qQF0Z1",
))),
Err: stub.UniqueError(0xcafe),
},
@@ -109,7 +109,7 @@ func TestExec(t *testing.T) {
}
testtoolDestroy(t, base, c)
}, pkg.MustDecode("UiV6kMz7KrTsc_yphiyQzFLqjRanHxUOwrBMtkKuWo4mOO6WgPFAcoUEeSp7eVIW")},
}, pkg.MustDecode("Q5DluWQCAeohLoiGRImurwFp3vdz9IfQCoj7Fuhh73s4KQPRHpEQEnHTdNHmB8Fx")},
{"net", nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
c.SetStrict(true)
@@ -144,7 +144,7 @@ func TestExec(t *testing.T) {
})
testtoolDestroy(t, base, c)
}, pkg.MustDecode("ek4K-0d4iRSArkY2TCs3WK34DbiYeOmhE_4vsJTSu_6roY4ZF3YG6eKRooal-i1o")},
}, pkg.MustDecode("bPYvvqxpfV7xcC1EptqyKNK1klLJgYHMDkzBcoOyK6j_Aj5hb0mXNPwTwPSK5F6Z")},
{"overlay root", nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
c.SetStrict(true)
@@ -170,7 +170,7 @@ func TestExec(t *testing.T) {
})
testtoolDestroy(t, base, c)
}, pkg.MustDecode("VIqqpf0ip9jcyw63i6E8lCMGUcLivQBe4Bevt3WusNac-1MSy5bzB647qGUBzl-W")},
}, pkg.MustDecode("PO2DSSCa4yoSgEYRcCSZfQfwow1yRigL3Ry-hI0RDI4aGuFBha-EfXeSJnG_5_Rl")},
{"overlay work", nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
c.SetStrict(true)
@@ -201,7 +201,7 @@ func TestExec(t *testing.T) {
})
testtoolDestroy(t, base, c)
}, pkg.MustDecode("q8x2zQg4YZbKpPqKlEBj_uxXD9vOBaZ852qOuIsl9QdO73I_UMNpuUoPLtunxUYl")},
}, pkg.MustDecode("iaRt6l_Wm2n-h5UsDewZxQkCmjZjyL8r7wv32QT2kyV55-Lx09Dq4gfg9BiwPnKs")},
{"multiple layers", nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
c.SetStrict(true)
@@ -254,7 +254,7 @@ func TestExec(t *testing.T) {
})
testtoolDestroy(t, base, c)
}, pkg.MustDecode("SITnQ6PTV12PAQQjIuLUxkvsXQiC9Gq_HJQlcb4BPL5YnRHnx8lsW7PRM9YMLBsx")},
}, pkg.MustDecode("O2YzyR7IUGU5J2CADy0hUZ3A5NkP_Vwzs4UadEdn2oMZZVWRtH0xZGJ3HXiimTnZ")},
{"overlay layer promotion", nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
c.SetStrict(true)
@@ -286,7 +286,7 @@ func TestExec(t *testing.T) {
})
testtoolDestroy(t, base, c)
}, pkg.MustDecode("fuC20BhMKr86TYzNPP2A-9P7mGLvdcOiG10exlhRvZm8ySI7csf0LhW3im_26l1N")},
}, pkg.MustDecode("3EaW6WibLi9gl03_UieiFPaFcPy5p4x3JPxrnLJxGaTI-bh3HU9DK9IMx7c3rrNm")},
})
}

View File

@@ -25,6 +25,12 @@ var _ KnownChecksum = new(fileArtifactNamed)
// String returns the caller-supplied reporting name.
func (a *fileArtifactNamed) String() string { return a.name }
// Params writes the caller-supplied reporting name and the file body.
func (a *fileArtifactNamed) Params(ctx *IContext) {
ctx.WriteString(a.name)
ctx.Write(a.fileArtifact)
}
// NewFile returns a [FileArtifact] that cures into a caller-supplied byte slice.
//
// Caller must not modify data after NewFile returns.
@@ -39,8 +45,22 @@ func NewFile(name string, data []byte) FileArtifact {
// Kind returns the hardcoded [Kind] constant.
func (*fileArtifact) Kind() Kind { return KindFile }
// Params writes the result of Cure.
func (a *fileArtifact) Params(ctx *IContext) { ctx.GetHash().Write(*a) }
// Params writes an empty string and the file body.
func (a *fileArtifact) Params(ctx *IContext) {
ctx.WriteString("")
ctx.Write(*a)
}
func init() {
register(KindFile, func(r *IRReader) Artifact {
name := r.ReadString()
data := r.ReadStringBytes()
if _, ok := r.Finalise(); !ok {
panic(ErrExpectedChecksum)
}
return NewFile(name, data)
})
}
// Dependencies returns a nil slice.
func (*fileArtifact) Dependencies() []Artifact { return nil }

View File

@@ -17,13 +17,13 @@ func TestFile(t *testing.T) {
cureMany(t, c, []cureStep{
{"short", pkg.NewFile("null", []byte{0}), base.Append(
"identifier",
"lIx_W4M7tVOcQ8jh08EJOfXf4brRmkEEjvUa7c17vVUzlmtUxlhhrgqmc9aZhjbn",
"3376ALA7hIUm2LbzH2fDvRezgzod1eTK_G6XjyOgbM2u-6swvkFaF0BOwSl_juBi",
), pkg.MustDecode(
"vsAhtPNo4waRNOASwrQwcIPTqb3SBuJOXw2G4T1mNmVZM-wrQTRllmgXqcIIoRcX",
), nil},
})
}, pkg.MustDecode(
"hnrfmJtivNKcgtETsKnU9gP_OwPgpNY3DSUJnmxnmeOODSO-YBvEBiTgieY4AAd7",
"iR6H5OIsyOW4EwEgtm9rGzGF6DVtyHLySEtwnFE8bnus9VJcoCbR4JIek7Lw-vwT",
)},
})
}

762
internal/pkg/ir.go Normal file
View File

@@ -0,0 +1,762 @@
package pkg
import (
"bufio"
"bytes"
"context"
"crypto/sha512"
"encoding/binary"
"errors"
"fmt"
"io"
"slices"
"strconv"
"syscall"
"unique"
"unsafe"
)
// wordSize is the boundary which binary segments are always aligned to.
const wordSize = 8
// alignSize returns the padded size for aligning sz.
func alignSize(sz int) int {
return sz + (wordSize-(sz)%wordSize)%wordSize
}
// panicToError recovers from a panic and replaces a nil error with the panicked
// error value. If the value does not implement error, it is re-panicked.
func panicToError(errP *error) {
r := recover()
if r == nil {
return
}
if err, ok := r.(error); !ok {
panic(r)
} else if *errP == nil {
*errP = err
}
}
// IContext is passed to [Artifact.Params] and provides methods for writing
// values to the IR writer. It does not expose the underlying [io.Writer].
//
// IContext is valid until [Artifact.Params] returns.
type IContext struct {
// Address of underlying [Cache], should be zeroed or made unusable after
// [Artifact.Params] returns and must not be exposed directly.
cache *Cache
// Written to by various methods, should be zeroed after [Artifact.Params]
// returns and must not be exposed directly.
w io.Writer
}
// Unwrap returns the underlying [context.Context].
func (i *IContext) Unwrap() context.Context { return i.cache.ctx }
// irZero is a zero IR word.
var irZero [wordSize]byte
// IRValueKind denotes the kind of encoded value.
type IRValueKind uint32
const (
// IRKindEnd denotes the end of the current parameters stream. The ancillary
// value is interpreted as [IREndFlag].
IRKindEnd IRValueKind = iota
// IRKindIdent denotes the identifier of a dependency [Artifact]. The
// ancillary value is reserved for future use.
IRKindIdent
// IRKindUint32 denotes an inlined uint32 value.
IRKindUint32
// IRKindString denotes a string with its true length encoded in header
// ancillary data. Its wire length is always aligned to 8 byte boundary.
IRKindString
irHeaderShift = 32
irHeaderMask = 0xffffffff
)
// String returns a user-facing name of k.
func (k IRValueKind) String() string {
switch k {
case IRKindEnd:
return "terminator"
case IRKindIdent:
return "ident"
case IRKindUint32:
return "uint32"
case IRKindString:
return "string"
default:
return "invalid kind " + strconv.Itoa(int(k))
}
}
// irValueHeader encodes [IRValueKind] and a 32-bit ancillary value.
type irValueHeader uint64
// encodeHeader returns irValueHeader encoding [IRValueKind] and ancillary data.
func (k IRValueKind) encodeHeader(v uint32) irValueHeader {
return irValueHeader(v)<<irHeaderShift | irValueHeader(k)
}
// put stores h in b[0:8].
func (h irValueHeader) put(b []byte) {
binary.LittleEndian.PutUint64(b[:], uint64(h))
}
// append appends the bytes of h to b and returns the appended slice.
func (h irValueHeader) append(b []byte) []byte {
return binary.LittleEndian.AppendUint64(b, uint64(h))
}
// IREndFlag is ancillary data encoded in the header of an [IRKindEnd] value and
// specifies the presence of optional fields in the remaining [IRKindEnd] data.
// Order of present fields is the order of their corresponding constants defined
// below.
type IREndFlag uint32
const (
// IREndKnownChecksum denotes a [KnownChecksum] artifact. For an [IRKindEnd]
// value with this flag set, the remaining data contains the [Checksum].
IREndKnownChecksum IREndFlag = 1 << iota
)
// mustWrite writes to IContext.w and panics on error. The panic is recovered
// from by the caller and used as the return value.
func (i *IContext) mustWrite(p []byte) {
if _, err := i.w.Write(p); err != nil {
panic(err)
}
}
// WriteIdent writes the identifier of [Artifact] to the IR. The behaviour of
// WriteIdent is not defined for an [Artifact] not part of the slice returned by
// [Artifact.Dependencies].
func (i *IContext) WriteIdent(a Artifact) {
buf := i.cache.getIdentBuf()
defer i.cache.putIdentBuf(buf)
IRKindIdent.encodeHeader(0).put(buf[:])
*(*ID)(buf[wordSize:]) = i.cache.Ident(a).Value()
i.mustWrite(buf[:])
}
// WriteUint32 writes a uint32 value to the IR.
func (i *IContext) WriteUint32(v uint32) {
i.mustWrite(IRKindUint32.encodeHeader(v).append(nil))
}
// irMaxStringLength is the maximum acceptable wire size of [IRKindString].
const irMaxStringLength = 1 << 20
// IRStringError is a string value too big to encode in IR.
type IRStringError string
func (IRStringError) Error() string {
return "params value too big to encode in IR"
}
// Write writes p as a string value to the IR.
func (i *IContext) Write(p []byte) {
sz := alignSize(len(p))
if len(p) > irMaxStringLength || sz > irMaxStringLength {
panic(IRStringError(p))
}
i.mustWrite(IRKindString.encodeHeader(uint32(len(p))).append(nil))
i.mustWrite(p)
psz := sz - len(p)
if psz > 0 {
i.mustWrite(irZero[:psz])
}
}
// WriteString writes s as a string value to the IR.
func (i *IContext) WriteString(s string) {
p := unsafe.Slice(unsafe.StringData(s), len(s))
i.Write(p)
}
// Encode writes a deterministic, efficient representation of a to w and returns
// the first non-nil error encountered while writing to w.
func (c *Cache) Encode(w io.Writer, a Artifact) (err error) {
deps := a.Dependencies()
idents := make([]*extIdent, len(deps))
for i, d := range deps {
dbuf, did := c.unsafeIdent(d, true)
if dbuf == nil {
dbuf = c.getIdentBuf()
binary.LittleEndian.PutUint64(dbuf[:], uint64(d.Kind()))
*(*ID)(dbuf[wordSize:]) = did.Value()
} else {
c.storeIdent(d, dbuf)
}
defer c.putIdentBuf(dbuf)
idents[i] = dbuf
}
slices.SortFunc(idents, func(a, b *extIdent) int {
return bytes.Compare(a[:], b[:])
})
idents = slices.CompactFunc(idents, func(a, b *extIdent) bool {
return *a == *b
})
// kind uint64 | deps_sz uint64
var buf [wordSize * 2]byte
binary.LittleEndian.PutUint64(buf[:], uint64(a.Kind()))
binary.LittleEndian.PutUint64(buf[wordSize:], uint64(len(idents)))
if _, err = w.Write(buf[:]); err != nil {
return
}
for _, dn := range idents {
// kind uint64 | ident ID
if _, err = w.Write(dn[:]); err != nil {
return
}
}
func() {
i := IContext{c, w}
defer panicToError(&err)
defer func() { i.cache, i.w = nil, nil }()
a.Params(&i)
}()
if err != nil {
return
}
var f IREndFlag
kcBuf := c.getIdentBuf()
sz := wordSize
if kc, ok := a.(KnownChecksum); ok {
f |= IREndKnownChecksum
*(*Checksum)(kcBuf[wordSize:]) = kc.Checksum()
sz += len(Checksum{})
}
IRKindEnd.encodeHeader(uint32(f)).put(kcBuf[:])
_, err = w.Write(kcBuf[:sz])
c.putIdentBuf(kcBuf)
return
}
// encodeAll implements EncodeAll by recursively encoding dependencies and
// performs deduplication by value via the encoded map.
func (c *Cache) encodeAll(
w io.Writer,
a Artifact,
encoded map[Artifact]struct{},
) (err error) {
if _, ok := encoded[a]; ok {
return
}
for _, d := range a.Dependencies() {
if err = c.encodeAll(w, d, encoded); err != nil {
return
}
}
encoded[a] = struct{}{}
return c.Encode(w, a)
}
// EncodeAll writes a self-describing IR stream of a to w and returns the first
// non-nil error encountered while writing to w.
//
// EncodeAll tries to avoid encoding the same [Artifact] more than once, however
// it will fail to do so if they do not compare equal by value, as that will
// require buffering and greatly reduce performance. It is therefore up to the
// caller to avoid causing dependencies to be represented in a way such that
// two equivalent artifacts do not compare equal. While an IR stream with
// repeated artifacts is valid, it is somewhat inefficient, and the reference
// [IRDecoder] implementation produces a warning for it.
//
// Note that while EncodeAll makes use of the ident free list, it does not use
// the ident cache, nor does it contribute identifiers it computes back to the
// ident cache. Because of this, multiple invocations of EncodeAll will have
// similar cost and does not amortise when combined with a call to Cure.
func (c *Cache) EncodeAll(w io.Writer, a Artifact) error {
return c.encodeAll(w, a, make(map[Artifact]struct{}))
}
// ErrRemainingIR is returned for a [IRReadFunc] that failed to call
// [IRReader.Finalise] before returning.
var ErrRemainingIR = errors.New("implementation did not consume final value")
// DanglingIdentError is an identifier in a [IRKindIdent] value that was never
// described in the IR stream before it was encountered.
type DanglingIdentError unique.Handle[ID]
func (e DanglingIdentError) Error() string {
return "artifact " + Encode(unique.Handle[ID](e).Value()) +
" was never described"
}
type (
// IRDecoder decodes [Artifact] from an IR stream. The stream is read to
// EOF and the final [Artifact] is returned. Previous artifacts may be
// looked up by their identifier.
//
// An [Artifact] may appear more than once in the same IR stream. A
// repeating [Artifact] generates a warning via [Cache] and will appear if
// verbose logging is enabled. Artifacts may only depend on artifacts
// previously described in the IR stream.
//
// Methods of IRDecoder are not safe for concurrent use.
IRDecoder struct {
// Address of underlying [Cache], must not be exposed directly.
c *Cache
// Underlying IR reader. Methods of [IRReader] must not use this as it
// bypasses ident measurement.
r io.Reader
// Artifacts already seen in the IR stream.
ident map[unique.Handle[ID]]Artifact
// Whether Decode returned, and the entire IR stream was decoded.
done, ok bool
}
// IRReader provides methods to decode the IR wire format and read values
// from the reader embedded in the underlying [IRDecoder]. It is
// deliberately impossible to obtain the [IRValueKind] of the next value,
// and callers must never recover from panics in any read method.
//
// It is the responsibility of the caller to call Finalise after all IR
// values have been read. Failure to call Finalise causes the resulting
// [Artifact] to be rejected with [ErrRemainingIR].
//
// For an [Artifact] expected to have dependencies, the caller must consume
// all dependencies by calling Next until all dependencies are depleted, or
// call DiscardAll to explicitly discard them and rely on values encoded as
// [IRKindIdent] instead. Failure to consume all unstructured dependencies
// causes the resulting [Artifact] to be rejected with [MissedDependencyError].
//
// Requesting the value of an unstructured dependency not yet described in
// the IR stream via Next, or reading an [IRKindIdent] value not part of
// unstructured dependencies via ReadIdent may cause the resulting
// [Artifact] to be rejected with [DanglingIdentError], however either
// method may return a non-nil [Artifact] implementation of unspecified
// value.
IRReader struct {
// Address of underlying [IRDecoder], should be zeroed or made unusable
// after finalisation and must not be exposed directly.
d *IRDecoder
// Common buffer for word-sized reads.
buf [wordSize]byte
// Dependencies sent before params, sorted by identifier. Resliced on
// each call to Next and checked to be depleted during Finalise.
deps []*extIdent
// Number of values already read, -1 denotes a finalised IRReader.
count int
// Header of value currently being read.
h irValueHeader
// Measured IR reader. All reads for the current [Artifact] must go
// through this to produce a correct ident.
r io.Reader
// Buffers measure writes. Flushed and returned to d during Finalise.
ibw *bufio.Writer
}
// IRReadFunc reads IR values written by [Artifact.Params] to produce an
// instance of [Artifact] identical to the one to produce these values.
IRReadFunc func(r *IRReader) Artifact
)
// kind returns the [IRValueKind] encoded in h.
func (h irValueHeader) kind() IRValueKind {
return IRValueKind(h & irHeaderMask)
}
// value returns ancillary data encoded in h.
func (h irValueHeader) value() uint32 {
return uint32(h >> irHeaderShift)
}
// irArtifact refers to artifact IR interpretation functions and must not be
// written to directly.
var irArtifact = make(map[Kind]IRReadFunc)
// InvalidKindError is an unregistered [Kind] value.
type InvalidKindError Kind
func (e InvalidKindError) Error() string {
return "invalid artifact kind " + strconv.Itoa(int(e))
}
// register records the [IRReadFunc] of an implementation of [Artifact] under
// the specified [Kind]. Expecting to be used only during initialization, it
// panics if the mapping between [Kind] and [IRReadFunc] is not a bijection.
//
// register is not safe for concurrent use. register must not be called after
// the first instance of [Cache] has been opened.
func register(k Kind, f IRReadFunc) {
if _, ok := irArtifact[k]; ok {
panic("attempting to register " + strconv.Itoa(int(k)) + " twice")
}
irArtifact[k] = f
}
// Register records the [IRReadFunc] of a custom implementation of [Artifact]
// under the specified [Kind]. Expecting to be used only during initialization,
// it panics if the mapping between [Kind] and [IRReadFunc] is not a bijection,
// or the specified [Kind] is below [KindCustomOffset].
//
// Register is not safe for concurrent use. Register must not be called after
// the first instance of [Cache] has been opened.
func Register(k Kind, f IRReadFunc) {
if k < KindCustomOffset {
panic("attempting to register within internal kind range")
}
register(k, f)
}
// NewDecoder returns a new [IRDecoder] that reads from the [io.Reader].
func (c *Cache) NewDecoder(r io.Reader) *IRDecoder {
return &IRDecoder{c, r, make(map[unique.Handle[ID]]Artifact), false, false}
}
const (
// irMaxValues is the arbitrary maximum number of values allowed to be
// written by [Artifact.Params] and subsequently read via [IRReader].
irMaxValues = 1 << 12
// irMaxDeps is the arbitrary maximum number of direct dependencies allowed
// to be returned by [Artifact.Dependencies] and subsequently decoded by
// [IRDecoder].
irMaxDeps = 1 << 10
)
var (
// ErrIRValues is returned for an [Artifact] with too many parameter values.
ErrIRValues = errors.New("artifact has too many IR parameter values")
// ErrIRDepend is returned for an [Artifact] with too many dependencies.
ErrIRDepend = errors.New("artifact has too many dependencies")
// ErrAlreadyFinalised is returned when attempting to use an [IRReader] that
// has already been finalised.
ErrAlreadyFinalised = errors.New("reader has already finalised")
)
// enterReader panics with an appropriate error for an out-of-bounds count and
// must be called at some point in any exported method.
func (ir *IRReader) enterReader(read bool) {
if ir.count < 0 {
panic(ErrAlreadyFinalised)
}
if ir.count >= irMaxValues {
panic(ErrIRValues)
}
if read {
ir.count++
}
}
// IRKindError describes an attempt to read an IR value of unexpected kind.
type IRKindError struct {
Got, Want IRValueKind
Ancillary uint32
}
func (e *IRKindError) Error() string {
return fmt.Sprintf(
"got %s IR value (%#x) instead of %s",
e.Got, e.Ancillary, e.Want,
)
}
// readFull reads until either p is filled or an error is encountered.
func (ir *IRReader) readFull(p []byte) (n int, err error) {
for n < len(p) && err == nil {
var nn int
nn, err = ir.r.Read(p[n:])
n += nn
}
return
}
// mustRead reads from the underlying measured reader and panics on error. If
// an [io.EOF] is encountered and n != len(p), the error is promoted to a
// [io.ErrUnexpectedEOF], if n == 0, [io.EOF] is kept as is, otherwise it is
// zeroed.
func (ir *IRReader) mustRead(p []byte) {
n, err := ir.readFull(p)
if err == nil {
return
}
if errors.Is(err, io.EOF) {
if n == len(p) {
return
}
err = io.ErrUnexpectedEOF
}
panic(err)
}
// mustReadHeader reads the next header via d and checks its kind.
func (ir *IRReader) mustReadHeader(k IRValueKind) {
ir.mustRead(ir.buf[:])
ir.h = irValueHeader(binary.LittleEndian.Uint64(ir.buf[:]))
if wk := ir.h.kind(); wk != k {
panic(&IRKindError{wk, k, ir.h.value()})
}
}
// putAll returns all dependency buffers to the underlying [Cache].
func (ir *IRReader) putAll() {
for _, buf := range ir.deps {
ir.d.c.putIdentBuf(buf)
}
ir.deps = nil
}
// DiscardAll discards all unstructured dependencies. This is useful to
// implementations that encode dependencies as [IRKindIdent] which are read back
// via ReadIdent.
func (ir *IRReader) DiscardAll() {
if ir.deps == nil {
panic("attempting to discard dependencies twice")
}
ir.putAll()
}
// ErrDependencyDepleted is returned when attempting to advance to the next
// unstructured dependency when there are none left.
var ErrDependencyDepleted = errors.New("reading past end of dependencies")
// Next returns the next unstructured dependency.
func (ir *IRReader) Next() Artifact {
if len(ir.deps) == 0 {
panic(ErrDependencyDepleted)
}
id := unique.Make(ID(ir.deps[0][wordSize:]))
ir.d.c.putIdentBuf(ir.deps[0])
ir.deps = ir.deps[1:]
if a, ok := ir.d.ident[id]; !ok {
ir.putAll()
panic(DanglingIdentError(id))
} else {
return a
}
}
// MissedDependencyError is the number of unstructured dependencies remaining
// in [IRReader] that was never requested or explicitly discarded before
// finalisation.
type MissedDependencyError int
func (e MissedDependencyError) Error() string {
return "missed " + strconv.Itoa(int(e)) + " unstructured dependencies"
}
var (
// ErrUnexpectedChecksum is returned by a [IRReadFunc] that does not expect
// a checksum but received one in [IRKindEnd] anyway.
ErrUnexpectedChecksum = errors.New("checksum specified on unsupported artifact")
// ErrExpectedChecksum is returned by a [IRReadFunc] that expects a checksum
// but did not receive one in [IRKindEnd].
ErrExpectedChecksum = errors.New("checksum required but not specified")
)
// Finalise reads the final [IRKindEnd] value and marks r as finalised. Methods
// of r are invalid upon entry into Finalise. If a [Checksum] is available via
// [IREndKnownChecksum], its handle is returned and the caller must store its
// value in the resulting [Artifact].
func (ir *IRReader) Finalise() (checksum unique.Handle[Checksum], ok bool) {
ir.enterReader(true)
ir.count = -1
ir.mustReadHeader(IRKindEnd)
f := IREndFlag(ir.h.value())
if f&IREndKnownChecksum != 0 {
buf := ir.d.c.getIdentBuf()
defer ir.d.c.putIdentBuf(buf)
ir.mustRead(buf[wordSize:])
checksum = unique.Make(Checksum(buf[wordSize:]))
ok = true
}
if err := ir.ibw.Flush(); err != nil {
panic(err)
}
ir.r, ir.ibw = nil, nil
if len(ir.deps) != 0 {
panic(MissedDependencyError(len(ir.deps)))
}
return
}
// ReadIdent reads the next value as [IRKindIdent].
func (ir *IRReader) ReadIdent() Artifact {
ir.enterReader(true)
ir.mustReadHeader(IRKindIdent)
buf := ir.d.c.getIdentBuf()
defer ir.d.c.putIdentBuf(buf)
ir.mustRead(buf[wordSize:])
id := unique.Make(ID(buf[wordSize:]))
if a, ok := ir.d.ident[id]; !ok {
panic(DanglingIdentError(id))
} else {
return a
}
}
// ReadUint32 reads the next value as [IRKindUint32].
func (ir *IRReader) ReadUint32() uint32 {
ir.enterReader(true)
ir.mustReadHeader(IRKindUint32)
return ir.h.value()
}
// ReadStringBytes reads the next value as [IRKindString] but returns it as a
// byte slice instead.
func (ir *IRReader) ReadStringBytes() []byte {
ir.enterReader(true)
ir.mustReadHeader(IRKindString)
sz := int(ir.h.value())
szWire := alignSize(sz)
if szWire > irMaxStringLength {
panic(IRStringError("\x00"))
}
p := make([]byte, szWire)
ir.mustRead(p)
return p[:sz]
}
// ReadString reads the next value as [IRKindString].
func (ir *IRReader) ReadString() string {
p := ir.ReadStringBytes()
return unsafe.String(unsafe.SliceData(p), len(p))
}
// decode decodes the next [Artifact] in the IR stream and returns any buffer
// originating from [Cache] before returning. decode returns [io.EOF] if and
// only if the underlying [io.Reader] is already read to EOF.
func (d *IRDecoder) decode() (a Artifact, err error) {
defer panicToError(&err)
var ir IRReader
defer func() { ir.d = nil }()
ir.d = d
h := sha512.New384()
ir.ibw = d.c.getWriter(h)
defer d.c.putWriter(ir.ibw)
ir.r = io.TeeReader(d.r, ir.ibw)
if n, _err := ir.readFull(ir.buf[:]); _err != nil {
if errors.Is(_err, io.EOF) {
if n != 0 {
_err = io.ErrUnexpectedEOF
}
}
err = _err
return
}
ak := Kind(binary.LittleEndian.Uint64(ir.buf[:]))
f, ok := irArtifact[ak]
if !ok {
err = InvalidKindError(ak)
return
}
defer ir.putAll()
ir.mustRead(ir.buf[:])
sz := binary.LittleEndian.Uint64(ir.buf[:])
if sz > irMaxDeps {
err = ErrIRDepend
return
}
ir.deps = make([]*extIdent, sz)
for i := range ir.deps {
ir.deps[i] = d.c.getIdentBuf()
}
for _, buf := range ir.deps {
ir.mustRead(buf[:])
}
a = f(&ir)
if a == nil {
err = syscall.ENOTRECOVERABLE
return
}
if ir.count != -1 {
err = ErrRemainingIR
return
}
buf := d.c.getIdentBuf()
h.Sum(buf[wordSize:wordSize])
id := unique.Make(ID(buf[wordSize:]))
d.c.putIdentBuf(buf)
if _, ok = d.ident[id]; !ok {
d.ident[id] = a
} else {
d.c.msg.Verbosef(
"artifact %s appeared more than once in IR stream",
Encode(id.Value()),
)
}
return
}
// Decode consumes the IR stream to EOF and returns the final [Artifact]. After
// Decode returns, Lookup is available and Decode must not be called again.
func (d *IRDecoder) Decode() (a Artifact, err error) {
if d.done {
panic("attempting to decode an IR stream twice")
}
defer func() { d.done = true }()
var cur Artifact
next:
a, err = d.decode()
if err == nil {
cur = a
goto next
}
if errors.Is(err, io.EOF) {
a, err = cur, nil
d.ok = true
}
return
}
// Lookup looks up an [Artifact] described by the IR stream by its identifier.
func (d *IRDecoder) Lookup(id unique.Handle[ID]) (a Artifact, ok bool) {
if !d.ok {
panic("attempting to look up artifact without full IR stream")
}
a, ok = d.ident[id]
return
}

114
internal/pkg/ir_test.go Normal file
View File

@@ -0,0 +1,114 @@
package pkg_test
import (
"bytes"
"io"
"reflect"
"testing"
"hakurei.app/container/check"
"hakurei.app/internal/pkg"
)
func TestIRRoundtrip(t *testing.T) {
t.Parallel()
testCases := []struct {
name string
a pkg.Artifact
}{
{"http get aligned", pkg.NewHTTPGet(
nil, "file:///testdata",
pkg.Checksum(bytes.Repeat([]byte{0xfd}, len(pkg.Checksum{}))),
)},
{"http get unaligned", pkg.NewHTTPGet(
nil, "https://hakurei.app",
pkg.Checksum(bytes.Repeat([]byte{0xfc}, len(pkg.Checksum{}))),
)},
{"http get tar", pkg.NewHTTPGetTar(
nil, "file:///testdata",
pkg.Checksum(bytes.Repeat([]byte{0xff}, len(pkg.Checksum{}))),
pkg.TarBzip2,
)},
{"http get tar unaligned", pkg.NewHTTPGetTar(
nil, "https://hakurei.app",
pkg.Checksum(bytes.Repeat([]byte{0xfe}, len(pkg.Checksum{}))),
pkg.TarUncompressed,
)},
{"exec offline", pkg.NewExec(
"exec-offline", nil, 0, false,
pkg.AbsWork,
[]string{"HAKUREI_TEST=1"},
check.MustAbs("/opt/bin/testtool"),
[]string{"testtool"},
pkg.MustPath("/file", false, pkg.NewFile("file", []byte(
"stub file",
))), pkg.MustPath("/.hakurei", false, pkg.NewHTTPGetTar(
nil, "file:///hakurei.tar",
pkg.Checksum(bytes.Repeat([]byte{0xfc}, len(pkg.Checksum{}))),
pkg.TarUncompressed,
)), pkg.MustPath("/opt", false, pkg.NewHTTPGetTar(
nil, "file:///testtool.tar.gz",
pkg.Checksum(bytes.Repeat([]byte{0xfc}, len(pkg.Checksum{}))),
pkg.TarGzip,
)),
)},
{"exec net", pkg.NewExec(
"exec-net",
(*pkg.Checksum)(bytes.Repeat([]byte{0xfc}, len(pkg.Checksum{}))),
0, false,
pkg.AbsWork,
[]string{"HAKUREI_TEST=1"},
check.MustAbs("/opt/bin/testtool"),
[]string{"testtool", "net"},
pkg.MustPath("/file", false, pkg.NewFile("file", []byte(
"stub file",
))), pkg.MustPath("/.hakurei", false, pkg.NewHTTPGetTar(
nil, "file:///hakurei.tar",
pkg.Checksum(bytes.Repeat([]byte{0xfc}, len(pkg.Checksum{}))),
pkg.TarUncompressed,
)), pkg.MustPath("/opt", false, pkg.NewHTTPGetTar(
nil, "file:///testtool.tar.gz",
pkg.Checksum(bytes.Repeat([]byte{0xfc}, len(pkg.Checksum{}))),
pkg.TarGzip,
)),
)},
{"file anonymous", pkg.NewFile("", []byte{0})},
{"file", pkg.NewFile("stub", []byte("stub"))},
}
testCasesCache := make([]cacheTestCase, len(testCases))
for i, tc := range testCases {
want := tc.a
testCasesCache[i] = cacheTestCase{tc.name, nil,
func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
r, w := io.Pipe()
done := make(chan error, 1)
go func() {
t.Helper()
done <- c.EncodeAll(w, want)
_ = w.Close()
}()
if got, err := c.NewDecoder(r).Decode(); err != nil {
t.Fatalf("Decode: error = %v", err)
} else if !reflect.DeepEqual(got, want) {
t.Fatalf("Decode: %#v, want %#v", got, want)
}
if err := <-done; err != nil {
t.Fatalf("EncodeAll: error = %v", err)
}
}, pkg.MustDecode(
"E4vEZKhCcL2gPZ2Tt59FS3lDng-d_2SKa2i5G_RbDfwGn6EemptFaGLPUDiOa94C",
),
}
}
checkWithCache(t, testCasesCache)
}

View File

@@ -19,8 +19,8 @@ type httpArtifact struct {
// closing the [io.ReadCloser] returned by Cure.
checksum unique.Handle[Checksum]
// doFunc is the Do method of [http.Client] supplied by the caller.
doFunc func(req *http.Request) (*http.Response, error)
// client is the address of the caller-supplied [http.Client].
client *http.Client
}
var _ KnownChecksum = new(httpArtifact)
@@ -33,10 +33,7 @@ func NewHTTPGet(
url string,
checksum Checksum,
) FileArtifact {
if c == nil {
c = http.DefaultClient
}
return &httpArtifact{url: url, checksum: unique.Make(checksum), doFunc: c.Do}
return &httpArtifact{url: url, checksum: unique.Make(checksum), client: c}
}
// Kind returns the hardcoded [Kind] constant.
@@ -44,8 +41,17 @@ func (*httpArtifact) Kind() Kind { return KindHTTPGet }
// Params writes the backing url string. Client is not represented as it does
// not affect [Cache.Cure] outcome.
func (a *httpArtifact) Params(ctx *IContext) {
ctx.GetHash().Write([]byte(a.url))
func (a *httpArtifact) Params(ctx *IContext) { ctx.WriteString(a.url) }
func init() {
register(KindHTTPGet, func(r *IRReader) Artifact {
url := r.ReadString()
checksum, ok := r.Finalise()
if !ok {
panic(ErrExpectedChecksum)
}
return NewHTTPGet(nil, url, checksum.Value())
})
}
// Dependencies returns a nil slice.
@@ -78,9 +84,15 @@ func (a *httpArtifact) Cure(r *RContext) (rc io.ReadCloser, err error) {
if err != nil {
return
}
req.Header.Set("User-Agent", "Hakurei/1.1")
c := a.client
if c == nil {
c = http.DefaultClient
}
var resp *http.Response
if resp, err = a.doFunc(req); err != nil {
if resp, err = c.Do(req); err != nil {
return
}

View File

@@ -109,7 +109,7 @@ func TestHTTPGet(t *testing.T) {
)
wantPathname := base.Append(
"identifier",
"NqVORkT6L9HX6Za7kT2zcibY10qFqBaxEjPiYFrBQX-ZFr3yxCzJxbKOP0zVjeWb",
"oM-2pUlk-mOxK1t3aMWZer69UdOQlAXiAgMrpZ1476VoOqpYVP1aGFS9_HYy-D8_",
)
if pathname, checksum, err := c.Cure(f); err != nil {
t.Fatalf("Cure: error = %v", err)
@@ -156,6 +156,6 @@ func TestHTTPGet(t *testing.T) {
if _, _, err := c.Cure(f); !reflect.DeepEqual(err, wantErrNotFound) {
t.Fatalf("Pathname: error = %#v, want %#v", err, wantErrNotFound)
}
}, pkg.MustDecode("bqtn69RkV5E7V7GhhgCFjcvbxmaqrO8DywamM4Tyjf10F6EJBHjXiIa_tFRtF4iN")},
}, pkg.MustDecode("L_0RFHpr9JUS4Zp14rz2dESSRvfLzpvqsLhR1-YjQt8hYlmEdVl7vI3_-v8UNPKs")},
})
}

View File

@@ -65,35 +65,6 @@ func MustDecode(s string) (checksum Checksum) {
return
}
// IContext is passed to [Artifact.Params] and provides identifier information
// and the target [hash.Hash] for writing params into.
//
// Methods of IContext are safe for concurrent use. IContext is valid
// until [Artifact.Params] returns.
type IContext struct {
// Address of underlying [Cache], should be zeroed or made unusable after
// [Artifact.Params] returns and must not be exposed directly.
cache *Cache
// Made available for writing, should be zeroed after [Artifact.Params]
// returns. Internal state must not be inspected.
h hash.Hash
}
// Unwrap returns the underlying [context.Context].
func (i *IContext) Unwrap() context.Context { return i.cache.ctx }
// GetHash returns the underlying [hash.Hash] for writing. Callers must not
// attempt to inspect its internal state.
func (i *IContext) GetHash() hash.Hash { return i.h }
// WriteIdent writes the identifier of [Artifact] to the underlying [hash.Hash].
func (i *IContext) WriteIdent(a Artifact) {
buf := i.cache.getIdentBuf()
*(*ID)(buf[wordSize:]) = i.cache.Ident(a).Value()
i.h.Write(buf[wordSize:])
i.cache.putIdentBuf(buf)
}
// TContext is passed to [TrivialArtifact.Cure] and provides information and
// methods required for curing the [TrivialArtifact].
//
@@ -238,10 +209,12 @@ type Artifact interface {
// [Artifact] is allowed to return the same [Kind] value.
Kind() Kind
// Params writes opaque bytes that describes [Artifact]. Implementations
// Params writes deterministic values describing [Artifact]. Implementations
// must guarantee that these values are unique among differing instances
// of the same implementation with the same dependencies. Callers must not
// attempt to interpret these params.
// of the same implementation with identical dependencies and conveys enough
// information to create another instance of [Artifact] identical to the
// instance emitting these values. The new instance created via [IRReadFunc]
// from these values must then produce identical IR values.
//
// Result must remain identical across multiple invocations.
Params(ctx *IContext)
@@ -564,38 +537,13 @@ func (c *Cache) unsafeIdent(a Artifact, encodeKind bool) (
return
}
deps := a.Dependencies()
idents := make([]*extIdent, len(deps))
for i, d := range deps {
dbuf, did := c.unsafeIdent(d, true)
if dbuf == nil {
dbuf = c.getIdentBuf()
binary.LittleEndian.PutUint64(dbuf[:], uint64(d.Kind()))
*(*ID)(dbuf[wordSize:]) = did.Value()
} else {
c.storeIdent(d, dbuf)
}
defer c.putIdentBuf(dbuf)
idents[i] = dbuf
}
slices.SortFunc(idents, func(a, b *extIdent) int {
return bytes.Compare(a[:], b[:])
})
idents = slices.CompactFunc(idents, func(a, b *extIdent) bool {
return *a == *b
})
buf = c.getIdentBuf()
h := sha512.New384()
binary.LittleEndian.PutUint64(buf[:], uint64(a.Kind()))
h.Write(buf[:wordSize])
i := IContext{c, h}
a.Params(&i)
i.cache, i.h = nil, nil
for _, dn := range idents {
h.Write(dn[:])
if err := c.Encode(h, a); err != nil {
// unreachable
panic(err)
}
binary.LittleEndian.PutUint64(buf[:], uint64(a.Kind()))
h.Sum(buf[wordSize:wordSize])
return
}
@@ -991,8 +939,9 @@ func (c *Cache) openFile(f FileArtifact) (r io.ReadCloser, err error) {
return
}
// InvalidFileModeError describes an [Artifact.Cure] that did not result in
// a regular file or directory located at the work pathname.
// InvalidFileModeError describes a [FloodArtifact.Cure] or
// [TrivialArtifact.Cure] that did not result in a regular file or directory
// located at the work pathname.
type InvalidFileModeError fs.FileMode
// Error returns a constant string.
@@ -1000,8 +949,8 @@ func (e InvalidFileModeError) Error() string {
return "artifact did not produce a regular file or directory"
}
// NoOutputError describes an [Artifact.Cure] that did not populate its
// work pathname despite completing successfully.
// NoOutputError describes a [FloodArtifact.Cure] or [TrivialArtifact.Cure]
// that did not populate its work pathname despite completing successfully.
type NoOutputError struct{}
// Unwrap returns [os.ErrNotExist].
@@ -1179,39 +1128,38 @@ func (e *CureError) Error() string { return e.Err.Error() }
// A DependencyCureError wraps errors returned while curing dependencies.
type DependencyCureError []*CureError
// sort sorts underlying errors by their identifier.
func (e *DependencyCureError) sort() {
var identBuf [2]ID
slices.SortFunc(*e, func(a, b *CureError) int {
identBuf[0], identBuf[1] = a.Ident.Value(), b.Ident.Value()
return slices.Compare(identBuf[0][:], identBuf[1][:])
})
// unwrapM recursively expands underlying errors into a caller-supplied map.
func (e *DependencyCureError) unwrapM(me map[unique.Handle[ID]]*CureError) {
for _, err := range *e {
if _e, ok := err.Err.(*DependencyCureError); ok {
_e.unwrapM(me)
continue
}
me[err.Ident] = err
}
}
// unwrap recursively expands and deduplicates underlying errors.
func (e *DependencyCureError) unwrap() DependencyCureError {
errs := make(DependencyCureError, 0, len(*e))
for _, err := range *e {
if _e, ok := err.Err.(*DependencyCureError); ok {
errs = append(errs, _e.unwrap()...)
continue
}
errs = append(errs, err)
}
me := make(map[unique.Handle[ID]]*CureError, len(errs))
for _, err := range errs {
me[err.Ident] = err
}
return slices.AppendSeq(
me := make(map[unique.Handle[ID]]*CureError)
e.unwrapM(me)
errs := slices.AppendSeq(
make(DependencyCureError, 0, len(me)),
maps.Values(me),
)
var identBuf [2]ID
slices.SortFunc(errs, func(a, b *CureError) int {
identBuf[0], identBuf[1] = a.Ident.Value(), b.Ident.Value()
return slices.Compare(identBuf[0][:], identBuf[1][:])
})
return errs
}
// Unwrap returns a deduplicated slice of underlying errors.
func (e *DependencyCureError) Unwrap() []error {
errs := e.unwrap()
errs.sort()
_errs := make([]error, len(errs))
for i, err := range errs {
_errs[i] = err
@@ -1222,7 +1170,6 @@ func (e *DependencyCureError) Unwrap() []error {
// Error returns a user-facing multiline error message.
func (e *DependencyCureError) Error() string {
errs := e.unwrap()
errs.sort()
if len(errs) == 0 {
return "invalid dependency cure outcome"
}

View File

@@ -15,6 +15,7 @@ import (
"os"
"path/filepath"
"reflect"
"strconv"
"syscall"
"testing"
"unique"
@@ -93,7 +94,7 @@ type stubArtifact struct {
}
func (a *stubArtifact) Kind() pkg.Kind { return a.kind }
func (a *stubArtifact) Params(ctx *pkg.IContext) { ctx.GetHash().Write(a.params) }
func (a *stubArtifact) Params(ctx *pkg.IContext) { ctx.Write(a.params) }
func (a *stubArtifact) Dependencies() []pkg.Artifact { return a.deps }
func (a *stubArtifact) Cure(t *pkg.TContext) error { return a.cure(t) }
func (*stubArtifact) IsExclusive() bool { return false }
@@ -109,7 +110,7 @@ type stubArtifactF struct {
}
func (a *stubArtifactF) Kind() pkg.Kind { return a.kind }
func (a *stubArtifactF) Params(ctx *pkg.IContext) { ctx.GetHash().Write(a.params) }
func (a *stubArtifactF) Params(ctx *pkg.IContext) { ctx.Write(a.params) }
func (a *stubArtifactF) Dependencies() []pkg.Artifact { return a.deps }
func (a *stubArtifactF) Cure(f *pkg.FContext) error { return a.cure(f) }
func (a *stubArtifactF) IsExclusive() bool { return a.excl }
@@ -218,7 +219,7 @@ func TestIdent(t *testing.T) {
},
nil,
}, unique.Make[pkg.ID](pkg.MustDecode(
"HnySzeLQvSBZuTUcvfmLEX_OmH4yJWWH788NxuLuv7kVn8_uPM6Ks4rqFWM2NZJY",
"WKErnjTOVbuH2P9a0gM4OcAAO4p-CoX2HQu7CbZrg8ZOzApvWoO3-ISzPw6av_rN",
))},
}
@@ -531,7 +532,7 @@ func TestCache(t *testing.T) {
kind: pkg.KindExec,
params: []byte("artifact overridden to be incomplete"),
}}, nil, pkg.Checksum{}, pkg.InvalidArtifactError(pkg.MustDecode(
"da4kLKa94g1wN2M0qcKflqgf2-Y2UL36iehhczqsIIW8G0LGvM7S8jjtnBc0ftB0",
"E__uZ1sLIvb84vzSm5Uezb03RogsiaeTt1nfIVv8TKnnf4LqwtSi-smdHhlkZrUJ",
))},
{"error passthrough", newStubFile(
@@ -953,6 +954,17 @@ func TestErrors(t *testing.T) {
{"NoOutputError", pkg.NoOutputError{
// empty struct
}, "artifact cured successfully but did not produce any output"},
{"IRKindError", &pkg.IRKindError{
Got: pkg.IRKindEnd,
Want: pkg.IRKindIdent,
Ancillary: 0xcafebabe,
}, "got terminator IR value (0xcafebabe) instead of ident"},
{"IRKindError invalid", &pkg.IRKindError{
Got: 0xbeef,
Want: pkg.IRKindIdent,
Ancillary: 0xcafe,
}, "got invalid kind 48879 IR value (0xcafe) instead of ident"},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
@@ -1137,6 +1149,40 @@ func TestDependencyCureError(t *testing.T) {
}
}
// earlyFailureF is a [FloodArtifact] with a large dependency graph resulting in
// a large [DependencyCureError].
type earlyFailureF int
func (earlyFailureF) Kind() pkg.Kind { return pkg.KindExec }
func (earlyFailureF) Params(*pkg.IContext) {}
func (earlyFailureF) IsExclusive() bool { return false }
func (a earlyFailureF) Dependencies() []pkg.Artifact {
deps := make([]pkg.Artifact, a)
for i := range deps {
deps[i] = a - 1
}
return deps
}
func (a earlyFailureF) Cure(*pkg.FContext) error {
if a != 0 {
panic("unexpected cure on " + strconv.Itoa(int(a)))
}
return stub.UniqueError(0xcafe)
}
func TestDependencyCureErrorEarly(t *testing.T) {
checkWithCache(t, []cacheTestCase{
{"early", nil, func(t *testing.T, _ *check.Absolute, c *pkg.Cache) {
_, _, err := c.Cure(earlyFailureF(8))
if !errors.Is(err, stub.UniqueError(0xcafe)) {
t.Fatalf("Cure: error = %v", err)
}
}, pkg.MustDecode("E4vEZKhCcL2gPZ2Tt59FS3lDng-d_2SKa2i5G_RbDfwGn6EemptFaGLPUDiOa94C")},
})
}
func TestNew(t *testing.T) {
t.Parallel()

View File

@@ -4,7 +4,6 @@ import (
"archive/tar"
"compress/bzip2"
"compress/gzip"
"encoding/binary"
"errors"
"fmt"
"io"
@@ -29,7 +28,7 @@ type tarArtifact struct {
// Caller-supplied backing tarball.
f Artifact
// Compression on top of the tarball.
compression uint64
compression uint32
}
// tarArtifactNamed embeds tarArtifact for a [fmt.Stringer] tarball.
@@ -47,7 +46,7 @@ func (a *tarArtifactNamed) String() string { return a.name + "-unpack" }
// NewTar returns a new [Artifact] backed by the supplied [Artifact] and
// compression method. The source [Artifact] must be compatible with
// [TContext.Open].
func NewTar(a Artifact, compression uint64) Artifact {
func NewTar(a Artifact, compression uint32) Artifact {
ta := tarArtifact{a, compression}
if s, ok := a.(fmt.Stringer); ok {
if name := s.String(); name != "" {
@@ -62,7 +61,7 @@ func NewHTTPGetTar(
hc *http.Client,
url string,
checksum Checksum,
compression uint64,
compression uint32,
) Artifact {
return NewTar(NewHTTPGet(hc, url, checksum), compression)
}
@@ -71,8 +70,16 @@ func NewHTTPGetTar(
func (a *tarArtifact) Kind() Kind { return KindTar }
// Params writes compression encoded in little endian.
func (a *tarArtifact) Params(ctx *IContext) {
ctx.GetHash().Write(binary.LittleEndian.AppendUint64(nil, a.compression))
func (a *tarArtifact) Params(ctx *IContext) { ctx.WriteUint32(a.compression) }
func init() {
register(KindTar, func(r *IRReader) Artifact {
a := NewTar(r.Next(), r.ReadUint32())
if _, ok := r.Finalise(); ok {
panic(ErrUnexpectedChecksum)
}
return a
})
}
// Dependencies returns a slice containing the backing file.

View File

@@ -40,7 +40,7 @@ func TestTar(t *testing.T) {
}, pkg.MustDecode(
"cTw0h3AmYe7XudSoyEMByduYXqGi-N5ZkTZ0t9K5elsu3i_jNIVF5T08KR1roBFM",
))
}, pkg.MustDecode("sxbgyX-bPoezbha214n2lbQhiVfTUBkhZ0EX6zI7mmkMdrCdwuMwhMBJphLQsy94")},
}, pkg.MustDecode("NQTlc466JmSVLIyWklm_u8_g95jEEb98PxJU-kjwxLpfdjwMWJq0G8ze9R4Vo1Vu")},
{"http expand", nil, func(t *testing.T, base *check.Absolute, c *pkg.Cache) {
checkTarHTTP(t, base, c, fstest.MapFS{
@@ -51,7 +51,7 @@ func TestTar(t *testing.T) {
}, pkg.MustDecode(
"CH3AiUrCCcVOjOYLaMKKK1Da78989JtfHeIsxMzWOQFiN4mrCLDYpoDxLWqJWCUN",
))
}, pkg.MustDecode("4I8wx_h7NSJTlG5lbuz-GGEXrOg0GYC3M_503LYEBhv5XGWXfNIdIY9Q3eVSYldX")},
}, pkg.MustDecode("hSoSSgCYTNonX3Q8FjvjD1fBl-E-BQyA6OTXro2OadXqbST4tZ-akGXszdeqphRe")},
})
}
@@ -98,14 +98,37 @@ func checkTarHTTP(
wantIdent := func() pkg.ID {
h := sha512.New384()
h.Write([]byte{byte(pkg.KindTar), 0, 0, 0, 0, 0, 0, 0})
h.Write([]byte{pkg.TarGzip, 0, 0, 0, 0, 0, 0, 0})
h.Write([]byte{byte(pkg.KindHTTPGet), 0, 0, 0, 0, 0, 0, 0})
// kind uint64
h.Write([]byte{byte(pkg.KindTar), 0, 0, 0, 0, 0, 0, 0})
// deps_sz uint64
h.Write([]byte{1, 0, 0, 0, 0, 0, 0, 0})
// kind uint64
h.Write([]byte{byte(pkg.KindHTTPGet), 0, 0, 0, 0, 0, 0, 0})
// ident ID
h0 := sha512.New384()
// kind uint64
h0.Write([]byte{byte(pkg.KindHTTPGet), 0, 0, 0, 0, 0, 0, 0})
// deps_sz uint64
h0.Write([]byte{0, 0, 0, 0, 0, 0, 0, 0})
// url string
h0.Write([]byte{byte(pkg.IRKindString), 0, 0, 0})
h0.Write([]byte{0x10, 0, 0, 0})
h0.Write([]byte("file:///testdata"))
// end(KnownChecksum)
h0.Write([]byte{byte(pkg.IRKindEnd), 0, 0, 0})
h0.Write([]byte{byte(pkg.IREndKnownChecksum), 0, 0, 0})
// checksum Checksum
h0.Write(testdataChecksum[:])
h.Write(h0.Sum(nil))
// compression uint32
h.Write([]byte{byte(pkg.IRKindUint32), 0, 0, 0})
h.Write([]byte{pkg.TarGzip, 0, 0, 0})
// end
h.Write([]byte{byte(pkg.IRKindEnd), 0, 0, 0})
h.Write([]byte{0, 0, 0, 0})
return pkg.ID(h.Sum(nil))
}()

View File

@@ -142,12 +142,12 @@ func main() {
}
const checksumEmptyDir = "MGWmEfjut2QE2xPJwTsmUzpff4BN_FEnQ7T0j7gvUCCiugJQNwqt9m151fm9D1yU"
ident := "U2cbgVgEtjfRuvHfE1cQnZ3t8yoexULQyo_VLgvxAVJSsobMcNaFIsuDWtmt7kzK"
ident := "dztPS6jRjiZtCF4_p8AzfnxGp6obkhrgFVsxdodbKWUoAEVtDz3MykepJB4kI_ks"
log.Println(m)
next := func() { m = m.Next; log.Println(m) }
if overlayRoot {
ident = "5ey2wpmMpj483YYa7ZZQciYLA2cx3_l167JCqWW4Pd-5DVp81dj9EsBtVTwYptF6"
ident = "RdMA-mubnrHuu3Ky1wWyxauSYCO0ZH_zCPUj3uDHqkfwv5sGcByoF_g5PjlGiClb"
if m.Root != "/" || m.Target != "/" ||
m.Source != "overlay" || m.FsType != "overlay" {
@@ -165,7 +165,7 @@ func main() {
log.Fatal("unexpected artifact checksum")
}
} else {
ident = "tfjrsVuBuFgzWgwz-yPppFtylYuC1VFWnKhyBiHbWTGkyz8lt7Ee9QXWaIHPXs4x"
ident = "p1t_drXr34i-jZNuxDMLaMOdL6tZvQqhavNafGynGqxOZoXAUTSn7kqNh3Ovv3DT"
lowerdirsEscaped := strings.Split(lowerdir, ":")
lowerdirs := lowerdirsEscaped[:0]
@@ -194,7 +194,7 @@ func main() {
}
} else {
if hostNet {
ident = "QdsJhGgnk5N2xdUNGcndXQxFKifxf1V_2t9X8CQ-pDcg24x6mGJC_BiLfGbs6Qml"
ident = "G8qPxD9puvvoOVV7lrT80eyDeIl3G_CCFoKw12c8mCjMdG1zF7NEPkwYpNubClK3"
}
if m.Root != "/sysroot" || m.Target != "/" {
@@ -213,14 +213,14 @@ func main() {
}
if promote {
ident = "O-6VjlIUxc4PYLf5v35uhIeL8kkYCbHYklqlmDjFPXe0m4j6GkUDg5qwTzBRESnf"
ident = "xXTIYcXmgJWNLC91c417RRrNM9cjELwEZHpGvf8Fk_GNP5agRJp_SicD0w9aMeLJ"
}
next() // testtool artifact
next()
if overlayWork {
ident = "acaDzHZv40dZaz4cGAXayqbRMgbEOuiuiUijZL8IgDQvyeCNMFE3onBMYfny-kXA"
ident = "5hlaukCirnXE4W_RSLJFOZN47Z5RiHnacXzdFp_70cLgiJUGR6cSb_HaFftkzi0-"
if m.Root != "/" || m.Target != "/work" ||
m.Source != "overlay" || m.FsType != "overlay" {
log.Fatal("unexpected work mount entry")

View File

@@ -7,23 +7,9 @@ func (t Toolchain) newAttr() pkg.Artifact {
version = "2.5.2"
checksum = "YWEphrz6vg1sUMmHHVr1CRo53pFXRhq_pjN-AlG8UgwZK1y6m7zuDhxqJhD0SV0l"
)
return t.New("attr-"+version, false, []pkg.Artifact{
t.Load(Make),
t.Load(Perl),
}, nil, nil, `
ln -s ../../system/bin/perl /usr/bin
cd "$(mktemp -d)"
/usr/src/attr/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}" \
--enable-static
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("attr"), true, t.NewPatchedSource(
return t.NewViaMake("attr", version, t.NewPatchedSource(
"attr", version, pkg.NewHTTPGetTar(
nil,
"https://download.savannah.nongnu.org/releases/attr/"+
nil, "https://download.savannah.nongnu.org/releases/attr/"+
"attr-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
@@ -66,7 +52,16 @@ index 6ce2f9b..e9bde92 100644
$ setfattr -n user. -v value f
> setfattr: f: Invalid argument
`},
)))
), &MakeAttr{
ScriptEarly: `
ln -s ../../system/bin/perl /usr/bin
`,
Configure: [][2]string{
{"enable-static"},
},
},
t.Load(Perl),
)
}
func init() { artifactsF[Attr] = Toolchain.newAttr }
@@ -75,24 +70,21 @@ func (t Toolchain) newACL() pkg.Artifact {
version = "2.3.2"
checksum = "-fY5nwH4K8ZHBCRXrzLdguPkqjKI6WIiGu4dBtrZ1o0t6AIU73w8wwJz_UyjIS0P"
)
return t.New("acl-"+version, false, []pkg.Artifact{
t.Load(Make),
t.Load(Attr),
}, nil, nil, `
cd "$(mktemp -d)"
/usr/src/acl/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}" \
--enable-static
make "-j$(nproc)"
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("acl"), true, pkg.NewHTTPGetTar(
return t.NewViaMake("acl", version, pkg.NewHTTPGetTar(
nil,
"https://download.savannah.nongnu.org/releases/acl/"+
"acl-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
), &MakeAttr{
Configure: [][2]string{
{"enable-static"},
},
// makes assumptions about uid_map/gid_map
SkipCheck: true,
},
t.Load(Attr),
)
}
func init() { artifactsF[ACL] = Toolchain.newACL }

View File

@@ -13,40 +13,75 @@ const (
ACL PArtifact = iota
Attr
Autoconf
Automake
Bash
Busybox
Binutils
CMake
Coreutils
Curl
Diffutils
Findutils
Fuse
Gawk
GMP
Gettext
Git
Go
Gperf
Grep
Gzip
Hakurei
HakureiDist
IniConfig
KernelHeaders
LibXau
Libexpat
Libpsl
Libffi
Libgd
Libtool
Libseccomp
Libucontext
Libxml2
M4
MPC
MPFR
Make
Meson
Mksh
NSS
NSSCACert
Ninja
OpenSSL
Packaging
Patch
Perl
PkgConfig
Pluggy
PyTest
Pygments
Python
Rsync
Sed
Setuptools
Toybox
toyboxEarly
Unzip
utilMacros
Wayland
WaylandProtocols
XCB
XCBProto
Xproto
XZ
Zlib
buildcatrust
// gcc is a hacked-to-pieces GCC toolchain meant for use in intermediate
// stages only. This preset and its direct output must never be exposed.
gcc
// _presetEnd is the total number of presets and does not denote a preset.
_presetEnd
)
@@ -68,3 +103,73 @@ func (t Toolchain) Load(p PArtifact) pkg.Artifact {
})
return artifacts[t][p]
}
// ResolveName returns a [PArtifact] by name.
func ResolveName(name string) (p PArtifact, ok bool) {
p, ok = map[string]PArtifact{
"acl": ACL,
"attr": Attr,
"autoconf": Autoconf,
"automake": Automake,
"bash": Bash,
"binutils": Binutils,
"cmake": CMake,
"coreutils": Coreutils,
"curl": Curl,
"diffutils": Diffutils,
"findutils": Findutils,
"fuse": Fuse,
"gawk": Gawk,
"gmp": GMP,
"gettext": Gettext,
"git": Git,
"go": Go,
"gperf": Gperf,
"grep": Grep,
"gzip": Gzip,
"hakurei": Hakurei,
"hakurei-dist": HakureiDist,
"iniconfig": IniConfig,
"kernel-headers": KernelHeaders,
"libXau": LibXau,
"libexpat": Libexpat,
"libpsl": Libpsl,
"libseccomp": Libseccomp,
"libucontext": Libucontext,
"libxml2": Libxml2,
"libffi": Libffi,
"libgd": Libgd,
"libtool": Libtool,
"m4": M4,
"mpc": MPC,
"mpfr": MPFR,
"make": Make,
"meson": Meson,
"mksh": Mksh,
"nss": NSS,
"nss-cacert": NSSCACert,
"ninja": Ninja,
"openssl": OpenSSL,
"packaging": Packaging,
"patch": Patch,
"perl": Perl,
"pkg-config": PkgConfig,
"pluggy": Pluggy,
"pytest": PyTest,
"pygments": Pygments,
"python": Python,
"rsync": Rsync,
"sed": Sed,
"setuptools": Setuptools,
"toybox": Toybox,
"unzip": Unzip,
"wayland": Wayland,
"wayland-protocols": WaylandProtocols,
"xcb": XCB,
"xcb-proto": XCBProto,
"xproto": Xproto,
"xz": XZ,
"zlib": Zlib,
}[name]
return
}

View File

@@ -5,8 +5,7 @@ import (
"io"
"net/http"
"os"
"slices"
"strings"
"runtime"
"time"
"hakurei.app/container/fhs"
@@ -33,6 +32,16 @@ func (a busyboxBin) Dependencies() []pkg.Artifact {
return []pkg.Artifact{a.bin}
}
func init() {
pkg.Register(kindBusyboxBin, func(r *pkg.IRReader) pkg.Artifact {
a := busyboxBin{r.Next().(pkg.FileArtifact)}
if _, ok := r.Finalise(); ok {
panic(pkg.ErrUnexpectedChecksum)
}
return a
})
}
// String returns the reporting name of the underlying file prefixed with expand.
func (a busyboxBin) String() string {
return "expand-" + a.bin.(fmt.Stringer).String()
@@ -78,10 +87,23 @@ func (a busyboxBin) Cure(t *pkg.TContext) (err error) {
// newBusyboxBin returns a [pkg.Artifact] containing a busybox installation from
// the https://busybox.net/downloads/binaries/ binary release.
func newBusyboxBin() pkg.Artifact {
const (
version = "1.35.0"
var version, url, checksum string
switch runtime.GOARCH {
case "amd64":
version = "1.35.0"
url = "https://busybox.net/downloads/binaries/" +
version + "-" + linuxArch() + "-linux-musl/busybox"
checksum = "L7OBIsPu9enNHn7FqpBT1kOg_mCLNmetSeNMA3i4Y60Z5jTgnlX3qX3zcQtLx5AB"
)
case "arm64":
version = "1.31.0"
url = "https://busybox.net/downloads/binaries/" +
version + "-defconfig-multiarch-musl/busybox-armv8l"
checksum = "npJjBO7iwhjW6Kx2aXeSxf8kXhVgTCDChOZTTsI8ZfFfa3tbsklxRiidZQdrVERg"
default:
panic("unsupported target " + runtime.GOARCH)
}
return pkg.NewExec(
"busybox-bin-"+version, nil, pkg.ExecTimeoutMax, false,
fhs.AbsRoot, []string{
@@ -96,264 +118,8 @@ func newBusyboxBin() pkg.Artifact {
&http.Client{Transport: &http.Transport{
// busybox website is really slow to respond
TLSHandshakeTimeout: 2 * time.Minute,
}},
"https://busybox.net/downloads/binaries/"+
version+"-"+linuxArch()+"-linux-musl/busybox",
}}, url,
mustDecode(checksum),
)}),
)
}
func (t Toolchain) newBusybox() pkg.Artifact {
const (
version = "1.37.0"
checksum = "Ial94Tnt7esJ_YEeb0AxunVL6MGYFyOw7Rtu2o87CXCi1TLrc6rlznVsN1rZk7it"
)
var env []string
if t == toolchainStage3 {
env = append(env, "EXTRA_LDFLAGS=-static")
}
return t.New("busybox-"+version, false, stage3Concat(t, []pkg.Artifact{},
t.Load(Make),
t.Load(KernelHeaders),
), nil, slices.Concat([]string{
"ROSA_BUSYBOX_ENABLE=" + strings.Join([]string{
"STATIC",
"PIE",
}, " "),
"ROSA_BUSYBOX_DISABLE=" + strings.Join([]string{
"FEATURE_IPV6",
"FEATURE_PREFER_IPV4_ADDRESS",
"FEATURE_HWIB",
"ARP",
"ARPING",
"BRCTL",
"FEATURE_BRCTL_FANCY",
"FEATURE_BRCTL_SHOW",
"DNSD",
"ETHER_WAKE",
"FTPD",
"FEATURE_FTPD_WRITE",
"FEATURE_FTPD_ACCEPT_BROKEN_LIST",
"FEATURE_FTPD_AUTHENTICATION",
"FTPGET",
"FTPPUT",
"FEATURE_FTPGETPUT_LONG_OPTIONS",
"HOSTNAME",
"DNSDOMAINNAME",
"HTTPD",
"FEATURE_HTTPD_PORT_DEFAULT",
"FEATURE_HTTPD_RANGES",
"FEATURE_HTTPD_SETUID",
"FEATURE_HTTPD_BASIC_AUTH",
"FEATURE_HTTPD_AUTH_MD5",
"FEATURE_HTTPD_CGI",
"FEATURE_HTTPD_CONFIG_WITH_SCRIPT_INTERPR",
"FEATURE_HTTPD_SET_REMOTE_PORT_TO_ENV",
"FEATURE_HTTPD_ENCODE_URL_STR",
"FEATURE_HTTPD_ERROR_PAGES",
"FEATURE_HTTPD_PROXY",
"FEATURE_HTTPD_GZIP",
"FEATURE_HTTPD_ETAG",
"FEATURE_HTTPD_LAST_MODIFIED",
"FEATURE_HTTPD_DATE",
"FEATURE_HTTPD_ACL_IP",
"IFCONFIG",
"FEATURE_IFCONFIG_STATUS",
"FEATURE_IFCONFIG_SLIP",
"FEATURE_IFCONFIG_MEMSTART_IOADDR_IRQ",
"FEATURE_IFCONFIG_HW",
"FEATURE_IFCONFIG_BROADCAST_PLUS",
"IFENSLAVE",
"IFPLUGD",
"IFUP",
"IFDOWN",
"IFUPDOWN_IFSTATE_PATH",
"FEATURE_IFUPDOWN_IP",
"FEATURE_IFUPDOWN_IPV4",
"FEATURE_IFUPDOWN_IPV6",
"FEATURE_IFUPDOWN_MAPPING",
"INETD",
"FEATURE_INETD_SUPPORT_BUILTIN_ECHO",
"FEATURE_INETD_SUPPORT_BUILTIN_DISCARD",
"FEATURE_INETD_SUPPORT_BUILTIN_TIME",
"FEATURE_INETD_SUPPORT_BUILTIN_DAYTIME",
"FEATURE_INETD_SUPPORT_BUILTIN_CHARGEN",
"IP",
"IPADDR",
"IPLINK",
"IPROUTE",
"IPTUNNEL",
"IPRULE",
"IPNEIGH",
"FEATURE_IP_ADDRESS",
"FEATURE_IP_LINK",
"FEATURE_IP_LINK_CAN",
"FEATURE_IP_ROUTE",
"FEATURE_IP_ROUTE_DIR",
"FEATURE_IP_TUNNEL",
"FEATURE_IP_RULE",
"FEATURE_IP_NEIGH",
"IPCALC",
"FEATURE_IPCALC_LONG_OPTIONS",
"FEATURE_IPCALC_FANCY",
"FAKEIDENTD",
"NAMEIF",
"FEATURE_NAMEIF_EXTENDED",
"NBDCLIENT",
"NC",
"NC_SERVER",
"NC_EXTRA",
"NC_110_COMPAT",
"NETSTAT",
"FEATURE_NETSTAT_WIDE",
"FEATURE_NETSTAT_PRG",
"NSLOOKUP",
"FEATURE_NSLOOKUP_BIG",
"FEATURE_NSLOOKUP_LONG_OPTIONS",
"NTPD",
"FEATURE_NTPD_SERVER",
"FEATURE_NTPD_CONF",
"FEATURE_NTP_AUTH",
"PING",
"PING6",
"FEATURE_FANCY_PING",
"PSCAN",
"ROUTE",
"SLATTACH",
"SSL_CLIENT",
"TC",
"FEATURE_TC_INGRESS",
"TCPSVD",
"UDPSVD",
"TELNET",
"FEATURE_TELNET_TTYPE",
"FEATURE_TELNET_AUTOLOGIN",
"FEATURE_TELNET_WIDTH",
"TELNETD",
"FEATURE_TELNETD_STANDALONE",
"FEATURE_TELNETD_PORT_DEFAULT",
"FEATURE_TELNETD_INETD_WAIT",
"TFTP",
"FEATURE_TFTP_PROGRESS_BAR",
"FEATURE_TFTP_HPA_COMPAT",
"TFTPD",
"FEATURE_TFTP_GET",
"FEATURE_TFTP_PUT",
"FEATURE_TFTP_BLOCKSIZE",
"TLS",
"TRACEROUTE",
"TRACEROUTE6",
"FEATURE_TRACEROUTE_VERBOSE",
"FEATURE_TRACEROUTE_USE_ICMP",
"TUNCTL",
"FEATURE_TUNCTL_UG",
"VCONFIG",
"WGET",
"FEATURE_WGET_LONG_OPTIONS",
"FEATURE_WGET_STATUSBAR",
"FEATURE_WGET_FTP",
"FEATURE_WGET_AUTHENTICATION",
"FEATURE_WGET_TIMEOUT",
"FEATURE_WGET_HTTPS",
"FEATURE_WGET_OPENSSL",
"WHOIS",
"ZCIP",
"UDHCPD",
"FEATURE_UDHCPD_BOOTP",
"FEATURE_UDHCPD_WRITE_LEASES_EARLY",
"DHCPD_LEASES_FILE",
"DUMPLEASES",
"DHCPRELAY",
"UDHCPC",
"FEATURE_UDHCPC_ARPING",
"FEATURE_UDHCPC_SANITIZEOPT",
"UDHCPC_DEFAULT_SCRIPT",
"UDHCPC6_DEFAULT_SCRIPT",
"UDHCPC6",
"FEATURE_UDHCPC6_RFC3646",
"FEATURE_UDHCPC6_RFC4704",
"FEATURE_UDHCPC6_RFC4833",
"FEATURE_UDHCPC6_RFC5970",
}, " "),
}, env), `
config_enable() {
for ent in "$@"; do
sed "s/^# CONFIG_${ent}.*/CONFIG_${ent}=y/" -i .config
shift
done
}
config_disable() {
for ent in "$@"; do
sed "s/^CONFIG_${ent}=y/# CONFIG_${ent} is not set/" -i .config
shift
done
}
cat > /bin/gcc << EOF
exec clang \
-Wno-ignored-optimization-argument \
${LDFLAGS} \
\$@
EOF
chmod +x /bin/gcc
cd /usr/src/busybox
chmod +w editors editors/awk.c
patch -p 1 < /usr/src/patches/awk-fix-literal-backslash.patch
cd "$(mktemp -d)"
make \
KBUILD_SRC=/usr/src/busybox \
-f /usr/src/busybox/Makefile \
defconfig
config_enable $ROSA_BUSYBOX_ENABLE
config_disable $ROSA_BUSYBOX_DISABLE
ln -s ../system/bin/pwd /bin/pwd || true
make CFLAGS_busybox="${LDFLAGS} ${EXTRA_LDFLAGS}" "-j$(nproc)"
mkdir -p /system/bin/ /work/bin/
cp busybox /system/bin/
mkdir -pv /work/system/bin/
busybox --install -s /work/system/bin/
cp -v busybox /work/system/bin/
ln -vs ../system/bin/hush /work/bin/sh
mkdir -vp /work/usr/bin/
ln -vs ../../system/bin/busybox /work/usr/bin/env
`, pkg.Path(AbsUsrSrc.Append("busybox"), true, pkg.NewHTTPGetTar(
&http.Client{Transport: &http.Transport{
// busybox website is really slow to respond
TLSHandshakeTimeout: 2 * time.Minute,
}},
"https://busybox.net/downloads/busybox-"+version+".tar.bz2",
mustDecode(checksum),
pkg.TarBzip2,
)), pkg.Path(
AbsUsrSrc.Append("patches", "awk-fix-literal-backslash.patch"), false,
pkg.NewFile("awk-fix-literal-backslash.patch", []byte(`diff --git a/editors/awk.c b/editors/awk.c
index 64e752f4b..40f5ba7f7 100644
--- a/editors/awk.c
+++ b/editors/awk.c
@@ -2636,8 +2636,13 @@ static int awk_sub(node *rn, const char *repl, int nm, var *src, var *dest /*,in
resbuf = qrealloc(resbuf, residx + replen + n, &resbufsize);
memcpy(resbuf + residx, sp + pmatch[j].rm_so - start_ofs, n);
residx += n;
- } else
+ } else {
+/* '\\' and '&' following a backslash keep its original meaning, any other
+ * occurrence of a '\\' should be treated as literal */
+ if (bslash && c != '\\' && c != '&')
+ resbuf[residx++] = '\\';
resbuf[residx++] = c;
+ }
bslash = 0;
}
}`)),
))
}
func init() { artifactsF[Busybox] = Toolchain.newBusybox }

View File

@@ -13,13 +13,10 @@ func (t Toolchain) newCMake() pkg.Artifact {
version = "4.2.1"
checksum = "Y3OdbMsob6Xk2y1DCME6z4Fryb5_TkFD7knRT8dTNIRtSqbiCJyyDN9AxggN_I75"
)
return t.New("cmake-"+version, false, []pkg.Artifact{
return t.New("cmake-"+version, 0, []pkg.Artifact{
t.Load(Make),
t.Load(KernelHeaders),
}, nil, nil, `
# expected to be writable in the copy made during bootstrap
chmod -R +w /usr/src/cmake/Tests
cd "$(mktemp -d)"
/usr/src/cmake/bootstrap \
--prefix=/system \
@@ -28,25 +25,25 @@ cd "$(mktemp -d)"
-DCMAKE_USE_OPENSSL=OFF
make "-j$(nproc)"
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("cmake"), true,
pkg.NewHTTPGetTar(
`, pkg.Path(AbsUsrSrc.Append("cmake"), true, t.NewPatchedSource(
// expected to be writable in the copy made during bootstrap
"cmake", version, pkg.NewHTTPGetTar(
nil, "https://github.com/Kitware/CMake/releases/download/"+
"v"+version+"/cmake-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
), false,
)))
}
func init() { artifactsF[CMake] = Toolchain.newCMake }
// CMakeAttr holds the project-specific attributes that will be applied to a new
// [pkg.Artifact] compiled via CMake.
// [pkg.Artifact] compiled via [CMake].
type CMakeAttr struct {
// Path elements joined with source.
Append []string
// Use source tree as scratch space.
Writable bool
// Dependencies concatenated with the build system itself.
Extra []pkg.Artifact
// CMake CACHE entries.
Cache [][2]string
@@ -54,21 +51,26 @@ type CMakeAttr struct {
Env []string
// Runs before cmake.
ScriptEarly string
// Runs after cmake.
// Runs after cmake, replaces default.
ScriptConfigured string
// Runs after install.
Script string
// Override the default installation prefix [AbsSystem].
Prefix *check.Absolute
// Return an exclusive artifact.
Exclusive bool
// Passed through to [Toolchain.New].
Paths []pkg.ExecPath
// Passed through to [Toolchain.New].
Flag int
}
// NewViaCMake returns a [pkg.Artifact] for compiling and installing via CMake.
// NewViaCMake returns a [pkg.Artifact] for compiling and installing via [CMake].
func (t Toolchain) NewViaCMake(
name, version, variant string,
source pkg.Artifact,
attr *CMakeAttr,
extra ...pkg.Artifact,
) pkg.Artifact {
if name == "" || version == "" || variant == "" {
panic("names must be non-empty")
@@ -84,11 +86,9 @@ func (t Toolchain) NewViaCMake(
panic("CACHE must be non-empty")
}
scriptEarly := attr.ScriptEarly
if attr.Writable {
scriptEarly = `
chmod -R +w "${ROSA_SOURCE}"
` + scriptEarly
scriptConfigured := "cmake --build .\ncmake --install .\n"
if attr.ScriptConfigured != "" {
scriptConfigured = attr.ScriptConfigured
}
prefix := attr.Prefix
@@ -97,14 +97,14 @@ chmod -R +w "${ROSA_SOURCE}"
}
sourcePath := AbsUsrSrc.Append(name)
return t.New(name+"-"+variant+"-"+version, attr.Exclusive, stage3Concat(t, attr.Extra,
return t.New(name+"-"+variant+"-"+version, attr.Flag, stage3Concat(t, extra,
t.Load(CMake),
t.Load(Ninja),
), nil, slices.Concat([]string{
"ROSA_SOURCE=" + sourcePath.String(),
"ROSA_CMAKE_SOURCE=" + sourcePath.Append(attr.Append...).String(),
"ROSA_INSTALL_PREFIX=/work" + prefix.String(),
}, attr.Env), scriptEarly+`
}, attr.Env), attr.ScriptEarly+`
mkdir /cure && cd /cure
cmake -G Ninja \
-DCMAKE_C_COMPILER_TARGET="${ROSA_TRIPLE}" \
@@ -119,7 +119,7 @@ cmake -G Ninja \
}), " \\\n\t")+` \
-DCMAKE_INSTALL_PREFIX="${ROSA_INSTALL_PREFIX}" \
"${ROSA_CMAKE_SOURCE}"
cmake --build .
cmake --install .
`+attr.Script, pkg.Path(sourcePath, attr.Writable, source))
`+scriptConfigured+attr.Script, slices.Concat([]pkg.ExecPath{
pkg.Path(sourcePath, attr.Writable, source),
}, attr.Paths)...)
}

32
internal/rosa/curl.go Normal file
View File

@@ -0,0 +1,32 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newCurl() pkg.Artifact {
const (
version = "8.18.0"
checksum = "YpOolP_sx1DIrCEJ3elgVAu0wTLDS-EZMZFvOP0eha7FaLueZUlEpuMwDzJNyi7i"
)
return t.NewViaMake("curl", version, pkg.NewHTTPGetTar(
nil, "https://curl.se/download/curl-"+version+".tar.bz2",
mustDecode(checksum),
pkg.TarBzip2,
), &MakeAttr{
Env: []string{
"TFLAGS=-j256",
},
Configure: [][2]string{
{"with-openssl"},
{"with-ca-bundle", "/system/etc/ssl/certs/ca-bundle.crt"},
},
ScriptConfigured: `
make "-j$(nproc)"
`,
},
t.Load(Perl),
t.Load(Libpsl),
t.Load(OpenSSL),
)
}
func init() { artifactsF[Curl] = Toolchain.newCurl }

View File

@@ -4,8 +4,10 @@ import (
"errors"
"io"
"os"
"sync"
"syscall"
"hakurei.app/container/fhs"
"hakurei.app/internal/pkg"
)
@@ -86,8 +88,27 @@ nobody:x:65534:
// Kind returns the hardcoded [pkg.Kind] value.
func (cureEtc) Kind() pkg.Kind { return kindEtc }
// Params is a noop.
func (cureEtc) Params(*pkg.IContext) {}
// Params writes whether iana-etc is populated.
func (a cureEtc) Params(ctx *pkg.IContext) {
if a.iana != nil {
ctx.WriteUint32(1)
} else {
ctx.WriteUint32(0)
}
}
func init() {
pkg.Register(kindEtc, func(r *pkg.IRReader) pkg.Artifact {
a := cureEtc{}
if r.ReadUint32() != 0 {
a.iana = r.Next()
}
if _, ok := r.Finalise(); ok {
panic(pkg.ErrUnexpectedChecksum)
}
return a
})
}
// IsExclusive returns false: Cure performs a few trivial filesystem writes.
func (cureEtc) IsExclusive() bool { return false }
@@ -121,3 +142,22 @@ func newIANAEtc() pkg.Artifact {
pkg.TarGzip,
)
}
var (
resolvconfPath pkg.ExecPath
resolvconfOnce sync.Once
)
// resolvconf returns a hardcoded /etc/resolv.conf file.
func resolvconf() pkg.ExecPath {
resolvconfOnce.Do(func() {
resolvconfPath = pkg.Path(
fhs.AbsEtc.Append("resolv.conf"), false,
pkg.NewFile("resolv.conf", []byte(`
nameserver 1.1.1.1
nameserver 1.0.0.1
`)),
)
})
return resolvconfPath
}

45
internal/rosa/fuse.go Normal file
View File

@@ -0,0 +1,45 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newFuse() pkg.Artifact {
const (
version = "3.18.1"
checksum = "COb-BgJRWXLbt9XUkNeuiroQizpMifXqxgieE1SlkMXhs_WGSyJStrmyewAw2hd6"
)
return t.New("fuse-"+version, 0, []pkg.Artifact{
t.Load(Python),
t.Load(Meson),
t.Load(Ninja),
t.Load(IniConfig),
t.Load(Packaging),
t.Load(Pluggy),
t.Load(Pygments),
t.Load(PyTest),
t.Load(KernelHeaders),
}, nil, nil, `
cd "$(mktemp -d)"
meson setup \
--reconfigure \
--buildtype=release \
--prefix=/system \
--prefer-static \
-Dtests=true \
-Duseroot=false \
-Dinitscriptdir=/system/init.d \
-Ddefault_library=both \
. /usr/src/fuse
meson compile
python3 -m pytest test/
meson install \
--destdir=/work
`, pkg.Path(AbsUsrSrc.Append("fuse"), false, pkg.NewHTTPGetTar(
nil, "https://github.com/libfuse/libfuse/releases/download/"+
"fuse-"+version+"/fuse-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
}
func init() { artifactsF[Fuse] = Toolchain.newFuse }

View File

@@ -9,25 +9,93 @@ func (t Toolchain) newGit() pkg.Artifact {
version = "2.52.0"
checksum = "uH3J1HAN_c6PfGNJd2OBwW4zo36n71wmkdvityYnrh8Ak0D1IifiAvEWz9Vi9DmS"
)
return t.New("git-"+version, false, stage3Concat(t, []pkg.Artifact{},
t.Load(Make),
return t.NewViaMake("git", version, t.NewPatchedSource(
"git", version, pkg.NewHTTPGetTar(
nil, "https://www.kernel.org/pub/software/scm/git/"+
"git-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), false,
), &MakeAttr{
// uses source tree as scratch space
Writable: true,
InPlace: true,
// test suite in subdirectory
SkipCheck: true,
Make: []string{"all"},
ScriptEarly: `
cd /usr/src/git
make configure
`,
Script: `
ln -s ../../system/bin/perl /usr/bin/ || true
function disable_test {
local test=$1 pattern=$2
if [ $# -eq 1 ]; then
rm "t/${test}.sh"
else
sed -i "t/${test}.sh" \
-e "/^\s*test_expect_.*$pattern/,/^\s*' *\$/{s/^/: #/}"
fi
}
disable_test t5319-multi-pack-index
disable_test t1305-config-include
disable_test t3900-i18n-commit
disable_test t3507-cherry-pick-conflict
disable_test t4201-shortlog
disable_test t5303-pack-corruption-resilience
disable_test t4301-merge-tree-write-tree
disable_test t8005-blame-i18n
disable_test t9350-fast-export
disable_test t9300-fast-import
disable_test t0211-trace2-perf
disable_test t1517-outside-repo
disable_test t2200-add-update
make \
-C t \
GIT_PROVE_OPTS="--jobs 32 --failures" \
prove
`,
},
t.Load(Perl),
t.Load(Diffutils),
t.Load(M4),
t.Load(Autoconf),
t.Load(Gettext),
t.Load(Zlib),
), nil, nil, `
chmod -R +w /usr/src/git && cd /usr/src/git
make configure
./configure --prefix=/system
make "-j$(nproc)" all
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("git"), true, pkg.NewHTTPGetTar(
nil, "https://www.kernel.org/pub/software/scm/git/"+
"git-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
t.Load(Curl),
t.Load(OpenSSL),
t.Load(Libexpat),
)
}
func init() { artifactsF[Git] = Toolchain.newGit }
// NewViaGit returns a [pkg.Artifact] for cloning a git repository.
func (t Toolchain) NewViaGit(
name, url, rev string,
checksum pkg.Checksum,
) pkg.Artifact {
return t.New(name+"-"+rev, 0, []pkg.Artifact{
t.Load(NSSCACert),
t.Load(OpenSSL),
t.Load(Libpsl),
t.Load(Curl),
t.Load(Libexpat),
t.Load(Git),
}, &checksum, nil, `
git \
-c advice.detachedHead=false \
clone \
--revision=`+rev+` \
`+url+` \
/work
rm -rf /work/.git
`, resolvconf())
}

View File

@@ -2,90 +2,147 @@ package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newMake() pkg.Artifact {
const (
version = "4.4.1"
checksum = "YS_B07ZcAy9PbaK5_vKGj64SrxO2VMpnMKfc9I0Q9IC1rn0RwOH7802pJoj2Mq4a"
)
return t.New("make-"+version, false, nil, nil, nil, `
cd "$(mktemp -d)"
/usr/src/make/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}" \
--disable-dependency-tracking
./build.sh
./make DESTDIR=/work install check
`, pkg.Path(AbsUsrSrc.Append("make"), false, pkg.NewHTTPGetTar(
nil,
"https://ftp.gnu.org/gnu/make/make-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
}
func init() { artifactsF[Make] = Toolchain.newMake }
func (t Toolchain) newM4() pkg.Artifact {
const (
version = "1.4.20"
checksum = "RT0_L3m4Co86bVBY3lCFAEs040yI1WdeNmRylFpah8IZovTm6O4wI7qiHJN3qsW9"
)
return t.New("m4-"+version, false, []pkg.Artifact{
t.Load(Make),
}, nil, nil, `
cd /usr/src/m4
chmod +w tests/test-c32ispunct.sh && echo '#!/bin/sh' > tests/test-c32ispunct.sh
cd "$(mktemp -d)"
/usr/src/m4/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}"
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("m4"), true, pkg.NewHTTPGetTar(
nil,
"https://ftp.gnu.org/gnu/m4/m4-"+version+".tar.bz2",
return t.NewViaMake("m4", version, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gnu/m4/m4-"+version+".tar.bz2",
mustDecode(checksum),
pkg.TarBzip2,
)))
), &MakeAttr{
Writable: true,
ScriptEarly: `
cd /usr/src/m4
chmod +w tests/test-c32ispunct.sh && echo '#!/bin/sh' > tests/test-c32ispunct.sh
`,
},
t.Load(Diffutils),
)
}
func init() { artifactsF[M4] = Toolchain.newM4 }
func (t Toolchain) newSed() pkg.Artifact {
const (
version = "4.9"
checksum = "pe7HWH4PHNYrazOTlUoE1fXmhn2GOPFN_xE62i0llOr3kYGrH1g2_orDz0UtZ9Nt"
)
return t.NewViaMake("sed", version, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gnu/sed/sed-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), nil,
t.Load(Diffutils),
)
}
func init() { artifactsF[Sed] = Toolchain.newSed }
func (t Toolchain) newAutoconf() pkg.Artifact {
const (
version = "2.72"
checksum = "-c5blYkC-xLDer3TWEqJTyh1RLbOd1c5dnRLKsDnIrg_wWNOLBpaqMY8FvmUFJ33"
)
return t.New("autoconf-"+version, false, []pkg.Artifact{
t.Load(Make),
t.Load(M4),
t.Load(Perl),
}, nil, nil, `
cd "$(mktemp -d)"
/usr/src/autoconf/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}"
make \
"-j$(nproc)" \
TESTSUITEFLAGS="-j$(nproc)" \
check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("autoconf"), false, pkg.NewHTTPGetTar(
nil,
"https://ftp.gnu.org/gnu/autoconf/autoconf-"+version+".tar.gz",
return t.NewViaMake("autoconf", version, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gnu/autoconf/autoconf-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
), &MakeAttr{
Make: []string{
`TESTSUITEFLAGS="-j$(nproc)"`,
},
Flag: TExclusive,
},
t.Load(M4),
t.Load(Perl),
t.Load(Bash),
t.Load(Diffutils),
)
}
func init() { artifactsF[Autoconf] = Toolchain.newAutoconf }
func (t Toolchain) newAutomake() pkg.Artifact {
const (
version = "1.18.1"
checksum = "FjvLG_GdQP7cThTZJLDMxYpRcKdpAVG-YDs1Fj1yaHlSdh_Kx6nRGN14E0r_BjcG"
)
return t.NewViaMake("automake", version, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gnu/automake/automake-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), &MakeAttr{
Writable: true,
ScriptEarly: `
cd /usr/src/automake
test_disable() { chmod +w "$2" && echo "$1" > "$2"; }
test_disable '#!/bin/sh' t/objcxx-minidemo.sh
test_disable '#!/bin/sh' t/objcxx-deps.sh
test_disable '#!/bin/sh' t/dist-no-built-sources.sh
test_disable '#!/bin/sh' t/distname.sh
test_disable '#!/bin/sh' t/pr9.sh
`,
},
t.Load(M4),
t.Load(Perl),
t.Load(Grep),
t.Load(Gzip),
t.Load(Autoconf),
t.Load(Diffutils),
)
}
func init() { artifactsF[Automake] = Toolchain.newAutomake }
func (t Toolchain) newLibtool() pkg.Artifact {
const (
version = "2.5.4"
checksum = "pa6LSrQggh8mSJHQfwGjysAApmZlGJt8wif2cCLzqAAa2jpsTY0jZ-6stS3BWZ2Q"
)
return t.NewViaMake("libtool", version, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gnu/libtool/libtool-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), &MakeAttr{
Make: []string{
`TESTSUITEFLAGS=32`,
},
},
t.Load(M4),
t.Load(Diffutils),
)
}
func init() { artifactsF[Libtool] = Toolchain.newLibtool }
func (t Toolchain) newGzip() pkg.Artifact {
const (
version = "1.14"
checksum = "NWhjUavnNfTDFkZJyAUonL9aCOak8GVajWX2OMlzpFnuI0ErpBFyj88mz2xSjz0q"
)
return t.NewViaMake("gzip", version, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gnu/gzip/gzip-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), &MakeAttr{
// dependency loop
SkipCheck: true,
})
}
func init() { artifactsF[Gzip] = Toolchain.newGzip }
func (t Toolchain) newGettext() pkg.Artifact {
const (
version = "0.26"
checksum = "IMu7yDZX7xL5UO1ZxXc-iBMbY9LLEUlOroyuSlHMZwg9MKtxG7HIm8F2LheDua0y"
version = "1.0"
checksum = "3MasKeEdPeFEgWgzsBKk7JqWqql1wEMbgPmzAfs-mluyokoW0N8oQVxPQoOnSdgC"
)
return t.New("gettext-"+version, false, []pkg.Artifact{
t.Load(Make),
}, nil, nil, `
return t.NewViaMake("gettext", version, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gnu/gettext/gettext-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), &MakeAttr{
Writable: true,
ScriptEarly: `
cd /usr/src/gettext
test_disable() { chmod +w "$2" && echo "$1" > "$2"; }
@@ -99,18 +156,15 @@ test_disable '#!/bin/sh' gettext-tools/tests/format-c-5
test_disable '#!/bin/sh' gettext-tools/gnulib-tests/test-c32ispunct.sh
test_disable 'int main(){return 0;}' gettext-tools/gnulib-tests/test-stdcountof-h.c
cd "$(mktemp -d)"
/usr/src/gettext/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}"
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("gettext"), true, pkg.NewHTTPGetTar(
nil,
"https://ftp.gnu.org/pub/gnu/gettext/gettext-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
touch gettext-tools/autotools/archive.dir.tar
`,
},
t.Load(Diffutils),
t.Load(Gzip),
t.Load(Sed),
t.Load(KernelHeaders),
)
}
func init() { artifactsF[Gettext] = Toolchain.newGettext }
@@ -119,27 +173,22 @@ func (t Toolchain) newDiffutils() pkg.Artifact {
version = "3.12"
checksum = "9J5VAq5oA7eqwzS1Yvw-l3G5o-TccUrNQR3PvyB_lgdryOFAfxtvQfKfhdpquE44"
)
return t.New("diffutils-"+version, false, []pkg.Artifact{
t.Load(Make),
}, nil, nil, `
return t.NewViaMake("diffutils", version, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gnu/diffutils/diffutils-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), &MakeAttr{
Writable: true,
ScriptEarly: `
cd /usr/src/diffutils
test_disable() { chmod +w "$2" && echo "$1" > "$2"; }
test_disable '#!/bin/sh' gnulib-tests/test-c32ispunct.sh
test_disable 'int main(){return 0;}' gnulib-tests/test-c32ispunct.c
cd "$(mktemp -d)"
/usr/src/diffutils/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}"
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("diffutils"), true, pkg.NewHTTPGetTar(
nil,
"https://ftp.gnu.org/gnu/diffutils/diffutils-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
test_disable '#!/bin/sh' tests/cmp
`,
Flag: TEarly,
})
}
func init() { artifactsF[Diffutils] = Toolchain.newDiffutils }
@@ -148,27 +197,21 @@ func (t Toolchain) newPatch() pkg.Artifact {
version = "2.8"
checksum = "MA0BQc662i8QYBD-DdGgyyfTwaeALZ1K0yusV9rAmNiIsQdX-69YC4t9JEGXZkeR"
)
return t.New("patch-"+version, false, []pkg.Artifact{
t.Load(Make),
}, nil, nil, `
return t.NewViaMake("patch", version, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gnu/patch/patch-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), &MakeAttr{
Writable: true,
ScriptEarly: `
cd /usr/src/patch
test_disable() { chmod +w "$2" && echo "$1" > "$2"; }
test_disable '#!/bin/sh' tests/ed-style
test_disable '#!/bin/sh' tests/need-filename
cd "$(mktemp -d)"
/usr/src/patch/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}"
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("patch"), true, pkg.NewHTTPGetTar(
nil,
"https://ftp.gnu.org/gnu/patch/patch-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
`,
Flag: TEarly,
})
}
func init() { artifactsF[Patch] = Toolchain.newPatch }
@@ -177,22 +220,17 @@ func (t Toolchain) newBash() pkg.Artifact {
version = "5.3"
checksum = "4LQ_GRoB_ko-Ih8QPf_xRKA02xAm_TOxQgcJLmFDT6udUPxTAWrsj-ZNeuTusyDq"
)
return t.New("bash-"+version, false, []pkg.Artifact{
t.Load(Make),
}, nil, nil, `
cd "$(mktemp -d)"
/usr/src/bash/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}" \
--without-bash-malloc
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("bash"), true, pkg.NewHTTPGetTar(
nil,
"https://ftp.gnu.org/gnu/bash/bash-"+version+".tar.gz",
return t.NewViaMake("bash", version, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gnu/bash/bash-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
), &MakeAttr{
Script: "ln -s bash /work/system/bin/sh\n",
Configure: [][2]string{
{"without-bash-malloc"},
},
Flag: TEarly,
})
}
func init() { artifactsF[Bash] = Toolchain.newBash }
@@ -201,29 +239,30 @@ func (t Toolchain) newCoreutils() pkg.Artifact {
version = "9.9"
checksum = "B1_TaXj1j5aiVIcazLWu8Ix03wDV54uo2_iBry4qHG6Y-9bjDpUPlkNLmU_3Nvw6"
)
return t.New("coreutils-"+version, false, []pkg.Artifact{
t.Load(Make),
t.Load(Perl),
t.Load(KernelHeaders),
}, nil, nil, `
return t.NewViaMake("coreutils", version, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gnu/coreutils/coreutils-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), &MakeAttr{
Writable: true,
ScriptEarly: `
cd /usr/src/coreutils
test_disable() { chmod +w "$2" && echo "$1" > "$2"; }
test_disable '#!/bin/sh' gnulib-tests/test-c32ispunct.sh
test_disable '#!/bin/sh' tests/split/line-bytes.sh
test_disable '#!/bin/sh' tests/dd/no-allocate.sh
test_disable 'int main(){return 0;}' gnulib-tests/test-chown.c
test_disable 'int main(){return 0;}' gnulib-tests/test-fchownat.c
test_disable 'int main(){return 0;}' gnulib-tests/test-lchown.c
`,
Flag: TEarly,
},
t.Load(Perl),
t.Load(Bash),
cd "$(mktemp -d)"
/usr/src/coreutils/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}"
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("coreutils"), true, pkg.NewHTTPGetTar(
nil,
"https://ftp.gnu.org/gnu/coreutils/coreutils-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
t.Load(KernelHeaders),
)
}
func init() { artifactsF[Coreutils] = Toolchain.newCoreutils }
@@ -232,19 +271,353 @@ func (t Toolchain) newGperf() pkg.Artifact {
version = "3.3"
checksum = "RtIy9pPb_Bb8-31J2Nw-rRGso2JlS-lDlVhuNYhqR7Nt4xM_nObznxAlBMnarJv7"
)
return t.New("gperf-"+version, false, []pkg.Artifact{
t.Load(Make),
}, nil, nil, `
cd "$(mktemp -d)"
/usr/src/gperf/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}"
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("gperf"), true, pkg.NewHTTPGetTar(
nil, "https://ftp.gnu.org/pub/gnu/gperf/gperf-"+version+".tar.gz",
return t.NewViaMake("gperf", version, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gperf/gperf-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
), nil,
t.Load(Diffutils),
)
}
func init() { artifactsF[Gperf] = Toolchain.newGperf }
func (t Toolchain) newGawk() pkg.Artifact {
const (
version = "5.3.2"
checksum = "uIs0d14h_d2DgMGYwrPtegGNyt_bxzG3D6Fe-MmExx_pVoVkQaHzrtmiXVr6NHKk"
)
return t.NewViaMake("gawk", version, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gnu/gawk/gawk-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), &MakeAttr{
Flag: TEarly,
// dependency loop
SkipCheck: true,
})
}
func init() { artifactsF[Gawk] = Toolchain.newGawk }
func (t Toolchain) newGrep() pkg.Artifact {
const (
version = "3.12"
checksum = "qMB4RjaPNRRYsxix6YOrjE8gyAT1zVSTy4nW4wKW9fqa0CHYAuWgPwDTirENzm_1"
)
return t.NewViaMake("grep", version, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gnu/grep/grep-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), &MakeAttr{
Writable: true,
ScriptEarly: `
cd /usr/src/grep
test_disable() { chmod +w "$2" && echo "$1" > "$2"; }
test_disable '#!/bin/sh' gnulib-tests/test-c32ispunct.sh
test_disable 'int main(){return 0;}' gnulib-tests/test-c32ispunct.c
`,
},
t.Load(Diffutils),
)
}
func init() { artifactsF[Grep] = Toolchain.newGrep }
func (t Toolchain) newFindutils() pkg.Artifact {
const (
version = "4.10.0"
checksum = "ZXABdNBQXL7QjTygynRRTdXYWxQKZ0Wn5eMd3NUnxR0xaS0u0VfcKoTlbo50zxv6"
)
return t.NewViaMake("findutils", version, pkg.NewHTTPGet(
nil, "https://ftpmirror.gnu.org/gnu/findutils/findutils-"+version+".tar.xz",
mustDecode(checksum),
), &MakeAttr{
SourceSuffix: ".tar.xz",
ScriptEarly: `
cd /usr/src/
tar xf findutils.tar.xz
mv findutils-` + version + ` findutils
cd findutils
echo '#!/bin/sh' > gnulib-tests/test-c32ispunct.sh
echo 'int main(){return 0;}' > tests/xargs/test-sigusr.c
`,
},
t.Load(Diffutils),
t.Load(XZ),
t.Load(Sed),
)
}
func init() { artifactsF[Findutils] = Toolchain.newFindutils }
func (t Toolchain) newBinutils() pkg.Artifact {
const (
version = "2.45"
checksum = "hlLtqqHDmzAT2OQVHaKEd_io2DGFvJkaeS-igBuK8bRRir7LUKGHgHYNkDVKaHTT"
)
return t.NewViaMake("binutils", version, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gnu/binutils/binutils-"+version+".tar.bz2",
mustDecode(checksum),
pkg.TarBzip2,
), &MakeAttr{
ScriptConfigured: `
make "-j$(nproc)"
`,
},
t.Load(Bash),
)
}
func init() { artifactsF[Binutils] = Toolchain.newBinutils }
func (t Toolchain) newGMP() pkg.Artifact {
const (
version = "6.3.0"
checksum = "yrgbgEDWKDdMWVHh7gPbVl56-sRtVVhfvv0M_LX7xMUUk_mvZ1QOJEAnt7g4i3k5"
)
return t.NewViaMake("gmp", version, pkg.NewHTTPGetTar(
nil, "https://gcc.gnu.org/pub/gcc/infrastructure/"+
"gmp-"+version+".tar.bz2",
mustDecode(checksum),
pkg.TarBzip2,
), &MakeAttr{
ScriptConfigured: `
make "-j$(nproc)"
`,
},
t.Load(M4),
)
}
func init() { artifactsF[GMP] = Toolchain.newGMP }
func (t Toolchain) newMPFR() pkg.Artifact {
const (
version = "4.2.2"
checksum = "wN3gx0zfIuCn9r3VAn_9bmfvAYILwrRfgBjYSD1IjLqyLrLojNN5vKyQuTE9kA-B"
)
return t.NewViaMake("mpfr", version, pkg.NewHTTPGetTar(
nil, "https://gcc.gnu.org/pub/gcc/infrastructure/"+
"mpfr-"+version+".tar.bz2",
mustDecode(checksum),
pkg.TarBzip2,
), nil,
t.Load(GMP),
)
}
func init() { artifactsF[MPFR] = Toolchain.newMPFR }
func (t Toolchain) newMPC() pkg.Artifact {
const (
version = "1.3.1"
checksum = "o8r8K9R4x7PuRx0-JE3-bC5jZQrtxGV2nkB773aqJ3uaxOiBDCID1gKjPaaDxX4V"
)
return t.NewViaMake("mpc", version, pkg.NewHTTPGetTar(
nil, "https://gcc.gnu.org/pub/gcc/infrastructure/"+
"mpc-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), nil,
t.Load(GMP),
t.Load(MPFR),
)
}
func init() { artifactsF[MPC] = Toolchain.newMPC }
func (t Toolchain) newGCC() pkg.Artifact {
const (
version = "15.2.0"
checksum = "TXJ5WrbXlGLzy1swghQTr4qxgDCyIZFgJry51XEPTBZ8QYbVmFeB4lZbSMtPJ-a1"
)
return t.NewViaMake("gcc", version, t.NewPatchedSource(
"gcc", version,
pkg.NewHTTPGetTar(
nil, "https://ftp.tsukuba.wide.ad.jp/software/gcc/releases/"+
"gcc-"+version+"/gcc-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), true, [2]string{"musl-off64_t-loff_t", `diff --git a/libgo/sysinfo.c b/libgo/sysinfo.c
index 180f5c31d74..44d7ea73f7d 100644
--- a/libgo/sysinfo.c
+++ b/libgo/sysinfo.c
@@ -365,11 +365,7 @@ enum {
typedef loff_t libgo_loff_t_type;
#endif
-#if defined(HAVE_OFF64_T)
-typedef off64_t libgo_off_t_type;
-#else
typedef off_t libgo_off_t_type;
-#endif
// The following section introduces explicit references to types and
// constants of interest to support bootstrapping libgo using a
`}, [2]string{"musl-legacy-lfs", `diff --git a/libgo/go/internal/syscall/unix/at_largefile.go b/libgo/go/internal/syscall/unix/at_largefile.go
index 82e0dcfd074..16151ecad1b 100644
--- a/libgo/go/internal/syscall/unix/at_largefile.go
+++ b/libgo/go/internal/syscall/unix/at_largefile.go
@@ -10,5 +10,5 @@ import (
"syscall"
)
-//extern fstatat64
+//extern fstatat
func fstatat(int32, *byte, *syscall.Stat_t, int32) int32
diff --git a/libgo/go/os/dir_largefile.go b/libgo/go/os/dir_largefile.go
index 1fc5ee0771f..0c6dffe1a75 100644
--- a/libgo/go/os/dir_largefile.go
+++ b/libgo/go/os/dir_largefile.go
@@ -11,5 +11,5 @@ package os
import "syscall"
-//extern readdir64
+//extern readdir
func libc_readdir(*syscall.DIR) *syscall.Dirent
diff --git a/libgo/go/syscall/libcall_glibc.go b/libgo/go/syscall/libcall_glibc.go
index 5c1ec483c75..5a1245ed44b 100644
--- a/libgo/go/syscall/libcall_glibc.go
+++ b/libgo/go/syscall/libcall_glibc.go
@@ -114,7 +114,7 @@ func Pipe2(p []int, flags int) (err error) {
}
//sys sendfile(outfd int, infd int, offset *Offset_t, count int) (written int, err error)
-//sendfile64(outfd _C_int, infd _C_int, offset *Offset_t, count Size_t) Ssize_t
+//sendfile(outfd _C_int, infd _C_int, offset *Offset_t, count Size_t) Ssize_t
func Sendfile(outfd int, infd int, offset *int64, count int) (written int, err error) {
if race.Enabled {
diff --git a/libgo/go/syscall/libcall_linux.go b/libgo/go/syscall/libcall_linux.go
index 03ca7261b59..ad21fd0b3ac 100644
--- a/libgo/go/syscall/libcall_linux.go
+++ b/libgo/go/syscall/libcall_linux.go
@@ -158,7 +158,7 @@ func Reboot(cmd int) (err error) {
//adjtimex(buf *Timex) _C_int
//sys Fstatfs(fd int, buf *Statfs_t) (err error)
-//fstatfs64(fd _C_int, buf *Statfs_t) _C_int
+//fstatfs(fd _C_int, buf *Statfs_t) _C_int
func Gettid() (tid int) {
r1, _, _ := Syscall(SYS_GETTID, 0, 0, 0)
@@ -245,7 +245,7 @@ func Splice(rfd int, roff *int64, wfd int, woff *int64, len int, flags int) (n i
}
//sys Statfs(path string, buf *Statfs_t) (err error)
-//statfs64(path *byte, buf *Statfs_t) _C_int
+//statfs(path *byte, buf *Statfs_t) _C_int
//sysnb Sysinfo(info *Sysinfo_t) (err error)
//sysinfo(info *Sysinfo_t) _C_int
diff --git a/libgo/go/syscall/libcall_posix_largefile.go b/libgo/go/syscall/libcall_posix_largefile.go
index f90055bb29a..334212f0af1 100644
--- a/libgo/go/syscall/libcall_posix_largefile.go
+++ b/libgo/go/syscall/libcall_posix_largefile.go
@@ -10,40 +10,40 @@
package syscall
//sys Creat(path string, mode uint32) (fd int, err error)
-//creat64(path *byte, mode Mode_t) _C_int
+//creat(path *byte, mode Mode_t) _C_int
//sys Fstat(fd int, stat *Stat_t) (err error)
-//fstat64(fd _C_int, stat *Stat_t) _C_int
+//fstat(fd _C_int, stat *Stat_t) _C_int
//sys Ftruncate(fd int, length int64) (err error)
-//ftruncate64(fd _C_int, length Offset_t) _C_int
+//ftruncate(fd _C_int, length Offset_t) _C_int
//sysnb Getrlimit(resource int, rlim *Rlimit) (err error)
-//getrlimit64(resource _C_int, rlim *Rlimit) _C_int
+//getrlimit(resource _C_int, rlim *Rlimit) _C_int
//sys Lstat(path string, stat *Stat_t) (err error)
-//lstat64(path *byte, stat *Stat_t) _C_int
+//lstat(path *byte, stat *Stat_t) _C_int
//sys mmap(addr uintptr, length uintptr, prot int, flags int, fd int, offset int64) (xaddr uintptr, err error)
-//mmap64(addr *byte, length Size_t, prot _C_int, flags _C_int, fd _C_int, offset Offset_t) *byte
+//mmap(addr *byte, length Size_t, prot _C_int, flags _C_int, fd _C_int, offset Offset_t) *byte
//sys Open(path string, mode int, perm uint32) (fd int, err error)
-//__go_open64(path *byte, mode _C_int, perm Mode_t) _C_int
+//__go_open(path *byte, mode _C_int, perm Mode_t) _C_int
//sys Pread(fd int, p []byte, offset int64) (n int, err error)
-//pread64(fd _C_int, buf *byte, count Size_t, offset Offset_t) Ssize_t
+//pread(fd _C_int, buf *byte, count Size_t, offset Offset_t) Ssize_t
//sys Pwrite(fd int, p []byte, offset int64) (n int, err error)
-//pwrite64(fd _C_int, buf *byte, count Size_t, offset Offset_t) Ssize_t
+//pwrite(fd _C_int, buf *byte, count Size_t, offset Offset_t) Ssize_t
//sys Seek(fd int, offset int64, whence int) (off int64, err error)
-//lseek64(fd _C_int, offset Offset_t, whence _C_int) Offset_t
+//lseek(fd _C_int, offset Offset_t, whence _C_int) Offset_t
//sysnb Setrlimit(resource int, rlim *Rlimit) (err error)
-//setrlimit64(resource int, rlim *Rlimit) _C_int
+//setrlimit(resource int, rlim *Rlimit) _C_int
//sys Stat(path string, stat *Stat_t) (err error)
-//stat64(path *byte, stat *Stat_t) _C_int
+//stat(path *byte, stat *Stat_t) _C_int
//sys Truncate(path string, length int64) (err error)
-//truncate64(path *byte, length Offset_t) _C_int
+//truncate(path *byte, length Offset_t) _C_int
diff --git a/libgo/runtime/go-varargs.c b/libgo/runtime/go-varargs.c
index f84860891e6..7efc9615985 100644
--- a/libgo/runtime/go-varargs.c
+++ b/libgo/runtime/go-varargs.c
@@ -84,7 +84,7 @@ __go_ioctl_ptr (int d, int request, void *arg)
int
__go_open64 (char *path, int mode, mode_t perm)
{
- return open64 (path, mode, perm);
+ return open (path, mode, perm);
}
#endif
`}), &MakeAttr{
ScriptEarly: `
ln -s system/lib /
ln -s system/lib /work/
`,
Configure: [][2]string{
{"disable-multilib"},
{"with-multilib-list", `""`},
{"enable-default-pie"},
{"disable-nls"},
{"with-gnu-as"},
{"with-gnu-ld"},
{"with-system-zlib"},
{"enable-languages", "c,c++,go"},
{"with-native-system-header-dir", "/system/include"},
},
Make: []string{
"BOOT_CFLAGS='-O2 -g'",
"bootstrap",
},
// This toolchain is hacked to pieces, it is not expected to ever work
// well in its current state. That does not matter as long as the
// toolchain it produces passes its own test suite.
SkipCheck: true,
// GCC spends most of its time in its many configure scripts, however
// it also saturates the CPU for a consequential amount of time.
Flag: TExclusive,
},
t.Load(Binutils),
t.Load(GMP),
t.Load(MPFR),
t.Load(MPC),
t.Load(Zlib),
t.Load(Libucontext),
t.Load(KernelHeaders),
)
}
func init() { artifactsF[gcc] = Toolchain.newGCC }

View File

@@ -10,32 +10,17 @@ import (
// newGoBootstrap returns the Go bootstrap toolchain.
func (t Toolchain) newGoBootstrap() pkg.Artifact {
const checksum = "8o9JL_ToiQKadCTb04nvBDkp8O1xiWOolAxVEqaTGodieNe4lOFEjlOxN3bwwe23"
return t.New("go1.4-bootstrap", false, []pkg.Artifact{
return t.New("go1.4-bootstrap", 0, []pkg.Artifact{
t.Load(Bash),
}, nil, []string{
"CGO_ENABLED=0",
}, `
mkdir -p /var/tmp
cp -r /usr/src/go /work
cd /work/go/src
mkdir -p /var/tmp/ /work/system/
cp -r /usr/src/go /work/system/
cd /work/system/go/src
chmod -R +w ..
ln -s ../system/bin/busybox /bin/pwd
cat << EOF > /bin/hostname
#!/bin/sh
echo cure
EOF
chmod +x /bin/hostname
rm \
cmd/objdump/objdump_test.go \
syscall/creds_test.go \
net/multicast_test.go
./all.bash
cd /work/
mkdir system/
mv go/ system/
./make.bash
`, pkg.Path(AbsUsrSrc.Append("go"), false, pkg.NewHTTPGetTar(
nil, "https://dl.google.com/go/go1.4-bootstrap-20171003.tar.gz",
mustDecode(checksum),
@@ -50,7 +35,7 @@ func (t Toolchain) newGo(
script string,
extra ...pkg.Artifact,
) pkg.Artifact {
return t.New("go"+version, false, slices.Concat([]pkg.Artifact{
return t.New("go"+version, 0, slices.Concat([]pkg.Artifact{
t.Load(Bash),
}, extra), nil, slices.Concat([]string{
"CC=cc",
@@ -78,14 +63,50 @@ ln -s \
}
func (t Toolchain) newGoLatest() pkg.Artifact {
var (
bootstrapEnv []string
bootstrapExtra []pkg.Artifact
)
switch runtime.GOARCH {
case "amd64":
bootstrapExtra = append(bootstrapExtra, t.newGoBootstrap())
case "arm64":
bootstrapEnv = append(bootstrapEnv,
"GOROOT_BOOTSTRAP=/system",
)
bootstrapExtra = append(bootstrapExtra,
t.Load(Binutils),
t.Load(GMP),
t.Load(MPFR),
t.Load(MPC),
t.Load(Zlib),
t.Load(Libucontext),
t.Load(gcc),
)
default:
panic("unsupported target " + runtime.GOARCH)
}
go119 := t.newGo(
"1.19",
"9_e0aFHsIkVxWVGsp9T2RvvjOc3p4n9o9S8tkNe9Cvgzk_zI2FhRQB7ioQkeAAro",
[]string{"CGO_ENABLED=0"}, `
append(bootstrapEnv, "CGO_ENABLED=0"), `
rm \
crypto/tls/handshake_client_test.go
`, t.newGoBootstrap(),
)
crypto/tls/handshake_client_test.go \
cmd/pprof/pprof_test.go \
os/os_unix_test.go
sed -i \
's/os\.Getenv("GCCGO")$/"nonexistent"/' \
go/internal/gccgoimporter/importer_test.go
echo \
'type syscallDescriptor = int' >> \
os/rawconn_test.go
`, bootstrapExtra...)
go121 := t.newGo(
"1.21.13",
@@ -97,7 +118,11 @@ sed -i \
rm \
crypto/tls/handshake_client_test.go \
crypto/tls/handshake_server_test.go
crypto/tls/handshake_server_test.go \
os/os_unix_test.go
echo \
'type syscallDescriptor = int' >> \
os/rawconn_test.go
`, go119,
)
@@ -118,6 +143,9 @@ sed -i \
sed -i \
's,/lib/ld-musl-`+linuxArch()+`.so.1,/system/bin/linker,' \
cmd/link/internal/`+runtime.GOARCH+`/obj.go
rm \
os/root_unix_test.go
`, go123,
)

View File

@@ -4,19 +4,22 @@ import (
"hakurei.app/internal/pkg"
)
func (t Toolchain) newHakurei() pkg.Artifact {
func (t Toolchain) newHakurei(suffix, script string) pkg.Artifact {
const (
version = "0.3.3"
checksum = "iMN9qzDB000noZ6dOHh_aSdrhRZPopjyWHd0KFVjxjQLQstAOvLYZEZ74btlL0pu"
version = "0.3.4"
checksum = "wVwSLo75a2OnH5tgxNWXR_YhiOJUFnYM_9-sJtxAEOKhcPE0BJafs6PU8o5JzyCT"
)
return t.New("hakurei-"+version, false, []pkg.Artifact{
return t.New("hakurei"+suffix+"-"+version, 0, []pkg.Artifact{
t.Load(Go),
t.Load(Gzip),
t.Load(PkgConfig),
t.Load(KernelHeaders),
t.Load(Libseccomp),
t.Load(ACL),
t.Load(Attr),
t.Load(Fuse),
t.Load(Xproto),
t.Load(LibXau),
@@ -29,6 +32,7 @@ func (t Toolchain) newHakurei() pkg.Artifact {
t.Load(Wayland),
t.Load(WaylandProtocols),
}, nil, []string{
"CGO_ENABLED=1",
"GOCACHE=/tmp/gocache",
"CC=clang -O3 -Werror",
}, `
@@ -38,32 +42,202 @@ echo
chmod -R +w /usr/src/hakurei
cd /usr/src/hakurei
mkdir -p /work/system/{bin,libexec/hakurei}
echo '# Building hakurei.'
go generate -v ./...
go build -trimpath -v -o /work/system/libexec/hakurei -ldflags="-s -w
-buildid=
-extldflags=-static
-X hakurei.app/internal/info.buildVersion='v`+version+`'
-X hakurei.app/internal/info.hakureiPath=/system/bin/hakurei
-X hakurei.app/internal/info.hsuPath=/system/bin/hsu
-X main.hakureiPath=/system/bin/hakurei" ./...
echo
echo '# Testing hakurei.'
go test -ldflags='-buildid= -extldflags=-static' ./...
echo
mv \
/work/system/libexec/hakurei/{hakurei,hpkg} \
/work/system/bin
`, pkg.Path(AbsUsrSrc.Append("hakurei"), true, pkg.NewHTTPGetTar(
HAKUREI_VERSION='v`+version+`'
`+script, pkg.Path(AbsUsrSrc.Append("hakurei"), true, t.NewPatchedSource("hakurei", version, pkg.NewHTTPGetTar(
nil, "https://git.gensokyo.uk/security/hakurei/archive/"+
"v"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)), pkg.Path(AbsUsrSrc.Append("hostname", "main.go"), false, pkg.NewFile(
), true, [2]string{"dist", `From 67e453f5c4de915de23ecbe5980e595758f0f2fb Mon Sep 17 00:00:00 2001
From: Ophestra <cat@gensokyo.uk>
Date: Tue, 27 Jan 2026 06:49:48 +0900
Subject: [PATCH] dist: run tests
This used to be impossible due to nix jank which has been addressed.
Signed-off-by: Ophestra <cat@gensokyo.uk>
---
dist/release.sh | 21 ++++++++++++++++-----
flake.nix | 32 ++++++++++++++++++++------------
internal/acl/acl_test.go | 2 +-
package.nix | 2 +-
4 files changed, 38 insertions(+), 19 deletions(-)
diff --git a/dist/release.sh b/dist/release.sh
index 4dcb278..0ba9104 100755
--- a/dist/release.sh
+++ b/dist/release.sh
@@ -2,19 +2,30 @@
cd "$(dirname -- "$0")/.."
VERSION="${HAKUREI_VERSION:-untagged}"
pname="hakurei-${VERSION}"
-out="dist/${pname}"
+out="${DESTDIR:-dist}/${pname}"
+echo '# Preparing distribution files.'
mkdir -p "${out}"
cp -v "README.md" "dist/hsurc.default" "dist/install.sh" "${out}"
cp -rv "dist/comp" "${out}"
+echo
+echo '# Building hakurei.'
go generate ./...
-go build -trimpath -v -o "${out}/bin/" -ldflags "-s -w -buildid= -extldflags '-static'
+go build -trimpath -v -o "${out}/bin/" -ldflags "-s -w
+ -buildid= -extldflags '-static'
-X hakurei.app/internal/info.buildVersion=${VERSION}
-X hakurei.app/internal/info.hakureiPath=/usr/bin/hakurei
-X hakurei.app/internal/info.hsuPath=/usr/bin/hsu
-X main.hakureiPath=/usr/bin/hakurei" ./...
+echo
-rm -f "./${out}.tar.gz" && tar -C dist -czf "${out}.tar.gz" "${pname}"
-rm -rf "./${out}"
-(cd dist && sha512sum "${pname}.tar.gz" > "${pname}.tar.gz.sha512")
+echo '# Testing hakurei.'
+go test -ldflags='-buildid= -extldflags=-static' ./...
+echo
+
+echo '# Creating distribution.'
+rm -f "${out}.tar.gz" && tar -C "${out}/.." -vczf "${out}.tar.gz" "${pname}"
+rm -rf "${out}"
+(cd "${out}/.." && sha512sum "${pname}.tar.gz" > "${pname}.tar.gz.sha512")
+echo
diff --git a/flake.nix b/flake.nix
index 9e09c61..2340b92 100644
--- a/flake.nix
+++ b/flake.nix
@@ -143,19 +143,27 @@
"bin/mount.fuse.sharefs" = "${hakurei}/libexec/sharefs";
};
- dist = pkgs.runCommand "${hakurei.name}-dist" { buildInputs = hakurei.targetPkgs ++ [ pkgs.pkgsStatic.musl ]; } ''
- # go requires XDG_CACHE_HOME for the build cache
- export XDG_CACHE_HOME="$(mktemp -d)"
+ dist =
+ pkgs.runCommand "${hakurei.name}-dist"
+ {
+ buildInputs = hakurei.targetPkgs ++ [
+ pkgs.pkgsStatic.musl
+ ];
+ }
+ ''
+ cd $(mktemp -d) \
+ && cp -r ${hakurei.src}/. . \
+ && chmod +w cmd && cp -r ${hsu.src}/. cmd/hsu/ \
+ && chmod -R +w .
- # get a different workdir as go does not like /build
- cd $(mktemp -d) \
- && cp -r ${hakurei.src}/. . \
- && chmod +w cmd && cp -r ${hsu.src}/. cmd/hsu/ \
- && chmod -R +w .
-
- export HAKUREI_VERSION="v${hakurei.version}"
- CC="clang -O3 -Werror" ./dist/release.sh && mkdir $out && cp -v "dist/hakurei-$HAKUREI_VERSION.tar.gz"* $out
- '';
+ CC="musl-clang -O3 -Werror -Qunused-arguments" \
+ GOCACHE="$(mktemp -d)" \
+ HAKUREI_TEST_SKIP_ACL=1 \
+ PATH="${pkgs.pkgsStatic.musl.bin}/bin:$PATH" \
+ DESTDIR="$out" \
+ HAKUREI_VERSION="v${hakurei.version}" \
+ ./dist/release.sh
+ '';
}
);
diff --git a/internal/acl/acl_test.go b/internal/acl/acl_test.go
index af6da55..19ce45a 100644
--- a/internal/acl/acl_test.go
+++ b/internal/acl/acl_test.go
@@ -24,7 +24,7 @@ var (
)
func TestUpdate(t *testing.T) {
- if os.Getenv("GO_TEST_SKIP_ACL") == "1" {
+ if os.Getenv("HAKUREI_TEST_SKIP_ACL") == "1" {
t.Skip("acl test skipped")
}
diff --git a/package.nix b/package.nix
index 00c4401..2eaa2ec 100644
--- a/package.nix
+++ b/package.nix
@@ -89,7 +89,7 @@ buildGoModule rec {
CC = "clang -O3 -Werror";
# nix build environment does not allow acls
- GO_TEST_SKIP_ACL = 1;
+ HAKUREI_TEST_SKIP_ACL = 1;
};
buildInputs = [`}, [2]string{"container-tests", `From bf14a412e47344fff2681f4b24d1ecc7415bfcb0 Mon Sep 17 00:00:00 2001
From: Ophestra <cat@gensokyo.uk>
Date: Sat, 31 Jan 2026 10:59:56 +0900
Subject: [PATCH] container: fix host-dependent test cases
These are not fully controlled by hakurei and may change depending on host configuration.
Signed-off-by: Ophestra <cat@gensokyo.uk>
---
container/container_test.go | 27 +++++++++++++++------------
1 file changed, 15 insertions(+), 12 deletions(-)
diff --git a/container/container_test.go b/container/container_test.go
index d737a18..98713cb 100644
--- a/container/container_test.go
+++ b/container/container_test.go
@@ -275,12 +275,12 @@ var containerTestCases = []struct {
),
earlyMnt(
ent("/", "/dev", "ro,nosuid,nodev,relatime", "tmpfs", "devtmpfs", ignore),
- ent("/null", "/dev/null", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
- ent("/zero", "/dev/zero", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
- ent("/full", "/dev/full", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
- ent("/random", "/dev/random", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
- ent("/urandom", "/dev/urandom", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
- ent("/tty", "/dev/tty", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
+ ent("/null", "/dev/null", ignore, "devtmpfs", "devtmpfs", ignore),
+ ent("/zero", "/dev/zero", ignore, "devtmpfs", "devtmpfs", ignore),
+ ent("/full", "/dev/full", ignore, "devtmpfs", "devtmpfs", ignore),
+ ent("/random", "/dev/random", ignore, "devtmpfs", "devtmpfs", ignore),
+ ent("/urandom", "/dev/urandom", ignore, "devtmpfs", "devtmpfs", ignore),
+ ent("/tty", "/dev/tty", ignore, "devtmpfs", "devtmpfs", ignore),
ent("/", "/dev/pts", "rw,nosuid,noexec,relatime", "devpts", "devpts", "rw,mode=620,ptmxmode=666"),
ent("/", "/dev/mqueue", "rw,nosuid,nodev,noexec,relatime", "mqueue", "mqueue", "rw"),
ent("/", "/dev/shm", "rw,nosuid,nodev,relatime", "tmpfs", "tmpfs", ignore),
@@ -293,12 +293,12 @@ var containerTestCases = []struct {
),
earlyMnt(
ent("/", "/dev", "ro,nosuid,nodev,relatime", "tmpfs", "devtmpfs", ignore),
- ent("/null", "/dev/null", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
- ent("/zero", "/dev/zero", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
- ent("/full", "/dev/full", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
- ent("/random", "/dev/random", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
- ent("/urandom", "/dev/urandom", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
- ent("/tty", "/dev/tty", "rw,nosuid", "devtmpfs", "devtmpfs", ignore),
+ ent("/null", "/dev/null", ignore, "devtmpfs", "devtmpfs", ignore),
+ ent("/zero", "/dev/zero", ignore, "devtmpfs", "devtmpfs", ignore),
+ ent("/full", "/dev/full", ignore, "devtmpfs", "devtmpfs", ignore),
+ ent("/random", "/dev/random", ignore, "devtmpfs", "devtmpfs", ignore),
+ ent("/urandom", "/dev/urandom", ignore, "devtmpfs", "devtmpfs", ignore),
+ ent("/tty", "/dev/tty", ignore, "devtmpfs", "devtmpfs", ignore),
ent("/", "/dev/pts", "rw,nosuid,noexec,relatime", "devpts", "devpts", "rw,mode=620,ptmxmode=666"),
ent("/", "/dev/shm", "rw,nosuid,nodev,relatime", "tmpfs", "tmpfs", ignore),
),
@@ -696,6 +696,9 @@ func init() {
mnt[i].VfsOptstr = strings.TrimSuffix(mnt[i].VfsOptstr, ",relatime")
mnt[i].VfsOptstr = strings.TrimSuffix(mnt[i].VfsOptstr, ",noatime")
+ cur.FsOptstr = strings.Replace(cur.FsOptstr, ",seclabel", "", 1)
+ mnt[i].FsOptstr = strings.Replace(mnt[i].FsOptstr, ",seclabel", "", 1)
+
if !cur.EqualWithIgnore(mnt[i], "\x00") {
fail = true
log.Printf("[FAIL] %s", cur)`}),
), pkg.Path(AbsUsrSrc.Append("hostname", "main.go"), false, pkg.NewFile(
"hostname.go",
[]byte(`
package main
@@ -80,4 +254,37 @@ func main() {
`),
)))
}
func init() { artifactsF[Hakurei] = Toolchain.newHakurei }
func init() {
artifactsF[Hakurei] = func(t Toolchain) pkg.Artifact {
return t.newHakurei("", `
mkdir -p /work/system/libexec/hakurei/
echo '# Building hakurei.'
go generate -v ./...
go build -trimpath -v -o /work/system/libexec/hakurei -ldflags="-s -w
-buildid=
-extldflags=-static
-X hakurei.app/internal/info.buildVersion="$HAKUREI_VERSION"
-X hakurei.app/internal/info.hakureiPath=/system/bin/hakurei
-X hakurei.app/internal/info.hsuPath=/system/bin/hsu
-X main.hakureiPath=/system/bin/hakurei" ./...
echo
echo '# Testing hakurei.'
go test -ldflags='-buildid= -extldflags=-static' ./...
echo
mkdir -p /work/system/bin/
(cd /work/system/libexec/hakurei && mv \
hakurei \
sharefs \
../../bin/)
`)
}
artifactsF[HakureiDist] = func(t Toolchain) pkg.Artifact {
return t.newHakurei("-dist", `
export HAKUREI_VERSION
DESTDIR=/work /usr/src/hakurei/dist/release.sh
`)
}
}

View File

@@ -8,7 +8,7 @@ import (
// newKernel is a helper for interacting with Kbuild.
func (t Toolchain) newKernel(
exclusive bool,
flag int,
patches [][2]string,
script string,
extra ...pkg.Artifact,
@@ -17,7 +17,7 @@ func (t Toolchain) newKernel(
version = "6.18.5"
checksum = "-V1e1WWl7HuePkmm84sSKF7nLuHfUs494uNMzMqXEyxcNE_PUE0FICL0oGWn44mM"
)
return t.New("kernel-"+version, exclusive, slices.Concat([]pkg.Artifact{
return t.New("kernel-"+version, flag, slices.Concat([]pkg.Artifact{
t.Load(Make),
}, extra), nil, nil, `
export LLVM=1
@@ -35,7 +35,7 @@ cd /usr/src/linux
}
func (t Toolchain) newKernelHeaders() pkg.Artifact {
return t.newKernel(false, nil, `
return t.newKernel(TEarly, nil, `
make "-j$(nproc)" \
INSTALL_HDR_PATH=/work/system \
headers_install

View File

@@ -11,23 +11,18 @@ func (t Toolchain) newLibexpat() pkg.Artifact {
version = "2.7.3"
checksum = "GmkoD23nRi9cMT0cgG1XRMrZWD82UcOMzkkvP1gkwSFWCBgeSXMuoLpa8-v8kxW-"
)
return t.New("libexpat-"+version, false, []pkg.Artifact{
t.Load(Make),
t.Load(Bash),
}, nil, nil, `
cd "$(mktemp -d)"
/usr/src/libexpat/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}" \
--enable-static
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("libexpat"), false, pkg.NewHTTPGetTar(
return t.NewViaMake("libexpat", version, pkg.NewHTTPGetTar(
nil, "https://github.com/libexpat/libexpat/releases/download/"+
"R_"+strings.ReplaceAll(version, ".", "_")+"/"+
"expat-"+version+".tar.bz2",
mustDecode(checksum),
pkg.TarBzip2,
)))
), &MakeAttr{
Configure: [][2]string{
{"enable-static"},
},
},
t.Load(Bash),
)
}
func init() { artifactsF[Libexpat] = Toolchain.newLibexpat }

View File

@@ -7,23 +7,17 @@ func (t Toolchain) newLibffi() pkg.Artifact {
version = "3.4.5"
checksum = "apIJzypF4rDudeRoI_n3K7N-zCeBLTbQlHRn9NSAZqdLAWA80mR0gXPTpHsL7oMl"
)
return t.New("libffi-"+version, false, []pkg.Artifact{
t.Load(Make),
t.Load(KernelHeaders),
}, nil, nil, `
cd "$(mktemp -d)"
/usr/src/libffi/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}" \
--enable-static
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("libffi"), false, pkg.NewHTTPGetTar(
nil,
"https://github.com/libffi/libffi/releases/download/"+
return t.NewViaMake("libffi", version, pkg.NewHTTPGetTar(
nil, "https://github.com/libffi/libffi/releases/download/"+
"v"+version+"/libffi-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
), &MakeAttr{
Configure: [][2]string{
{"enable-static"},
},
},
t.Load(KernelHeaders),
)
}
func init() { artifactsF[Libffi] = Toolchain.newLibffi }

View File

@@ -7,27 +7,24 @@ func (t Toolchain) newLibgd() pkg.Artifact {
version = "2.3.3"
checksum = "8T-sh1_FJT9K9aajgxzh8ot6vWIF-xxjcKAHvTak9MgGUcsFfzP8cAvvv44u2r36"
)
return t.New("libgd-"+version, false, []pkg.Artifact{
t.Load(Make),
t.Load(Zlib),
}, nil, []string{
"TMPDIR=/dev/shm/gd",
}, `
mkdir /dev/shm/gd
cd "$(mktemp -d)"
/usr/src/libgd/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}" \
--enable-static
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("libgd"), true, pkg.NewHTTPGetTar(
return t.NewViaMake("libgd", version, pkg.NewHTTPGetTar(
nil, "https://github.com/libgd/libgd/releases/download/"+
"gd-"+version+"/libgd-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
), &MakeAttr{
OmitDefaults: true,
Env: []string{
"TMPDIR=/dev/shm/gd",
},
ScriptEarly: `
mkdir /dev/shm/gd
`,
Configure: [][2]string{
{"enable-static"},
},
},
t.Load(Zlib),
)
}
func init() { artifactsF[Libgd] = Toolchain.newLibgd }

28
internal/rosa/libpsl.go Normal file
View File

@@ -0,0 +1,28 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newLibpsl() pkg.Artifact {
const (
version = "0.21.5"
checksum = "XjfxSzh7peG2Vg4vJlL8z4JZJLcXqbuP6pLWkrGCmRxlnYUFTKNBqWGHCxEOlCad"
)
return t.NewViaMake("libpsl", version, pkg.NewHTTPGetTar(
nil, "https://github.com/rockdaboot/libpsl/releases/download/"+
version+"/libpsl-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), &MakeAttr{
Writable: true,
ScriptEarly: `
cd /usr/src/libpsl
test_disable() { chmod +w "$2" && echo "$1" > "$2"; }
test_disable 'int main(){return 0;}' tests/test-is-public-builtin.c
`,
},
t.Load(Python),
)
}
func init() { artifactsF[Libpsl] = Toolchain.newLibpsl }

View File

@@ -9,28 +9,25 @@ func (t Toolchain) newLibseccomp() pkg.Artifact {
version = "2.6.0"
checksum = "mMu-iR71guPjFbb31u-YexBaanKE_nYPjPux-vuBiPfS_0kbwJdfCGlkofaUm-EY"
)
return t.New("libseccomp-"+version, false, []pkg.Artifact{
t.Load(Make),
t.Load(Bash),
t.Load(Gperf),
t.Load(KernelHeaders),
}, nil, nil, `
ln -s ../system/bin/bash /bin/bash
cd "$(mktemp -d)"
/usr/src/libseccomp/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}" \
--enable-static
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("libseccomp"), false, pkg.NewHTTPGetTar(
return t.NewViaMake("libseccomp", version, pkg.NewHTTPGetTar(
nil,
"https://github.com/seccomp/libseccomp/releases/download/"+
"v"+version+"/libseccomp-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
), &MakeAttr{
ScriptEarly: `
ln -s ../system/bin/bash /bin/
`,
Configure: [][2]string{
{"enable-static"},
},
},
t.Load(Bash),
t.Load(Diffutils),
t.Load(Gperf),
t.Load(KernelHeaders),
)
}
func init() { artifactsF[Libseccomp] = Toolchain.newLibseccomp }

View File

@@ -0,0 +1,40 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newLibucontext() pkg.Artifact {
const (
version = "1.5"
checksum = "Ggk7FMmDNBdCx1Z9PcNWWW6LSpjGYssn2vU0GK5BLXJYw7ZxZbA2m_eSgT9TFnIG"
)
return t.New("libucontext", 0, []pkg.Artifact{
t.Load(Make),
}, nil, []string{
"ARCH=" + linuxArch(),
}, `
cd /usr/src/libucontext
make check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("libucontext"), true,
t.NewPatchedSource("libucontext", version, pkg.NewHTTPGetTar(
nil, "https://github.com/kaniini/libucontext/archive/refs/tags/"+
"libucontext-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), true, [2]string{"rosa-prefix", `diff --git a/Makefile b/Makefile
index c80e574..4a8c1d3 100644
--- a/Makefile
+++ b/Makefile
@@ -17,7 +17,7 @@ ifeq ($(ARCH),$(filter $(ARCH),arm64))
override ARCH = aarch64
endif
-prefix = /usr
+prefix = /system
libdir = ${prefix}/lib
shared_libdir = ${libdir}
static_libdir = ${libdir}
`}),
))
}
func init() { artifactsF[Libucontext] = Toolchain.newLibucontext }

View File

@@ -11,25 +11,24 @@ func (t Toolchain) newLibxml2() pkg.Artifact {
version = "2.15.1"
checksum = "pYzAR3cNrEHezhEMirgiq7jbboLzwMj5GD7SQp0jhSIMdgoU4G9oU9Gxun3zzUIU"
)
return t.New("libxml2-"+version, false, []pkg.Artifact{
t.Load(Make),
}, nil, nil, `
cd /usr/src/
tar xf libxml2.tar.xz
mv libxml2-`+version+` libxml2
cd "$(mktemp -d)"
/usr/src/libxml2/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}" \
--enable-static
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("libxml2.tar.xz"), false, pkg.NewHTTPGet(
return t.NewViaMake("libxml2", version, pkg.NewHTTPGet(
nil, "https://download.gnome.org/sources/libxml2/"+
strings.Join(strings.Split(version, ".")[:2], ".")+
"/libxml2-"+version+".tar.xz",
mustDecode(checksum),
)))
), &MakeAttr{
ScriptEarly: `
cd /usr/src/
tar xf libxml2.tar.xz
mv libxml2-` + version + ` libxml2
`,
Configure: [][2]string{
{"enable-static"},
},
SourceSuffix: ".tar.xz",
},
t.Load(Diffutils),
t.Load(XZ),
)
}
func init() { artifactsF[Libxml2] = Toolchain.newLibxml2 }

View File

@@ -22,12 +22,18 @@ type llvmAttr struct {
cmake [][2]string
// Override CMakeAttr.Append.
append []string
// Concatenated with default dependencies for CMakeAttr.Extra.
// Concatenated with default dependencies for Toolchain.NewViaCMake.
extra []pkg.Artifact
// Passed through to CMakeAttr.Paths.
paths []pkg.ExecPath
// Passed through to CMakeAttr.ScriptConfigured.
scriptConfigured string
// Concatenated with default fixup for CMakeAttr.Script.
script string
// Passed through to CMakeAttr.Prefix.
prefix *check.Absolute
// Passed through to CMakeAttr.Writable.
writable bool
// Patch name and body pairs.
patches [][2]string
@@ -180,26 +186,31 @@ cp -r /system/include /usr/include && rm -rf /system/include
), &CMakeAttr{
Cache: slices.Concat(cache, attr.cmake),
Append: cmakeAppend,
Extra: stage3Concat(t, attr.extra,
t.Load(Libffi),
t.Load(Python),
t.Load(Perl),
t.Load(Diffutils),
t.Load(Bash),
t.Load(Coreutils),
t.Load(KernelHeaders),
),
Prefix: attr.prefix,
Env: slices.Concat([]string{
"ROSA_LLVM_PROJECTS=" + strings.Join(projects, ";"),
"ROSA_LLVM_RUNTIMES=" + strings.Join(runtimes, ";"),
}, attr.env),
ScriptEarly: scriptEarly, Script: script + attr.script,
ScriptEarly: scriptEarly,
ScriptConfigured: attr.scriptConfigured,
Script: script + attr.script,
Writable: attr.writable,
Exclusive: true,
})
Paths: attr.paths,
Flag: TExclusive,
}, stage3Concat(t, attr.extra,
t.Load(Libffi),
t.Load(Python),
t.Load(Perl),
t.Load(Diffutils),
t.Load(Bash),
t.Load(Gawk),
t.Load(Coreutils),
t.Load(Findutils),
t.Load(KernelHeaders),
)...)
}
// newLLVM returns LLVM toolchain across multiple [pkg.Artifact].
@@ -208,6 +219,8 @@ func (t Toolchain) newLLVM() (musl, compilerRT, runtimes, clang pkg.Artifact) {
switch runtime.GOARCH {
case "386", "amd64":
target = "X86"
case "arm64":
target = "AArch64"
default:
panic("unsupported target " + runtime.GOARCH)
@@ -229,6 +242,7 @@ func (t Toolchain) newLLVM() (musl, compilerRT, runtimes, clang pkg.Artifact) {
{"COMPILER_RT_BUILD_BUILTINS", "ON"},
{"COMPILER_RT_DEFAULT_TARGET_ONLY", "ON"},
{"COMPILER_RT_SANITIZERS_TO_BUILD", "asan"},
{"LLVM_ENABLE_PER_TARGET_RUNTIME_DIR", "ON"},
// does not work without libunwind
@@ -236,7 +250,6 @@ func (t Toolchain) newLLVM() (musl, compilerRT, runtimes, clang pkg.Artifact) {
{"COMPILER_RT_BUILD_LIBFUZZER", "OFF"},
{"COMPILER_RT_BUILD_MEMPROF", "OFF"},
{"COMPILER_RT_BUILD_PROFILE", "OFF"},
{"COMPILER_RT_BUILD_SANITIZERS", "OFF"},
{"COMPILER_RT_BUILD_XRAY", "OFF"},
},
append: []string{"compiler-rt"},
@@ -253,10 +266,10 @@ ln -s \
"${ROSA_INSTALL_PREFIX}/lib/clang/21/lib/"
ln -s \
"clang_rt.crtbegin-$(uname -m).o" \
"clang_rt.crtbegin-` + linuxArch() + `.o" \
"${ROSA_INSTALL_PREFIX}/lib/${ROSA_TRIPLE}/crtbeginS.o"
ln -s \
"clang_rt.crtend-$(uname -m).o" \
"clang_rt.crtend-` + linuxArch() + `.o" \
"${ROSA_INSTALL_PREFIX}/lib/${ROSA_TRIPLE}/crtendS.o"
`,
})
@@ -309,7 +322,6 @@ ln -s \
musl,
compilerRT,
runtimes,
t.Load(Git),
},
script: `
ln -s clang /work/system/bin/cc
@@ -319,6 +331,40 @@ ninja check-all
`,
patches: [][2]string{
{"add-rosa-vendor", `diff --git a/llvm/include/llvm/TargetParser/Triple.h b/llvm/include/llvm/TargetParser/Triple.h
index 657f4230379e..12c305756184 100644
--- a/llvm/include/llvm/TargetParser/Triple.h
+++ b/llvm/include/llvm/TargetParser/Triple.h
@@ -185,6 +185,7 @@ public:
Apple,
PC,
+ Rosa,
SCEI,
Freescale,
IBM,
diff --git a/llvm/lib/TargetParser/Triple.cpp b/llvm/lib/TargetParser/Triple.cpp
index 0584c941d2e6..e4d6ef963cc7 100644
--- a/llvm/lib/TargetParser/Triple.cpp
+++ b/llvm/lib/TargetParser/Triple.cpp
@@ -269,6 +269,7 @@ StringRef Triple::getVendorTypeName(VendorType Kind) {
case NVIDIA: return "nvidia";
case OpenEmbedded: return "oe";
case PC: return "pc";
+ case Rosa: return "rosa";
case SCEI: return "scei";
case SUSE: return "suse";
}
@@ -669,6 +670,7 @@ static Triple::VendorType parseVendor(StringRef VendorName) {
.Case("suse", Triple::SUSE)
.Case("oe", Triple::OpenEmbedded)
.Case("intel", Triple::Intel)
+ .Case("rosa", Triple::Rosa)
.Default(Triple::UnknownVendor);
}
`},
{"xfail-broken-tests", `diff --git a/clang/test/Modules/timestamps.c b/clang/test/Modules/timestamps.c
index 50fdce630255..4b4465a75617 100644
--- a/clang/test/Modules/timestamps.c
@@ -358,66 +404,85 @@ index cdbf21fb9026..dd052858700d 100644
addSystemInclude(DriverArgs, CC1Args, ResourceDirInclude);
`},
{"path-system-libraries", `diff --git a/clang/lib/Driver/ToolChains/CommonArgs.cpp b/clang/lib/Driver/ToolChains/CommonArgs.cpp
index 8d3775de9be5..1e126e2d6f24 100644
--- a/clang/lib/Driver/ToolChains/CommonArgs.cpp
+++ b/clang/lib/Driver/ToolChains/CommonArgs.cpp
@@ -463,6 +463,15 @@ void tools::AddLinkerInputs(const ToolChain &TC, const InputInfoList &Inputs,
if (!TC.isCrossCompiling())
addDirectoryList(Args, CmdArgs, "-L", "LIBRARY_PATH");
+ const std::string RosaSuffix = "-rosa-linux-musl";
+ if (TC.getTripleString().size() > RosaSuffix.size() &&
+ std::equal(RosaSuffix.rbegin(), RosaSuffix.rend(), TC.getTripleString().rbegin())) {
+ CmdArgs.push_back("-rpath");
+ CmdArgs.push_back("/system/lib");
+ CmdArgs.push_back("-rpath");
+ CmdArgs.push_back(("/system/lib/" + TC.getTripleString()).c_str());
+ }
+
for (const auto &II : Inputs) {
// If the current tool chain refers to an OpenMP offloading host, we
// should ignore inputs that refer to OpenMP offloading devices -
diff --git a/clang/lib/Driver/ToolChains/Linux.cpp b/clang/lib/Driver/ToolChains/Linux.cpp
index 8ac8d4eb9181..795995bb53cb 100644
{"path-system-libraries", `diff --git a/clang/lib/Driver/ToolChains/Linux.cpp b/clang/lib/Driver/ToolChains/Linux.cpp
index 8ac8d4eb9181..f4d1347ab64d 100644
--- a/clang/lib/Driver/ToolChains/Linux.cpp
+++ b/clang/lib/Driver/ToolChains/Linux.cpp
@@ -324,6 +324,7 @@ Linux::Linux(const Driver &D, const llvm::Triple &Triple, const ArgList &Args)
@@ -282,6 +282,7 @@ Linux::Linux(const Driver &D, const llvm::Triple &Triple, const ArgList &Args)
const bool IsHexagon = Arch == llvm::Triple::hexagon;
const bool IsRISCV = Triple.isRISCV();
const bool IsCSKY = Triple.isCSKY();
+ const bool IsRosa = Triple.getVendor() == llvm::Triple::Rosa;
if (IsCSKY && !SelectedMultilibs.empty())
SysRoot = SysRoot + SelectedMultilibs.back().osSuffix();
@@ -318,12 +319,23 @@ Linux::Linux(const Driver &D, const llvm::Triple &Triple, const ArgList &Args)
const std::string OSLibDir = std::string(getOSLibDir(Triple, Args));
const std::string MultiarchTriple = getMultiarchTriple(D, Triple, SysRoot);
+ if (IsRosa) {
+ ExtraOpts.push_back("-rpath");
+ ExtraOpts.push_back("/system/lib");
+ ExtraOpts.push_back("-rpath");
+ ExtraOpts.push_back(concat("/system/lib", MultiarchTriple));
+ }
+
// mips32: Debian multilib, we use /libo32, while in other case, /lib is
// used. We need add both libo32 and /lib.
if (Arch == llvm::Triple::mips || Arch == llvm::Triple::mipsel) {
Generic_GCC::AddMultilibPaths(D, SysRoot, "libo32", MultiarchTriple, Paths);
addPathIfExists(D, concat(SysRoot, "/libo32"), Paths);
addPathIfExists(D, concat(SysRoot, "/usr/libo32"), Paths);
+ addPathIfExists(D, concat(SysRoot, "/system/libo32"), Paths);
- addPathIfExists(D, concat(SysRoot, "/libo32"), Paths);
- addPathIfExists(D, concat(SysRoot, "/usr/libo32"), Paths);
+ if (!IsRosa) {
+ addPathIfExists(D, concat(SysRoot, "/libo32"), Paths);
+ addPathIfExists(D, concat(SysRoot, "/usr/libo32"), Paths);
+ } else {
+ addPathIfExists(D, concat(SysRoot, "/system/libo32"), Paths);
+ }
}
Generic_GCC::AddMultilibPaths(D, SysRoot, OSLibDir, MultiarchTriple, Paths);
@@ -343,16 +344,20 @@ Linux::Linux(const Driver &D, const llvm::Triple &Triple, const ArgList &Args)
@@ -341,18 +353,30 @@ Linux::Linux(const Driver &D, const llvm::Triple &Triple, const ArgList &Args)
Paths);
}
addPathIfExists(D, concat(SysRoot, "/usr/lib", MultiarchTriple), Paths);
addPathIfExists(D, concat(SysRoot, "/usr", OSLibDir), Paths);
+ addPathIfExists(D, concat(SysRoot, "/system/lib", MultiarchTriple), Paths);
+ addPathIfExists(D, concat(SysRoot, "/system", OSLibDir), Paths);
- addPathIfExists(D, concat(SysRoot, "/usr/lib", MultiarchTriple), Paths);
- addPathIfExists(D, concat(SysRoot, "/usr", OSLibDir), Paths);
+ if (!IsRosa) {
+ addPathIfExists(D, concat(SysRoot, "/usr/lib", MultiarchTriple), Paths);
+ addPathIfExists(D, concat(SysRoot, "/usr", OSLibDir), Paths);
+ } else {
+ addPathIfExists(D, concat(SysRoot, "/system/lib", MultiarchTriple), Paths);
+ addPathIfExists(D, concat(SysRoot, "/system", OSLibDir), Paths);
+ }
if (IsRISCV) {
StringRef ABIName = tools::riscv::getRISCVABI(Args, Triple);
addPathIfExists(D, concat(SysRoot, "/", OSLibDir, ABIName), Paths);
addPathIfExists(D, concat(SysRoot, "/usr", OSLibDir, ABIName), Paths);
+ addPathIfExists(D, concat(SysRoot, "/system", OSLibDir, ABIName), Paths);
- addPathIfExists(D, concat(SysRoot, "/usr", OSLibDir, ABIName), Paths);
+ if (!IsRosa)
+ addPathIfExists(D, concat(SysRoot, "/usr", OSLibDir, ABIName), Paths);
+ else
+ addPathIfExists(D, concat(SysRoot, "/system", OSLibDir, ABIName), Paths);
}
Generic_GCC::AddMultiarchPaths(D, SysRoot, OSLibDir, Paths);
addPathIfExists(D, concat(SysRoot, "/lib"), Paths);
addPathIfExists(D, concat(SysRoot, "/usr/lib"), Paths);
+ addPathIfExists(D, concat(SysRoot, "/system/lib"), Paths);
- addPathIfExists(D, concat(SysRoot, "/lib"), Paths);
- addPathIfExists(D, concat(SysRoot, "/usr/lib"), Paths);
+ if (!IsRosa) {
+ addPathIfExists(D, concat(SysRoot, "/lib"), Paths);
+ addPathIfExists(D, concat(SysRoot, "/usr/lib"), Paths);
+ } else {
+ addPathIfExists(D, concat(SysRoot, "/system/lib"), Paths);
+ }
}
ToolChain::RuntimeLibType Linux::GetDefaultRuntimeLibType() const {
@@ -457,6 +462,11 @@ std::string Linux::getDynamicLinker(const ArgList &Args) const {
@@ -457,6 +481,9 @@ std::string Linux::getDynamicLinker(const ArgList &Args) const {
return Triple.isArch64Bit() ? "/system/bin/linker64" : "/system/bin/linker";
}
if (Triple.isMusl()) {
+ const std::string RosaSuffix = "-rosa-linux-musl";
+ if (Triple.str().size() > RosaSuffix.size() &&
+ std::equal(RosaSuffix.rbegin(), RosaSuffix.rend(), Triple.str().rbegin()))
+ if (Triple.getVendor() == llvm::Triple::Rosa)
+ return "/system/bin/linker";
+
std::string ArchName;

162
internal/rosa/make.go Normal file
View File

@@ -0,0 +1,162 @@
package rosa
import (
"slices"
"strings"
"hakurei.app/internal/pkg"
)
func (t Toolchain) newMake() pkg.Artifact {
const (
version = "4.4.1"
checksum = "YS_B07ZcAy9PbaK5_vKGj64SrxO2VMpnMKfc9I0Q9IC1rn0RwOH7802pJoj2Mq4a"
)
return t.New("make-"+version, TEarly, nil, nil, nil, `
cd "$(mktemp -d)"
/usr/src/make/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}" \
--disable-dependency-tracking
./build.sh
./make DESTDIR=/work install check
`, pkg.Path(AbsUsrSrc.Append("make"), false, pkg.NewHTTPGetTar(
nil, "https://ftpmirror.gnu.org/gnu/make/make-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
}
func init() { artifactsF[Make] = Toolchain.newMake }
// MakeAttr holds the project-specific attributes that will be applied to a new
// [pkg.Artifact] compiled via [Make].
type MakeAttr struct {
// Mount the source tree writable.
Writable bool
// Do not include default extras.
OmitDefaults bool
// Dependencies not provided by stage3.
NonStage3 []pkg.Artifact
// Additional environment variables.
Env []string
// Runs before configure.
ScriptEarly string
// Runs after configure.
ScriptConfigured string
// Runs after install.
Script string
// Remain in working directory set up during ScriptEarly.
InPlace bool
// Flags passed to the configure script.
Configure [][2]string
// Extra make targets.
Make []string
// Target triple, zero value is equivalent to the Rosa OS triple.
Build string
// Whether to skip the check target.
SkipCheck bool
// Name of the check target, zero value is equivalent to "check".
CheckName string
// Suffix appended to the source pathname.
SourceSuffix string
// Passed through to [Toolchain.New].
Flag int
}
// NewViaMake returns a [pkg.Artifact] for compiling and installing via [Make].
func (t Toolchain) NewViaMake(
name, version string,
source pkg.Artifact,
attr *MakeAttr,
extra ...pkg.Artifact,
) pkg.Artifact {
if name == "" || version == "" {
panic("names must be non-empty")
}
if attr == nil {
attr = new(MakeAttr)
}
build := `"${ROSA_TRIPLE}"`
if attr.Build != "" {
build = attr.Build
}
var configureFlags string
if len(attr.Configure) > 0 {
const sep = " \\\n\t"
configureFlags += sep + strings.Join(
slices.Collect(func(yield func(string) bool) {
for _, v := range attr.Configure {
s := v[0]
if v[1] == "" || (v[0] != "" &&
v[0][0] >= 'a' &&
v[0][0] <= 'z') {
s = "--" + s
}
if v[1] != "" {
s += "=" + v[1]
}
if !yield(s) {
return
}
}
}),
sep,
)
}
var buildFlag string
if attr.Build != `""` {
buildFlag = ` \
--build=` + build
}
makeTargets := make([]string, 1, 2+len(attr.Make))
if !attr.SkipCheck {
if attr.CheckName == "" {
makeTargets = append(makeTargets, "check")
} else {
makeTargets = append(makeTargets, attr.CheckName)
}
}
makeTargets = append(makeTargets, attr.Make...)
if len(makeTargets) == 1 {
makeTargets = nil
}
finalExtra := []pkg.Artifact{
t.Load(Make),
}
if attr.OmitDefaults || attr.Flag&TEarly == 0 {
finalExtra = append(finalExtra,
t.Load(Gawk),
t.Load(Coreutils),
)
}
finalExtra = append(finalExtra, extra...)
scriptEarly := attr.ScriptEarly
if !attr.InPlace {
scriptEarly += "\ncd \"$(mktemp -d)\""
} else if scriptEarly == "" {
panic("cannot remain in root")
}
return t.New(name+"-"+version, attr.Flag, stage3Concat(t,
attr.NonStage3,
finalExtra...,
), nil, attr.Env, scriptEarly+`
/usr/src/`+name+`/configure \
--prefix=/system`+buildFlag+configureFlags+attr.ScriptConfigured+`
make "-j$(nproc)"`+strings.Join(makeTargets, " ")+`
make DESTDIR=/work install
`+attr.Script, pkg.Path(AbsUsrSrc.Append(
name+attr.SourceSuffix,
), attr.Writable, source))
}

View File

@@ -7,7 +7,7 @@ func (t Toolchain) newMeson() pkg.Artifact {
version = "1.10.1"
checksum = "w895BXF_icncnXatT_OLCFe2PYEtg4KrKooMgUYdN-nQVvbFX3PvYWHGEpogsHtd"
)
return t.New("meson-"+version, false, []pkg.Artifact{
return t.New("meson-"+version, 0, []pkg.Artifact{
t.Load(Python),
t.Load(Setuptools),
}, nil, nil, `

36
internal/rosa/mksh.go Normal file
View File

@@ -0,0 +1,36 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newMksh() pkg.Artifact {
const (
version = "59c"
checksum = "0Zj-k4nXEu3IuJY4lvwD2OrC2t27GdZj8SPy4DoaeuBRH1padWb7oREpYgwY8JNq"
)
return t.New("mksh-"+version, 0, stage3Concat(t, []pkg.Artifact{},
t.Load(Perl),
t.Load(Coreutils),
), nil, []string{
"LDSTATIC=-static",
"CPPFLAGS=-DMKSH_DEFAULT_PROFILEDIR=\\\"/system/etc\\\"",
}, `
cd "$(mktemp -d)"
sh /usr/src/mksh/Build.sh -r
CPPFLAGS="${CPPFLAGS} -DMKSH_BINSHPOSIX -DMKSH_BINSHREDUCED" \
sh /usr/src/mksh/Build.sh -r -L
./test.sh -C regress:no-ctty
mkdir -p /work/system/bin/
cp -v mksh /work/system/bin/
cp -v lksh /work/system/bin/sh
mkdir -p /work/bin/
ln -vs ../system/bin/sh /work/bin/
`, pkg.Path(AbsUsrSrc.Append("mksh"), false, pkg.NewHTTPGetTar(
nil,
"https://mbsd.evolvis.org/MirOS/dist/mir/mksh/mksh-R"+version+".tgz",
mustDecode(checksum),
pkg.TarGzip,
)))
}
func init() { artifactsF[Mksh] = Toolchain.newMksh }

View File

@@ -30,7 +30,7 @@ func (t Toolchain) NewMusl(attr *MuslAttr) pkg.Artifact {
target := "install"
script := `
mkdir -p /work/system/bin
COMPAT_LINKER_NAME="ld-musl-$(uname -m).so.1"
COMPAT_LINKER_NAME="ld-musl-` + linuxArch() + `.so.1"
ln -vs ../lib/libc.so /work/system/bin/linker
ln -vs ../lib/libc.so /work/system/bin/ldd
ln -vs libc.so "/work/system/lib/${COMPAT_LINKER_NAME}"
@@ -42,23 +42,23 @@ rmdir -v /work/lib
script = ""
}
return t.New("musl-"+version, false, stage3Concat(t, attr.Extra,
return t.New("musl-"+version, 0, stage3Concat(t, attr.Extra,
t.Load(Make),
t.Load(Coreutils),
), nil, slices.Concat([]string{
"ROSA_MUSL_TARGET=" + target,
}, attr.Env), `
# expected to be writable in copies
chmod -R +w /usr/src/musl/
cd "$(mktemp -d)"
/usr/src/musl/configure \
--prefix=/system \
--target="${ROSA_TRIPLE}"
make "-j$(nproc)" DESTDIR=/work "${ROSA_MUSL_TARGET}"
`+script, pkg.Path(AbsUsrSrc.Append("musl"), true, pkg.NewHTTPGetTar(
nil,
"https://musl.libc.org/releases/musl-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
`+script, pkg.Path(AbsUsrSrc.Append("musl"), false, t.NewPatchedSource(
// expected to be writable in copies
"musl", version, pkg.NewHTTPGetTar(
nil, "https://musl.libc.org/releases/musl-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), false,
)))
}

View File

@@ -7,29 +7,33 @@ func (t Toolchain) newNinja() pkg.Artifact {
version = "1.13.2"
checksum = "ygKWMa0YV2lWKiFro5hnL-vcKbc_-RACZuPu0Io8qDvgQlZ0dxv7hPNSFkt4214v"
)
return t.New("ninja-"+version, false, []pkg.Artifact{
return t.New("ninja-"+version, 0, []pkg.Artifact{
t.Load(CMake),
t.Load(Python),
t.Load(Bash),
}, nil, nil, `
chmod -R +w /usr/src/ninja/
mkdir -p /work/system/bin/ && cd /work/system/bin/
cd "$(mktemp -d)"
python3 /usr/src/ninja/configure.py \
--bootstrap \
--gtest-source-dir=/usr/src/googletest
./ninja all
./ninja_test
mkdir -p /work/system/bin/
cp ninja /work/system/bin/
`, pkg.Path(AbsUsrSrc.Append("googletest"), false,
pkg.NewHTTPGetTar(
nil, "https://github.com/google/googletest/releases/download/"+
"v1.16.0/googletest-1.16.0.tar.gz",
mustDecode("NjLGvSbgPy_B-y-o1hdanlzEzaYeStFcvFGxpYV3KYlhrWWFRcugYhM3ZMzOA9B_"),
pkg.TarGzip,
)), pkg.Path(AbsUsrSrc.Append("ninja"), true,
pkg.NewHTTPGetTar(
)), pkg.Path(AbsUsrSrc.Append("ninja"), true, t.NewPatchedSource(
"ninja", version, pkg.NewHTTPGetTar(
nil, "https://github.com/ninja-build/ninja/archive/refs/tags/"+
"v"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
), false,
)))
}
func init() { artifactsF[Ninja] = Toolchain.newNinja }

36
internal/rosa/openssl.go Normal file
View File

@@ -0,0 +1,36 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newOpenSSL() pkg.Artifact {
const (
version = "3.5.5"
checksum = "I2Hp1LxcTR8j4G6LFEQMVy6EJH-Na1byI9Ti-ThBot6EMLNRnjGXGq-WXrim3Fkz"
)
return t.New("openssl-"+version, 0, []pkg.Artifact{
t.Load(Perl),
t.Load(Make),
t.Load(Zlib),
t.Load(KernelHeaders),
}, nil, []string{
"CC=cc",
}, `
cd "$(mktemp -d)"
/usr/src/openssl/Configure \
--prefix=/system \
--libdir=lib \
--openssldir=etc/ssl
make \
"-j$(nproc)" \
HARNESS_JOBS=256 \
test
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("openssl"), false, pkg.NewHTTPGetTar(
nil, "https://github.com/openssl/openssl/releases/download/"+
"openssl-"+version+"/openssl-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
}
func init() { artifactsF[OpenSSL] = Toolchain.newOpenSSL }

View File

@@ -7,12 +7,13 @@ func (t Toolchain) newPerl() pkg.Artifact {
version = "5.42.0"
checksum = "2KR7Jbpk-ZVn1a30LQRwbgUvg2AXlPQZfzrqCr31qD5-yEsTwVQ_W76eZH-EdxM9"
)
return t.New("perl-"+version, false, []pkg.Artifact{
return t.New("perl-"+version, TEarly, []pkg.Artifact{
t.Load(Make),
}, nil, nil, `
chmod -R +w /usr/src/perl && cd /usr/src/perl
cd /usr/src/perl
echo 'print STDOUT "1..0 # Skip broken test\n";' > ext/Pod-Html/t/htmldir3.t
rm -f /system/bin/ps # perl does not like toybox ps
./Configure \
-des \
@@ -27,11 +28,12 @@ make \
TEST_JOBS=256 \
test_harness
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("perl"), true, pkg.NewHTTPGetTar(
nil,
"https://www.cpan.org/src/5.0/perl-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
`, pkg.Path(AbsUsrSrc.Append("perl"), true, t.NewPatchedSource(
"perl", version, pkg.NewHTTPGetTar(
nil, "https://www.cpan.org/src/5.0/perl-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), false,
)))
}
func init() { artifactsF[Perl] = Toolchain.newPerl }

View File

@@ -7,23 +7,17 @@ func (t Toolchain) newPkgConfig() pkg.Artifact {
version = "0.29.2"
checksum = "gi7yAvkwo20Inys1tHbeYZ3Wjdm5VPkrnO0Q6_QZPCAwa1zrA8F4a63cdZDd-717"
)
return t.New("pkg-config-"+version, false, []pkg.Artifact{
t.Load(Make),
}, nil, nil, `
cd "$(mktemp -d)"
/usr/src/pkg-config/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}" \
CFLAGS='-Wno-int-conversion' \
--with-internal-glib
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("pkg-config"), true, pkg.NewHTTPGetTar(
return t.NewViaMake("pkg-config", version, pkg.NewHTTPGetTar(
nil,
"https://pkgconfig.freedesktop.org/releases/"+
"pkg-config-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
), &MakeAttr{
Configure: [][2]string{
{"CFLAGS", "'-Wno-int-conversion'"},
{"with-internal-glib"},
},
})
}
func init() { artifactsF[PkgConfig] = Toolchain.newPkgConfig }

View File

@@ -1,6 +1,7 @@
package rosa
import (
"slices"
"strings"
"hakurei.app/internal/pkg"
@@ -11,65 +12,81 @@ func (t Toolchain) newPython() pkg.Artifact {
version = "3.14.2"
checksum = "7nZunVMGj0viB-CnxpcRego2C90X5wFsMTgsoewd5z-KSZY2zLuqaBwG-14zmKys"
)
skipTests := []string{
// requires internet access (http://www.pythontest.net/)
"test_asyncio",
"test_socket",
"test_urllib2",
"test_urllibnet",
"test_urllib2net",
return t.NewViaMake("python", version, t.NewPatchedSource("python", version, pkg.NewHTTPGetTar(
nil, "https://www.python.org/ftp/python/"+version+
"/Python-"+version+".tgz",
mustDecode(checksum),
pkg.TarGzip,
), false), &MakeAttr{
// test_synopsis_sourceless assumes this is writable and checks __pycache__
Writable: true,
// makes assumptions about uid_map/gid_map
"test_os",
"test_subprocess",
Env: []string{
"EXTRATESTOPTS=-j0 -x " + strings.Join([]string{
// requires internet access (http://www.pythontest.net/)
"test_asyncio",
"test_socket",
"test_urllib2",
"test_urllibnet",
"test_urllib2net",
// somehow picks up mtime of source code
"test_zipfile",
// makes assumptions about uid_map/gid_map
"test_os",
"test_subprocess",
// requires gcc
"test_ctypes",
// somehow picks up mtime of source code
"test_zipfile",
// breaks on llvm
"test_dbm_gnu",
}
return t.New("python-"+version, false, []pkg.Artifact{
t.Load(Make),
// requires gcc
"test_ctypes",
// breaks on llvm
"test_dbm_gnu",
}, " -x "),
// _ctypes appears to infer something from the linker name
"LDFLAGS=-Wl,--dynamic-linker=/system/lib/" +
"ld-musl-" + linuxArch() + ".so.1",
},
ScriptEarly: `
export HOME="$(mktemp -d)"
`,
CheckName: "test",
},
t.Load(Zlib),
t.Load(Libffi),
}, nil, []string{
"EXTRATESTOPTS=-j0 -x " + strings.Join(skipTests, " -x "),
// _ctypes appears to infer something from the linker name
"LDFLAGS=-Wl,--dynamic-linker=/system/lib/" +
"ld-musl-" + linuxArch() + ".so.1",
}, `
# test_synopsis_sourceless assumes this is writable and checks __pycache__
chmod -R +w /usr/src/python/
export HOME="$(mktemp -d)"
cd "$(mktemp -d)"
/usr/src/python/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}"
make "-j$(nproc)"
make test
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("python"), true,
pkg.NewHTTPGetTar(
nil, "https://www.python.org/ftp/python/"+version+
"/Python-"+version+".tgz",
mustDecode(checksum),
pkg.TarGzip,
)))
)
}
func init() { artifactsF[Python] = Toolchain.newPython }
// newViaPip is a helper for installing python dependencies via pip.
func (t Toolchain) newViaPip(
name, version, abi, platform, checksum, prefix string,
extra ...pkg.Artifact,
) pkg.Artifact {
wname := name + "-" + version + "-py3-" + abi + "-" + platform + ".whl"
return t.New(name+"-"+version, 0, slices.Concat([]pkg.Artifact{
t.Load(Python),
}, extra), nil, nil, `
pip3 install \
--no-index \
--prefix=/system \
--root=/work \
/usr/src/`+wname+`
`, pkg.Path(AbsUsrSrc.Append(wname), false, pkg.NewHTTPGet(
nil, prefix+wname,
mustDecode(checksum),
)))
}
func (t Toolchain) newSetuptools() pkg.Artifact {
const (
version = "80.10.1"
checksum = "p3rlwEmy1krcUH1KabprQz1TCYjJ8ZUjOQknQsWh3q-XEqLGEd3P4VrCc7ouHGXU"
)
return t.New("setuptools-"+version, false, []pkg.Artifact{
return t.New("setuptools-"+version, 0, []pkg.Artifact{
t.Load(Python),
}, nil, nil, `
pip3 install \
@@ -85,3 +102,50 @@ pip3 install \
)))
}
func init() { artifactsF[Setuptools] = Toolchain.newSetuptools }
func (t Toolchain) newPygments() pkg.Artifact {
return t.newViaPip("pygments", "2.19.2", "none", "any",
"ak_lwTalmSr7W4Mjy2XBZPG9I6a0gwSy2pS87N8x4QEuZYif0ie9z0OcfRfi9msd",
"https://files.pythonhosted.org/packages/"+
"c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/")
}
func init() { artifactsF[Pygments] = Toolchain.newPygments }
func (t Toolchain) newPluggy() pkg.Artifact {
return t.newViaPip("pluggy", "1.6.0", "none", "any",
"2HWYBaEwM66-y1hSUcWI1MyE7dVVuNNRW24XD6iJBey4YaUdAK8WeXdtFMQGC-4J",
"https://files.pythonhosted.org/packages/"+
"54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/")
}
func init() { artifactsF[Pluggy] = Toolchain.newPluggy }
func (t Toolchain) newPackaging() pkg.Artifact {
return t.newViaPip("packaging", "26.0", "none", "any",
"iVVXcqdwHDskPKoCFUlh2x8J0Gyq-bhO4ns9DvUJ7oJjeOegRYtSIvLV33Bki-pP",
"https://files.pythonhosted.org/packages/"+
"b7/b9/c538f279a4e237a006a2c98387d081e9eb060d203d8ed34467cc0f0b9b53/")
}
func init() { artifactsF[Packaging] = Toolchain.newPackaging }
func (t Toolchain) newIniConfig() pkg.Artifact {
const version = "2.3.0"
return t.newViaPip("iniconfig", version, "none", "any",
"SDgs4S5bXi77aVOeKTPv2TUrS3M9rduiK4DpU0hCmDsSBWqnZcWInq9lsx6INxut",
"https://github.com/pytest-dev/iniconfig/releases/download/"+
"v"+version+"/")
}
func init() { artifactsF[IniConfig] = Toolchain.newIniConfig }
func (t Toolchain) newPyTest() pkg.Artifact {
const version = "9.0.2"
return t.newViaPip("pytest", version, "none", "any",
"IM2wDbLke1EtZhF92zvAjUl_Hms1uKDtM7U8Dt4acOaChMnDg1pW7ib8U0wYGDLH",
"https://github.com/pytest-dev/pytest/releases/download/"+
version+"/",
t.Load(IniConfig),
t.Load(Packaging),
t.Load(Pluggy),
t.Load(Pygments),
)
}
func init() { artifactsF[PyTest] = Toolchain.newPyTest }

View File

@@ -49,6 +49,8 @@ func linuxArch() string {
switch runtime.GOARCH {
case "amd64":
return "x86_64"
case "arm64":
return "aarch64"
default:
panic("unsupported target " + runtime.GOARCH)
@@ -173,10 +175,17 @@ func fixupEnviron(env, extras []string, paths ...string) []string {
// build script under.
var absCureScript = fhs.AbsUsrBin.Append(".cure-script")
const (
// TExclusive denotes an exclusive [pkg.Artifact].
TExclusive = 1 << iota
// TEarly hints for an early variant of [Toybox] to be used when available.
TEarly
)
// New returns a [pkg.Artifact] compiled on this toolchain.
func (t Toolchain) New(
name string,
exclusive bool,
flag int,
extra []pkg.Artifact,
checksum *pkg.Checksum,
env []string,
@@ -187,36 +196,42 @@ func (t Toolchain) New(
const lcMessages = "LC_MESSAGES=C.UTF-8"
var (
path = AbsSystem.Append("bin", "busybox")
args = []string{"hush", absCureScript.String()}
path = AbsSystem.Append("bin", "sh")
args = []string{"sh", absCureScript.String()}
support []pkg.Artifact
)
switch t {
case toolchainBusybox:
name += "-early"
support = slices.Concat([]pkg.Artifact{newBusyboxBin()}, extra)
path = AbsSystem.Append("bin", "busybox")
args[0] = "hush"
env = fixupEnviron(env, nil, "/system/bin")
case toolchainStage3:
name += "-boot"
const (
version = "20260111T160052Z"
checksum = "c5_FwMnRN8RZpTdBLGYkL4RR8ampdaZN2JbkgrFLe8-QHQAVQy08APVvIL6eT7KW"
)
var seed string
switch runtime.GOARCH {
case "amd64":
seed = "c5_FwMnRN8RZpTdBLGYkL4RR8ampdaZN2JbkgrFLe8-QHQAVQy08APVvIL6eT7KW"
case "arm64":
seed = "79uRbRI44PyknQQ9RlFUQrwqplup7vImiIk6klefL8TN-fT42TXMS_v4XszwexCb"
default:
panic("unsupported target " + runtime.GOARCH)
}
path = fhs.AbsRoot.Append("bin", "bash")
args[0] = "bash"
support = slices.Concat([]pkg.Artifact{
cureEtc{},
toolchainBusybox.New("stage3-"+version, false, nil, nil, nil, `
toolchainBusybox.New("stage3", 0, nil, nil, nil, `
tar -C /work -xf /usr/src/stage3.tar.xz
rm -rf /work/dev/ /work/proc/
ln -vs ../usr/bin /work/bin
`, pkg.Path(AbsUsrSrc.Append("stage3.tar.xz"), false,
pkg.NewHTTPGet(
nil, "https://distfiles.gentoo.org/releases/"+
runtime.GOARCH+"/autobuilds/"+version+
"/stage3-"+runtime.GOARCH+"-musl-llvm-"+version+".tar.xz",
mustDecode(checksum),
nil, "https://basement.gensokyo.uk/seed/"+seed,
mustDecode(seed),
),
)),
}, extra)
@@ -236,13 +251,18 @@ ln -vs ../usr/bin /work/bin
boot := t - 1
musl, compilerRT, runtimes, clang := boot.NewLLVM()
toybox := Toybox
if flag&TEarly != 0 {
toybox = toyboxEarly
}
support = slices.Concat(extra, []pkg.Artifact{
cureEtc{newIANAEtc()},
musl,
compilerRT,
runtimes,
clang,
boot.Load(Busybox),
boot.Load(Mksh),
boot.Load(toybox),
})
env = fixupEnviron(env, []string{
EnvTriplet + "=" + triplet(),
@@ -259,7 +279,7 @@ ln -vs ../usr/bin /work/bin
}
return pkg.NewExec(
name, checksum, pkg.ExecTimeoutMax, exclusive,
name, checksum, pkg.ExecTimeoutMax, flag&TExclusive != 0,
fhs.AbsRoot, env,
path, args,
@@ -308,7 +328,7 @@ cat /usr/src/` + name + `-patches/* | \
`
aname += "-patched"
}
return t.New(aname, false, stage3Concat(t, []pkg.Artifact{},
return t.New(aname, 0, stage3Concat(t, []pkg.Artifact{},
t.Load(Patch),
), nil, nil, script, paths...)
}

View File

@@ -7,23 +7,25 @@ func (t Toolchain) newRsync() pkg.Artifact {
version = "3.4.1"
checksum = "VBlTsBWd9z3r2-ex7GkWeWxkUc5OrlgDzikAC0pK7ufTjAJ0MbmC_N04oSVTGPiv"
)
return t.New("rsync-"+version, false, []pkg.Artifact{
t.Load(Make),
}, nil, nil, `
cd "$(mktemp -d)"
/usr/src/rsync/configure --prefix=/system \
--build="${ROSA_TRIPLE}" \
--disable-openssl \
--disable-xxhash \
--disable-zstd \
--disable-lz4
make "-j${nproc}"
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("rsync"), false, pkg.NewHTTPGetTar(
nil,
"https://download.samba.org/pub/rsync/src/rsync-"+version+".tar.gz",
return t.NewViaMake("rsync", version, pkg.NewHTTPGetTar(
nil, "https://download.samba.org/pub/rsync/src/"+
"rsync-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
), &MakeAttr{
Configure: [][2]string{
{"disable-openssl"},
{"disable-xxhash"},
{"disable-zstd"},
{"disable-lz4"},
},
// circular dependency
SkipCheck: true,
Flag: TEarly,
},
t.Load(Gawk),
)
}
func init() { artifactsF[Rsync] = Toolchain.newRsync }

84
internal/rosa/ssl.go Normal file
View File

@@ -0,0 +1,84 @@
package rosa
import (
"hakurei.app/internal/pkg"
)
func (t Toolchain) newNSS() pkg.Artifact {
const (
version = "3_120"
checksum = "9M0SNMrj9BJp6RH2rQnMm6bZWtP0Kgj64D5JNPHF7Cxr2_8kfy3msubIcvEPwC35"
version0 = "4_38_2"
checksum0 = "25x2uJeQnOHIiq_zj17b4sYqKgeoU8-IsySUptoPcdHZ52PohFZfGuIisBreWzx0"
)
return t.New("nss-"+version, 0, []pkg.Artifact{
t.Load(Perl),
t.Load(Python),
t.Load(Unzip),
t.Load(Make),
t.Load(Gawk),
t.Load(Coreutils),
t.Load(Zlib),
t.Load(KernelHeaders),
}, nil, nil, `
unzip /usr/src/nspr.zip -d /usr/src
mv '/usr/src/nspr-NSPR_`+version0+`_RTM' /usr/src/nspr
cd /usr/src/nss
make \
"-j$(nproc)" \
CCC="clang++" \
NSDISTMODE=copy \
BUILD_OPT=1 \
USE_64=1 \
nss_build_all
mkdir -p /work/system/nss
cp -r \
/usr/src/dist/. \
lib/ckfw/builtins/certdata.txt \
/work/system/nss
`, pkg.Path(AbsUsrSrc.Append("nss"), true, t.NewPatchedSource(
"nss", version, pkg.NewHTTPGetTar(
nil, "https://github.com/nss-dev/nss/archive/refs/tags/"+
"NSS_"+version+"_RTM.tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), false,
)), pkg.Path(AbsUsrSrc.Append("nspr.zip"), false, pkg.NewHTTPGet(
nil, "https://hg-edge.mozilla.org/projects/nspr/archive/"+
"NSPR_"+version0+"_RTM.zip",
mustDecode(checksum0),
)))
}
func init() { artifactsF[NSS] = Toolchain.newNSS }
func (t Toolchain) newBuildCATrust() pkg.Artifact {
const version = "0.4.0"
return t.newViaPip("buildcatrust", version, "none", "any",
"k_FGzkRCLjbTWBkuBLzQJ1S8FPAz19neJZlMHm0t10F2Y0hElmvVwdSBRc03Rjo1",
"https://github.com/nix-community/buildcatrust/"+
"releases/download/v"+version+"/")
}
func init() { artifactsF[buildcatrust] = Toolchain.newBuildCATrust }
func (t Toolchain) newNSSCACert() pkg.Artifact {
return t.New("nss-cacert", 0, []pkg.Artifact{
t.Load(Bash),
t.Load(Python),
t.Load(NSS),
t.Load(buildcatrust),
}, nil, nil, `
mkdir -p /work/system/etc/ssl/{certs/unbundled,certs/hashed,trust-source}
buildcatrust \
--certdata_input /system/nss/certdata.txt \
--ca_bundle_output /work/system/etc/ssl/certs/ca-bundle.crt \
--ca_standard_bundle_output /work/system/etc/ssl/certs/ca-no-trust-rules-bundle.crt \
--ca_unpacked_output /work/system/etc/ssl/certs/unbundled \
--ca_hashed_unpacked_output /work/system/etc/ssl/certs/hashed \
--p11kit_output /work/system/etc/ssl/trust-source/ca-bundle.trust.p11-kit
`)
}
func init() { artifactsF[NSSCACert] = Toolchain.newNSSCACert }

64
internal/rosa/toybox.go Normal file
View File

@@ -0,0 +1,64 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newToybox(suffix, script string) pkg.Artifact {
const (
version = "0.8.13"
checksum = "rZ1V1ATDte2WeQZanxLVoiRGdfPXhMlEo5-exX-e-ml8cGn9qOv0ABEUVZpX3wTI"
)
return t.New("toybox-"+version+suffix, TEarly, stage3Concat(t, []pkg.Artifact{},
t.Load(Make),
t.Load(Bash),
t.Load(Gzip),
t.Load(KernelHeaders),
), nil, stage3Concat(t, []string{},
"ROSA_CHECK=make USER=cure tests",
), `
ln -s ../system/bin/bash /bin/ || true
cd /usr/src/toybox
chmod +w kconfig tests
rm \
tests/du.test \
tests/sed.test \
tests/tar.test \
tests/ls.test \
tests/taskset.test
make defconfig
sed -i \
's/^CONFIG_TOYBOX_ZHELP=y$/CONFIG_TOYBOX_ZHELP=0/' \
.config
`+script+`
make \
"-j$(nproc)" \
LDFLAGS="${LDFLAGS} -static"
${ROSA_CHECK}
PREFIX=/work/system/bin make install_flat
mkdir -p /work/usr/bin
ln -s ../../system/bin/env /work/usr/bin
`, pkg.Path(AbsUsrSrc.Append("toybox"), true, pkg.NewHTTPGetTar(
nil,
"https://landley.net/toybox/downloads/toybox-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
}
func init() {
artifactsF[Toybox] = func(t Toolchain) pkg.Artifact {
return t.newToybox("", "")
}
artifactsF[toyboxEarly] = func(t Toolchain) pkg.Artifact {
return t.newToybox("-early", `
echo '
CONFIG_EXPR=y
CONFIG_TR=y
CONFIG_AWK=y
CONFIG_DIFF=y
' >> .config
`)
}
}

34
internal/rosa/unzip.go Normal file
View File

@@ -0,0 +1,34 @@
package rosa
import (
"strings"
"hakurei.app/internal/pkg"
)
func (t Toolchain) newUnzip() pkg.Artifact {
const (
version = "6.0"
checksum = "fcqjB1IOVRNJ16K5gTGEDt3zCJDVBc7EDSra9w3H93stqkNwH1vaPQs_QGOpQZu1"
)
return t.New("unzip-"+version, 0, []pkg.Artifact{
t.Load(Make),
t.Load(Coreutils),
}, nil, nil, `
cd /usr/src/unzip/
unix/configure
make -f unix/Makefile generic1
mkdir -p /work/system/bin/
mv unzip /work/system/bin/
`, pkg.Path(AbsUsrSrc.Append("unzip"), true, t.NewPatchedSource(
"unzip", version, pkg.NewHTTPGetTar(
nil, "https://downloads.sourceforge.net/project/infozip/"+
"UnZip%206.x%20%28latest%29/UnZip%20"+version+"/"+
"unzip"+strings.ReplaceAll(version, ".", "")+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), false,
)))
}
func init() { artifactsF[Unzip] = Toolchain.newUnzip }

View File

@@ -7,12 +7,14 @@ func (t Toolchain) newWayland() pkg.Artifact {
version = "1.24.0"
checksum = "JxgLiFRRGw2D3uhVw8ZeDbs3V7K_d4z_ypDog2LBqiA_5y2vVbUAk5NT6D5ozm0m"
)
return t.New("wayland-"+version, false, []pkg.Artifact{
return t.New("wayland-"+version, 0, []pkg.Artifact{
t.Load(Python),
t.Load(Meson),
t.Load(PkgConfig),
t.Load(CMake),
t.Load(Ninja),
t.Load(Gawk),
t.Load(Diffutils),
t.Load(Libffi),
t.Load(Libexpat),
@@ -50,7 +52,7 @@ func (t Toolchain) newWaylandProtocols() pkg.Artifact {
version = "1.47"
checksum = "B_NodZ7AQfCstcx7kgbaVjpkYOzbAQq0a4NOk-SA8bQixAE20FY3p1-6gsbPgHn9"
)
return t.New("wayland-protocols-"+version, false, []pkg.Artifact{
return t.New("wayland-protocols-"+version, 0, []pkg.Artifact{
t.Load(Python),
t.Load(Meson),
t.Load(PkgConfig),

View File

@@ -2,26 +2,47 @@ package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newUtilMacros() pkg.Artifact {
const (
version = "1.17"
checksum = "vYPO4Qq3B_WGcsBjG0-lfwZ6DZ7ayyrOLqfDrVOgTDcyLChuMGOAAVAa_UXLu5tD"
)
return t.NewViaMake("util-macros", version, pkg.NewHTTPGetTar(
nil, "https://www.x.org/releases/X11R7.7/src/util/"+
"util-macros-"+version+".tar.bz2",
mustDecode(checksum),
pkg.TarBzip2,
), nil)
}
func init() { artifactsF[utilMacros] = Toolchain.newUtilMacros }
func (t Toolchain) newXproto() pkg.Artifact {
const (
version = "7.0.23"
checksum = "goxwWxV0jZ_3pNczXFltZWHAhq92x-aEreUGyp5Ns8dBOoOmgbpeNIu1nv0Zx07z"
)
return t.New("xproto-"+version, false, []pkg.Artifact{
t.Load(Make),
t.Load(PkgConfig),
}, nil, nil, `
cd "$(mktemp -d)"
/usr/src/xproto/configure \
--prefix=/system \
--enable-static
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("xproto"), true, pkg.NewHTTPGetTar(
nil, "https://www.x.org/releases/X11R7.7/src/proto/xproto-"+version+".tar.bz2",
return t.NewViaMake("xproto", version, pkg.NewHTTPGetTar(
nil, "https://www.x.org/releases/X11R7.7/src/proto/"+
"xproto-"+version+".tar.bz2",
mustDecode(checksum),
pkg.TarBzip2,
)))
), &MakeAttr{
Writable: true,
// ancient configure script
ScriptEarly: `
cd /usr/src/xproto
autoreconf -if
`,
},
t.Load(M4),
t.Load(Perl),
t.Load(Autoconf),
t.Load(Automake),
t.Load(PkgConfig),
t.Load(utilMacros),
)
}
func init() { artifactsF[Xproto] = Toolchain.newXproto }
@@ -30,23 +51,33 @@ func (t Toolchain) newLibXau() pkg.Artifact {
version = "1.0.7"
checksum = "bm768RoZZnHRe9VjNU1Dw3BhfE60DyS9D_bgSR-JLkEEyUWT_Hb_lQripxrXto8j"
)
return t.New("libXau-"+version, false, []pkg.Artifact{
t.Load(Make),
t.Load(PkgConfig),
t.Load(Xproto),
}, nil, nil, `
cd "$(mktemp -d)"
/usr/src/libXau/configure \
--prefix=/system \
--enable-static
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("libXau"), true, pkg.NewHTTPGetTar(
return t.NewViaMake("libXau", version, pkg.NewHTTPGetTar(
nil, "https://www.x.org/releases/X11R7.7/src/lib/"+
"libXau-"+version+".tar.bz2",
mustDecode(checksum),
pkg.TarBzip2,
)))
), &MakeAttr{
Writable: true,
// ancient configure script
ScriptEarly: `
cd /usr/src/libXau
autoreconf -if
`,
Configure: [][2]string{
{"enable-static"},
},
},
t.Load(M4),
t.Load(Perl),
t.Load(Autoconf),
t.Load(Automake),
t.Load(Libtool),
t.Load(PkgConfig),
t.Load(utilMacros),
t.Load(Xproto),
)
}
func init() { artifactsF[LibXau] = Toolchain.newLibXau }

View File

@@ -7,22 +7,17 @@ func (t Toolchain) newXCBProto() pkg.Artifact {
version = "1.17.0"
checksum = "_NtbKaJ_iyT7XiJz25mXQ7y-niTzE8sHPvLXZPcqtNoV_-vTzqkezJ8Hp2U1enCv"
)
return t.New("xcb-proto-"+version, false, []pkg.Artifact{
t.Load(Make),
t.Load(Python),
}, nil, nil, `
cd "$(mktemp -d)"
/usr/src/xcb-proto/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}" \
--enable-static
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("xcb-proto"), true, pkg.NewHTTPGetTar(
return t.NewViaMake("xcb-proto", version, pkg.NewHTTPGetTar(
nil, "https://xcb.freedesktop.org/dist/xcb-proto-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
), &MakeAttr{
Configure: [][2]string{
{"enable-static"},
},
},
t.Load(Python),
)
}
func init() { artifactsF[XCBProto] = Toolchain.newXCBProto }
@@ -31,26 +26,21 @@ func (t Toolchain) newXCB() pkg.Artifact {
version = "1.17.0"
checksum = "hjjsc79LpWM_hZjNWbDDS6qRQUXREjjekS6UbUsDq-RR1_AjgNDxhRvZf-1_kzDd"
)
return t.New("xcb-"+version, false, []pkg.Artifact{
t.Load(Make),
return t.NewViaMake("xcb", version, pkg.NewHTTPGetTar(
nil, "https://xcb.freedesktop.org/dist/libxcb-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), &MakeAttr{
Configure: [][2]string{
{"enable-static"},
},
},
t.Load(Python),
t.Load(PkgConfig),
t.Load(XCBProto),
t.Load(Xproto),
t.Load(LibXau),
}, nil, nil, `
cd "$(mktemp -d)"
/usr/src/xcb/configure \
--prefix=/system \
--build="${ROSA_TRIPLE}" \
--enable-static
make "-j$(nproc)" check
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("xcb"), true, pkg.NewHTTPGetTar(
nil, "https://xcb.freedesktop.org/dist/libxcb-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
)
}
func init() { artifactsF[XCB] = Toolchain.newXCB }

19
internal/rosa/xz.go Normal file
View File

@@ -0,0 +1,19 @@
package rosa
import "hakurei.app/internal/pkg"
func (t Toolchain) newXZ() pkg.Artifact {
const (
version = "5.8.2"
checksum = "rXT-XCp9R2q6cXqJ5qenp0cmGPfiENQiU3BWtUVeVgArfRmSsISeUJgvCR3zI0a0"
)
return t.NewViaMake("xz", version, pkg.NewHTTPGetTar(
nil, "https://github.com/tukaani-project/xz/releases/download/"+
"v"+version+"/xz-"+version+".tar.bz2",
mustDecode(checksum),
pkg.TarBzip2,
), nil,
t.Load(Diffutils),
)
}
func init() { artifactsF[XZ] = Toolchain.newXZ }

View File

@@ -7,19 +7,16 @@ func (t Toolchain) newZlib() pkg.Artifact {
version = "1.3.1"
checksum = "E-eIpNzE8oJ5DsqH4UuA_0GDKuQF5csqI8ooDx2w7Vx-woJ2mb-YtSbEyIMN44mH"
)
return t.New("zlib-"+version, false, []pkg.Artifact{
t.Load(Make),
}, nil, nil, `
cd "$(mktemp -d)"
CC="clang -fPIC" /usr/src/zlib/configure \
--prefix /system
make "-j$(nproc)" test
make DESTDIR=/work install
`, pkg.Path(AbsUsrSrc.Append("zlib"), true,
pkg.NewHTTPGetTar(
nil, "https://zlib.net/zlib-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
)))
return t.NewViaMake("zlib", version, pkg.NewHTTPGetTar(
nil, "https://zlib.net/zlib-"+version+".tar.gz",
mustDecode(checksum),
pkg.TarGzip,
), &MakeAttr{
OmitDefaults: true,
Env: []string{
"CC=clang -fPIC",
},
Build: `""`,
})
}
func init() { artifactsF[Zlib] = Toolchain.newZlib }

View File

@@ -89,7 +89,7 @@ buildGoModule rec {
CC = "clang -O3 -Werror";
# nix build environment does not allow acls
GO_TEST_SKIP_ACL = 1;
HAKUREI_TEST_SKIP_ACL = 1;
};
buildInputs = [