1
0
forked from rosa/hakurei

54 Commits

Author SHA1 Message Date
520d95bc07 internal/rosa/libxslt: fetch source tarball
This does not have submodules, so the overhead of git is unnecessary.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-28 18:31:44 +09:00
451df3f4e7 internal/rosa/libxml2: fetch source tarball
This does not have submodules, so the overhead of git is unnecessary.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-28 18:31:28 +09:00
011fac15ed internal/rosa/git: 2.53.0 to 2.54.0
This release broke httpd detection and job control on mksh.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-28 18:23:20 +09:00
347682ad0b internal/rosa/kernel: 6.12.83 to 6.12.84
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-28 17:44:20 +09:00
1a2b979add internal/rosa/rsync: 3.4.1 to 3.4.2
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-28 16:37:47 +09:00
b1c90cc380 internal/rosa/libexpat: 2.7.5 to 2.8.0
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-28 16:37:16 +09:00
3a66b8143a internal/rosa/nss: 3.123 to 3.123.1
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-28 16:15:14 +09:00
64bbd3aabd internal/rosa/mesa: libdrm 2.4.131 to 2.4.133
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-28 16:03:49 +09:00
08799a13d0 internal/rosa/glslang: spirv-tools check stable versions
This hides release candidates.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-28 16:03:29 +09:00
1aef9c3bbb internal/rosa/python: pathspec 1.0.4 to 1.1.1
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-28 16:02:19 +09:00
1f38303747 internal/rosa/python: packaging 26.1 to 26.2
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-28 16:01:56 +09:00
640777b00c internal/rosa/gnu: parallel 20260322 to 20260422
This pulls in bash with nonstandard hardcoded path.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-28 15:58:59 +09:00
1d657193cf internal/rosa/kernel: disable md
This is entirely unused and is a somewhat large attack surface, so disable it.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-28 15:48:20 +09:00
bab5406295 internal/rosa/go: require popcnt for x86
This backports https://go.dev/cl/746640.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-28 14:36:59 +09:00
725ae7d64d nix: remove all explicit timeouts
These were useful during development because timing out is often the only indication of failure due to the terrible design of nixos vm test harness. This has become a nuisance however especially when the system is under load, so remove explicit values and fall back to the ludicrously high default.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-23 13:07:22 +09:00
37a0c3967e internal/rosa/gnu: mpc fetch source tarball
This does not have submodules, so the overhead of git is unnecessary.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-23 12:57:11 +09:00
ea0692548f internal/rosa/gnu: coreutils 9.10 to 9.11
Test regression was fixed, dropping patch.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-23 12:30:46 +09:00
48ea23e648 internal/rosa/gnu: sed 4.9 to 4.10
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-23 12:30:06 +09:00
40320e4920 internal/rosa/meson: 1.11.0 to 1.11.1
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-23 12:29:17 +09:00
3ca0f61632 internal/rosa/llvm: 22.1.3 to 22.1.4
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-23 12:28:55 +09:00
6ffaac96e3 internal/rosa/cmake: 4.3.1 to 4.3.2
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-23 12:28:34 +09:00
13c7713d0c internal/rosa/kernel: 6.12.82 to 6.12.83
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-23 12:28:14 +09:00
42389f7ec5 internal/rosa/qemu: 10.2.2 to 11.0.0
This pulls in some python packages.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-23 01:15:13 +09:00
30f130c691 internal/rosa/python: wheel artifact
No idea why this ended up as a package.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-23 01:07:14 +09:00
ceb4d26087 internal/pkg: record cache variant on-disk
This makes custom artifacts much less error-prone to use.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-23 00:53:21 +09:00
852f3a9b3d internal/rosa/kernel: 6.12.81 to 6.12.82
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-20 22:11:13 +09:00
5e02dbdb0d internal/rosa/python: remove pypi helpers
Pypi is disallowed by policy so these helpers are no longer useful.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-20 02:37:10 +09:00
6a3248d472 internal/rosa/python: install pyyaml from source
Required by mesa.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-20 02:35:30 +09:00
67404c98d9 internal/rosa/nss: install buildcatrust from source
Dependencies are now available, so this no longer has to rely on the release.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-20 02:09:24 +09:00
b9bf69cfce internal/rosa/python: install mako from source
Required by mesa.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-20 01:55:23 +09:00
4648f98272 internal/rosa/python: run tests via helper
Despite the lack of standards, pytest seems somewhat widely agreed upon.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-20 01:50:57 +09:00
11d99439ac internal/rosa/python: install markupsafe from source
Required by mesa.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-20 01:26:11 +09:00
39e4c5b8ac internal/rosa/python: optionally install before check
Some test suites require package to be installed globally.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-20 01:25:43 +09:00
e8f6db38b6 internal/rosa/python: install pytest from source
Used by many python packages.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 23:17:38 +09:00
20d5b71575 internal/rosa/python: install iniconfig from source
This also required the setuptools-scm hack.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 22:53:32 +09:00
e903e7f542 internal/rosa/python: install pygments from source
This finally has its dependencies.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 22:40:43 +09:00
1caa051f4d internal/rosa/python: hatchling artifact
Required by many python packages.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 22:35:18 +09:00
dcdc6f7f6d internal/rosa/python: trove-classifiers artifact
Required by hatchling.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 22:32:12 +09:00
5ad6f26b46 internal/rosa/python: install packaging from source
This is required by many packages.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 22:12:49 +09:00
7ba75a79f4 internal/rosa/python: install pluggy from source
This finally has all its dependencies at this point.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 21:55:55 +09:00
9ef84d3904 internal/rosa/python: setuptools-scm artifact
Awful hack required by many packages.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 21:38:44 +09:00
3b7b6e51fb internal/rosa/python: pass build dependencies separately
This is cleaner with less duplicate code.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 20:26:41 +09:00
b1b4debb82 internal/rosa/python: pathspec artifact
Required by hatchling, which is required by many python packages.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 20:13:26 +09:00
021cbbc2a8 cmd/mbf: default daemon socket in cache
This location makes more sense than the current directory.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 19:50:54 +09:00
a4a54a4a4d cmd/mbf: remove pointless recover
This used to scrub the cache, and was not fully removed when that became nonviable.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 19:49:01 +09:00
04a344aac6 internal/rosa/python: flirt_core artifact
A build system required by a dependency of another build system, which is required by yet another build system.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 19:25:04 +09:00
6b98156a3d internal/rosa/python: change insane strict_timestamps default
There is no scenario where this is useful, and it breaks builds.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 18:56:22 +09:00
753432cf09 cmd/mbf: optionally wait for cancel
Synchronisation is not needed here during interactive use.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 18:24:11 +09:00
f8902e3679 internal/rosa/python: append to source path
This gets around messy projects with multiple packages.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 17:51:00 +09:00
8ee53a5164 internal/rosa: use builtin for checksum warning
This avoids having to configure the logger early.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 17:50:12 +09:00
3981d44757 internal/rosa/python: migrate setuptools to wrapper
Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 15:36:43 +09:00
9fd67e47b4 internal/rosa/python: wrap python package
Metadata for this is somewhat boilerplate-heavy, so wrap it to create metadata in one call.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 15:22:18 +09:00
4dcec40156 cmd/mbf: close on cancel completion
Like the previous change, this enables synchronisation on the client side via epoll.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 15:03:52 +09:00
9a274c78a3 cmd/mbf: close on abort completion
This enables synchronisation on the client side via epoll.

Signed-off-by: Ophestra <cat@gensokyo.uk>
2026-04-19 14:53:28 +09:00
30 changed files with 962 additions and 388 deletions

View File

@@ -31,9 +31,6 @@ func (cache *cache) open() (err error) {
return os.ErrInvalid
}
if cache.base == "" {
cache.base = "cache"
}
var base *check.Absolute
if cache.base, err = filepath.Abs(cache.base); err != nil {
return

View File

@@ -99,10 +99,9 @@ func cancelIdent(
var ident pkg.ID
if _, err := io.ReadFull(conn, ident[:]); err != nil {
return nil, false, errors.Join(err, conn.Close())
} else if err = conn.Close(); err != nil {
return nil, false, err
}
return &ident, cache.Cancel(unique.Make(ident)), nil
ok := cache.Cancel(unique.Make(ident))
return &ident, ok, conn.Close()
}
// serve services connections from a [net.UnixListener].
@@ -194,11 +193,11 @@ func serve(
}
case specialAbort:
log.Println("aborting all pending cures")
cm.c.Abort()
if _err := conn.Close(); _err != nil {
log.Println(_err)
}
log.Println("aborting all pending cures")
cm.c.Abort()
}
return
@@ -306,6 +305,7 @@ func cancelRemote(
ctx context.Context,
addr *net.UnixAddr,
a pkg.Artifact,
wait bool,
) error {
done, conn, err := dial(ctx, addr)
if err != nil {
@@ -324,13 +324,19 @@ func cancelRemote(
} else if n != len(id) {
return errors.Join(io.ErrShortWrite, conn.Close())
}
return conn.Close()
if wait {
if _, err = conn.Read(make([]byte, 1)); err == io.EOF {
err = nil
}
}
return errors.Join(err, conn.Close())
}
// abortRemote aborts all [pkg.Artifact] curing on a daemon.
func abortRemote(
ctx context.Context,
addr *net.UnixAddr,
wait bool,
) error {
done, conn, err := dial(ctx, addr)
if err != nil {
@@ -339,5 +345,10 @@ func abortRemote(
defer close(done)
err = writeSpecialHeader(conn, specialAbort)
if wait && err == nil {
if _, err = conn.Read(make([]byte, 1)); err == io.EOF {
err = nil
}
}
return errors.Join(err, conn.Close())
}

View File

@@ -106,11 +106,11 @@ func TestDaemon(t *testing.T) {
}
}()
if err = cancelRemote(ctx, &addr, pkg.NewFile("nonexistent", nil)); err != nil {
if err = cancelRemote(ctx, &addr, pkg.NewFile("nonexistent", nil), true); err != nil {
t.Fatalf("cancelRemote: error = %v", err)
}
if err = abortRemote(ctx, &addr); err != nil {
if err = abortRemote(ctx, &addr, true); err != nil {
t.Fatalf("abortRemote: error = %v", err)
}

View File

@@ -59,14 +59,7 @@ func main() {
defer stop()
var cm cache
defer func() {
cm.Close()
if r := recover(); r != nil {
fmt.Println(r)
log.Fatal("consider scrubbing the on-disk cache")
}
}()
defer func() { cm.Close() }()
var (
flagQuiet bool
@@ -77,11 +70,14 @@ func main() {
msg.SwapVerbose(!flagQuiet)
cm.ctx, cm.msg = ctx, msg
cm.base = os.ExpandEnv(cm.base)
if cm.base == "" {
cm.base = "cache"
}
addr.Net = "unix"
addr.Name = os.ExpandEnv(addr.Name)
if addr.Name == "" {
addr.Name = "daemon"
addr.Name = filepath.Join(cm.base, "daemon")
}
return nil
@@ -489,7 +485,7 @@ func main() {
cc, cancel := context.WithDeadline(context.Background(), daemonDeadline())
defer cancel()
if _err := cancelRemote(cc, &addr, a); _err != nil {
if _err := cancelRemote(cc, &addr, a, false); _err != nil {
log.Println(err)
}
}
@@ -523,7 +519,7 @@ func main() {
c.NewCommand(
"abort",
"Abort all pending cures on the daemon",
func([]string) error { return abortRemote(ctx, &addr) },
func([]string) error { return abortRemote(ctx, &addr, false) },
)
{

View File

@@ -432,6 +432,12 @@ func (e InvalidKindError) Error() string {
// register is not safe for concurrent use. register must not be called after
// the first instance of [Cache] has been opened.
func register(k Kind, f IRReadFunc) {
openMu.Lock()
defer openMu.Unlock()
if opened {
panic("attempting to register after open")
}
if _, ok := irArtifact[k]; ok {
panic("attempting to register " + strconv.Itoa(int(k)) + " twice")
}

View File

@@ -18,6 +18,7 @@ import (
"path/filepath"
"runtime"
"slices"
"strconv"
"strings"
"sync"
"sync/atomic"
@@ -70,6 +71,64 @@ func MustDecode(s string) (checksum Checksum) {
return
}
var (
// extension is a string uniquely identifying a set of custom [Artifact]
// implementations registered by calling [Register].
extension string
// openMu synchronises access to global state for initialisation.
openMu sync.Mutex
// opened is false if [Open] was never called.
opened bool
)
// Extension returns a string uniquely identifying the currently registered set
// of custom [Artifact], or the zero value if none was registered.
func Extension() string { return extension }
// ValidExtension returns whether s is valid for use in a call to SetExtension.
func ValidExtension(s string) bool {
if l := len(s); l == 0 || l > 128 {
return false
}
for _, v := range s {
if v < 'a' || v > 'z' {
return false
}
}
return true
}
// ErrInvalidExtension is returned for a variant identification string for which
// [ValidExtension] returns false.
var ErrInvalidExtension = errors.New("invalid extension variant identification string")
// SetExtension sets the extension variant identification string. SetExtension
// must be called before [Open] if custom [Artifact] implementations had been
// recorded by calling [Register].
//
// The variant identification string must be between 1 and 128 bytes long and
// consists of only bytes between 'a' and 'z'.
//
// SetExtension is not safe for concurrent use. SetExtension is called at most
// once and must not be called after the first instance of Cache has been opened.
func SetExtension(s string) {
openMu.Lock()
defer openMu.Unlock()
if opened {
panic("attempting to set extension after open")
}
if extension != "" {
panic("attempting to set extension twice")
}
if !ValidExtension(s) {
panic(ErrInvalidExtension)
}
extension = s
statusHeader = makeStatusHeader(s)
}
// common holds elements and receives methods shared between different contexts.
type common struct {
// Context specific to this [Artifact]. The toplevel context in [Cache] must
@@ -102,19 +161,27 @@ type TContext struct {
common
}
// statusHeader is the header written to all status files in dirStatus.
var statusHeader = func() string {
// makeStatusHeader creates the header written to every status file. This should
// not be called directly, its result is stored in statusHeader and will not
// change after the first [Cache] is opened.
func makeStatusHeader(extension string) string {
s := programName
if v := info.Version(); v != info.FallbackVersion {
s += " " + v
}
if extension != "" {
s += " with " + extension + " extensions"
}
s += " (" + runtime.GOARCH + ")"
if name, err := os.Hostname(); err == nil {
s += " on " + name
}
s += "\n\n"
return s
}()
}
// statusHeader is the header written to all status files in dirStatus.
var statusHeader = makeStatusHeader("")
// prepareStatus initialises the status file once.
func (t *TContext) prepareStatus() error {
@@ -427,6 +494,9 @@ const (
// KindFile is the kind of [Artifact] returned by [NewFile].
KindFile
// _kindEnd is the total number of kinds and does not denote a kind.
_kindEnd
// KindCustomOffset is the first [Kind] value reserved for implementations
// not from this package.
KindCustomOffset = 1 << 31
@@ -441,6 +511,9 @@ const (
// fileLock is the file name appended to Cache.base for guaranteeing
// exclusive access to the cache directory.
fileLock = "lock"
// fileVariant is the file name appended to Cache.base holding the variant
// identification string set by a prior call to [SetExtension].
fileVariant = "variant"
// dirIdentifier is the directory name appended to Cache.base for storing
// artifacts named after their [ID].
@@ -540,6 +613,10 @@ const (
// impurity due to [KindExecNet] being [KnownChecksum]. This flag exists
// to support kernels without Landlock LSM enabled.
CHostAbstract
// CPromoteVariant allows [pkg.Open] to promote an unextended on-disk cache
// to the current extension variant. This is a one-way operation.
CPromoteVariant
)
// toplevel holds [context.WithCancel] over caller-supplied context, where all
@@ -1930,6 +2007,20 @@ func (c *Cache) Close() {
c.unlock()
}
// UnsupportedVariantError describes an on-disk cache with an extension variant
// identification string that differs from the value returned by [Extension].
type UnsupportedVariantError string
func (e UnsupportedVariantError) Error() string {
return "unsupported variant " + strconv.Quote(string(e))
}
var (
// ErrWouldPromote is returned by [Open] if the [CPromoteVariant] bit is not
// set and the on-disk cache requires variant promotion.
ErrWouldPromote = errors.New("operation would promote unextended cache")
)
// Open returns the address of a newly opened instance of [Cache].
//
// Concurrent cures of a [FloodArtifact] dependency graph is limited to the
@@ -1961,6 +2052,14 @@ func open(
base *check.Absolute,
lock bool,
) (*Cache, error) {
openMu.Lock()
defer openMu.Unlock()
opened = true
if extension == "" && len(irArtifact) != int(_kindEnd) {
panic("attempting to open cache with incomplete variant setup")
}
if cures < 1 {
cures = runtime.NumCPU()
}
@@ -1974,8 +2073,10 @@ func open(
dirStatus,
dirWork,
} {
if err := os.MkdirAll(base.Append(name).String(), 0700); err != nil &&
!errors.Is(err, os.ErrExist) {
if err := os.MkdirAll(
base.Append(name).String(),
0700,
); err != nil && !errors.Is(err, os.ErrExist) {
return nil, err
}
}
@@ -2013,6 +2114,45 @@ func open(
c.unlock = func() {}
}
variantPath := base.Append(fileVariant).String()
if p, err := os.ReadFile(variantPath); err != nil {
if !errors.Is(err, os.ErrNotExist) {
c.unlock()
return nil, err
}
// nonexistence implies newly created cache, or a cache predating
// variant identification strings, in which case it is silently promoted
if err = os.WriteFile(
variantPath,
[]byte(extension),
0400,
); err != nil {
c.unlock()
return nil, err
}
} else if s := string(p); s == "" {
if extension != "" {
if flags&CPromoteVariant == 0 {
c.unlock()
return nil, ErrWouldPromote
}
if err = os.WriteFile(
variantPath,
[]byte(extension),
0400,
); err != nil {
c.unlock()
return nil, err
}
}
} else if !ValidExtension(s) {
c.unlock()
return nil, ErrInvalidExtension
} else if s != extension {
c.unlock()
return nil, UnsupportedVariantError(s)
}
return &c, nil
}

View File

@@ -41,6 +41,25 @@ func unsafeOpen(
lock bool,
) (*pkg.Cache, error)
var (
// extension is a string uniquely identifying a set of custom [Artifact]
// implementations registered by calling [Register].
//
//go:linkname extension hakurei.app/internal/pkg.extension
extension string
// opened is false if [Open] was never called.
//
//go:linkname opened hakurei.app/internal/pkg.opened
opened bool
// irArtifact refers to artifact IR interpretation functions and must not be
// written to directly.
//
//go:linkname irArtifact hakurei.app/internal/pkg.irArtifact
irArtifact map[pkg.Kind]pkg.IRReadFunc
)
// newRContext returns the address of a new [pkg.RContext] unsafely created for
// the specified [testing.TB].
func newRContext(tb testing.TB, c *pkg.Cache) *pkg.RContext {
@@ -342,9 +361,20 @@ func checkWithCache(t *testing.T, testCases []cacheTestCase) {
restoreTemp = true
}
// destroy lock file to avoid changing cache checksums
if err := os.Remove(base.Append("lock").String()); err != nil {
t.Fatal(err)
// destroy lock and variant file to avoid changing cache checksums
for _, s := range []string{
"lock",
"variant",
} {
pathname := base.Append(s)
if p, err := os.ReadFile(pathname.String()); err != nil {
t.Fatal(err)
} else if len(p) != 0 {
t.Fatalf("file %q: %q", s, string(p))
}
if err := os.Remove(pathname.String()); err != nil {
t.Fatal(err)
}
}
// destroy non-deterministic status files
@@ -1101,6 +1131,10 @@ func TestErrors(t *testing.T) {
Want: pkg.IRKindIdent,
Ancillary: 0xcafe,
}, "got invalid kind 48879 IR value (0xcafe) instead of ident"},
{"UnsupportedVariantError", pkg.UnsupportedVariantError(
"rosa",
), `unsupported variant "rosa"`},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
@@ -1309,6 +1343,8 @@ func (a earlyFailureF) Cure(*pkg.FContext) error {
}
func TestDependencyCureErrorEarly(t *testing.T) {
t.Parallel()
checkWithCache(t, []cacheTestCase{
{"early", 0, nil, func(t *testing.T, _ *check.Absolute, c *pkg.Cache) {
_, _, err := c.Cure(earlyFailureF(8))
@@ -1319,7 +1355,7 @@ func TestDependencyCureErrorEarly(t *testing.T) {
})
}
func TestNew(t *testing.T) {
func TestOpen(t *testing.T) {
t.Parallel()
t.Run("nonexistent", func(t *testing.T) {
@@ -1367,3 +1403,219 @@ func TestNew(t *testing.T) {
}
})
}
func TestExtensionRegister(t *testing.T) {
extensionOld := extension
openedOld := opened
t.Cleanup(func() { extension = extensionOld; opened = openedOld })
extension = ""
opened = false
t.Run("set", func(t *testing.T) {
t.Cleanup(func() { extension = "" })
const want = "rosa"
pkg.SetExtension(want)
if got := pkg.Extension(); got != want {
t.Fatalf("Extension: %q, want %q", got, want)
}
})
t.Run("twice", func(t *testing.T) {
t.Cleanup(func() { extension = "" })
defer func() {
const wantPanic = "attempting to set extension twice"
if r := recover(); r != wantPanic {
t.Errorf("panic: %#v, want %q", r, wantPanic)
}
}()
pkg.SetExtension("rosa")
pkg.SetExtension("rosa")
})
t.Run("invalid", func(t *testing.T) {
defer func() {
var wantPanic = pkg.ErrInvalidExtension
if r := recover(); r != wantPanic {
t.Errorf("panic: %#v, want %#v", r, wantPanic)
}
}()
pkg.SetExtension(" ")
})
t.Run("opened", func(t *testing.T) {
t.Cleanup(func() { opened = false })
if _, err := pkg.Open(
t.Context(),
message.New(log.Default()),
0, 0, 0,
check.MustAbs(container.Nonexistent),
); !errors.Is(err, os.ErrNotExist) {
t.Fatalf("Open: error = %v", err)
}
t.Run("variant", func(t *testing.T) {
defer func() {
const wantPanic = "attempting to set extension after open"
if r := recover(); r != wantPanic {
t.Errorf("panic: %#v, want %q", r, wantPanic)
}
}()
pkg.SetExtension("rosa")
})
t.Run("register", func(t *testing.T) {
defer func() {
const wantPanic = "attempting to register after open"
if r := recover(); r != wantPanic {
t.Errorf("panic: %#v, want %q", r, wantPanic)
}
}()
pkg.Register(pkg.KindCustomOffset, nil)
})
})
t.Run("incomplete", func(t *testing.T) {
t.Cleanup(func() { delete(irArtifact, pkg.KindCustomOffset) })
defer func() {
const wantPanic = "attempting to open cache with incomplete variant setup"
if r := recover(); r != wantPanic {
t.Errorf("panic: %#v, want %q", r, wantPanic)
}
}()
pkg.Register(pkg.KindCustomOffset, nil)
t.Cleanup(func() { opened = false })
_, _ = pkg.Open(nil, nil, 0, 0, 0, nil)
panic("unreachable")
})
t.Run("create", func(t *testing.T) {
t.Cleanup(func() { extension = "" })
const want = "rosa"
pkg.SetExtension(want)
base := check.MustAbs(t.TempDir())
t.Cleanup(func() { opened = false })
if c, err := pkg.Open(
t.Context(), nil,
0, 0, 0,
base,
); err != nil {
t.Fatal(err)
} else {
c.Close()
}
if got, err := os.ReadFile(base.Append("variant").String()); err != nil {
t.Fatal(err)
} else if string(got) != want {
t.Fatalf("variant: %q", string(got))
}
})
t.Run("access", func(t *testing.T) {
base := check.MustAbs(t.TempDir())
t.Cleanup(func() { opened = false })
if err := os.WriteFile(base.Append("variant").String(), nil, 0); err != nil {
t.Fatal(err)
}
wantErr := &os.PathError{
Op: "open",
Path: base.Append("variant").String(),
Err: syscall.EACCES,
}
if _, err := pkg.Open(
t.Context(), nil,
0, 0, 0,
base,
); !reflect.DeepEqual(err, wantErr) {
t.Fatalf("Open: error = %v, want %v", err, wantErr)
}
})
t.Run("promote", func(t *testing.T) {
t.Cleanup(func() { extension = "" })
const want = "rosa"
pkg.SetExtension(want)
base := check.MustAbs(t.TempDir())
t.Cleanup(func() { opened = false })
variantPath := base.Append("variant")
if err := os.WriteFile(variantPath.String(), nil, 0600); err != nil {
t.Fatal(err)
}
if _, err := pkg.Open(
t.Context(), nil,
0, 0, 0,
base,
); !reflect.DeepEqual(err, pkg.ErrWouldPromote) {
t.Fatalf("Open: error = %v", err)
}
if p, err := os.ReadFile(variantPath.String()); err != nil {
t.Fatal(err)
} else if len(p) != 0 {
t.Fatalf("variant: %q", string(p))
}
if c, err := pkg.Open(
t.Context(), nil,
pkg.CPromoteVariant, 0, 0,
base,
); err != nil {
t.Fatalf("Open: error = %v", err)
} else {
c.Close()
}
if p, err := os.ReadFile(variantPath.String()); err != nil {
t.Fatal(err)
} else if string(p) != want {
t.Fatalf("variant: %q, want %q", string(p), want)
}
})
t.Run("open invalid", func(t *testing.T) {
base := check.MustAbs(t.TempDir())
t.Cleanup(func() { opened = false })
variantPath := base.Append("variant")
if err := os.WriteFile(variantPath.String(), make([]byte, 129), 0400); err != nil {
t.Fatal(err)
}
if _, err := pkg.Open(
t.Context(), nil,
0, 0, 0,
base,
); !reflect.DeepEqual(err, pkg.ErrInvalidExtension) {
t.Fatalf("Open: error = %v", err)
}
})
t.Run("unsupported", func(t *testing.T) {
base := check.MustAbs(t.TempDir())
t.Cleanup(func() { opened = false })
variantPath := base.Append("variant")
if err := os.WriteFile(variantPath.String(), []byte("rosa"), 0400); err != nil {
t.Fatal(err)
}
if _, err := pkg.Open(
t.Context(), nil,
0, 0, 0,
base,
); !reflect.DeepEqual(err, pkg.UnsupportedVariantError("rosa")) {
t.Fatalf("Open: error = %v", err)
}
})
}

View File

@@ -129,20 +129,27 @@ const (
PkgConfig
Procps
Python
PythonFlitCore
PythonHatchling
PythonIniConfig
PythonMako
PythonMarkupSafe
PythonPackaging
PythonPathspec
PythonPluggy
PythonPyTest
PythonPyYAML
PythonPygments
PythonSetuptools
PythonSetuptoolsSCM
PythonTroveClassifiers
PythonVCSVersioning
PythonWheel
QEMU
Rdfind
Readline
Rsync
Sed
Setuptools
SPIRVHeaders
SPIRVTools
SquashfsTools

View File

@@ -10,8 +10,8 @@ import (
func (t Toolchain) newCMake() (pkg.Artifact, string) {
const (
version = "4.3.1"
checksum = "RHpzZiM1kJ5bwLjo9CpXSeHJJg3hTtV9QxBYpQoYwKFtRh5YhGWpShrqZCSOzQN6"
version = "4.3.2"
checksum = "6QylwRVKletndTSkZTV2YBRwgd_9rUVgav_QW23HpjUgV21AVYZOUOal8tdBDmO7"
)
return t.NewPackage("cmake", version, newFromGitHubRelease(
"Kitware/CMake",

View File

@@ -9,8 +9,8 @@ import (
func (t Toolchain) newGit() (pkg.Artifact, string) {
const (
version = "2.53.0"
checksum = "rlqSTeNgSeVKJA7nvzGqddFH8q3eFEPB4qRZft-4zth8wTHnbTbm7J90kp_obHGm"
version = "2.54.0"
checksum = "7vGKtFOJGqY8DO4e8UMRax7dLgImXKQz5MMalec6MlgYrsarffSJjgOughwRFpSH"
)
return t.NewPackage("git", version, newTar(
"https://www.kernel.org/pub/software/scm/git/"+
@@ -20,6 +20,9 @@ func (t Toolchain) newGit() (pkg.Artifact, string) {
), &PackageAttr{
ScriptEarly: `
ln -s ../../system/bin/perl /usr/bin/ || true
# test suite assumes apache
rm -f /system/bin/httpd
`,
// uses source tree as scratch space
@@ -38,6 +41,7 @@ function disable_test {
fi
}
disable_test t1800-hook
disable_test t5319-multi-pack-index
disable_test t1305-config-include
disable_test t3900-i18n-commit
@@ -63,6 +67,9 @@ disable_test t2200-add-update
NO_INSTALL_HARDLINKS=1 \
install`,
},
// test suite hangs on mksh
Bash,
Diffutils,
Autoconf,
Gettext,

View File

@@ -86,6 +86,8 @@ func init() {
},
ID: 14894,
latest: (*Versions).getStable,
}
}

View File

@@ -67,8 +67,8 @@ func init() {
func (t Toolchain) newSed() (pkg.Artifact, string) {
const (
version = "4.9"
checksum = "pe7HWH4PHNYrazOTlUoE1fXmhn2GOPFN_xE62i0llOr3kYGrH1g2_orDz0UtZ9Nt"
version = "4.10"
checksum = "TXTRFQJCyflb-bpBRI2S5Y1DpplwvT7-KfXtpqN4AdZgZ5OtI6yStn1-bkhDKx51"
)
return t.NewPackage("sed", version, newTar(
"https://ftpmirror.gnu.org/gnu/sed/sed-"+version+".tar.gz",
@@ -76,6 +76,8 @@ func (t Toolchain) newSed() (pkg.Artifact, string) {
pkg.TarGzip,
), nil, (*MakeHelper)(nil),
Diffutils,
KernelHeaders,
), version
}
func init() {
@@ -374,8 +376,8 @@ func init() {
func (t Toolchain) newCoreutils() (pkg.Artifact, string) {
const (
version = "9.10"
checksum = "o-B9wssRnZySzJUI1ZJAgw-bZtj1RC67R9po2AcM2OjjS8FQIl16IRHpC6IwO30i"
version = "9.11"
checksum = "t8UMed5wpFEoC56aa42_yidfOAaRGzOfj7MRtQkkqgGbpXiskNA8bd-EmVSQkZie"
)
return t.NewPackage("coreutils", version, newTar(
"https://ftpmirror.gnu.org/gnu/coreutils/coreutils-"+version+".tar.gz",
@@ -387,106 +389,13 @@ func (t Toolchain) newCoreutils() (pkg.Artifact, string) {
test_disable() { chmod +w "$2" && echo "$1" > "$2"; }
test_disable '#!/bin/sh' gnulib-tests/test-c32ispunct.sh
test_disable '#!/bin/sh' tests/split/line-bytes.sh
test_disable '#!/bin/sh' tests/ls/hyperlink.sh
test_disable '#!/bin/sh' tests/misc/user.sh
test_disable 'int main(){return 0;}' gnulib-tests/test-chown.c
test_disable 'int main(){return 0;}' gnulib-tests/test-fchownat.c
test_disable 'int main(){return 0;}' gnulib-tests/test-lchown.c
`,
Patches: []KV{
{"tests-fix-job-control", `From 21d287324aa43aa3a31f39619ade0deac7fd6013 Mon Sep 17 00:00:00 2001
From: =?UTF-8?q?P=C3=A1draig=20Brady?= <P@draigBrady.com>
Date: Tue, 24 Feb 2026 15:44:41 +0000
Subject: [PATCH] tests: fix job control triggering test termination
This avoids the test harness being terminated like:
make[1]: *** [Makefile:24419: check-recursive] Hangup
make[3]: *** [Makefile:24668: check-TESTS] Hangup
make: *** [Makefile:24922: check] Hangup
make[2]: *** [Makefile:24920: check-am] Hangup
make[4]: *** [Makefile:24685: tests/misc/usage_vs_refs.log] Error 129
...
This happened sometimes when the tests were being run non interactively.
For example when run like:
setsid make TESTS="tests/timeout/timeout.sh \
tests/tail/overlay-headers.sh" SUBDIRS=. -j2 check
Note the race window can be made bigger by adding a sleep
after tail is stopped in overlay-headers.sh
The race can trigger the kernel to induce its job control
mechanism to prevent stuck processes.
I.e. where it sends SIGHUP + SIGCONT to a process group
when it determines that group may become orphaned,
and there are stopped processes in that group.
* tests/tail/overlay-headers.sh: Use setsid(1) to keep the stopped
tail process in a separate process group, thus avoiding any kernel
job control protection mechanism.
* tests/timeout/timeout.sh: Use setsid(1) to avoid the kernel
checking the main process group when sleep(1) is reparented.
Fixes https://bugs.gnu.org/80477
---
tests/tail/overlay-headers.sh | 8 +++++++-
tests/timeout/timeout.sh | 11 ++++++++---
2 files changed, 15 insertions(+), 4 deletions(-)
diff --git a/tests/tail/overlay-headers.sh b/tests/tail/overlay-headers.sh
index be9b6a7df..1e6da0a3f 100755
--- a/tests/tail/overlay-headers.sh
+++ b/tests/tail/overlay-headers.sh
@@ -20,6 +20,8 @@
. "${srcdir=.}/tests/init.sh"; path_prepend_ ./src
print_ver_ tail sleep
+setsid true || skip_ 'setsid required to control groups'
+
# Function to count number of lines from tail
# while ignoring transient errors due to resource limits
countlines_ ()
@@ -54,7 +56,11 @@ echo start > file2 || framework_failure_
env sleep 60 & sleep=$!
# Note don't use timeout(1) here as it currently
-# does not propagate SIGCONT
+# does not propagate SIGCONT.
+# Note use setsid here to ensure we're in a separate process group
+# as we're going to STOP this tail process, and this can trigger
+# the kernel to send SIGHUP to a group if other tests have
+# processes that are reparented. (See tests/timeout/timeout.sh).
tail $fastpoll --pid=$sleep -f file1 file2 > out & pid=$!
# Ensure tail is running
diff --git a/tests/timeout/timeout.sh b/tests/timeout/timeout.sh
index 9a395416b..fbb043312 100755
--- a/tests/timeout/timeout.sh
+++ b/tests/timeout/timeout.sh
@@ -56,9 +56,14 @@ returns_ 124 timeout --foreground -s0 -k1 .1 sleep 10 && fail=1
) || fail=1
# Don't be confused when starting off with a child (Bug#9098).
-out=$(sleep .1 & exec timeout .5 sh -c 'sleep 2; echo foo')
-status=$?
-test "$out" = "" && test $status = 124 || fail=1
+# Use setsid to avoid sleep being in the test's process group, as
+# upon reparenting it can trigger an orphaned process group SIGHUP
+# (if there were stopped processes in other tests).
+if setsid true; then
+ out=$(setsid sleep .1 & exec timeout .5 sh -c 'sleep 2; echo foo')
+ status=$?
+ test "$out" = "" && test $status = 124 || fail=1
+fi
# Verify --verbose output
cat > exp <<\EOF
--
2.53.0
`},
},
Flag: TEarly,
}, &MakeHelper{
Configure: []KV{
@@ -757,15 +666,20 @@ func init() {
func (t Toolchain) newParallel() (pkg.Artifact, string) {
const (
version = "20260322"
checksum = "gHoPmFkOO62ev4xW59HqyMlodhjp8LvTsBOwsVKHUUdfrt7KwB8koXmSVqQ4VOrB"
version = "20260422"
checksum = "eTsepxgqhXpMEhPd55qh-W5y4vjKn0x9TD2mzbJCNZYtFf4lT4Wzoqr74HGJYBEH"
)
return t.NewPackage("parallel", version, newTar(
"https://ftpmirror.gnu.org/gnu/parallel/parallel-"+version+".tar.bz2",
checksum,
pkg.TarBzip2,
), nil, (*MakeHelper)(nil),
), &PackageAttr{
ScriptEarly: `
ln -s ../system/bin/bash /bin/
`,
}, (*MakeHelper)(nil),
Perl,
Bash,
), version
}
func init() {
@@ -1143,10 +1057,11 @@ func init() {
func (t Toolchain) newMPC() (pkg.Artifact, string) {
const (
version = "1.4.1"
checksum = "wdXAhplnS89FjVp20m2nC2CmLFQeyQqLpQAfViTy4vPxFdv2WYOTtfBKeIk5_Rec"
checksum = "ZffaZyWkvIw0iPvRe5EJ7O-VvHtSkbbb3K_7SgPtK810NvGan7nbF0T5-6tozjQN"
)
return t.NewPackage("mpc", version, t.newTagRemote(
"https://gitlab.inria.fr/mpc/mpc.git",
return t.NewPackage("mpc", version, newFromGitLab(
"gitlab.inria.fr",
"mpc/mpc",
version, checksum,
), &PackageAttr{
// does not find mpc-impl.h otherwise

View File

@@ -151,6 +151,9 @@ rm \
sed -i \
's,/lib/ld-musl-`+linuxArch()+`.so.1,/system/bin/linker,' \
cmd/link/internal/`+runtime.GOARCH+`/obj.go
sed -i \
's/cpu.X86.HasAVX512VBMI/& \&\& cpu.X86.HasPOPCNT/' \
internal/runtime/gc/scan/scan_amd64.go
rm \
os/root_unix_test.go

View File

@@ -2,12 +2,12 @@ package rosa
import "hakurei.app/internal/pkg"
const kernelVersion = "6.12.81"
const kernelVersion = "6.12.84"
var kernelSource = newTar(
"https://git.kernel.org/pub/scm/linux/kernel/git/stable/linux.git/"+
"snapshot/linux-"+kernelVersion+".tar.gz",
"fBkNwf82DQXh74in6gaF2Jot7Vg-Vlcp9BUtCEipL9mvcM1EXLVFdV7FcrO20Eve",
"GJLUEu68r3DpLYoTcMl4wA_ThMBs_Zwc0gZsp82ii_3AOfcVxpI639IKfq2jAAY2",
pkg.TarGzip,
)

View File

@@ -1,16 +1,16 @@
#
# Automatically generated file; DO NOT EDIT.
# Linux/x86 6.12.80 Kernel Configuration
# Linux/x86 6.12.84 Kernel Configuration
#
CONFIG_CC_VERSION_TEXT="clang version 22.1.2"
CONFIG_CC_VERSION_TEXT="clang version 22.1.4"
CONFIG_GCC_VERSION=0
CONFIG_CC_IS_CLANG=y
CONFIG_CLANG_VERSION=220102
CONFIG_CLANG_VERSION=220104
CONFIG_AS_IS_LLVM=y
CONFIG_AS_VERSION=220102
CONFIG_AS_VERSION=220104
CONFIG_LD_VERSION=0
CONFIG_LD_IS_LLD=y
CONFIG_LLD_VERSION=220102
CONFIG_LLD_VERSION=220104
CONFIG_RUSTC_VERSION=0
CONFIG_RUSTC_LLVM_VERSION=0
CONFIG_CC_HAS_ASM_GOTO_OUTPUT=y
@@ -3175,14 +3175,8 @@ CONFIG_PATA_ACPI=y
CONFIG_ATA_GENERIC=y
CONFIG_PATA_LEGACY=m
CONFIG_MD=y
CONFIG_BLK_DEV_MD=m
# CONFIG_BLK_DEV_MD is not set
CONFIG_MD_BITMAP_FILE=y
CONFIG_MD_LINEAR=m
CONFIG_MD_RAID0=m
CONFIG_MD_RAID1=m
CONFIG_MD_RAID10=m
CONFIG_MD_RAID456=m
CONFIG_MD_CLUSTER=m
CONFIG_BCACHE=m
# CONFIG_BCACHE_DEBUG is not set
# CONFIG_BCACHE_ASYNC_REGISTRATION is not set
@@ -3205,7 +3199,7 @@ CONFIG_DM_ERA=m
CONFIG_DM_CLONE=m
CONFIG_DM_MIRROR=m
CONFIG_DM_LOG_USERSPACE=m
CONFIG_DM_RAID=m
# CONFIG_DM_RAID is not set
CONFIG_DM_ZERO=m
CONFIG_DM_MULTIPATH=m
CONFIG_DM_MULTIPATH_QL=m
@@ -11636,10 +11630,7 @@ CONFIG_RANDSTRUCT_NONE=y
CONFIG_XOR_BLOCKS=m
CONFIG_ASYNC_CORE=m
CONFIG_ASYNC_MEMCPY=m
CONFIG_ASYNC_XOR=m
CONFIG_ASYNC_PQ=m
CONFIG_ASYNC_RAID6_RECOV=m
CONFIG_CRYPTO=y
#
@@ -11925,8 +11916,6 @@ CONFIG_BINARY_PRINTF=y
#
# Library routines
#
CONFIG_RAID6_PQ=m
CONFIG_RAID6_PQ_BENCHMARK=y
CONFIG_LINEAR_RANGES=y
CONFIG_PACKING=y
CONFIG_BITREVERSE=y
@@ -12471,7 +12460,6 @@ CONFIG_RUNTIME_TESTING_MENU=y
# CONFIG_INTERVAL_TREE_TEST is not set
# CONFIG_PERCPU_TEST is not set
# CONFIG_ATOMIC64_SELFTEST is not set
# CONFIG_ASYNC_RAID6_TEST is not set
# CONFIG_TEST_HEXDUMP is not set
# CONFIG_TEST_KSTRTOX is not set
# CONFIG_TEST_PRINTF is not set

View File

@@ -1,16 +1,16 @@
#
# Automatically generated file; DO NOT EDIT.
# Linux/arm64 6.12.80 Kernel Configuration
# Linux/arm64 6.12.83 Kernel Configuration
#
CONFIG_CC_VERSION_TEXT="clang version 21.1.8"
CONFIG_CC_VERSION_TEXT="clang version 22.1.4"
CONFIG_GCC_VERSION=0
CONFIG_CC_IS_CLANG=y
CONFIG_CLANG_VERSION=210108
CONFIG_CLANG_VERSION=220104
CONFIG_AS_IS_LLVM=y
CONFIG_AS_VERSION=210108
CONFIG_AS_VERSION=220104
CONFIG_LD_VERSION=0
CONFIG_LD_IS_LLD=y
CONFIG_LLD_VERSION=210108
CONFIG_LLD_VERSION=220104
CONFIG_RUSTC_VERSION=0
CONFIG_RUSTC_LLVM_VERSION=0
CONFIG_CC_HAS_ASM_GOTO_OUTPUT=y
@@ -3253,14 +3253,8 @@ CONFIG_PATA_ACPI=y
CONFIG_ATA_GENERIC=y
CONFIG_PATA_LEGACY=m
CONFIG_MD=y
CONFIG_BLK_DEV_MD=m
# CONFIG_BLK_DEV_MD is not set
CONFIG_MD_BITMAP_FILE=y
CONFIG_MD_LINEAR=m
CONFIG_MD_RAID0=m
CONFIG_MD_RAID1=m
CONFIG_MD_RAID10=m
CONFIG_MD_RAID456=m
CONFIG_MD_CLUSTER=m
CONFIG_BCACHE=m
# CONFIG_BCACHE_DEBUG is not set
# CONFIG_BCACHE_ASYNC_REGISTRATION is not set
@@ -3283,7 +3277,7 @@ CONFIG_DM_ERA=m
CONFIG_DM_CLONE=m
CONFIG_DM_MIRROR=m
CONFIG_DM_LOG_USERSPACE=m
CONFIG_DM_RAID=m
# CONFIG_DM_RAID is not set
CONFIG_DM_ZERO=m
CONFIG_DM_MULTIPATH=m
CONFIG_DM_MULTIPATH_QL=m
@@ -10300,7 +10294,6 @@ CONFIG_ALTERA_MSGDMA=m
# CONFIG_AMBA_PL08X is not set
CONFIG_APPLE_ADMAC=m
CONFIG_AXI_DMAC=m
CONFIG_BCM_SBA_RAID=m
CONFIG_DMA_BCM2835=m
CONFIG_DMA_SUN6I=m
CONFIG_DW_AXI_DMAC=m
@@ -13292,12 +13285,7 @@ CONFIG_RANDSTRUCT_NONE=y
CONFIG_XOR_BLOCKS=m
CONFIG_ASYNC_CORE=m
CONFIG_ASYNC_MEMCPY=m
CONFIG_ASYNC_XOR=m
CONFIG_ASYNC_PQ=m
CONFIG_ASYNC_RAID6_RECOV=m
CONFIG_ASYNC_TX_DISABLE_PQ_VAL_DMA=y
CONFIG_ASYNC_TX_DISABLE_XOR_VAL_DMA=y
CONFIG_CRYPTO=y
#
@@ -13640,8 +13628,6 @@ CONFIG_BINARY_PRINTF=y
#
# Library routines
#
CONFIG_RAID6_PQ=m
CONFIG_RAID6_PQ_BENCHMARK=y
CONFIG_LINEAR_RANGES=y
CONFIG_PACKING=y
CONFIG_BITREVERSE=y
@@ -14172,7 +14158,6 @@ CONFIG_RUNTIME_TESTING_MENU=y
# CONFIG_INTERVAL_TREE_TEST is not set
# CONFIG_PERCPU_TEST is not set
# CONFIG_ATOMIC64_SELFTEST is not set
# CONFIG_ASYNC_RAID6_TEST is not set
# CONFIG_TEST_HEXDUMP is not set
# CONFIG_TEST_KSTRTOX is not set
# CONFIG_TEST_PRINTF is not set

View File

@@ -1,6 +1,6 @@
#
# Automatically generated file; DO NOT EDIT.
# Linux/riscv 6.12.77 Kernel Configuration
# Linux/riscv 6.12.80 Kernel Configuration
#
CONFIG_CC_VERSION_TEXT="clang version 22.1.2"
CONFIG_GCC_VERSION=0
@@ -37,11 +37,6 @@ CONFIG_BUILD_SALT=""
CONFIG_HAVE_KERNEL_GZIP=y
CONFIG_HAVE_KERNEL_ZSTD=y
# CONFIG_KERNEL_GZIP is not set
# CONFIG_KERNEL_BZIP2 is not set
# CONFIG_KERNEL_LZMA is not set
# CONFIG_KERNEL_XZ is not set
# CONFIG_KERNEL_LZO is not set
# CONFIG_KERNEL_LZ4 is not set
CONFIG_KERNEL_ZSTD=y
CONFIG_DEFAULT_INIT=""
CONFIG_DEFAULT_HOSTNAME="rosa-early"
@@ -2848,14 +2843,8 @@ CONFIG_PATA_ACPI=y
CONFIG_ATA_GENERIC=y
CONFIG_PATA_LEGACY=m
CONFIG_MD=y
CONFIG_BLK_DEV_MD=m
# CONFIG_BLK_DEV_MD is not set
CONFIG_MD_BITMAP_FILE=y
CONFIG_MD_LINEAR=m
CONFIG_MD_RAID0=m
CONFIG_MD_RAID1=m
CONFIG_MD_RAID10=m
CONFIG_MD_RAID456=m
CONFIG_MD_CLUSTER=m
CONFIG_BCACHE=m
# CONFIG_BCACHE_DEBUG is not set
# CONFIG_BCACHE_ASYNC_REGISTRATION is not set
@@ -2878,7 +2867,7 @@ CONFIG_DM_ERA=m
CONFIG_DM_CLONE=m
CONFIG_DM_MIRROR=m
CONFIG_DM_LOG_USERSPACE=m
CONFIG_DM_RAID=m
# CONFIG_DM_RAID is not set
CONFIG_DM_ZERO=m
CONFIG_DM_MULTIPATH=m
CONFIG_DM_MULTIPATH_QL=m
@@ -10655,10 +10644,7 @@ CONFIG_RANDSTRUCT_NONE=y
CONFIG_XOR_BLOCKS=m
CONFIG_ASYNC_CORE=m
CONFIG_ASYNC_MEMCPY=m
CONFIG_ASYNC_XOR=m
CONFIG_ASYNC_PQ=m
CONFIG_ASYNC_RAID6_RECOV=m
CONFIG_CRYPTO=y
#
@@ -10918,8 +10904,6 @@ CONFIG_BINARY_PRINTF=y
#
# Library routines
#
CONFIG_RAID6_PQ=m
CONFIG_RAID6_PQ_BENCHMARK=y
CONFIG_LINEAR_RANGES=y
CONFIG_PACKING=y
CONFIG_BITREVERSE=y
@@ -11408,7 +11392,6 @@ CONFIG_RUNTIME_TESTING_MENU=y
# CONFIG_INTERVAL_TREE_TEST is not set
# CONFIG_PERCPU_TEST is not set
# CONFIG_ATOMIC64_SELFTEST is not set
# CONFIG_ASYNC_RAID6_TEST is not set
# CONFIG_TEST_HEXDUMP is not set
# CONFIG_TEST_KSTRTOX is not set
# CONFIG_TEST_PRINTF is not set

View File

@@ -8,8 +8,8 @@ import (
func (t Toolchain) newLibexpat() (pkg.Artifact, string) {
const (
version = "2.7.5"
checksum = "vTRUjjg-qbHSXUBYKXgzVHkUO7UNyuhrkSYrE7ikApQm0g-OvQ8tspw4w55M-1Tp"
version = "2.8.0"
checksum = "pnwZ_JSif-OfoWIwk2JYXWHagOWMA3Sh-Ea0p-4Rz9U9mDEeAebhyvnfD7OYOMCk"
)
return t.NewPackage("libexpat", version, newFromGitHubRelease(
"libexpat/libexpat",

View File

@@ -5,10 +5,11 @@ import "hakurei.app/internal/pkg"
func (t Toolchain) newLibxml2() (pkg.Artifact, string) {
const (
version = "2.15.3"
checksum = "oWkNe53c3d4Lt4OzrXPHBcOLHJ3TWqpa0x7B7bh_DyZ-uIMiplpdZjQRgRWVal2h"
checksum = "oJy74htGlEpf70KPvpW18fYJo0RQQkCXZRwqUz6NoXborS3HCq3Nm4gsyaSeNmUH"
)
return t.NewPackage("libxml2", version, t.newTagRemote(
"https://gitlab.gnome.org/GNOME/libxml2.git",
return t.NewPackage("libxml2", version, newFromGitLab(
"gitlab.gnome.org",
"GNOME/libxml2",
"v"+version, checksum,
), &PackageAttr{
// can't create shell.out: Read-only file system

View File

@@ -5,10 +5,11 @@ import "hakurei.app/internal/pkg"
func (t Toolchain) newLibxslt() (pkg.Artifact, string) {
const (
version = "1.1.45"
checksum = "MZc_dyUWpHChkWDKa5iycrECxBsRd4ZMbYfL4VojTbung593mlH2tHGmxYB6NFYT"
checksum = "67ks7v8od2oWaEGf23Sst_Xbn_8brQyolQjqxPoO-lK35k_WJhi2Px5JJgbk-nfn"
)
return t.NewPackage("libxslt", version, t.newTagRemote(
"https://gitlab.gnome.org/GNOME/libxslt.git",
return t.NewPackage("libxslt", version, newFromGitLab(
"gitlab.gnome.org",
"GNOME/libxslt",
"v"+version, checksum,
), nil, &MakeHelper{
Generate: "NOCONFIGURE=1 ./autogen.sh",

View File

@@ -3,7 +3,7 @@ package rosa
// latest version of LLVM, conditional to temporarily avoid broken new releases
const (
llvmVersionMajor = "22"
llvmVersion = llvmVersionMajor + ".1.3"
llvmVersion = llvmVersionMajor + ".1.4"
llvmChecksum = "CUwnpzua_y28HZ9oI0NmcKL2wClsSjFpgY9do5-7cCZJHI5KNF64vfwGvY0TYyR3"
llvmChecksum = "Bk3t-tV5sD5T0bqefFMcLeFuAwXnhFipywZmqst5hAZs97QQWGKB_5XyAFjj5tDB"
)

View File

@@ -30,8 +30,8 @@ func init() {
func (t Toolchain) newLibdrm() (pkg.Artifact, string) {
const (
version = "2.4.131"
checksum = "riHPSpvTnvCPbR-iT4jt7_X-z4rpwm6oNh9ZN2zP6RBFkFVxBRKmedG4eEXSADIh"
version = "2.4.133"
checksum = "bfj296NcR9DndO11hqDbSRFPqaweSLMqRk3dlCPZpM6FONX1WZ9J4JdbTDMUd1rU"
)
return t.NewPackage("libdrm", version, newFromGitLab(
"gitlab.freedesktop.org",

View File

@@ -9,8 +9,8 @@ import (
func (t Toolchain) newMeson() (pkg.Artifact, string) {
const (
version = "1.11.0"
checksum = "QJolMPzypTiS65GReSNPPlkUjHI6b1EDpZ-avIk3n6b6TQ93KfUM57DVUpY97Hf7"
version = "1.11.1"
checksum = "uvILRxdopwc6Dy17UbIeClcQr0qHqyTaqyk1M9OqWKN9PwB9N6UVAiyN8kSSz3r2"
)
return t.NewPackage("meson", version, newFromGitHub(
"mesonbuild/meson",
@@ -22,7 +22,7 @@ func (t Toolchain) newMeson() (pkg.Artifact, string) {
},
}, &PipHelper{
EnterSource: true,
Script: `
Check: `
cd 'test cases'
rm -rf \
'common/32 has header' \
@@ -44,7 +44,7 @@ python3 ./run_project_tests.py \
--backend=ninja
`,
},
Setuptools,
PythonSetuptools,
PkgConfig,
CMake,
Ninja,

View File

@@ -8,8 +8,8 @@ import (
func (t Toolchain) newNSS() (pkg.Artifact, string) {
const (
version = "3.123"
checksum = "pwBz0FO8jmhejPblfzNQLGsqBBGT0DwAw-z9yBJH3V3hVJBMKSc1l0R8GC0_BnzF"
version = "3.123.1"
checksum = "g811Z_fc74ssg-s6BeXRG-ipSfJggD6hrxjVJxrOBIz98CE7piv0OLwzIRLMQpwR"
version0 = "4_38_2"
checksum0 = "25x2uJeQnOHIiq_zj17b4sYqKgeoU8-IsySUptoPcdHZ52PohFZfGuIisBreWzx0"
@@ -82,30 +82,24 @@ func init() {
}
}
func (t Toolchain) newBuildCATrust() (pkg.Artifact, string) {
func init() {
const (
version = "0.5.1"
checksum = "g9AqIksz-hvCUceSR7ZKwfqf8Y_UsJU_3_zLUIdc4IkxFVkgdv9kKVvhFjE4s1-7"
checksum = "oxjnuIrPVMPvD6x8VFLqB7EdbfuhouGQdtPuHDpEHGzoyH5nkxqtYN9UthMY9noA"
)
artifactsM[buildcatrust] = newPythonPackage(
"buildcatrust", 233988,
"transform certificate stores between formats",
"https://github.com/nix-community/buildcatrust",
version, newFromGitHub(
"nix-community/buildcatrust",
"v"+version, checksum,
), &PackageAttr{
ScriptEarly: `
rm buildcatrust/tests/test_nonhermetic.py
`,
}, nil, P{PythonFlitCore},
)
return t.newViaPip("buildcatrust", version,
"https://github.com/nix-community/buildcatrust/releases/"+
"download/v"+version+"/buildcatrust-"+version+"-py3-none-any.whl",
checksum), version
}
func init() {
artifactsM[buildcatrust] = Metadata{
f: Toolchain.newBuildCATrust,
Name: "buildcatrust",
Description: "transform certificate stores between formats",
Website: "https://github.com/nix-community/buildcatrust",
Dependencies: P{
Python,
},
ID: 233988,
}
}
func (t Toolchain) newNSSCACert() (pkg.Artifact, string) {

View File

@@ -1,7 +1,6 @@
package rosa
import (
"path"
"slices"
"strings"
@@ -23,6 +22,32 @@ func (t Toolchain) newPython() (pkg.Artifact, string) {
Writable: true,
Chmod: true,
Patches: []KV{
{"zipfile-no-default-strict_timestamps", `diff --git a/Lib/zipfile/__init__.py b/Lib/zipfile/__init__.py
index 19aea290b58..51603ba9510 100644
--- a/Lib/zipfile/__init__.py
+++ b/Lib/zipfile/__init__.py
@@ -617,7 +617,7 @@ def _decodeExtra(self, filename_crc):
extra = extra[ln+4:]
@classmethod
- def from_file(cls, filename, arcname=None, *, strict_timestamps=True):
+ def from_file(cls, filename, arcname=None, *, strict_timestamps=False):
"""Construct an appropriate ZipInfo for a file on the filesystem.
filename should be the path to a file or directory on the filesystem.
@@ -1412,7 +1412,7 @@ class ZipFile:
_windows_illegal_name_trans_table = None
def __init__(self, file, mode="r", compression=ZIP_STORED, allowZip64=True,
- compresslevel=None, *, strict_timestamps=True, metadata_encoding=None):
+ compresslevel=None, *, strict_timestamps=False, metadata_encoding=None):
"""Open the ZIP file with mode read 'r', write 'w', exclusive create 'x',
or append 'a'."""
if mode not in ('r', 'w', 'x', 'a'):
`},
},
Env: []string{
"EXTRATESTOPTS=-j0 -x " + strings.Join([]string{
// requires internet access (http://www.pythontest.net/)
@@ -36,7 +61,7 @@ func (t Toolchain) newPython() (pkg.Artifact, string) {
"test_os",
"test_subprocess",
// somehow picks up mtime of source code
// patched out insane strict_timestamps default
"test_zipfile",
// requires gcc
@@ -83,18 +108,31 @@ func init() {
// PipHelper is the [Python] pip packaging helper.
type PipHelper struct {
// Path elements joined with source.
Append []string
// Whether to omit --no-build-isolation.
BuildIsolation bool
// Whether to enter source after install.
EnterSource bool
// Whether to install to build environment after install.
Install bool
// Whether to skip running tests.
SkipCheck bool
// Replaces pytest if non-empty.
Check string
// Runs after install.
Script string
}
var _ Helper = new(PipHelper)
// extra returns python.
func (*PipHelper) extra(int) P { return P{Python} }
// extra returns python, or pytest if defaults are assumed.
func (attr *PipHelper) extra(int) P {
if attr == nil || (!attr.SkipCheck && attr.Check == "") {
return P{PythonPyTest}
}
return P{Python}
}
// wantsChmod returns true.
func (*PipHelper) wantsChmod() bool { return true }
@@ -116,6 +154,7 @@ func (attr *PipHelper) script(name string) string {
if attr == nil {
attr = new(PipHelper)
}
sourcePath := AbsUsrSrc.Append(name).Append(attr.Append...)
var extra string
if !attr.BuildIsolation {
@@ -123,61 +162,56 @@ func (attr *PipHelper) script(name string) string {
--no-build-isolation \`
}
script := attr.Script
if attr.EnterSource {
script = "cd '/usr/src/" + name + "'\n" + script
var script string
if attr.Install {
script += `pip3 install \
--no-index \
--prefix=/system \
--no-build-isolation \
'` + sourcePath.String() + `'
`
}
if attr.EnterSource {
script += "cd '/usr/src/" + name + "'\n"
}
if !attr.SkipCheck {
if attr.Check == "" {
// some test suites fall apart when ran out-of-tree
script += "(cd '" + sourcePath.String() + "' && pytest)\n"
} else {
script += attr.Check
}
}
script += attr.Script
return `
pip3 install \
--no-index \
--prefix=/system \
--root=/work \` + extra + `
'/usr/src/` + name + `'
'` + sourcePath.String() + `'
` + script
}
// newViaPip installs a pip wheel from a url.
func (t Toolchain) newViaPip(
name, version, url, checksum string,
extra ...PArtifact,
) pkg.Artifact {
return t.New(name+"-"+version, 0, t.AppendPresets(nil,
slices.Concat(P{Python}, extra)...,
), nil, nil, `
pip3 install \
--no-index \
--prefix=/system \
--root=/work \
'/usr/src/`+path.Base(url)+`'
`, pkg.Path(AbsUsrSrc.Append(path.Base(url)), false, pkg.NewHTTPGet(
nil, url,
mustDecode(checksum),
)))
}
// newPypi creates [Metadata] for a [pypi] package.
//
// [pypi]: https://pypi.org/
func newPypi(
name string, id int,
description, version, interpreter, abi, platform, checksum string,
extra ...PArtifact,
// newPythonPackage creates [Metadata] for a [Python] package.
func newPythonPackage(
name string, id int, description, website, version string,
source pkg.Artifact, attrP *PackageAttr, attr *PipHelper,
build P, extra ...PArtifact,
) Metadata {
name = "python-" + name
return Metadata{
f: func(t Toolchain) (pkg.Artifact, string) {
return t.newViaPip(name, version, "https://files.pythonhosted.org/"+path.Join(
"packages",
interpreter,
string(name[0]),
name,
name+"-"+version+"-"+interpreter+"-"+abi+"-"+platform+".whl",
), checksum, extra...), version
return t.NewPackage(name, version, source, attrP, attr, slices.Concat(
P{Python},
extra,
build,
)...), version
},
Name: "python-" + name,
Name: name,
Description: description,
Website: "https://pypi.org/project/" + name + "/",
Website: website,
Dependencies: slices.Concat(P{Python}, extra),
@@ -185,100 +219,345 @@ func newPypi(
}
}
func (t Toolchain) newSetuptools() (pkg.Artifact, string) {
func init() {
const (
version = "0.47.0"
checksum = "HZ-MvkUP8mbbx2YmsRNswj_bbOCIiXckuHqL5Qbvb5NxN5DYfWnqwkGNyS7OrId0"
)
artifactsM[PythonWheel] = newPythonPackage(
"wheel", 11428,
"the official binary distribution format for Python",
"https://peps.python.org/pep-0427/",
version, newFromGitHub(
"pypa/wheel",
version, checksum,
), nil, &PipHelper{
Install: true,
}, P{PythonFlitCore, PythonSetuptools},
)
}
func init() {
const (
version = "82.0.1"
checksum = "nznP46Tj539yqswtOrIM4nQgwLA1h-ApKX7z7ghazROCpyF5swtQGwsZoI93wkhc"
)
return t.NewPackage("setuptools", version, newFromGitHub(
"pypa/setuptools",
"v"+version, checksum,
), nil, &PipHelper{
BuildIsolation: true,
}), version
}
func init() {
artifactsM[Setuptools] = Metadata{
f: Toolchain.newSetuptools,
Name: "python-setuptools",
Description: "the autotools of the Python ecosystem",
Website: "https://pypi.org/project/setuptools/",
Dependencies: P{
Python,
},
ID: 4021,
}
artifactsM[PythonSetuptools] = newPythonPackage(
"setuptools", 4021,
"the autotools of the Python ecosystem",
"https://pypi.org/project/setuptools/",
version, newFromGitHub(
"pypa/setuptools",
"v"+version, checksum,
), nil, &PipHelper{
// error: invalid command 'dist_info'
BuildIsolation: true,
// pytest circular dependency
SkipCheck: true,
}, nil)
}
func init() {
artifactsM[PythonPygments] = newPypi(
"pygments", 3986,
" a syntax highlighting package written in Python",
"2.20.0", "py3", "none", "any",
"qlyqX2YSXcV0Z8XgGaPttc_gkq-xsu_nYs6NFOcYnk-CX7qmcj45gG-h6DpwPIcO",
const (
version = "1.1.1"
checksum = "rXZixTsZcRcIoUC1LvWrjySsiXSv5uhW6ng2P-yXZrbdj7FrSrDeJLCfC2b-ladV"
)
artifactsM[PythonPluggy] = newPypi(
"pluggy", 7500,
"the core framework used by the pytest, tox, and devpi projects",
"1.6.0", "py3", "none", "any",
"2HWYBaEwM66-y1hSUcWI1MyE7dVVuNNRW24XD6iJBey4YaUdAK8WeXdtFMQGC-4J",
artifactsM[PythonVCSVersioning] = newPythonPackage(
"vcs-versioning", 389421,
"core VCS versioning functionality extracted as a standalone library",
"https://setuptools-scm.readthedocs.io/en/latest/",
version, newFromGitHub(
"pypa/setuptools-scm",
"vcs-versioning-v"+version, checksum,
), &PackageAttr{
Env: []string{
"SETUPTOOLS_SCM_PRETEND_VERSION=" + version,
},
}, &PipHelper{
// upstream is monorepo of two packages (setuptools-scm)
Append: []string{"vcs-versioning"},
// pytest circular dependency
SkipCheck: true,
}, nil,
PythonSetuptools,
PythonPackaging,
)
}
artifactsM[PythonPackaging] = newPypi(
func init() {
const (
version = "10.0.5"
checksum = "vTN_TPd-b4Wbsw5WmAcsWjrs-FNXXznOeVTDnb54NtXve9Oy-eb2HPy-RG3FzNqp"
)
artifactsM[PythonSetuptoolsSCM] = newPythonPackage(
"setuptools-scm", 7874,
"extracts Python package versions from Git or Mercurial metadata",
"https://setuptools-scm.readthedocs.io/en/latest/",
version, newFromGitHub(
"pypa/setuptools-scm",
"setuptools-scm-v"+version, checksum,
), &PackageAttr{
Env: []string{
"SETUPTOOLS_SCM_PRETEND_VERSION=" + version,
},
}, &PipHelper{
// upstream is monorepo of two packages
Append: []string{"setuptools-scm"},
// pytest circular dependency
SkipCheck: true,
}, nil,
PythonSetuptools,
PythonVCSVersioning,
)
}
func init() {
const (
version = "3.12.0"
checksum = "VcTsiGiDU1aPLbjSPe38f9OjJDCLcxFz9loObJqUI1ZxDHXAaQMxBpNyLz_G1Rff"
)
artifactsM[PythonFlitCore] = newPythonPackage(
"flit-core", 44841,
"a PEP 517 build backend for packages using Flit",
"https://flit.pypa.io/",
version, newFromGitHub(
"pypa/flit",
version, checksum,
), nil, &PipHelper{
// upstream has other unused packages with many dependencies
Append: []string{"flit_core"},
// pytest circular dependency
SkipCheck: true,
}, nil,
)
}
func init() {
const (
version = "26.2"
checksum = "rdpGa2EkPFbj1mFtLKLnSwIX9gPfELcuneiICjRVDNw6By49szTFVoW8gtMMZ6ZS"
)
artifactsM[PythonPackaging] = newPythonPackage(
"packaging", 60461,
"reusable core utilities for various Python Packaging interoperability specifications",
"26.1", "py3", "none", "any",
"6WZjBJeRb0eZZavxM8cLPcgD-ch-1FblsHoCFKC_9VUC5XAmd397LwliVhsnQcSN",
"https://packaging.pypa.io/",
version, newFromGitHub(
"pypa/packaging",
version, checksum,
), nil, &PipHelper{
// pytest circular dependency
SkipCheck: true,
}, P{PythonFlitCore},
)
}
artifactsM[PythonIniConfig] = newPypi(
func init() {
const (
version = "1.1.1"
checksum = "1fVwoal6FoKXczoG3qRUi87TxSWESSGcgvnbEZDYuaOgsO25o36iF3SbAhwkr4Va"
)
artifactsM[PythonPathspec] = newPythonPackage(
"pathspec", 23424,
"utility library for gitignore style pattern matching of file paths",
"https://github.com/cpburnz/python-pathspec",
version, newFromGitHub(
"cpburnz/python-pathspec",
"v"+version, checksum,
), nil, &PipHelper{
// pytest circular dependency
SkipCheck: true,
}, P{PythonFlitCore},
)
}
func init() {
const (
version = "2026.1.14.14"
checksum = "ADfyOvGq7wfOhW10xTVvTmcuY7ZXw8Xlqhz7AM6KULOEm42rpk_45hfcO2PgxOIe"
)
artifactsM[PythonTroveClassifiers] = newPythonPackage(
"trove-classifiers", 88298,
"canonical source for classifiers on PyPI",
"https://pypi.org/p/trove-classifiers/",
version, newFromGitHub(
"pypa/trove-classifiers",
version, checksum,
), nil, &PipHelper{
// pytest circular dependency
SkipCheck: true,
}, P{PythonSetuptools},
)
}
func init() {
const (
version = "1.6.0"
checksum = "GiUgDkKjF8Xn1cmq6iMhTGXzcPIYeaJrvQpHBSAJapNVx4UyuiTXqd5eVlxSClJu"
)
artifactsM[PythonPluggy] = newPythonPackage(
"pluggy", 7500,
"the core framework used by the pytest, tox, and devpi projects",
"https://pluggy.readthedocs.io/en/latest/",
version, newFromGitHub(
"pytest-dev/pluggy",
version, checksum,
), &PackageAttr{
Env: []string{
"SETUPTOOLS_SCM_PRETEND_VERSION_FOR_PLUGGY=" + version,
},
}, &PipHelper{
// pytest circular dependency
SkipCheck: true,
}, P{PythonSetuptoolsSCM},
)
}
func init() {
const (
version = "1.16.5"
checksum = "V2eREtqZLZeV85yb4O-bfAJCUluHcQP76Qfs0QH5s7RF_Oc8xIP8jD0jl85qFyWk"
)
artifactsM[PythonHatchling] = newPythonPackage(
"hatchling", 16137,
"the extensible, standards compliant build backend used by Hatch",
"https://hatch.pypa.io/latest/",
version, newFromGitHub(
"pypa/hatch",
"hatch-v"+version, checksum,
), nil, &PipHelper{
// upstream has other unused packages with many dependencies
Append: []string{"backend"},
// pytest circular dependency
SkipCheck: true,
}, nil,
PythonPackaging,
PythonPathspec,
PythonTroveClassifiers,
PythonPluggy,
)
}
func init() {
const (
version = "2.20.0"
checksum = "L-2P6vn7c_CNZYliE5CJAWLxO1ziDQVVkf8bnZbHj8aSCQ43oWv11wC9KzU9MeCa"
)
artifactsM[PythonPygments] = newPythonPackage(
"pygments", 3986,
"a syntax highlighting package written in Python",
"https://pygments.org/",
version, newFromGitHub(
"pygments/pygments",
version, checksum,
), nil, &PipHelper{
// pytest circular dependency
SkipCheck: true,
}, P{PythonHatchling},
)
}
func init() {
const (
version = "2.3.0"
checksum = "mH7VBZaXcYatBPE3RQQZvSzz_Ay8IPPek60NpPHZulPq4ReAFUUsA4EPWfiyMknZ"
)
artifactsM[PythonIniConfig] = newPythonPackage(
"iniconfig", 114778,
"a small and simple INI-file parser module",
"2.3.0", "py3", "none", "any",
"SDgs4S5bXi77aVOeKTPv2TUrS3M9rduiK4DpU0hCmDsSBWqnZcWInq9lsx6INxut",
"https://github.com/pytest-dev/iniconfig",
version, newFromGitHub(
"pytest-dev/iniconfig",
"v"+version, checksum,
), &PackageAttr{
Env: []string{
"SETUPTOOLS_SCM_PRETEND_VERSION_FOR_INICONFIG=" + version,
},
}, &PipHelper{
// pytest circular dependency
SkipCheck: true,
}, P{PythonSetuptoolsSCM},
)
}
artifactsM[PythonPyTest] = newPypi(
func init() {
const (
version = "9.0.3"
checksum = "qfLL_znWhbJCDbNJvrx9H3-orJ86z4ifhaW0bIn21jl2sDP-FVoX_1yieOypArQe"
)
artifactsM[PythonPyTest] = newPythonPackage(
"pytest", 3765,
"the pytest framework",
"9.0.3", "py3", "none", "any",
"57WLrIVOfyoRDjt5qD6LGOaDcDCtzQnKDSTUb7GzHyJDtry_nGHHs4-0tW0tiIJr",
"https://pytest.org",
version, newFromGitHub(
"pytest-dev/pytest",
version, checksum,
), &PackageAttr{
Env: []string{
"SETUPTOOLS_SCM_PRETEND_VERSION_FOR_PYTEST=" + version,
},
}, &PipHelper{
// many dependencies
SkipCheck: true,
}, P{PythonSetuptoolsSCM},
PythonIniConfig,
PythonPackaging,
PythonPluggy,
PythonPygments,
)
}
artifactsM[PythonMarkupSafe] = newPypi(
func init() {
const (
version = "3.0.3"
checksum = "txRGYdWE3his1lHHRI-lZADw0-ILvUg2l5OGdFHtFXIb_QowGxwdxHCUSJIgmjQs"
)
artifactsM[PythonMarkupSafe] = newPythonPackage(
"markupsafe", 3918,
"implements a text object that escapes characters so it is safe to use in HTML and XML",
"3.0.3", "cp314", "cp314", "musllinux_1_2_"+linuxArch(),
perArch[string]{
"amd64": "E2mo9ig_FKgTpGon_8qqviSEULwhnmxTIqd9vfyNxNpK4yofVYM7eLW_VE-LKbtO",
"arm64": "iG_hqsncOs8fA7bCaAg0x9XenXWlo9sqblyPcSG7yA9sfGLvM9KZznCpwWfOCwFC",
"riscv64": "7DI7U0M3jvr7U4uZml25GLw3m3EvMubCtNukZmss1gkVJ_DVkhV5DgX3Wt_sztbv",
}.unwrap(),
)
artifactsM[PythonMako] = newPypi(
"mako", 3915,
"a template library written in Python",
"1.3.11", "py3", "none", "any",
"WJ_hxYI-nNiuDiM6QhfAG84uO5U-M2aneB0JS9AQ2J2Oi6YXAbBxIdOeOEng6CoS",
PythonMarkupSafe,
)
artifactsM[PythonPyYAML] = newPypi(
"pyyaml", 4123,
"a YAML parser and emitter for Python",
"6.0.3", "cp314", "cp314", "musllinux_1_2_"+linuxArch(),
perArch[string]{
"amd64": "4_jhCFpUNtyrFp2HOMqUisR005u90MHId53eS7rkUbcGXkoaJ7JRsY21dREHEfGN",
"arm64": "sQ818ZYSmC7Vj9prIPx3sEYqSDhZlWvLbgHV9w4GjxsfQ63ZSzappctKM7Lb0Whw",
}.unwrap(),
"https://markupsafe.palletsprojects.com/",
version, newFromGitHub(
"pallets/markupsafe",
version, checksum,
), nil, &PipHelper{
// ModuleNotFoundError: No module named 'markupsafe'
Install: true,
}, P{PythonSetuptools},
)
}
func init() {
const (
version = "1.3.11"
checksum = "HbNreFhqmvmplmIa6h5TVnkwYYlPnsxBzkTXEA-esyx04GzybjTb7OU8Go07Kfyz"
)
artifactsM[PythonMako] = newPythonPackage(
"mako", 3915,
"a template library written in Python",
"https://www.makotemplates.org/",
version, newFromGitHub(
"sqlalchemy/mako",
"rel_"+strings.Join(strings.SplitN(version, ".", 3), "_"),
checksum,
), nil, nil, P{PythonSetuptools},
PythonMarkupSafe,
)
}
func init() {
const (
version = "6.0.3"
checksum = "7wDv0RW9chBdu9l5Q4Hun5F2HHdo105ZSIixwdFPKbEYbftW9YxmsegfL-zafnbJ"
)
artifactsM[PythonPyYAML] = newPythonPackage(
"pyyaml", 4123,
"a YAML parser and emitter for Python",
"https://pyyaml.org/",
version, newFromGitHub(
"yaml/pyyaml",
version, checksum,
), nil, &PipHelper{
// ModuleNotFoundError: No module named 'yaml'
Install: true,
}, P{PythonSetuptools},
)
}

View File

@@ -4,8 +4,8 @@ import "hakurei.app/internal/pkg"
func (t Toolchain) newQEMU() (pkg.Artifact, string) {
const (
version = "10.2.2"
checksum = "uNzRxlrVoLWe-EmZmBp75SezymgE512iE5XN90Bl7wi6CjE_oQGQB-9ocs7E16QG"
version = "11.0.0"
checksum = "C64gdi_Tkdg2fTwD9ERxtWGcf8vNn_6UvczW0c-x0KW1NZtd3NbEOIrlDhYGn15n"
)
return t.NewPackage("qemu", version, newTar(
"https://download.qemu.org/qemu-"+version+".tar.bz2",
@@ -73,6 +73,8 @@ EOF
},
Bash,
Python,
PythonSetuptools,
PythonWheel,
Ninja,
PkgConfig,
Diffutils,

View File

@@ -3,7 +3,6 @@ package rosa
import (
"errors"
"log"
"path"
"runtime"
"slices"
@@ -15,6 +14,12 @@ import (
"hakurei.app/internal/pkg"
)
// Extension is the variant identification string of custom artifact
// implementations registered by package rosa.
const Extension = "rosa"
func init() { pkg.SetExtension(Extension) }
const (
// kindEtc is the kind of [pkg.Artifact] of cureEtc.
kindEtc = iota + pkg.KindCustomOffset
@@ -28,7 +33,7 @@ const (
func mustDecode(s string) pkg.Checksum {
var fallback = pkg.Checksum{}
if s == "" {
log.Println(
println(
"falling back to",
pkg.Encode(fallback),
"for unpopulated checksum",

View File

@@ -4,8 +4,8 @@ import "hakurei.app/internal/pkg"
func (t Toolchain) newRsync() (pkg.Artifact, string) {
const (
version = "3.4.1"
checksum = "VBlTsBWd9z3r2-ex7GkWeWxkUc5OrlgDzikAC0pK7ufTjAJ0MbmC_N04oSVTGPiv"
version = "3.4.2"
checksum = "t7PxS4WHXzefLMKKc_3hJgxUmlGG6KgHMZ8i4DZvCQAUAizxbclNKwfLyOHyq5BX"
)
return t.NewPackage("rsync", version, newTar(
"https://download.samba.org/pub/rsync/src/"+

View File

@@ -26,7 +26,7 @@ def swaymsg(command: str = "", succeed=True, type="command"):
def check_filter(check_offset, name, pname):
pid = int(machine.wait_until_succeeds(f"pgrep -U {10000+check_offset} -x {pname}", timeout=60))
pid = int(machine.wait_until_succeeds(f"pgrep -U {10000+check_offset} -x {pname}"))
hash = machine.succeed(f"sudo -u alice -i XDG_RUNTIME_DIR=/run/user/1000 WAYLAND_DISPLAY=wayland-1 check-sandbox-{name} hash")
print(machine.succeed(f"hakurei-test -s {hash} filter {pid}"))
@@ -47,7 +47,7 @@ check_filter(0, "pdlike", "cat")
# Check fd leak:
swaymsg("exec exec 127</proc/cmdline && hakurei -v exec sleep infinity")
pd_identity0_sleep_pid = int(machine.wait_until_succeeds("pgrep -U 10000 -x sleep", timeout=60))
pd_identity0_sleep_pid = int(machine.wait_until_succeeds("pgrep -U 10000 -x sleep"))
print(machine.succeed(f"hakurei-test fd {pd_identity0_sleep_pid}"))
machine.succeed(f"kill -INT {pd_identity0_sleep_pid}")
@@ -66,7 +66,7 @@ check_offset = 0
def check_sandbox(name):
global check_offset
swaymsg(f"exec script /dev/null -E always -qec check-sandbox-{name}")
machine.wait_for_file(f"/var/tmp/.hakurei-check-ok.{check_offset}", timeout=60)
machine.wait_for_file(f"/var/tmp/.hakurei-check-ok.{check_offset}")
check_filter(check_offset, name, "hakurei-test")
check_offset += 1
@@ -79,7 +79,7 @@ check_sandbox("device")
check_sandbox("pdlike")
# Exit Sway and verify process exit status 0:
machine.wait_until_fails("pgrep -x hakurei", timeout=5)
machine.wait_until_fails("pgrep -x hakurei")
swaymsg("exit", succeed=False)
machine.wait_for_file("/tmp/sway-exit-ok")

View File

@@ -119,7 +119,7 @@ def hakurei_identity(offset):
# Start hakurei permissive defaults outside Wayland session:
print(machine.succeed("sudo -u alice -i hakurei -v exec -a 0 touch /tmp/pd-bare-ok"))
machine.wait_for_file("/tmp/hakurei.0/tmpdir/0/pd-bare-ok", timeout=5)
machine.wait_for_file("/tmp/hakurei.0/tmpdir/0/pd-bare-ok")
# Verify silent output permissive defaults:
output = machine.succeed("sudo -u alice -i hakurei exec -a 0 true &>/dev/stdout")
@@ -132,11 +132,11 @@ def silent_output_interrupt(flags):
wait_for_window("alice@machine")
# identity 0 does not have home-manager
machine.send_chars(f"exec hakurei exec {flags}-a 0 sh -c 'export PATH=/run/current-system/sw/bin:$PATH && touch /tmp/pd-silent-ready && sleep infinity' &>/tmp/pd-silent\n")
machine.wait_for_file("/tmp/hakurei.0/tmpdir/0/pd-silent-ready", timeout=15)
machine.wait_for_file("/tmp/hakurei.0/tmpdir/0/pd-silent-ready")
machine.succeed("rm /tmp/hakurei.0/tmpdir/0/pd-silent-ready")
machine.send_key("ctrl-c")
machine.wait_until_fails("pgrep foot", timeout=5)
machine.wait_until_fails(f"pgrep -u alice -f 'hakurei exec {flags}-a 0 '", timeout=5)
machine.wait_until_fails("pgrep foot")
machine.wait_until_fails(f"pgrep -u alice -f 'hakurei exec {flags}-a 0 '")
output = machine.succeed("cat /tmp/pd-silent && rm /tmp/pd-silent")
if output != "":
raise Exception(f"unexpected output\n{output}")
@@ -151,7 +151,7 @@ print(machine.fail("sudo -u alice -i hakurei -v exec --wayland true"))
# Start hakurei permissive defaults within Wayland session:
hakurei('-v exec --wayland --dbus --dbus-log notify-send -a "NixOS Tests" "Test notification" "Notification from within sandbox." && touch /tmp/dbus-ok')
machine.wait_for_file("/tmp/dbus-ok", timeout=15)
machine.wait_for_file("/tmp/dbus-ok")
collect_state_ui("dbus_notify_exited")
# not in pid namespace, verify termination
machine.wait_until_fails("pgrep xdg-dbus-proxy")
@@ -165,11 +165,11 @@ hakurei("-v exec --wayland -X --dbus --pulse -u p1 foot && touch /tmp/p1-exit-ok
wait_for_window("p1@machine")
print(machine.succeed("getfacl --absolute-names --omit-header --numeric /tmp/hakurei.0/runtime | grep 10000"))
machine.send_chars("exit\n")
machine.wait_for_file("/tmp/p1-exit-ok", timeout=15)
machine.wait_for_file("/tmp/p1-exit-ok")
# Verify acl is kept alive:
print(machine.succeed("getfacl --absolute-names --omit-header --numeric /tmp/hakurei.0/runtime | grep 10000"))
machine.send_chars("exit\n")
machine.wait_for_file("/tmp/p0-exit-ok", timeout=15)
machine.wait_for_file("/tmp/p0-exit-ok")
machine.fail("getfacl --absolute-names --omit-header --numeric /tmp/hakurei.0/runtime | grep 10000")
# Check invalid identifier fd behaviour:
@@ -181,7 +181,7 @@ print(machine.succeed('grep "^hakurei: cannot write identifier: bad file descrip
swaymsg("exec sh -c 'ne-foot; echo -n $? > /tmp/monitor-exit-code'")
wait_for_window(f"u0_a{hakurei_identity(0)}@machine")
machine.succeed("pkill -INT -f 'hakurei -v run '")
machine.wait_until_fails("pgrep foot", timeout=5)
machine.wait_until_fails("pgrep foot")
machine.wait_for_file("/tmp/monitor-exit-code")
interrupt_exit_code = int(machine.succeed("cat /tmp/monitor-exit-code"))
if interrupt_exit_code != 230:
@@ -191,7 +191,7 @@ if interrupt_exit_code != 230:
swaymsg("exec sh -c 'ne-foot-immediate; echo -n $? > /tmp/monitor-exit-code'")
wait_for_window(f"u0_a{hakurei_identity(0)}@machine")
machine.succeed("pkill -INT -f 'hakurei -v run '")
machine.wait_until_fails("pgrep foot", timeout=5)
machine.wait_until_fails("pgrep foot")
machine.wait_for_file("/tmp/monitor-exit-code")
interrupt_exit_code = int(machine.succeed("cat /tmp/monitor-exit-code"))
if interrupt_exit_code != 254:
@@ -202,7 +202,7 @@ swaymsg("exec sh -c 'ne-foot &> /tmp/shim-cont-unexpected-pid'")
wait_for_window(f"u0_a{hakurei_identity(0)}@machine")
machine.succeed("pkill -CONT -f 'hakurei shim'")
machine.succeed("pkill -INT -f 'hakurei -v run '")
machine.wait_until_fails("pgrep foot", timeout=5)
machine.wait_until_fails("pgrep foot")
machine.wait_for_file("/tmp/shim-cont-unexpected-pid")
print(machine.succeed('grep "shim: got SIGCONT from unexpected process$" /tmp/shim-cont-unexpected-pid'))
@@ -221,26 +221,26 @@ if sched_rr != 2:
swaymsg("exec ne-foot")
wait_for_window(f"u0_a{hakurei_identity(0)}@machine")
machine.send_chars("clear; wayland-info && touch /var/tmp/client-ok\n")
machine.wait_for_file("/var/tmp/client-ok", timeout=15)
machine.wait_for_file("/var/tmp/client-ok")
collect_state_ui("foot_wayland")
check_state("ne-foot", {"wayland": True})
# Verify lack of acl on XDG_RUNTIME_DIR:
machine.fail(f"getfacl --absolute-names --omit-header --numeric /run/user/1000 | grep {hakurei_identity(0) + 10000}")
machine.send_chars("exit\n")
machine.wait_until_fails("pgrep foot", timeout=5)
machine.fail(f"getfacl --absolute-names --omit-header --numeric /run/user/1000 | grep {hakurei_identity(0) + 10000}", timeout=5)
machine.wait_until_fails("pgrep foot")
machine.fail(f"getfacl --absolute-names --omit-header --numeric /run/user/1000 | grep {hakurei_identity(0) + 10000}")
# Test pipewire-pulse:
swaymsg("exec pa-foot")
wait_for_window(f"u0_a{hakurei_identity(1)}@machine")
machine.send_chars("clear; pactl info && touch /var/tmp/pulse-ok\n")
machine.wait_for_file("/var/tmp/pulse-ok", timeout=15)
machine.wait_for_file("/var/tmp/pulse-ok")
collect_state_ui("pulse_wayland")
check_state("pa-foot", {"wayland": True, "pipewire": True})
machine.fail("find /tmp -maxdepth 1 -type d -name '.hakurei-shim-*' -print -exec false '{}' +")
machine.send_chars("exit\n")
machine.wait_until_fails("pgrep foot", timeout=5)
machine.wait_until_fails("pgrep -x hakurei", timeout=5)
machine.wait_until_fails("pgrep foot")
machine.wait_until_fails("pgrep -x hakurei")
machine.succeed("find /tmp -maxdepth 1 -type d -name '.hakurei-shim-*' -print -exec false '{}' +")
# Test PipeWire SecurityContext:
machine.succeed("sudo -u alice -i XDG_RUNTIME_DIR=/run/user/1000 hakurei -v exec --pulse pactl info")
@@ -253,25 +253,25 @@ machine.fail("sudo -u alice -i XDG_RUNTIME_DIR=/run/user/1000 hakurei -v exec --
swaymsg("exec x11-alacritty")
wait_for_window(f"u0_a{hakurei_identity(0)}@machine")
machine.send_chars("clear; glinfo && touch /var/tmp/x11-ok\n")
machine.wait_for_file("/var/tmp/x11-ok", timeout=15)
machine.wait_for_file("/var/tmp/x11-ok")
collect_state_ui("alacritty_x11")
check_state("x11-alacritty", {"x11": True})
machine.send_chars("exit\n")
machine.wait_until_fails("pgrep alacritty", timeout=5)
machine.wait_until_fails("pgrep alacritty")
# Start app (foot) with direct Wayland access:
swaymsg("exec da-foot")
wait_for_window(f"u0_a{hakurei_identity(3)}@machine")
machine.send_chars("clear; wayland-info && touch /var/tmp/direct-ok\n")
collect_state_ui("foot_direct")
machine.wait_for_file("/var/tmp/direct-ok", timeout=15)
machine.wait_for_file("/var/tmp/direct-ok")
check_state("da-foot", {"wayland": True})
# Verify acl on XDG_RUNTIME_DIR:
print(machine.succeed(f"getfacl --absolute-names --omit-header --numeric /run/user/1000 | grep {hakurei_identity(3) + 10000}"))
machine.send_chars("exit\n")
machine.wait_until_fails("pgrep foot", timeout=5)
machine.wait_until_fails("pgrep foot")
# Verify acl cleanup on XDG_RUNTIME_DIR:
machine.wait_until_fails(f"getfacl --absolute-names --omit-header --numeric /run/user/1000 | grep {hakurei_identity(3) + 10000}", timeout=5)
machine.wait_until_fails(f"getfacl --absolute-names --omit-header --numeric /run/user/1000 | grep {hakurei_identity(3) + 10000}")
# Test syscall filter:
print(machine.fail("sudo -u alice -i XDG_RUNTIME_DIR=/run/user/1000 strace-failure"))
@@ -280,20 +280,20 @@ print(machine.fail("sudo -u alice -i XDG_RUNTIME_DIR=/run/user/1000 strace-failu
swaymsg("exec foot $SHELL -c '(ne-foot) & disown && exec $SHELL'")
wait_for_window(f"u0_a{hakurei_identity(0)}@machine")
machine.send_chars("clear; wayland-info && touch /var/tmp/term-ok\n")
machine.wait_for_file("/var/tmp/term-ok", timeout=15)
machine.wait_for_file("/var/tmp/term-ok")
machine.send_key("alt-h")
machine.send_chars("clear; hakurei show $(hakurei ps --short) && touch /tmp/ps-show-ok && exec cat\n")
machine.wait_for_file("/tmp/ps-show-ok", timeout=5)
machine.wait_for_file("/tmp/ps-show-ok")
collect_state_ui("foot_wayland_term")
check_state("ne-foot", {"wayland": True})
machine.send_key("alt-l")
machine.send_chars("exit\n")
wait_for_window("alice@machine")
machine.send_key("ctrl-c")
machine.wait_until_fails("pgrep foot", timeout=5)
machine.wait_until_fails("pgrep foot")
# Exit Sway and verify process exit status 0:
machine.wait_until_fails("pgrep -x hakurei", timeout=5)
machine.wait_until_fails("pgrep -x hakurei")
swaymsg("exit", succeed=False)
machine.wait_for_file("/tmp/sway-exit-ok")
@@ -308,4 +308,4 @@ machine.succeed("find /tmp -maxdepth 1 -type d -name '.hakurei-shim-*' -print -e
# Verify go test status:
machine.wait_for_file("/tmp/hakurei-test-done")
print(machine.succeed("cat /tmp/hakurei-test.log"))
machine.wait_for_file("/tmp/hakurei-test-ok", timeout=2)
machine.wait_for_file("/tmp/hakurei-test-ok")