Hello! zimoun skribis: > On Fri, 30 Aug 2019 at 01:18, Ludovic Courtès wrote: > >> Currently, only 25% of our packages are not fetched with ‘url-fetch’. >> For the remaining 75%, this checker can only report whether the tarball >> is missing (and apart from ftp.gnu.org and a few other exceptions, it >> usually _is_ missing) and cannot actually save it. > > Maybe I miss something, but for example guile-2.0 is not yet archived. > I am not able to find it with their search resources. And `guix lint > -c archival guile@2.0' reports "guile@2.0.14: source not archived on > Software Heritage". Yeah, most not-too-recent tarballs from ftp.gnu.org are archived, so I don’t know why this one is missing. We’d have to check with them. > I agree with the words on #swh-deve by olasd (Nicolas Dandrimont) from > SWH that the automatic "save" should be optional (even if the default > is save=true). Maybe we could have a flag somewhere to turn it off? The good thing of having it on (or opt-out) is that we increase the chances that the code we care about is archived. :-) >> The second step will be to write a “lister” for Software Heritage that >> grabs the list of source code URLs from >> . That could would run at SWH >> and it could potentially grab the tarballs, not just the VCS checkouts. >> Here’s are examples: >> >> https://forge.softwareheritage.org/source/swh-lister/browse/master/swh/lister/packagist/lister.py >> https://forge.softwareheritage.org/source/swh-lister/browse/master/swh/lister/gnu/lister.py >> >> It should be quite easy for a Pythonista to write something similar >> for our ‘packages.json’. Any takers? :-) > > I am not sure to understand all but I will give a look... I am reading > their GSoC about this topic [2]. Awesome, thank you! Having a “guix” lister in place would be perfect. Ludo’.