[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Wrong backports listing on website

From: Daniel Shahaf <d.s_at_daniel.shahaf.name>
Date: Wed, 31 Jul 2019 16:49:34 +0000

Julian Foad wrote on Wed, Jul 31, 2019 at 10:53:17 +0100:
> Daniel Shahaf wrote:
> > - In the Puppet manifest, [1], for the actual commands to use.
>
> I sent a merge request to change it from 'cd ~/src/svn/1.11.x' to 'cd
> ~/src/svn/latest' and made a symlink 'latest -> 1.12.x' locally on svn-qavm3
> (and will do the same on the new svn-qavm once I can access it).
>
> -- So now that symlink is what needs to be updated for each minor release.
>
> > The immediate fix would be to patch both of these to say "12" rather
> > than "11". Then, we should either add this step to the new-minor-branch
> > checklist, or [...]
>
> http://svn.apache.org/r1864042 : Document a step for updating the 'upcoming
> changes' branch for a new minor release.

How do you feel about automating that? We could make the script figure
out the latest stable version easily enough:

[[[
#!/usr/bin/env python3

import os
import re
import subprocess

DIST_RELEASE_URL = 'https://dist.apache.org/repos/dist/release/subversion'
SVN = 'svn'

# Get all current release files
filenames = subprocess.check_output([SVN, 'list', DIST_RELEASE_URL]).decode().splitlines()

# Get filenames of GA releases only, excluding prereleases
filenames = map(re.compile(r'^subversion-(?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)\.tar\.gz\.asc$').match, filenames)

f = lambda matchobj, groupname: int(matchobj.group(groupname))
newest_stable_version = max([
    (f(m, 'major'), f(m, 'minor'), f(m, 'patch'))
    for m in filenames if m
])
print(newest_stable_version)
]]]

(This prints «(1, 12, 2)» currently.)

Then we wouldn't need the cron job to 'cd', and wouldn't need the
'latest' symlink at all.

Thanks for fixing this, Julian.

Daniel
Received on 2019-07-31 18:49:41 CEST

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.