On Mon, Dec 10, 2018 at 23:15 Nico Kadel-Garcia <nkadel_at_gmail.com> wrote:
> On Mon, Dec 10, 2018 at 9:10 PM Tom Browder <tom.browder_at_gmail.com> wrote:
> > On Mon, Dec 10, 2018 at 19:45 Nico Kadel-Garcia <nkadel_at_gmail.com> wrote:
> >> On Mon, Dec 10, 2018 at 5:56 AM Tom Browder <tom.browder_at_gmail.com> wrote:
> >> >
> >> > On Mon, Dec 10, 2018 at 12:10 AM Nico Kadel-Garcia <nkadel_at_gmail.com> wrote:
> >> > > On Sun, Dec 9, 2018 at 6:31 PM Tom Browder <tom.browder_at_gmail.com> wrote:
> >> > ...
> >> > > > Given that history will be lost, does anyone see any problems with my recovery plan?
> >> > ...
> >> > > If you have working copies and you don't care about history, why are
> >> > > you spending any cycles on doing anything with hotcopy? You've lost
> >> > > history anyway, why keep any of it?
> >> >
> >> > Cycles aren't important, but the size of the data is. Transferring the
> >> > working copy from scratch would take a LONG time, while the bulk of
> >> > the data are already there in the hotcopy.
> >> Under what possible conditions wound importing a single snapshot of
> >> the current working copy, without history, take more time than working
> >> from a hotcopy to overlay the changes on top of that hotcopy?
> > I don’t know, Nico, I am a real novice at this. Your first answer didn’t help because I didn’t know the ramifications of what I was trying to do.
> > The original data, from just six months ago, was about 27 Gb, which took a very long time to upload from my home computer to my remote server. Since the only hotcopy, done shortly after the repo was loaded, there has been very little change, so if I could start with the hotcopy and somehow synch my working copy without pushing 27 Gb again, life would be better.
> ??? An import of the copy of the working data has no history. Is the
> *data* 27 GB, with no .svn content, 27 GB ? What in the devil are you
> putting in source control?
> I'm not objecting to your situation, just really confused by the
> content you are dealing with.
Sorry, Nico, I probably didn’t use the correct terms in my problem
description. Basically the subversion repos on my remote server were
current as of about six months ago when they were established there
and a hotcopy was made.
There have been few updates since, so is the hotcopy of value or not,
Anyway, my thought was to save some upload and download time if possible.
UPDATE: Problems in svn-repo land
I copied the two hotcopy backups to the original repo locations, and
my local server has found them (I'm using GUI client SmartSVN on
Windows, command line on Linux).
I started to update on one and am getting these messages (which you
warned me about):
Clean Up: Failed to run the WC DB work queue associated with
'C:\Users\Tom\Documents\0-mydocs-svn', work item 636 (file-install
Personal/TomB/sto/Misc/llftpar2.exe 1 0 1 1) Can't open file
svn-base': The system cannot find the file specified.
FWIW, the repos are on my remote Linux server I have full control
over--running Debian 9.
(Note the working copy is on Windows, and I do not have a wc of it on
my local Linux host.)
Is there anything I can do to fix something like that? Or do I have
to go through creating new repos and populating them from he original
repo files and dirs?
Thanks so much.
Received on 2018-12-13 14:19:04 CET