[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: lftp problems ...



Kenneth Dombrowski wrote:

On 03-05-07 08:25 +0100, Dave Selby wrote:
Kenneth Dombrowski wrote:
On 03-05-06 20:25 +0100, David Selby wrote:

PS to upload a complete site from local to server, would I be right to
use mput * ?
I've been using the 'mirror' command for this, which has a bunch more
options for synching files, permissions, link handling, etc

just a guess about the 'spare' directories, have you tried 'rm -rf'?

Yep I've tried rm -rf file, mrm -rf * etc etc, all to no avail, My guess
is it is a NTL server problem, does this sound reasonable ?

well, 'possible' maybe. It's hard to imagine they're doing anything to
prevent you from deleting directories called 'spare' or 'spare2'. Did
you create those directories?
a glance at 'man lftp' does reveal this:

 ls params

 List remote files. You can redirect output of this command to  file  or
 via  pipe to external command.  By default, ls output is cached, to see
 new listing use rels or cache flush.

so maybe try 'rels' &/or 'cache flush'

but 'rm -rf' should've taken care of that too. unless, I guess, there's
something that appeared in there since your last cache which you don't own.

but really, this is a wild guess too; I never saw this with lftp. I
mostly wanted to point out the 'mirror' command in response to your
second question. While I have the man page open, here is the summary of options:

 mirror [OPTS] [source [target]]
Mirror specified source directory to local target directory. If target
 directory ends with a slash, the source base name is appended to target
 directory name. Source and/or target can be URLs pointing  to  directo-
 ries.
-c, --continue continue a mirror job if possible
      -e, --delete             delete files not present at remote site
      -s, --allow-suid         set suid/sgid bits according to remote site                --allow-chown        try to set owner and group on files
      -n, --only-newer         download only newer files (-c won't work)
      -r, --no-recursion       don't go to subdirectories
      -p, --no-perms           don't set file permissions
          --no-umask           don't apply umask to file modes
      -R, --reverse            reverse mirror (put files)
      -L, --dereference        download symbolic links as files
      -N, --newer-than FILE    download only files newer than the file
      -P, --parallel[=N]       download N files in parallel
      -i RX, --include RX      include matching files
      -x RX, --exclude RX      exclude matching files
      -I GP, --include-glob GP include matching files
      -X GP, --exclude-glob GP exclude matching files
      -v, --verbose[=level]    verbose operation
          --use-cache          use cached directory listings
      --Remove-source-files    remove files after transfer (use with caution)
      -a             same as --allow-chown --allow-suid --no-umask
When using -R, the first directory is local and the second is remote.
 If the second directory is omitted, base name  of  first  directory  is
 used.  If both directories are omitted, current local and remote direc-
 tories are used.
RX is an extended regular expression, just like in egrep(1). GP is a glob pattern, e.g. `*.zip'.

 ... snip ...

as you can see, it offers a lot more control than 'mput -d'



Many thanks for mirror, have tried it out with mirror -R, works a lot better than mput, works recursively now. Tried rels & cache flush ... to no avail ... I will be using a "bought" server for hosting soon, will see if problems persist.

Many thanks for your help, mirror is a real find !

Dave




Reply to: