Re: undo extraction
>>>>> Colin Watson writes:
cw> Rob Hudson <rob@eugene.net> wrote:
>> I wrote a little perl script [1] that gets the names of the files from
>> a tarball, then removes all the files and directories found inside the
>> tarball.
>>
>> It comes in real handy when a tar archive dumps into the currently
>> directory and makes a big mess. Of course you can untar in a temp dir
>> or use the 't' option to look inside first, but sometimes I put too
>> much trush in where the tarballs are going to dump and get screwed.
>>
>> [1] http://www.cogit8.org/download/tarball-clean.txt
cw> I'd rather you used Perl's built-in unlink() and rmdir()
cw> functions; using system() might end up calling the shell,
cw> since filenames in tarballs can have shell metacharacters in
cw> them. They can also have spaces, which will confuse your
cw> current script. It'll be an awful lot faster if you don't have
cw> to fork either one or two new processes for each entry in the
cw> tarball, too.
I'd rather people didn't use a chainsaw where a butterknife will do
just as well:
tar tzf $1 | xargs rm
will work for those dumbass tarballs where they don't make a new
directory to put their files. There might be confusion if there are
spaces in the filenames, in which case a perl one-liner might be
called for:
perl -e 'while (<>){chomp; printf "\"%s\"\n", $_;}'
Pipe it through that and all will be well. Other than that, rm -rf
works for me. :) TIMTOWTDI.
Cheers,
Chris
--
Got jag? http://www.tribsoft.com
Reply to: