home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.unix.aix
- Path: sparky!uunet!charon.amdahl.com!pacbell.com!sgiblab!zaphod.mps.ohio-state.edu!cs.utexas.edu!geraldo.cc.utexas.edu!portal.austin.ibm.com!awdprime.austin.ibm.com!chukran.austin.ibm.com!rudy
- From: rudy@chukran.austin.ibm.com ()
- Subject: Re: backup and the -p option
- Sender: news@austin.ibm.com (News id)
- Message-ID: <C1H3JB.2n7v@austin.ibm.com>
- Date: Tue, 26 Jan 1993 18:04:23 GMT
- Reply-To: chukran@austin.vnet.ibm.com(Rudy Chukran)
- References: <1jp73lINN1un@meaddata.meaddata.com>
- Organization: IBM Advanced Workstation Division
- Lines: 29
-
- In article <1jp73lINN1un@meaddata.meaddata.com>, marko@meaddata.com (Mark Osbourne) writes:
- |> The file is compressable; compress squeezes the file from 70 Meg down
- |> to about 7 Meg.
- |>
- |> How does backup decide whether to compress the file or not? Is it
- |> based on available space on /tmp? Is it based on available memory
- |> swap space at the time?
- backup -p uses Huffman encoding , which is not as efficient as
- LZW compression. But it is not this inefficient!
-
- Some other quirks:
- The algorithm seems to give up if it finds character run lengths > 2**24 = 16M.
- If compress is getting a 10:1 compression factor, I wouldnt be surprised
- if this file has exceeded this limit.
-
- Also, ( this looks like a bug), if the file begins with 0x1f1e, then compression
- is ignored because the algorithm says "it is already compressed", which
- is ludicrous.
-
- backup -p is very convenient, but non-portable. I use it a lot myself.
- But tar |compress is usually more space efficient and portable.
- --
- *********************************************************************
- Rudy Chukran | EMAIL:
- IBM AIX Technical Consulting| RSCS: CHUKRAN at AUSTIN
- 11400 Burnet Rd. | AWDnet:rudy@chukran.austin.ibm.com
- Internal ZIP 2830 | internet: chukran@austin.vnet.ibm.com
- Austin, Texas 78758 |
- *********************************************************************
-