More on Disk Images -> Disk

Steven M. Schultz sms at moe.2bsd.com
Wed Mar 25 10:24:33 AEST 1998


Warren -

>From: Warren Toomey <wkt at henry.cs.adfa.oz.au>

> Now, what I've currently got will cope with -b12 compressed files. Can
> someone tell me if it would be feasible to fit a gunzip into 64K?? Even
> if it could only cope with gzip -1 files.

	If my understanding of 'gzip' is right then the alogrithm works on
	32kb blocks of data and the '-N' level has little to do with the
	memory consumption.  Rather, as the -1, ... -9 level increases the
	amount of work that gzip puts into the compression increases (the
	difference between -6 and -9 is only a few percent in final output
	size but the length of time taken is quite a bit higher).

	Of concern would be getting the gzip sources to compile with a non-ANSI
	compiler on a non-32bit machine (sizeof (long) == sizeof(int) is an
	endemic assumption I wager).  Well, ok - there is the worry that
	you will grow old waiting for it to compress something ;-)  Gzip is a 
	lot more cpu intensive than compress.

	Steven


Received: (from major at localhost)
	by minnie.cs.adfa.oz.au (8.8.5/8.8.5) id LAA19600
	for pups-liszt; Wed, 25 Mar 1998 11:32:56 +1100 (EST)
X-Authentication-Warning: minnie.cs.adfa.oz.au: major set sender to owner-pups at minnie.cs.adfa.oz.au using -f


More information about the TUHS mailing list