More on Disk Images -> Disk

Warren Toomey wkt at henry.cs.adfa.oz.au
Wed Mar 25 10:32:56 AEST 1998


In article by Steven M. Schultz:
> Warren -
> 
> >From: Warren Toomey <wkt at henry.cs.adfa.oz.au>
> 
> > Now, what I've currently got will cope with -b12 compressed files. Can
> > someone tell me if it would be feasible to fit a gunzip into 64K?? Even
> > if it could only cope with gzip -1 files.
> 
> 	If my understanding of 'gzip' is right then the alogrithm works on
> 	32kb blocks of data and the '-N' level has little to do with the
> 	memory consumption.  Rather, as the -1, ... -9 level increases the
> 	amount of work that gzip puts into the compression increases (the
> 	difference between -6 and -9 is only a few percent in final output
> 	size but the length of time taken is quite a bit higher).
> 
> 	Of concern would be getting the gzip sources to compile with a non-ANSI
> 	compiler on a non-32bit machine (sizeof (long) == sizeof(int) is an
> 	endemic assumption I wager).  Well, ok - there is the worry that
> 	you will grow old waiting for it to compress something ;-)  Gzip is a 
> 	lot more cpu intensive than compress.

I'm only thinking of implementing gunzip on the PDP-11. I've got
uncompress -b12 running standalone right now, but gunzip would be a big
win: you gzip -9 on a 32-bit system (higher compression) and gunzip 
on the PDP-11.

I just don't know if the gunzip would fit. Isn't there a gunzip for MS-DOS?
Surely we could leverage something from it?

	Warren

Received: (from major at localhost)
	by minnie.cs.adfa.oz.au (8.8.5/8.8.5) id OAA20196
	for pups-liszt; Wed, 25 Mar 1998 14:36:27 +1100 (EST)
X-Authentication-Warning: minnie.cs.adfa.oz.au: major set sender to owner-pups at minnie.cs.adfa.oz.au using -f


More information about the TUHS mailing list