home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!usc!zaphod.mps.ohio-state.edu!pacific.mps.ohio-state.edu!linac!att!ucbvax!ucdavis!flauta.engr.ucdavis.edu!cklarson
- From: cklarson@flauta.engr.ucdavis.edu (Christopher Klaus Larson)
- Newsgroups: comp.sys.mac.programmer
- Subject: Re: Recommended way to deal with large memory buffers?
- Message-ID: <21770@ucdavis.ucdavis.edu>
- Date: 25 Jan 93 18:56:21 GMT
- References: <9301221923.aa23160@Paris.ics.uci.edu> <1jqeefINNcio@darkstar.UCSC.EDU>
- Sender: usenet@ucdavis.ucdavis.edu
- Organization: College of Engineering - University of California - Davis
- Lines: 20
-
- In article <1jqeefINNcio@darkstar.UCSC.EDU> speth@cats.ucsc.edu (James Gustave) writes:
- >
- > I'm working on an encryption application, and I need to go through a file,
- >encrypt it, and write it to a new file. The encryption routine takes a Ptr
- >to a buffer, and a count of the number of bytes to process. For small files,
- >it makes sense to read the whole file into the buffer and do the encryption.
- >However, this means that the maximum size of the file to be encrypted is
- >determined by the amount of memory the program can allocate. On the other
- >hand, if I did the encryption incrementally, writing the parts to a temp file,
- >then there might be a lot of unnecessary disk access which would slow down
- >the process.
-
- You should figure out how much memory would be required to process the entire
- file at once and see if there is a block of temporary memory of this size
- available. If so, go ahead and use it. If not, you will have to process the
- file in pieces by using the buffer allocated within your application heap.
- See Inside Macintosh: Memory for details.
-
- --Chris
- cklarson@engr.ucdavis.edu
-