home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!ferkel.ucsb.edu!taco!rock!concert!uvaarpa!darwin.sura.net!spool.mu.edu!sgiblab!nec-gw!nec-tyo!wnoc-tyo-news!scslwide!wsgw!headgw!cvgw3!tshiono
- From: tshiono@cv.sony.co.jp (Toru SHIONO)
- Newsgroups: comp.lang.perl
- Subject: Out of memory
- Message-ID: <TSHIONO.93Jan28205613@cxn21.cv.sony.co.jp>
- Date: 28 Jan 93 11:56:13 GMT
- Sender: news@cv.sony.co.jp (Usenet News System)
- Distribution: comp
- Organization: Consumer Video Group, SONY, Osaki New city, Tokyo, JAPAN.
- Lines: 34
- Nntp-Posting-Host: cxn21
-
-
- I'm trying to process a very huge (1,000,000 lines) file with
- a simple Perl script.
- The processing is done line by line and it doesn't have to keep
- previous lines; so I presume that it will work no matter how the file
- may be big.
-
- However, Perl complains "Out of memory!"
-
- To extract the problem, I tested these scripts:
-
- #!/usr/bin/perl
- foreach (1 .. 1000000) {
- $line = <>;
- print $line;
- }
-
- dies shortly, whereas
-
- #!/usr/bin/perl
- for ($i = 0; $i < 1000000; $i++) {
- $line = <>;
- print $line;
- }
-
- works fine !
-
- What makes the difference ?
- I am afraid that the former loop seems to cause memory leak or
- something like that.
-
- Any suggestions ?
- --
- Toru Shiono (tshiono@cv.sony.co.jp) Sony Corporation, JAPAN
-