metacard digest, Vol 1 #728 - 3 msgs
Craig Spooner
cspooner at lamar.colostate.edu
Thu Jul 31 18:04:26 EDT 2003
Greg,
This slowdown couldn't be due to a repeat loop you're using, rather than
the array? Just a thought...
Craig
>Message: 1
>Date: Wed, 30 Jul 2003 13:09:58 -0400
>From: Gregory Lypny <gregory.lypny at videotron.ca>
>Subject: Re: Limits on array dimensions
>To: metacard at lists.runrev.com
>Reply-To: metacard at lists.runrev.com
>
>Thanks. I'm not a programmer, so please pardon my incorrect
>terminology. I find arrays to be invaluable, especially when creating
>utilities that index large (300 MB and up) flat-file databases (what
>I'm doing). However, once the number of array elements exceeds about
>150,000 (for example, to keep track of record ID's while indexing),
>processing gets progressively slower and eventually stalls even if the
>contents of the elements are small. This does not appear to be a
>memory constraint. My fix is to dump critical array variables to a
>file from time to time, delete the variable to free space, and recreate
>the array with new observations. The difference in speed is remarkable.
>
> Greg
>
> Gregory Lypny
> Associate Professor
> Concordia University
More information about the metacard
mailing list