Recursive Directory Walker for really huge numbers of files
Sivakatirswami
katir at hindu.org
Thu May 9 01:45:00 EDT 2002
The scripts so far given as examples setting a directory and recursively
digging all sub directories (.to return a list of all files with a full
path for each one) do work, but, as soon as the number of subdirectories
rises and file counts go up... things slow down to an unworkable crawl.
Context is: a web site mirror on a local server on the LAN with nearly
6,000 folders and files... I need to get a path listing for all those files
and then filter with "*.html" Pretty simple really...
(BBedit has batch functions, but there are serious bugs in it's search and
replace if you use grep for the find. If you try to replace really big
chunks, crazy things happen... but the same operation works just fine in
MC.)
I am using Ken Ray's script, and changed the repeat to use the "for each
line" to speed it up, but it still take ages to get a path list. Anyone know
a way to make this really fast?
global gHierList,gMainFolder,gBaseLevels
on mouseUp
put "" into gHierList
answer folder "Pick a folder you want to walk:"
if it = "" then exit mouseUp
set the itemDel to "/"
put it into gMainFolder
put the number of items of gMainFolder into gBaseLevels
directoryWalk gMainFolder
put gHierList into field "result"
end mouseUp
on directoryWalk whatFolder
set the itemDel to "/"
set the directory to whatFolder
put the files into temp
filter temp with ".html"
sort temp
repeat for each line x in temp
put whatFolder & "/" & x into line (the number of lines of gHierList)+1
of gHierList
end repeat
put the folders into tDirList
sort tDirList
delete line 1 of tDirList
repeat for each line x in tDirList
directoryWalk (whatFolder & "/" & x)
end repeat
end directoryWalk
Hinduism Today
Sivakatirswami
Editor's Assistant/Production Manager
katir at hindu.org
www.hinduismtoday.com
www.himalayanacademy.com
More information about the metacard
mailing list