Bash Scripting...
11 years ago
Well something I've been doing lately is anything I download (like my games) I've been recompressing it using zopfli to lower it's size. This includes PNG's and zip flies as the primary. The amount I've gotten per file usually is fairly small, but the space saved has been at least a CD's worth.
Anyways Windows doesn't let you make modifications to how it works, things that would be nice like default priority for programs who you know use huge amounts of processing and take a while, instead it's all defaulted as 'normal' so running a dozen background jobs is slow as hell.
So I've pulled out my book of bash power tools and scripting, and managed to build a script that not only divides the job up to individual processes so it takes advantage of multiple cores, but puts them in the background priority so it won't slow down anything I'm working with. curious how only a little work and lookup using a couple lines of code makes such a difference.
Anyways here's the script. If you have questions let me know. BTW I'm using AdvancedMAME recompression tools for this.
[code]
#!/bin/bash
for x
do
#check file size, if it exceeds 800Mb there's a decent chance it will fail due to memory issues
#If there's a better option for checking the file size, then du/sed should be replaced. It's slow but works.
if [ `du -BM $x|sed 's/M.*//'` -lt 800 ]
then
#low priority, plus each file has it's own background process to take advantage of multi-cores
nice -15 advzip -4 -z $x&
# ls $x
fi
done
[/code]
Anyways Windows doesn't let you make modifications to how it works, things that would be nice like default priority for programs who you know use huge amounts of processing and take a while, instead it's all defaulted as 'normal' so running a dozen background jobs is slow as hell.
So I've pulled out my book of bash power tools and scripting, and managed to build a script that not only divides the job up to individual processes so it takes advantage of multiple cores, but puts them in the background priority so it won't slow down anything I'm working with. curious how only a little work and lookup using a couple lines of code makes such a difference.
Anyways here's the script. If you have questions let me know. BTW I'm using AdvancedMAME recompression tools for this.
[code]
#!/bin/bash
for x
do
#check file size, if it exceeds 800Mb there's a decent chance it will fail due to memory issues
#If there's a better option for checking the file size, then du/sed should be replaced. It's slow but works.
if [ `du -BM $x|sed 's/M.*//'` -lt 800 ]
then
#low priority, plus each file has it's own background process to take advantage of multi-cores
nice -15 advzip -4 -z $x&
# ls $x
fi
done
[/code]
FA+

We'll see how it works, I might post that here soon.