![]() That way you won't need a whole bash or OpenCV C/C process per image. Note 3: If you do write a script to process an image, write it so that it accepts multiple filenames as parameters, then you can run parallel -X and it will pass as many filenames as your sysctl parameter kern.argmax allows. Note 2: You will want your disks well configured to handle multiple, parallel I/O streams. GNU Parallel is capable of transferring the images to remote servers along with the jobs, but I'd have to question whether it makes sense to do that for this task - you'd probably want to put a subset of the images on each server with its own local disk I/O and run the servers independently yourself rather than distributing from a single point globally. ![]() Note 1: You could also list the names of additional servers in your network and it will spread the load across them too. You can specify fewer, or more jobs in parallel with, say, parallel -j 8. iname \*.arw -print0 | parallel -progress -0 ProcessOne Īnd that will recurse in the current directory finding all Sony ARW files and passing them into GNU Parallel, which will then keep all 24-cores busy until the whole lot are done. So, say you have a 24-core MacPro, and a bash script called ProcessOne that takes the name of a Sony ARW image as parameter, you could run: find. I would also think you may want to consider porting, or having ported, Fred's algorithm to C or Python to run with OpenCV rather than ImageMagick. If/when you get a script that does what you want, I would suggest using GNU Parallel to get decent performance. From there, click the Image Converter function. ![]() © Fred Weinhaus - Fred's ImageMagick scripts Steps to Follow the steps mentioned in the list below to convert NEF to JPG with the UniConverter: Step 1 Run the Wondershare UniConverter Launch the software, and click the Toolbox section on the left. Your project sounds distinctly commercial. You may want to speak to Fred Weinhaus about his Retinex script (search for "hazy" on that page), which does a rather wonderful job of haze removal. It's also 173 solid days of 24 hr/day processing, assuming you can do 1 image per second - which I doubt. That's 200TB of input images, without even allowing any storage space for output images.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |