http://www.speedbicycles.ch/showBike.php?enr=442
and I wanted to download the images, but there are like 72 of them and I don't want to right click each one and save them. So I was thinking since I could use wget, but I don't want to download the whole site. I just want the pictures of this sweet looking bike! So I took a look at the URL for the images and sure enough they are all using a simple naming scheme (e.g. speed_001.jpg). So what immediately comes to mind is a bash script with a for loop calling wget to download all the images for me (because you know as programmers we are lazy).
#!/bin/bash NUM=72 for ((i=1; i<=NUM; i++)) { if [ $i -lt 10 ] then ADD="http://www.speedbicycles.ch/bikes/442/bigPic/speed_00${i}.jpg" wget ${ADD} else ADD="http://www.speedbicycles.ch/bikes/442/bigPic/speed_0${i}.jpg" wget ${ADD} fi }
I'm not saying that this is the only way to do this, it's just what came to mind when faced with the problem.
No comments:
Post a Comment