Python 3 urllib urlretrieve retry on download fail -


i have program downloads files server. anywhere 2mb 8mb. runs through loop , grabs number of files request. problem internet sucks out here in middle of freekin' desert. while works beautifully of time, internet drops during urllib.request.urlretrieve request , freezes program. need way urllib detect when net has dropped, , retry file until comes again. appreciated!!!

example of doing:

try:     numimgs = len(imgstoget)      path1 = "level ii" #hightest folder     self.fn = imgs.split('/')[-1] #split out name left     path2 = self.fn[:4] #split out kicx     path3 = self.fn.split('_')[1] #split out date     savepath = os.path.join(path1, path2, path3) #level ii / radar / date path      if not os.path.isdir(savepath): #see if exists         os.makedirs(savepath) #if not, make      filesavepath = os.path.join(path1, path2, path3, self.fn)      if os.path.isfile(filesavepath): #chcek see if image path exists         self.time['text'] = self.fn + ' exists \n'         continue      #download progress     def reporthook(blocknum, blocksize, totalsize):         percent = 0         readsofar = blocknum * blocksize         if totalsize > 0:             percent = readsofar * 1e2 / totalsize             if percent >= 100:                 percent = 100              s = "\r%5.1f%% %*d / %d" % (                 percent, len(str(totalsize)), readsofar, totalsize)              self.time['text'] = 'downloading file: '+str(curimg)+ ' of '+str(numimgs)+' '+self.fn+'' + s              if readsofar >= totalsize: # near end                 self.time['text'] = "saving file..."         else: # total size unknown             self.time['text'] = "read %d\n" % (readsofar)          #update progressbar         self.pb.config(mode="determinate")         if percent > 0:             self.dl_p = round(percent,0)             self.pb['value'] = self.dl_p             self.pb.update()         if percent > 100:             self.pb['value'] = 0             self.pb.update()      urllib.request.urlretrieve(imgs, filesavepath, reporthook)  except urllib.error.httperror err: #catch 404 not found , continue     if err.code == 404:         self.time['text'] = ' not found'         continue 

cheers,

david

you place code in try except block, counter. here had done:

remaining_download_tries = 15  while remaining_download_tries > 0 :     try:         urlretrieve(caselawurl,file_path_and_name)         print("successfully downloaded: " + caselawurl)         time.sleep(0.1)     except:         print("error downloading " + caselawurl +" on trial no: " + str(16 - remaining_download_tries))         remaining_download_tries = remaining_download_tries - 1         continue     else:         break 

i hope code self explanatory. regards


Comments