Jump to content

Recommended Posts

Posted

Hi,

I don't know why, but lately I've come across some caching problems when using InetGet, and InetRead to download a text file from the Internet. Although, I did set the option flag to 1, i.e: 1 = Forces a reload from the remote site, but still, the programme still used the cache file, hence it cannot get the latest file. I could get the latest file using Firefox though.

To tell the truth, I haven't had this problem before. Hence I thought it was some of the new bug in the Beta test version, so I tried downloading and installing the stable version again. Then run the code again, but no luck. :graduated:

So I thought it should be something wrong with my laptop's settings. But to tell the truth, I haven't changed any configurations/settings for a long long time. ;)

Has anyone ever had the same problem? What should I do now? When searching the forums, I came across this thread.

The problem can be solved by using _FTP command, but this command works really slowly. Is there any way to circumvent this problem?

Thanks everyone a lot,

And have a good day, :)

Posted

I'm not able to reproduce this behavior at the moment although I had the same problem the other day. It went away on its own (which is frustrating in itself.)

.

Posted (edited)

I was having a similar issue. I use an .ini file on my site with versions of programs. In Check For Update command, I download the .ini file and read the info out to find the latest version, then do FileGetVersion on the program to compare. I found that the only way I could guarantee a fresh copy of the .ini file download was to delete it after the user decided to update or not. If I left the copy of the .ini file there, for some reason it was not being overwritten.

Also I never use the wait mode of InetGet. I do the Sleep(250) in a Do loop, but have a max count while checking for progress to avoid a hang.

snippet

$hDownload = InetGet($url & $d, $localFile, 1, 1)
$count = 0
Do
    $count += 1
If $count > $maxCount Then ExitLoop
    Sleep(250)
Until InetGetInfo($hDownload, 2)
InetClose($hDownload)
If $count <= $maxCount Then ; download succeeded so do something with it
;;
EndIf
Edited by MilesAhead
Posted (edited)

I was having a similar issue. I use an .ini file on my site with versions of programs. In Check For Update command, I download the .ini file and read the info out to find the latest version, then do FileGetVersion on the program to compare. I found that the only way I could guarantee a fresh copy of the .ini file download was to delete it after the user decided to update or not. If I left the copy of the .ini file there, for some reason it was not being overwritten.

Also I never use the wait mode of InetGet. I do the Sleep(250) in a Do loop, but have a max count while checking for progress to avoid a hang.

snippet

$hDownload = InetGet($url & $d, $localFile, 1, 1)
$count = 0
Do
    $count += 1
If $count > $maxCount Then ExitLoop
    Sleep(250)
Until InetGetInfo($hDownload, 2)
InetClose($hDownload)
If $count <= $maxCount Then ; download succeeded so do something with it
;;
EndIf

I've just tried setting the download in the background, but it still doen't work. :graduated: It seems to me that, if I use Firefox to get the file, then I'll get the latest one at once. But if I use InetGet, I'll be able to get the file, but only after the file has been updated like 15 mins or more.

I can also use _FTP_FileGet to get the newest one, but this is not convenient because before using _FTP_FileGet, I have to use _FTP_Connect, but this requires the Password for the site; hence, if the user decodes my .exe file, then my password will be shown.

Do anyone knowns a way to work around this problem? This is driving me real mad... ;)=((

Thanks everyone a lot.

Edited by eEniquEe
Posted

I notice Firefox gets the newest file from my site too. Do you make sure to delete the local file before the InetGet()?? I don't know why you should have to. But that was the only way I could download the latest .ini file. Seems like the zip file is current for some reason. Even though if I download with Chromium I'll get an old zip and Firefox will give me the new zip, InetGet seems to give the zip just fine!! Quirky business.

Guess trial and error is the only way.

Posted

I notice Firefox gets the newest file from my site too. Do you make sure to delete the local file before the InetGet()?? I don't know why you should have to. But that was the only way I could download the latest .ini file. Seems like the zip file is current for some reason. Even though if I download with Chromium I'll get an old zip and Firefox will give me the new zip, InetGet seems to give the zip just fine!! Quirky business.

Guess trial and error is the only way.

Well, yes. I think, if the extension is not .txt, then everything is just fine.

I tried to change the extension to .emt (some random extension that I came up with). And it did work. ^^!

Thank you a lot, Spammer.:graduated:

Btw, does anyone know why caching problems happen with .txt files, but not with other extensions? It seems strange to me.

Posted

It also happens with .ini. I use a file FavesVersions.ini to store the version numbers and download names of the programs that have Check For Update command in Tray Menu. I'll get the old one if I don't delete FavesVersions.ini before the call. Maybe it's something to do with Binary Mode vs Text Mode? Could be some "optimization" in Text Mode or something.

  • 1 year later...
Posted (edited)

I was able to resolve this issue clearing IE cache

$ClearID = "8"
Run("RunDll32.exe InetCpl.cpl,ClearMyTracksByProcess " & $ClearID)

For more information:

Edited by TEKBUSTER

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...