Search the Community
Showing results for tags 'InetGet'.
-
Hello, is there a way to use inetget() to catch the content of an 404 error page returned by the web server? $URL="https://www.autoitscript.com/ThisPathDoesntExist" $content=InetGet($url,"c:\temp\xxx.html",1+2) ConsoleWrite('@@ Debug(' & @ScriptLineNumber & ') : $content = "' & $content & """" & @CRLF & "@Extended: """ & @extended & """" & @CRLF & '>Error code: ' & @error & @CRLF) ;### Debug Console >"C:\Program Files (x86)\AutoIt3\SciTE\..\AutoIt3.exe" "C:\Program Files (x86)\AutoIt3\SciTE\AutoIt3Wrapper\AutoIt3Wrapper.au3" /run /prod /ErrorStdOut /in "C:\temp\löschmich\xxx.au3" /UserParams +>15:27:05 Starting AutoIt3Wrapper v.18.708.1148.0 SciTE v.4.1.0.0 Keyboard:00000407 OS:WIN_10/ CPU:X64 OS:X64 Environment(Language:0407) CodePage:0 utf8.auto.check:4 +> SciTEDir => C:\Program Files (x86)\AutoIt3\SciTE UserDir => C:\Users\admin.AD\AppData\Local\AutoIt v3\SciTE\AutoIt3Wrapper SCITE_USERHOME => C:\Users\admin.AD\AppData\Local\AutoIt v3\SciTE >Running AU3Check (3.3.14.5) from:C:\Program Files (x86)\AutoIt3 input:C:\temp\löschmich\xxx.au3 +>15:27:05 AU3Check ended.rc:0 >Running:(3.3.14.5):C:\Program Files (x86)\AutoIt3\autoit3.exe "C:\temp\löschmich\xxx.au3" --> Press Ctrl+Alt+Break to Restart or Ctrl+Break to Stop @@ Debug(6) : $content = "0" @Extended: "0" >Error code: 13 +>15:27:05 AutoIt3.exe ended.rc:0 +>15:27:05 AutoIt3Wrapper Finished. >Exit code: 0 Time: 0.9361 The browser (I use Chrome) is displaying this 404 page: (That's what I'd like to catch) Not Found The requested URL /ThisPathDoesntExist was not found on this server. html code (Browser ctrl+u): <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>404 Not Found</title> </head><body> <h1>Not Found</h1> <p>The requested URL /ThisPathDoesntExist was not found on this server.</p> </body></html> Wireshark response 404 packet: Hypertext Transfer Protocol HTTP/1.1 404 Not Found\r\n Server: nginx\r\n Date: Wed, 06 Apr 2022 13:34:26 GMT\r\n Content-Type: text/html; charset=iso-8859-1\r\n Content-Length: 217\r\n Connection: keep-alive\r\n Vary: Accept-Encoding\r\n \r\n [HTTP response 1/1] [Time since request: 0.056074000 seconds] [Request in frame: 1476] [Request URI: http://www.autoitscript.com/ThisPathDoesntExist] File Data: 217 bytes Line-based text data: text/html (7 lines) <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>404 Not Found</title> </head><body> <h1>Not Found</h1> <p>The requested URL /ThisPathDoesntExist was not found on this server.</p> </body></html> any suggestions appreciated, <edit> also tried _inetgetsource() and inetread() </edit> Rudi
-
I have an AutoIT script It monitors 2 websites for content that applys to me and the services that I provide. One site is : www.Freelancer.com The other: www.PeoplePerHour.com Both sites publish new jobs on their site hourly or so. My AutoIT app, will view those sites and present new jobs to me in a grid that pops up on my screen. Lately, the app has stopped showing me any jobs from PeoplePerHour. For freelancer.com, Inetget is giving full html but for peopleperhour, now its not coming. Func _CheckPPH() Local Static $hTimer = 0 Local Static $hDownload = 0 Local $aTitlesandUrls = 0 Local Static $sTempFile = "" If $hTimer = 0 Then $hTimer = TimerInit() If $hDownload = 0 Then $sTempFile = _WinAPI_GetTempFileName(@TempDir) ConsoleWrite("Checking PPH..." & @CRLF) ConsoleWrite(">Downloading..." & @CRLF) ;~ $hDownload = InetGet("http://www.peopleperhour.com/freelance-jobs", $sTempFile, $INET_FORCERELOAD, $INET_DOWNLOADBACKGROUND) $hDownload = InetGet("http://www.peopleperhour.com/freelance-jobs", $sTempFile, $INET_FORCERELOAD) ;~ Return 0 EndIf ;~ Sleep(30) ;~ Local $isCompleted = InetGetInfo($hDownload, $INET_DOWNLOADCOMPLETE) ;~ Local $isError = InetGetInfo($hDownload, $INET_DOWNLOADERROR) ;~ Sleep(30) ;~ If TimerDiff($hTimer) > 3000 And $isError Then ;~ ConsoleWrite("!PPH Fail" & @CRLF) ;~ InetClose($hDownload) ;~ $hDownload = 0 ;~ Return 0 ;~ EndIf ;~ Sleep(30) Local $Show = 0 ;~ If TimerDiff($hTimer) > 3000 And $isCompleted Then If $hDownload > 0 Then ConsoleWrite("+Downloaded..." & @CRLF) Local $sPPHHtml = FileRead($sTempFile) $aTitlesandUrls = _StringBetween($sPPHHtml, '"title">' & @LF, 'time>') ;~ _ArrayDisplay($aTitlesandUrls) Local $aPPH[0][4] Local $sTitle = "" Local $sUrl = "" Local $sID = "" Local $sDate = "" Local $iRet=0 Sleep(30) For $i = 0 To UBound($aTitlesandUrls) - 1 $sTitle = _StringBetween($aTitlesandUrls[$i], '<a title="', '" class') $sUrl = _StringBetween($aTitlesandUrls[$i], 'href="', '">') $sDate = _GetDate($aTitlesandUrls[$i]) If IsArray($sTitle) And IsArray($sUrl) Then $sID = _GetID($sUrl[0]) ;~ _ArrayAdd($aPPH, $sDate & "|" & $sTitle[0] & "|" & $sUrl[0] & "|" & $sID) $iRet = _BuildPopupsPPH($sID, $sDate, "PPH: " & $sTitle[0], $sUrl[0]) If $iRet Then $Show+=1 EndIf Next Sleep(30) ;~ If $Show > 0 Then ShowLatestJobs() ;~ _ArrayDisplay($aPPH) FileDelete($sTempFile) InetClose($hDownload) $hDownload = 0 $hTimer = 0 Return $Show EndIf Sleep(30) EndFunc ;==>_CheckPPH
-
I am using this code to Download a file I don't know correct way of checking error during downloading Global $DownloadStatus = InetGet($url, $fileName, Default, $INET_DOWNLOADBACKGROUND) While 1 $ArrayOFDownloadStatuts = InetGetInfo($DownloadStatus) If Not $ArrayOFDownloadStatuts[4] = 0 Then MsgBox(0,"Error","Error During Downloading") ExitLoop EndIf If $ArrayOFDownloadStatuts[2] Then MsgBox(0,"","Compelted") ExitLoop EndIf WEnd I am using that code but don't know why it gives error during downloading the file is it wrong way of checking error or is there network error on my pc ? thank you
- 4 replies
-
- inetget
- downloading
-
(and 2 more)
Tagged with:
-
Hello there, I am using the function InetGet example in the help files on a website and getting the error 13. First, I searched the help files and the forum for a list of errors to consult to no avail. I suspect this is a 400's server reply based error but it would be nice to get more info about it. Please help, thanks. L
-
Hi Guys, I need help. I have searched the forum before posting and i couldn't find anything. The code below works fine when downloading files from "http" sites, but when trying to download from "https" sites, no files are downloaded. I tried different sites and I experience the same problem everywhere. Is there something I'm missing or doing wrong? Please note that I'm not a programmer and I'm new to this. I'm just using logic wherever i can to get things done. your help will be highly appreciated. #include <InetConstants.au3> #include <MsgBoxConstants.au3> #include <WinAPIFiles.au3> ; Download a file in the background. ; Wait for the download to complete. Example() Func Example() ; Save the downloaded file to the temporary folder. Local $sFilePath = "d:\" ; Download the file in the background with the selected option of 'force a reload from the remote site.' Local $hDownload = InetGet("https://en.wikipedia.org/wiki/HTTPS#/media/File:Internet2.jpg", $sFilePath& "Internet2.jpg", $INET_FORCERELOAD, $INET_DOWNLOADBACKGROUND) ; Wait for the download to complete by monitoring when the 2nd index value of InetGetInfo returns True. Do Sleep(250) Until InetGetInfo($hDownload, $INET_DOWNLOADCOMPLETE) ; Retrieve the number of total bytes received and the filesize. Local $iBytesSize = InetGetInfo($hDownload, $INET_DOWNLOADREAD) Local $iFileSize = FileGetSize($sFilePath&"Internet2.jpg") ; Close the handle returned by InetGet. InetClose($hDownload) ; Display details about the total number of bytes read and the filesize. MsgBox($MB_SYSTEMMODAL, "", "The total download size: " & $iBytesSize & @CRLF & _ "The total filesize: " & $iFileSize) ; Delete the file. ;FileDelete($sFilePath) EndFunc ;==>Example
-
First script here. Thanks for taking the time. I want to download a file from my dropbox or other cloud file host and I want autoit to read the file and proceed. Here are the references I've gone through, it's just I'm not familiar yet with autoit so I'm looking for advice: https://www.autoitscript.com/autoit3/docs/functions/InetGet.htm https://www.autoitscript.com/autoit3/docs/functions/FileRead.htm How would I start out downloading a text file from dropbox and if in the file there is a 1 then it will proceed with the rest of the script if there is a 0 or if the file cannot be downloaded I want it to just end. Thank you for taking the time to read this and I apologize in advance if this seems very trivial for some but this is my first script and I'm hoping this is the correct place to ask this question.
-
Ever since I upgraded to Windows 10, scripts using Inetget and InetRead no longer seem to recognize Internet Explorer cookies. This makes them useless for websites that require you to be logged in to. Has there been some sort of change to IE cookies because of the upgrade and new Edge browser? Is there a way around this?
-
Hey I'm trying to use InetGet function to download multiple images from a website, some pages having three images, some pages having 4 images some of them having more... I wrote belong codes to work with autoit and having issues when autoit find not matching url available then autoit just script stopped without any error and I just want to same all the avaialble images on the website if links are not more left then script should moves on but script just stopped... Here is complete scenerio I've so many webpages with random number of images are hosting on those pages, in below code, InetGet able to download first three files easily and when it reaches to 4th link that is missing on the webpage that's why script stopped just but I want autoit to download only those images those are links are available and rest of files needs to be skipped automatically for this page if on the next page 4th link of image avaiable then autoit script needs to download 4th one also. Furthermore, please help me to download files with it's original names instead of whatever name I want to same them like in InetGet I've to give some name to the file that I'm willind to download instead of original name that is available online. Please Help me. Here i my code; $File6 = @ScriptDir & "\images_source.txt" $txt6 = FileRead($File6) $target_source7 = _StringBetween($txt6, 'src="', '"') if Not @error Then InetGet ( $target_source7[0], @ScriptDir & '\Description\Image1.jpg'); 1st image download coz link is available InetGet ( $target_source7[1], @ScriptDir & '\Description\Image2.jpg'); 2nd image download coz link is available InetGet ( $target_source7[2], @ScriptDir & '\Description\Image3.jpg'); 3rd image download coz link is available InetGet ( $target_source7[3], @ScriptDir & '\Description\Image4.jpg'); 4th image not able to download coz link is not available and script just stopped EndIf
-
Hi Guys, Since I'm able to get a Dell equipment warranty status thanks to my API key, I'm using an UDF to extract data from an XML file and get the end date. > Thing is, when using InetGet, the original file is in JSON format and the UDF is not working anymore, even if I download the file with the xml extension. Therefore, and when I manually download the page with Chrome, I have a proper XML file where the UDF is working fine. Here's my code: I even tried to convert the json to xml > https://www.autoitscript.com/forum/topic/185717-js-json-to-xml/ I took a look here https://www.autoitscript.com/forum/topic/104150-json-udf-library-fully-rfc4627-compliant/ but I don't understand anything :/ The XML read UDF is just perfect for my needs but I'm stuck here... Thanks for any help you can provide -31290- 3MTXM12.json 3MTXM12.xml
-
Hi everyone, My script uses IE11 on Win7 to log in to a site and enters data into a couple of forms. Upon clicking a link this data is used by the site to generate a PDF report. With my current set-up if I do this manually the PDF opens in a new IE tab and I can download or print it. If I right-click the link that creates the PDF and choose Save Target As the PDF is generated and the Open/Save As dialogue at the bottom of the screen opens. All good. However I would like the script to automatically download the PDF and close IE and then exit. Closing IE (_IEQuit) and exiting the script are easy enough, but I'm struggling getting the script to download the PDF. The link to generate the PDF contains a unique number each time the page with the link is reached, so it's not static. The link position however, using _IELinkGetCollection I can tell the link to generate the PDF is always the 10th one from the end of the page, so using $iNumLinks - 10 I am able to click the link. What I believe I need to use is InetGet however the problem I've been facing is that the link isn't static and I haven't worked out a way to access the link by index - is this possible? Here is the website HTML for the section containing the link although I don't think it's of much use but it at least shows the format of the link (I can't post a link as it's a password protected area)... <div class="rmButton right"><a title="Generates a PDF version of the market report in a new window." href="/rmplus/generatePdf?mr_id=60991" target="_blank">print/save as pdf</a></div> The full link https://www.rightmove.co.uk/rmplus/generatePdf?mr_id=60991 just for completeness - visiting it will give a HTTP 500 unless logged in. And here is the code that clicks this link opening the generated PDF in a new tab... $oLinks = _IELinkGetCollection($oIE) $iNumLinks = @extended $PrintPDF = _IELinkClickByIndex($oIE, ($iNumLinks - 10)) So, how to use InetGet to visit that link? Or is there a way to Save As the newly opened tab? I've tried _IEAction($oIE, "saveas") but it seems not to work in a tab containing only a PDF.
-
Hi I want to run UrlDownloadEx run in background so GUI don't hang. It should give response During Downloading any file with this udf. Please Help me Thank you
- 2 replies
-
- urldownloadex
- net
-
(and 2 more)
Tagged with:
-
Hello. What does it mean, when InetGET() returns @Error=13? This script worked exactly once to download a single JPG image file from a WebCam, GANZ, Model "ZN-MD221M". Any later tries failed The URL (see script below) doesn't work in FF, IE, Chrome either. In FF it's trying to download the image for maybe 5 Minutes, then FF is giving up, displaying this message: Die Grafik "http://ADMIN:1234@10.27.20.211/cgi-bin/still-image.cgi?duration=1~60" kann nicht angezeigt werden, weil sie Fehler enthält. In English about this: The Image <url> can't be displayed, as it contains errors. Wireshark traces shows simply no "answer" to the "GET" for the image file. This is my script: ; GANZ Cam im Bunker #include <inet.au3> HttpSetProxy(1) $URL="http://ADMIN:1234@10.27.20.211/cgi-bin/still-image.cgi?duration=1~60" ConsoleWrite('@@ Debug(' & @ScriptLineNumber & ') : $URL = ' & $URL & @CRLF & '>Error code: ' & @error & @CRLF) ;### Debug Console $file="C:\temp\bunker\bunker.jpg" FileDelete($file) ConsoleWrite('@@ Debug(' & @ScriptLineNumber & ') : $File = ' & $File & @CRLF & '>Error code: ' & @error & @CRLF) ;### Debug Console $result=InetGet($URL,$File,1+2) ConsoleWrite('@@ Debug(' & @ScriptLineNumber & ') : $result = ' & $result & @CRLF & '>Error code: ' & @error & @CRLF) ;### Debug Console The console output is this one: >"C:\Program Files (x86)\AutoIt3\SciTE\..\AutoIt3.exe" "C:\Program Files (x86)\AutoIt3\SciTE\AutoIt3Wrapper\AutoIt3Wrapper.au3" /run /prod /ErrorStdOut /in "H:\DATEN\PUBLIC\Bunker\Script\Mittags-ein-Schuss.au3" /UserParams +>17:18:39 Starting AutoIt3Wrapper v.15.920.938.0 SciTE v.3.6.0.0 Keyboard:00000407 OS:WIN_7/Service Pack 1 CPU:X64 OS:X64 Environment(Language:0407) +> SciTEDir => C:\Program Files (x86)\AutoIt3\SciTE UserDir => C:\Users\admin\AppData\Local\AutoIt v3\SciTE\AutoIt3Wrapper SCITE_USERHOME => C:\Users\admin\AppData\Local\AutoIt v3\SciTE >Running AU3Check (3.3.14.2) from:C:\Program Files (x86)\AutoIt3 input:H:\DATEN\PUBLIC\Bunker\Script\Mittags-ein-Schuss.au3 +>17:18:39 AU3Check ended.rc:0 >Running:(3.3.14.2):C:\Program Files (x86)\AutoIt3\autoit3.exe "H:\DATEN\PUBLIC\Bunker\Script\Mittags-ein-Schuss.au3" --> Press Ctrl+Alt+Break to Restart or Ctrl+Break to Stop @@ Debug(8) : $URL = http://ADMIN:1234@10.27.20.211/cgi-bin/still-image.cgi?duration=1~60 >Error code: 0 @@ Debug(15) : $File = C:\temp\bunker\bunker.jpg >Error code: 0 @@ Debug(20) : $result = 0 ####################### >Error code: 13 ####################### +>17:19:09 AutoIt3.exe ended.rc:0 +>17:19:09 AutoIt3Wrapper Finished. >Exit code: 0 Time: 30.6 so Inetget() returns "Size of download = 0 bytes", and @error=13, what does this error code mean? Regards, Rudi.
-
Hi guys, I have the last version of Autoit and Windows 7, if I execute this testing script: #include <InetConstants.au3> ConsoleWrite("WAIT TEST" & @CRLF & @CRLF) $name = @TempDir & "\filetransfer" & Random(100, 999, 1) $hInet = InetGet("http://www.google.com", $name, $INET_FORCERELOAD, $INET_DOWNLOADWAIT) $nSize = FileGetSize($name) ConsoleWrite("SUCCESS" & @CRLF & _ @TAB & "Number of bytes: " & $hInet & @CRLF & _ @TAB & "Size of the file: " & $nSize & @CRLF) FileDelete($name) $name = @TempDir & "\filetransfer" & Random(100, 999, 1) $hInet = InetGet("http://www.google.xxx", $name, $INET_FORCERELOAD, $INET_DOWNLOADWAIT) ;wrong URL $nSize = FileGetSize($name) ConsoleWrite("FAILURE" & @CRLF & _ @TAB & "Number of bytes: " & $hInet & @CRLF & _ @TAB & "Size of the file: " & $nSize & @CRLF) FileDelete($name) ConsoleWrite(@CRLF & "BACKGROUND TEST" & @CRLF & @CRLF) $name = @TempDir & "\filetransfer" & Random(100, 999, 1) $hInet = InetGet("http://www.google.com", $name, $INET_FORCERELOAD, $INET_DOWNLOADBACKGROUND) $sAnwser = IsHWnd($hInet) ? "Yes" : "No" Sleep(5000) $nSize = FileGetSize($name) ConsoleWrite('SUCCESS with sleep' & @CRLF & _ @TAB & "Handle: " & $hInet & @CRLF & _ @TAB & "Is really an handle? " & $sAnwser & @CRLF & _ @TAB & "Size of the file: " & $nSize & @CRLF) FileDelete($name) $name = @TempDir & "\filetransfer" & Random(100, 999, 1) $hInet = InetGet("http://www.google.com", $name, $INET_FORCERELOAD, $INET_DOWNLOADBACKGROUND) $sAnwser = IsHWnd($hInet) ? "Yes" : "No" $nSize = FileGetSize($name) ConsoleWrite('SUCCESS with no sleep' & @CRLF & _ @TAB & "Handle: " & $hInet & @CRLF & _ @TAB & "Is really an handle? " & $sAnwser & @CRLF & _ @TAB & "Size of the file: " & $nSize & @CRLF) FileDelete($name) $name = @TempDir & "\filetransfer" & Random(100, 999, 1) $hInet = InetGet("http://www.google.xxx", $name, $INET_FORCERELOAD, $INET_DOWNLOADBACKGROUND) ;wrong url $sAnwser = IsHWnd($hInet) ? "Yes" : "No" $nSize = FileGetSize($name) ConsoleWrite('FAILURE' & @CRLF & _ @TAB & "Handle: " & $hInet & @CRLF & _ @TAB & "Is really an handle? " & $sAnwser & @CRLF & _ @TAB & "Size of the file: " & $nSize & @CRLF) FileDelete($name) I obtain this output: WAIT TEST SUCCESS Number of bytes: 19163 Size of the file: 19163 FAILURE Number of bytes: 0 Size of the file: 0 BACKGROUND TEST SUCCESS with sleep Handle: 3 Is really an handle? No Size of the file: 19163 SUCCESS with no sleep Handle: 4 Is really an handle? No Size of the file: 0 FAILURE Handle: 5 Is really an handle? No Size of the file: 0 According to Help File, using InetGet with $INET_DOWNLOADBACKGROUND flag, I should get an handle for InetGetInfo function... but instead of it I get a number which seems the current number of downloads (the same that I get if I call InetGetInfo with no arguments). Does it happens with your operative system and Autoit version too?
-
Yosh all, A simple question how to get the Speed with the file of inetget get's downloaded? °-° just like Firefox/Edge/.. etc showing the Speed Thanks in advance °-°.
-
I am getting a corrupt Excel file when I use InetGet(). $sFilePath = @ScriptDir & "\List.xls" $sDownloadURL = "https://alachua.lienexpress.net/certificates/list.xls?q=%7B%7D%0A" $hDownload = InetGet($sDownloadURL, $sFilePath)
-
This is just a 2014 update of >_InetGetGUI() and >_InetGetProgress(). I have purposely left the original examples untouched just for nostalgic purposes. _InetGetGUI() Example and Function: #AutoIt3Wrapper_Au3Check_Parameters=-d -w 1 -w 2 -w 3 -w- 4 -w 5 -w 6 -w 7 #include <GUIConstantsEx.au3> #include <InetConstants.au3> #include <MsgBoxConstants.au3> #include <StringConstants.au3> Example() Func Example() Local $hGUI = GUICreate('_InetGetGUI()', 370, 90) Local $iLabel = GUICtrlCreateLabel('Welcome to the simple downloader', 5, 5, 270, 40) Local $iStartClose = GUICtrlCreateButton('&Download', 275, 2.5, 90, 25) Local $iProgressBar = GUICtrlCreateProgress(5, 60, 360, 20) GUISetState(@SW_SHOW, $hGUI) Local $sFilePath = '', $sFilePathURL = 'http://ipv4.download.thinkbroadband.com/5MB.zip' While 1 Switch GUIGetMsg() Case $GUI_EVENT_CLOSE ExitLoop Case $iStartClose $sFilePath = _InetGetGUI($sFilePathURL, $iLabel, $iProgressBar, $iStartClose, @TempDir) Switch @error ; Check what the actual error was. Case 1 ; $INETGET_ERROR_1 MsgBox($MB_SYSTEMMODAL, 'Error', 'Check the URL or your Internet connection is working.') Case 2 ; $INETGET_ERROR_2 MsgBox($MB_SYSTEMMODAL, 'Fail', 'It appears the user interrupted the download.') Case Else MsgBox($MB_SYSTEMMODAL, 'Success', 'Successfully downloaded "' & $sFilePath & '"') EndSwitch EndSwitch WEnd GUIDelete($hGUI) EndFunc ;==>Example ; #FUNCTION# ==================================================================================================================== ; Name ..........: _InetGetGUI ; Description ...: Download a file updating a GUICtrlCreateProgress() ; Syntax ........: _InetGetGUI($sURL, $iLabel, $iProgress, $iButton[, $sDirectory = @ScriptDir]) ; Parameters ....: $sURL - A valid URL that contains the filename too ; $iLabel - ControlID of a GUICtrlCreateLabel comtrol. ; $iProgress - ControlID of a GUICtrlCreateProgress control. ; $iButton - ControlID of a GUICtrlCreateButton control. ; $sDirectory - [optional] Directory of where to download. Default is @ScriptDir. ; Return values .: Success - Downloaded filepath. ; Failure - Blank string & sets @error to non-zero ; Author ........: guinness ; Example .......: Yes ; =============================================================================================================================== Func _InetGetGUI($sURL, $iLabel, $iProgress, $iButton, $sDirectory = @ScriptDir) Local Enum $INETGET_ERROR_0, $INETGET_ERROR_1, $INETGET_ERROR_2 Local $sFilePath = StringRegExpReplace($sURL, '^.*/', '') If StringStripWS($sFilePath, $STR_STRIPALL) == '' Then Return SetError($INETGET_ERROR_1, 0, $sFilePath) EndIf $sFilePath = StringRegExpReplace($sDirectory, '[\\/]+$', '') & '\' & $sFilePath Local $iFileSize = InetGetSize($sURL, $INET_FORCERELOAD) Local $hDownload = InetGet($sURL, $sFilePath, $INET_LOCALCACHE, $INET_DOWNLOADBACKGROUND) Local Const $iRound = 0 Local $iBytesRead = 0, $iPercentage = 0, $iSpeed = 0, _ $sProgressText = '', $sSpeed = 'Current speed: ' & _ByteSuffix($iBytesRead - $iSpeed) & '/s' Local $hTimer = TimerInit() Local $iError = $INETGET_ERROR_0, _ $sRead = GUICtrlRead($iButton) GUICtrlSetData($iButton, '&Cancel') While Not InetGetInfo($hDownload, $INET_DOWNLOADCOMPLETE) Switch GUIGetMsg() Case $GUI_EVENT_CLOSE, $iButton GUICtrlSetData($iLabel, 'Download cancelled.') $iError = $INETGET_ERROR_2 ExitLoop EndSwitch $iBytesRead = InetGetInfo($hDownload, $INET_DOWNLOADREAD) $iPercentage = $iBytesRead * 100 / $iFileSize $sProgressText = 'Downloading ' & _ByteSuffix($iBytesRead, $iRound) & ' of ' & _ByteSuffix($iFileSize, $iRound) & @CRLF & $sSpeed GUICtrlSetData($iLabel, $sProgressText) GUICtrlSetData($iProgress, $iPercentage) If TimerDiff($hTimer) >= 1000 Then $sSpeed = 'Current speed: ' & _ByteSuffix($iBytesRead - $iSpeed) & '/s' $iSpeed = $iBytesRead $hTimer = TimerInit() EndIf Sleep(20) WEnd InetClose($hDownload) GUICtrlSetData($iButton, $sRead) If $iError > $INETGET_ERROR_0 Then FileDelete($sFilePath) $sFilePath = '' EndIf Return SetError($iError, 0, $sFilePath) EndFunc ;==>_InetGetGUI Func _ByteSuffix($iBytes, $iRound = 2) ; By Spiff59 Local Const $aArray[9] = [' bytes', ' KB', ' MB', ' GB', ' TB', ' PB', ' EB', ' ZB', ' YB'] Local $iIndex = 0 While $iBytes > 1023 $iIndex += 1 $iBytes /= 1024 WEnd Return Round($iBytes, $iRound) & $aArray[$iIndex] EndFunc ;==>_ByteSuffix _InetGetProgress() Example and Function: #AutoIt3Wrapper_Au3Check_Parameters=-d -w 1 -w 2 -w 3 -w- 4 -w 5 -w 6 -w 7 #include <InetConstants.au3> #include <MsgBoxConstants.au3> #include <StringConstants.au3> Example() Func Example() Local $sFilePathURL = 'http://ipv4.download.thinkbroadband.com/5MB.zip' Local $sFilePath = _InetGetProgress($sFilePathURL, @TempDir) If @error Then MsgBox($MB_SYSTEMMODAL, 'Error', 'Check the URL or your Internet connection is working.') Else MsgBox($MB_SYSTEMMODAL, 'Success', 'Successfully downloaded "' & $sFilePath & '"') EndIf EndFunc ;==>Example ; #FUNCTION# ==================================================================================================================== ; Name ..........: _InetGetProgress ; Description ...: Download a file showing a progress bar using ProgressOn. ; Syntax ........: _InetGetProgress($sURL[, $sDirectory = @ScriptDir]) ; Parameters ....: $sURL - A valid URL that contains the filename too. ; $sDirectory - [optional] Directory of where to download. Default is @ScriptDir. ; Return values .: Success - Downloaded filepath. ; Failure - Blank string & sets @error to non-zero ; Author ........: guinness ; Example .......: Yes ; =============================================================================================================================== Func _InetGetProgress($sURL, $sDirectory = @ScriptDir) Local Const $INETGET_ERROR_1 = 1 Local $sFilePath = StringRegExpReplace($sURL, '^.*/', '') If StringStripWS($sFilePath, $STR_STRIPALL) == '' Then Return SetError($INETGET_ERROR_1, 0, $sFilePath) EndIf $sFilePath = StringRegExpReplace($sDirectory, '[\\/]+$', '') & '\' & $sFilePath Local $iFileSize = InetGetSize($sURL, $INET_FORCERELOAD) Local $hDownload = InetGet($sURL, $sFilePath, $INET_LOCALCACHE, $INET_DOWNLOADBACKGROUND) Local Const $iRound = 0 Local $iBytesRead = 0, $iPercentage = 0, $iSpeed = 0, _ $sProgressText = '', $sSpeed = 'Current speed: ' & _ByteSuffix($iBytesRead - $iSpeed) & '/s' Local $hTimer = TimerInit() ProgressOn('', '') While Not InetGetInfo($hDownload, $INET_DOWNLOADCOMPLETE) $iBytesRead = InetGetInfo($hDownload, $INET_DOWNLOADREAD) $iPercentage = $iBytesRead * 100 / $iFileSize $sProgressText = 'Downloading ' & _ByteSuffix($iBytesRead, $iRound) & ' of ' & _ByteSuffix($iFileSize, $iRound) & @CRLF & $sSpeed ProgressSet(Round($iPercentage, $iRound), $sProgressText, 'Downloading: ' & $sFilePath) If TimerDiff($hTimer) >= 1000 Then $sSpeed = 'Current speed: ' & _ByteSuffix($iBytesRead - $iSpeed) & '/s' $iSpeed = $iBytesRead $hTimer = TimerInit() EndIf Sleep(20) WEnd InetClose($hDownload) ProgressOff() Return $sFilePath EndFunc ;==>_InetGetProgress Func _ByteSuffix($iBytes, $iRound = 2) ; By Spiff59 Local Const $aArray[9] = [' bytes', ' KB', ' MB', ' GB', ' TB', ' PB', ' EB', ' ZB', ' YB'] Local $iIndex = 0 While $iBytes > 1023 $iIndex += 1 $iBytes /= 1024 WEnd Return Round($iBytes, $iRound) & $aArray[$iIndex] EndFunc ;==>_ByteSuffix
-
Prior to posting this question, I have experimented with using FileExists, InetGetInfo and InetGet/InetClose functions to test downloading files into my application folder, in case if my file did not already exist. In my case, I was able to pinpoint to a very specific web address, eg.) https://dropbox.com, etc. and then check/download the file. Now my question is, instead of checking from a hardcoded URL, is it possible to: 1. Do a check against the IP address of a remote computer, eg.) my friend's or colleague's desktop computer 2. After checking the IP address if valid and can be connected to, access the computer's hard drive location, eg.) that machine's C:specifiedFolderspecifiedFile.txt 3. Download a copy of that file into a local computer I did read ('?do=embed' frameborder='0' data-embedContent>>) but that was what I accomplished earlier, so now I am not sure if I were to change the method of file retrieval, what changes are necessary?
- 5 replies
-
- FileExists
- DllCall
-
(and 4 more)
Tagged with:
-
Hi Guys, I'm trying to create a program that'll download .torrent files for me automatically and place them in a folder so uTorrent starts downloading them. The whole script works flawless (for now) except for the most important part: Downloading the .torrent file. It works, it downloads the .torrent file perfectly, but for some reason uTorrent gives me the error that 'the torrent file was not correctly encoded'. For some reason downloading the torrent with InetGet instead of my browser, fucks it up. The size of the torrent is exactly the same as that of the one I download with my browser, still the files are different. This is my script: #cs ---------------------------------------------------------------------------- AutoIt Version: 3.3.8.1 Author: myName Script Function: Template AutoIt script. #ce ---------------------------------------------------------------------------- ; Script Start - Add your code below here #include <INet.au3> #include <Array.au3> $downloadfolder = 'C:\Users\Ludo\Downloads\torrents' $searchquiry = StringReplace('the hobbit desolation of smaug', ' ', '+') $preferedsite = 'kickmirror' $link = 'http://torrentz.eu/search?f='&$searchquiry $source = _INetGetSource($link, True) $S1 = StringSplit($source, '<a rel="nofollow" href="/searchA?f='&$searchquiry&'"> date </a> | <a rel="nofollow" href="/searchS?f='&$searchquiry&'"> size </a> | <b> peers </b></h3>', 1) $S2 = StringSplit($S1[2], '<dl><dt style="text-align: right">', 1) $S3 = StringSplit($S2[1], @LF, 1) global $torrents[$S3[0]+1][7] $torrents[0][0] = $S3[0]-1 ; Form of $torrents[a][b] for b: ; $torrents[a][0] = total string ; $torrents[a][1] = torrent url ; $torrents[a][2] = torrent title ; $torrents[a][3] = torrent size ; $torrents[a][4] = torrent seeders ; $torrents[a][5] = torrent peers ; $torrents[a][6] = torrent type For $i = 1 to $torrents[0][0] ;MsgBox(0, '', _StringBetw($S3[$i], '<a href="', '">')) $torrents[$i][0] = $S3[$i] $torrents[$i][1] = _StringBetw($S3[$i], '<a href="', '">') $torrents[$i][2] = _StringStrip(_StringBetw($S3[$i], '<a href="'&$torrents[$i][1]&'">', '</a>')) $temp1 = StringSplit($S3[$i], '</a> » ', 1) $temp2 = StringSplit($temp1[2], '</dt><dd>', 1) $temp3 = StringSplit($S3[$i], '</span></span><span class="s">', 1) $temp4 = StringSplit($temp3[2], '</span> <span class="u">', 1) $temp5 = StringSplit($temp4[2], '</span><span class="d">', 1) $temp6 = StringSplit($temp5[2], '</span>', 1) $torrents[$i][3] = $temp4[1] $torrents[$i][4] = $temp5[1] $torrents[$i][5] = $temp6[1] $torrents[$i][6] = $temp2[1] Next ;_ArrayDisplay($torrents) ;ClipPut($torrents[1][1]&@CRLF&@CRLF&$torrents[$torrents[0][0]][2]) $source2 = _INetGetSource('http://torrentz.eu/'&$torrents[1][1]) $A1 = StringSplit($source2, ' torrent download locations</h2><dl><dt>', 1) $A2 = StringSplit($A1[1], '</span> ', 1) $A3 = StringSplit($A1[2], '<a href="', 1) $locations = $A2[$A2[0]] global $tors[$locations+1] $n = 0 For $i = 2 to $locations $A4 = StringSplit($A3[$i], '" ', 1) $tors[$i] = $A4[1] If StringInstr($tors[$i], $preferedsite) Then $n = $i EndIf Next If $n = 0 Then Msgbox(32, 'Too bad', 'No kickmirror torrent links found..') Exit EndIf ;_ArrayDisplay($tors) $source3 = _INetGetSource($tors[$n], True) ;$B1 = _StringBetw($source3, '<a title="Magnet link" href="', '"') ;ShellExecute($B1) $B1 = _StringBetw($source3, '<a rel="nofollow" title="Download verified torrent file" href="', '"') $B2 = StringSplit($B1, '.torrent?title=', 1) $finallink = $B2[1]&'.torrent' InetGet($finallink,$downloadfolder&'\'&$B2[2]&'.torrent', 4) MsgBox(32, 'Succes', 'Torrent started downloading!') Func _StringBetw($string, $start, $end) $pa = StringSplit($string, $start, 1) If $pa[0] < 2 Then Return 0 $pb = StringSplit($pa[2], $end, 1) Return $pb[1] EndFunc Func _StringStrip($string) $s = StringReplace($string, '<b>', '') $s1 = StringReplace($s, '</b>', '') Return $s1 EndFunc Please try it out, then try to run the torrent with utorrent or some other torrent downloader. If somebody knows what the problem is, I'd be very happy if you'd help me here! Thnx in advance, Ludo
-
Hello everyone. Im having an issue with the InetGet function and i was hoping someone could help me provide som assiatnce. Im writing a script that monitor serveral servers and send information about performance characterics using a webbservice, to a server. I then get a dashboard of thye health of all my serverers. To this. Ive written a simpel update program. The idee is that the program should check online for a new version and if found, dowload and install it. In testing it all works, but in practise ive run into server issues. One was that the updater program (a separte script from the main script) was crashing when trying to unzip the downloaded zip file (witch contains the new version of thne scrips and configuration for the script). This happend at ablur 60% of all the servers. On close inspection it seems that the InetGet function doesnt download the entire file. It just downloads about 140KB of it. Therefore the unziping of the file crashes the script. To not crash the script, i added a check to see the size of the file befor the DL and after the DL. That works of couse, but it doesnt solve the problem. I guess it has to do with firewalls and antovirus software, softaware or somthing like that. The problem is that theese servers are on infrastracture that is out of my control. Downloading the file manually trough internet explorer seems to work though. Any help would be appritiated. Here is a part of my code. $dlsize = InetGetSize ($dl4,1) FileWriteLine(@ScriptDir & "\verbose.log", @YEAR & "-" & @MON & "-" & @MDAY & " " & @HOUR & ":" & @MIN & ":" & @SEC & " - Updater - Downloading file: " & $dl4) FileWriteLine(@ScriptDir & "\verbose.log", @YEAR & "-" & @MON & "-" & @MDAY & " " & @HOUR & ":" & @MIN & ":" & @SEC & " - Updater - File is: " & $dlsize & " bytes.") InetGet($dl4,@TempDir & "\system.zip",1) sleep(2000) $retry = 10 while 1 if FileExists(@TempDir & "\system.zip") then if $dlsize = FileGetSize(@TempDir & "\system.zip") then FileWriteLine(@ScriptDir & "\verbose.log", @YEAR & "-" & @MON & "-" & @MDAY & " " & @HOUR & ":" & @MIN & ":" & @SEC & " - Updater - Download OK") ExitLoop EndIf Else FileWrite(@TempDir & "\system.zip",InetRead($dl4,1)) EndIf sleep(1000) if $retry = 0 then FileWriteLine(@ScriptDir & "\verbose.log", @YEAR & "-" & @MON & "-" & @MDAY & " " & @HOUR & ":" & @MIN & ":" & @SEC & " - Updater - Download Faild") if $silent = 0 then MsgBox(0,"ERROR","Could not download file. Check Antivirus and firewalls.") Exit EndIf $retry = $retry - 1 WEnd
-
So I'm attempting to download a file that may have a new name that changes every day. I can parse the file name but for some reason my script just hangs or I never get the downloaded file. If I paste my download link, from the 'ClipPut', into a browser, it works perfectly. Below is what I have so far, can anyone see what nuance I'm missing? Thanks, -Mike #include <IE.au3> Global $SequenceNumber GetFileName() Download() Exit Func GetFileName() Local $oIE = _IECreate("http://www.symantec.com/security_response/definitions/download/detail.jsp?gid=sonar", 0, 0) Local $sString = _IEBodyReadHTML($oIE) Local $iPosSeqStart = StringInStr($sString, 'http://definitions.symantec.com/defs/sonar/') Local $iPosSeqEnd = StringInStr($sString, '">', 0, 1, $iPosSeqStart) $SequenceNumber = (StringMid($sString, $iPosSeqStart + 43, $iPosSeqEnd - $iPosSeqStart - 43) & @CRLF) _IEQuit($oIE) MsgBox(0, "Download Information:", "DefFile: " & $SequenceNumber & @CRLF & "DownloadPath: " & "http://definitions.symantec.com/defs/sonar/" & $SequenceNumber) ClipPut("http://definitions.symantec.com/defs/sonar/" & $SequenceNumber) EndFunc ;==>GetFileName Func Download() ; Advanced example - downloading in the background Local $hDownload = InetGet("http://definitions.symantec.com/defs/sonar/" & $SequenceNumber, @ScriptDir & "\" & $SequenceNumber, 1, 1) Do Sleep(250) Until InetGetInfo($hDownload, 2) ; Check if the download is complete. Local $nBytes = InetGetInfo($hDownload, 0) InetClose($hDownload) ; Close the handle to release resources. MsgBox(0, "", "Bytes read: " & $nBytes) EndFunc ;==>Download
-
This is unrelated to the topic I posted earlier today. I'm using inetget to retrieve a file online. sometimes this works, sometimes it doesn''t work but inetgetinfo gives a non-zero error number. Atleast I have an idea about what went wrong in that case. However, the most confusing of all is I sometimes get no error message but the file does not appear in my target directory. my code is pasted below: $i=99999 $keyword = "amazon" $hDownload = InetGet("http://www.google.com/trends/fetchComponent?q=" & $keyword & "&geo=US&cid=TIMESERIES_GRAPH_0&export=3&date=today%203-m&cmpt=q", @DesktopDir & "data_" & $i & ".txt",0,1) $inetgettime=0 Do Sleep(250) $inetgettime=$inetgettime+0.25 If $inetgettime>10 Then $timedout=1 InetClose($hDownload) ConsoleWrite("timed out waiting for download " & $inetgettime & @CRLF) $timedout=1 ExitLoop EndIf Until InetGetInfo($hDownload, 2) ; Check if the download is complete. $err = InetGetInfo($hDownload) ConsoleWrite("internet error = " & $err[4] & @CRLF) When the strange non-saving while $err=0 behaviour occurs, it is too fast for my 10 second timeout loop. The script takes about 0.9 seconds to run when the behaviour I described above is shown. Does anybody have an idea as to the cause of this?
-
I'm using InetGet to download some files but get an error "13" with InetGet after the 4th download. This occurs while using Hidemyass Pro VPN. If I can understand what error 13 actually means I'll have a better chance of debugging Does anybody know what error 13 means? I;ve looked at the soucre in the include folder for "IE" and "Inet" but can't see the source for Inetget in either. Any help is greatly appreciated
-
Right now the fastest way I can mine someone's database is by making hundreds of individual executable that all do there on INETGET TCP Request, obviously this take up a lot of processing and RAM resources. Anyone know of a way I can make more requests for pages faster\more efficiently? My scripts that I run look something like this... ;A setprate script makes a txt file with part of a URL to go to #include <File.au3> #include <Array.au3> $htmlstore = @DesktopCommonDir & "\HTMLstore\" $FileList = _FileListToArray($htmlstore) For $count = 2 To $FileList[0] + 1 If FileExists($htmlstore & $count & ".txt") = 1 Then FileMove($htmlstore & $count & ".txt", $htmlstore & @AutoItPID & @ComputerName & ".txt") $file = FileOpen($htmlstore & @AutoItPID & @ComputerName & ".txt") $PID = FileRead($file) FileClose($file) $hDownload = InetGet("http://www.ocpafl.org/Searches/ParcelSearch.aspx/PID/" & $PID, $htmlstore & $PID & ".html", 1) InetClose($hDownload) ; Close the handle to release resources. FileDelete($htmlstore & @AutoItPID & @ComputerName & ".txt") Exit EndIf Next If you need more info let me know. Any recommendations would be appreciated
-
Hi guys, i have a doubt using InetGet I have made this script: SplashTextOn("", "Downloading...", 200, 40, -1, -1, 1, "",10,"") Local $hDownload = InetGet("https://github.com/test/test/zipball/master", @WorkingDir & "\Test.zip", 1, 1) Do Sleep(250) Until InetGetInfo($hDownload, 2) InetClose($hDownload) SplashOff() MsgBox(0, "", "Completed") The script work, but i have made the filename ( Test.zip ) So i have this question: 1) Can automatically script set the original filename? 2) Can automatically script set the extension? (optional, i think is not possible) Thanks