Jump to content

RapidQueuer 2.4.4


Datenshi
 Share

Recommended Posts

Datensi,

Hmm... Perhaps Collector accounts don't have this "logged IP" problem!

I am very pleased to have been of some assistance :D . It was a challenge for me (I don't understand TCP), and that equates to fun. It's also a way of showing my appreciation to you for sharing your script. I attempted to do this same thing a year ago, but failed when confronted with submitting the form data in order to progress to the next page.

My tests with my personal files indicate that the correct file size is returned by the URL for the first page, but I like your idea of using one of the mirrors, provided that does not cause the same issue with RapidPoints being awarded.

I can't run any more tests for at least another 19 hours without the risk of exceeding the RS daily IP limit (I don't know the precise limit), so I'll try to discover what's causing this "logged IP" issue with Premium accounts at that time, and will let you know what I find.

Cheers,

Pinchly

Link to comment
Share on other sites

  • Replies 278
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Datenshi,

It's a good thing that I patiently waited 19 hours for the day to change on the RS server because the IP count against my daily limit would have increased when I performed several tests! Here's what I found regarding the file size and IP's being logged issues:

1. My tests indicate when InetGetSize($DownloadLinkList[$x]) is used to obtain the file size, the current IP is recorded in the premium account log, and that is very, very bad! However, it is the only method resulting in one RapidPoint being awarded.

2. When InetGetSize($FastestMirror) is used to obtain the file size, nothing bad happens in the log file (the IP is not recorded), but as you already know, no RapidPoint is awarded.

3. When one of the mirror links (other than $FastestMirror) is used to obtain the file size, nothing bad happens in the log file (the IP is not recorded). Unfortunately, no RapidPoint is awarded.

To isolate the problem IP logged problem, I made a test script and tried InetGetSize using a "first page" URL. That alone caused the current IP to be recorded in my premium RS log. This occurs even with Firefox Cookies off and when I am logged out of my RS account (which deletes the RS cookie). I'm guessing, but InetGetSize is probably Internet Explorer based, and RS likely sees that cookie, which is required in order to use a download manager with premium accounts.

That's unfortunate because my test indicate that the correct file size is returned when the "first page" URL is used, and as I've stated appears to be the only method that awards RapidPoints. But this is absolutely disastrous for premium users :D !

I currently have only one idea, but haven't tried it. What would happen if one loop was performed just to get the file size from any mirror, and a second loop perform to do everything else? Since RS issues new URLs each time the "first page" URL is requested, I think this will work.

Cheers,

Pinchly

Link to comment
Share on other sites

I might have to write my own Inetget function for this to work properly, Ive tried a couple of things today. Once you've established a connection and start to download the file from Rapidshare, any InetGetSize requests will return the "your ip is already in use" page size. If you send a InetGetSize request before the download no point will be added. And sending it after the download would ofc be useless. I'm suspecting the opening and closing of connections are the culprit here.

Post Submit send when you click the Download button on last page of RS.

POST /files/353535353/35533535/test.rar HTTP/1.1
Host: rs744tl2.rapidshare.com
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.11) Gecko/2009060215 Firefox/3.0.11 (.NET CLR 3.5.30729)
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Referer: http://rs744.rapidshare.com/files/35353535/test.rar
Content-Type: application/x-www-form-urlencoded
Content-Length: 19
mirror=on&x=65&y=68

the "mirror=on&x=65&y=68" is important but i cant find where these values are taken for x and y. They seem to be random

Header received back

HTTP/1.x 200 OK
Date: Tue, 16 Jun 2009 17:18:33 GMT
Connection: close
Content-Type: application/octet-stream
Accept-Ranges: bytes
Content-Disposition: Attachment; filename=test.rar
Content-Length: 5650015

Content-Length: 5650015 = filesize in bytes

Edited by Datenshi
Link to comment
Share on other sites

Post Submit send when you click the Download button on last page of RS.

I tried this, although I really don't know what I'm doing, and probably did something wrong. Instead of returning the "Content-Length:", TCPRecv actually downloaded the file. I paused the script immediately after the TCPRecv, and before InetGet, and watched the file download on the Windows Task Manager Networking tab.

To any RS premium users attempting to verify their RS log, while the log often updates soon after or during a download, it can take 5 to 10 minutes for the log to add new IPs. Don't be tricked!

Pinchly

Link to comment
Share on other sites

I tried this, although I really don't know what I'm doing, and probably did something wrong. Instead of returning the "Content-Length:", TCPRecv actually downloaded the file. I paused the script immediately after the TCPRecv, and before InetGet, and watched the file download on the Windows Task Manager Networking tab.

To any RS premium users attempting to verify their RS log, while the log often updates soon after or during a download, it can take 5 to 10 minutes for the log to add new IPs. Don't be tricked!

Pinchly

Actually, the server probably sent you the backheader i posted. With content-length, but the server keeps feeding the file afterwards for TCPRecv to get it.

For every loop with the TCPRecv i matched it with "StringInStr($sRecv,@CRLF&@CRLF)" then "Tcpclosesocket and ExitLoop" on successful match. @CRLF&@CRLF is the last line of the Header sent from the server, data after that belongs to the file i think. I wanted to escape the connection before the server started feeding me data so that i wouldnt trigger any "ip in use" or "15 min wait". But it seems to me that for a successful point reward it has to be a 1-shot download. This has proven to be more complicated then i expected :\

Btw, I've set up another collectors account for testing purpose, this way i dont have to worry about getting my real acc disabled.

Link to comment
Share on other sites

Actually, the server probably sent you the backheader i posted. With content-length, but the server keeps feeding the file afterwards for TCPRecv to get it.

For every loop with the TCPRecv i matched it with "StringInStr($sRecv,@CRLF&@CRLF)" then "Tcpclosesocket and ExitLoop" on successful match. @CRLF&@CRLF is the last line of the Header sent from the server, data after that belongs to the file i think. I wanted to escape the connection before the server started feeding me data so that i wouldnt trigger any "ip in use" or "15 min wait". But it seems to me that for a successful point reward it has to be a 1-shot download. This has proven to be more complicated then i expected :\

I'll have to play with this when I get a chance, but right now I have some real downloads in progress.

Btw, I've set up another collectors account for testing purpose, this way i dont have to worry about getting my real acc disabled.

Good thinking, a very wise move!

Pinchly

Edited by pinchly
Link to comment
Share on other sites

Datenshi,

I found a way to get the approximate file size using the data already being obtained by RQ. It's hiding on the countdown page, after the mirror URLs, and is in KBs.

Example data (5000 KB is the file size):

<p class="downloadlink">http://rapidshare.com/files/987654321/Diamond5.zip <font style="color:#8E908F;">| 5000 KB</font></p>

Extract the file size and convert to bytes as follows:

$aSize = _StringBetween($sRecv, "downloadlink", "KB")
$aSize = _StringBetween($aSize[0] & "KB", "|", "KB")
$iSize = StringStripWS($aSize[0], 8) * 1024

Again, this file size is not exact, but should be close enough for the progress calculations.

The same data is on the first page!

Pinchly

Edited by pinchly
Link to comment
Share on other sites

yepp, I'm aware, i must have deleted the post where i mentioned it, Anyway it's a solution which will ofc work but as you said isn't 100% accurate. Seeing it as the fail zone most likely being 0.X kb max, its not that big of a deal. RQ also uses the size for diskspacechk, so i would probably have to add +1kb to the value to be sure it'll fit. Such a small miss calculation would probably not be noticeable on the Progress bar. On the other hand, This value can not be trusted to always be right, Rapidshare might screw something up, therefor i want to test every possible solution to get the size in bytes straight from the source.

The $Size value is currently used for :

File Overwrite rules

Expected filesize match after DL

DiskSpaceChk

Progress bar+text % statistics

I'm probably gonna give up soon on the bytes value, and just go with the KB value on the first page. :D How nice would it be if InetGet returned the backheader?

Edited by Datenshi
Link to comment
Share on other sites

Datenshi,

In case you haven't noticed, AutoIt v3.3.1.0 beta changes InetGet and _StringBetween (see below, or help file). The good news is that InetGetInfo(handle, 1) returns the accurate file size (although the help file says the file size may not always be present), and a RapidPoint is awarded. The bad news is that InetGetInfo requires the download to be started with InetGet, which is too late for File Overwrite Rules and DskSpaceChk.

The return value of InetGet() has changed. It is important to read and understand the changes because it is possible to leak resources if InetGet() is used incorrectly.

InetGet("abort"), @InetGetActive and @InetGetBytesRead are now deprecated. The following list shows the new functions used to access the old behavior:

InetGet("abort") - Calling the new InetClose() function with a handle returned from InetGet() will abort a download. InetGet() handles must be closed or resources will leak

@InetGetActive - Calling the new InetGetInfo() function with no parameters returns a count of active downloads.

@InetGetBytesRead - Calling the new InetGetInfo() function with a handle returned from InetGet()will return the bytes read (and more) for a download.

The FtpBinaryMode option set with AutoItSetOption() has been removed. Now InetGet() takes a flag to specify the transfer mode.

The last optional parameter for _StringBetween() has been removed

I find that InetClose and InetGetInfo(handle, 2), and perhaps others, throw the error "Unknown function name" when a beta v3.3.1.1 compiled script is executed. It's okay if the script is run uncompiled. I made a bug report.

Pinchly

Edited by pinchly
Link to comment
Share on other sites

thanks for the tip, sounds like interesting changes and probably a huge improvement.

I find that InetClose and InetGetInfo(handle, 2), and perhaps others, throw the error "Unknown function name" when a beta v3.3.1.1 compiled script is executed. It's okay if the script is run uncompiled. I made a bug report.

Did you compile with the beta compiler?
Link to comment
Share on other sites

Did you compile with the beta compiler?

Yes, I used the beta compiler. Valik already fixed it and said version 3.3.1.2 beta will contain the fix. If you play with v3.3.1.1 beta, just run the script without compiling.

Pinchly

Link to comment
Share on other sites

To fix some issues in Vista/Win7, simply add #RequireAdmin to the head of your script...

I was getting an error saying that LinkList.dat didn't have any RS links... x.x

I figured it was a permission issue, and sure enough I tagged that #RA to the head, re-compiled, and it works... :)

SIGNATURE_0X800007D NOT FOUND

Link to comment
Share on other sites

  • 2 weeks later...

Datenshi,

Great script! I have customized your script to add tray icons to display the status. The summary of code modifications and source of icons is below. Maybe an option for future versions?

; Prior to displaying form, verify whether icons are installed
If FileExists(@ScriptDir & "\down.ico") Then $UseIcons = True

; Following each currrent TraySetToolTip(), add the line below with the appropriate icon
If $UseIcons Then TraySetIcon("next.ico")

; Icon Source:
; http://www.iconarchive.com/category/system/aesthetica-2-icons-by-dryicons.html

; Suggested icons:
; NEXT.ICO      => passing pages, skipping files
; DOWN.ICO      => downloading file
; REPEAT.ICO    => seeking fastest mirror, DHCP refresh
; PAUSE.ICO     => waiting
; ERROR.ICO     => errors
Edited by mlowery
Link to comment
Share on other sites

Been busy lately but i think i found a way for the RS point bug, actually pretty good solution, using the new Rapidshare API.

http://api.rapidshare.com/cgi-bin/rsapi.cgi?sub=checkfiles_v1&files=value&filenames=value

Return Format example = 244887530,test.rar,5650015,744,1,tg,0

1:File ID

2:Filename

3:Size (in bytes. If size is 0, this file does not exist.)

4:Server ID

5:Status integer, which can have the following values:

0=File not found

1=File OK (Downloading possible without any logging)

2=File OK (TrafficShare direct download without any logging)

3=Server down

4=File marked as illegal

5=Anonymous file locked, because it has more than 10 downloads already

6=File OK (TrafficShare direct download with enabled logging. Read our privacy policy to see what is logged.)

6:Short host (Use the short host to get the best download mirror: http://rs$serverid$shorthost.rapidshare.com/files/$fileid/$filename)

7:md5 (See parameter incmd5 in parameter description above.)

subroutine=checkfiles_v1
Description:    Gets status details about a list of given files. (files parameter limited to 10000 bytes. filenames parameter limited to 100000 bytes.)
Parameters: files=comma separated list of file ids
        filenames=comma separated list of the respective filename. Example: files=50444381,50444382 filenames=test1.rar,test2.rar
        incmd5=if set to 1, field 7 is the hex-md5 of the file. This will double your points! If not given, all md5 values will be 0
Reply fields:   1:File ID
        2:Filename
        3:Size (in bytes. If size is 0, this file does not exist.)
        4:Server ID
        5:Status integer, which can have the following values:
            0=File not found
            1=File OK (Downloading possible without any logging)
            2=File OK (TrafficShare direct download without any logging)
            3=Server down
            4=File marked as illegal
            5=Anonymous file locked, because it has more than 10 downloads already
            6=File OK (TrafficShare direct download with enabled logging. Read our privacy policy to see what is logged.)
        6:Short host (Use the short host to get the best download mirror: http://rs$serverid$shorthost.rapidshare.com/files/$fileid/$filename)
        7:md5 (See parameter incmd5 in parameter description above.)
Reply format:   integer,string,integer,integer,integer,string,string

this actually means i can get a bunch of good information about the file before waiting for anything, so Rapidqueuer will be much faster. I need to read further about their "point" system they have to combat server overload, so there's no risk of getting IP blocked if to many api requests are made to fast(edit: read about it, no way it'll happen if your not intentionally try to flood the API server). Whether or not to request the MD5 value of the file, which could be compared with a possible file already existing.

Datenshi,

Great script! I have customized your script to add tray icons to display the status. The summary of code modifications and source of icons is below. Maybe an option for future versions?

; Prior to displaying form, verify whether icons are installed
If FileExists(@ScriptDir & "\down.ico") Then $UseIcons = True

; Following each currrent TraySetToolTip(), add the line below with the appropriate icon
If $UseIcons Then TraySetIcon("next.ico")

; Icon Source:
; http://www.iconarchive.com/category/system/aesthetica-2-icons-by-dryicons.html

; Suggested icons:
; NEXT.ICO      => passing pages, skipping files
; DOWN.ICO      => downloading file
; REPEAT.ICO    => seeking fastest mirror, DHCP refresh
; PAUSE.ICO     => waiting
; ERROR.ICO     => errors

Aight man i'll check it out thx. Edited by Datenshi
Link to comment
Share on other sites

Been busy lately but i think i found a way for the RS point bug, actually pretty good solution, using the new Rapidshare API.

http://api.rapidshare.com/cgi-bin/rsapi.cgi?sub=checkfiles_v1&files=value&filenames=value

Return Format example = 244887530,test.rar,5650015,744,1,tg,0

Nice work Datenshi! I just ran 7 tests, using files in my Premium account and files by others, and the RS API worked perfectly. It returned the true file sizes, and you will be happy to know that this method does NOT charge the IP's that I used against my Premium account :) !

Since multiple files can be queried at the same time, your script could gather data about a bunch of files, perhaps all, long before the script attempts to download them. That would keep the number of API calls to a minimum, although some files may become unavailable when RQ finally gets to them, but I think RQ already handles that condition.

I'm glad you unraveled the RS API because if you had not I would not have bothered trying to understand it.

For those interested, the RS API "manual" is at

http://images.rapidshare.com/apidoc.txt

Pinchly

Link to comment
Share on other sites

I've changed the code to use Rapidshare API, and added small things like a traymenu item with Open DL folder..and changed so the "servername" info is the proper name of the mirror provider such as "Telia Sonera" and not the hostname. I'm looking into changing the download speed calculations, It feels to me like it's fluctuating a bit too much. I'm thinking an array[10] filled with sampled speed of an interval(500ms-1s) and then take a calculate average of the array. But then again I'm not sure if I'll implement it for v2.3, Would be cool with some more feature requests..I'm kind of satisfied with the current RQ for my own needs so I'm having a hard time coming up with ideas.

Edited by Datenshi
Link to comment
Share on other sites

Would be cool with some more feature requests..

I have an idea. Ive been to sites before with rapidshare links that are behind links so you cannot just copy them into RadpidQueuer. Mayhaps you could make it so that we can copy a web address and RadpidQueuer will grab the source of that page and get all of the rapidshare links off of it, perhaps this could come with a dialog box to deselect undisired links.

MUHAHAHAHAHA

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...