Jump to content

RapidQueuer 2.4.4


Datenshi
 Share

Recommended Posts

This project of yours keeps getting better and better. Thank you greatly for putting so much time and effort into it.

One small question....

On/Off the Progress Window with tray, default is OFF

I tried changing your code from $ProgressOn = 0 to 1 thinking that this would change default to ON, but it didn't work.

I would personally like the progress Window default on. How can I do this?

Thanks again.

Link to comment
Share on other sites

  • Replies 278
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Fmen, Search source for " For $x = 0 to Ubound($DownloadLinkList) -1 "

Paste this BEFORE:

ProgressOn("Progress Window","Waiting for next status update..","",1,1,18)
    TrayItemSetState($TrayProgressOn,1); checked
    $ProgressOn = 1

Edit: sry wrote AFTER instead of BEFORE :)

Edited by Datenshi
Link to comment
Share on other sites

Fmen, Search source for " For $x = 0 to Ubound($DownloadLinkList) -1 "

Paste this BEFORE:

ProgressOn("Progress Window","Waiting for next status update..","",1,1,18)
    TrayItemSetState($TrayProgressOn,1); checked
    $ProgressOn = 1

Edit: sry wrote AFTER instead of BEFORE :)

Excellent!!

Now it is perfect.....for me.

Thank you,

Link to comment
Share on other sites

  • 2 weeks later...

I have no idea if this is of any value to RapidQueuer or not, but RS released their API to the public today. See what they say at the following (which has a hyperlink to the API documentation):

http://rapidshare.com/news.html
Link to comment
Share on other sites

I have no idea if this is of any value to RapidQueuer or not, but RS released their API to the public today. See what they say at the following (which has a hyperlink to the API documentation):

http://rapidshare.com/news.html
Read it, and my first thought was.."why did they release this?" turns out..the API is all about uploading to Rapidshare with a couple of Prem Acc modifications that can be done.

As of now RapidQueuer doesn't do uploading and probably never will, :) anyway thanks for informing me.

Link to comment
Share on other sites

this is grt man but if it supports easyshare and hotfile too then it wud be a grt downloder.........!

Sorry, those two sites will most likely never be supported. If i choose to support any other site it'll be Megaupload. To bad they still have their annoying captcha which changes every now and then. I would say Rapidshare and Megaupload are the two biggest file hosting sites right now, and since I'm the only developer of RapidQueuer, it'll take quite some time and effort to add support for Megaupload, especially with their captcha.

Link to comment
Share on other sites

OK, I'm having problems for the first time.

Here's what's happening....

When I place valid links in the linklist.dat, rapidqueuer cannot find them.

Monitoring clipboard for links results in the same problem.

Am I the only one with this problem? Thanks

Link to comment
Share on other sites

Hm, have not had that problem..does this happen with all links? cause it could be an unusual filename that doesn't match regexp string. Could you PM me the link/links you're having problems with and i'll look into it.

Link to comment
Share on other sites

Hm, have not had that problem..does this happen with all links? cause it could be an unusual filename that doesn't match regexp string. Could you PM me the link/links you're having problems with and i'll look into it.

I guess it is my own personal problem, then.

Yes, it happens with each and every link. The links are definitely good.

I have tried using earlier versions of rapidqueur and on different computers..... the same thing happens.

I will try and decode the problem. Thanks for the feedback.

Link to comment
Share on other sites

  • 3 weeks later...

Thanks for sharing you script, it's very nice and I hope to learn from it. I used it to download several files and it worked quite well after modifying function DHCP_IPRefresh to RunWait another script in my collection which resets my particular router to obtain a new IP.

Here are my observations which I hope you find useful:

1. When the script begins the first download, it should check to see if a change in IP is required to avoid the 15 minute waiting period. Currently it simply obeys whatever waiting period RS imposes.

2. If I'm correct, the "Donate Download" option is currently ineffective in generating a RS point for you in appreciation for using your script. Obviously, I cannot see your tally, so I tried it with one of my 5.1 MB files and no point was awarded. I thought this might be because I have a premium membership, so I tried it on a friends computer, who is not a premium member, and had the same negative result.

Thanks again for sharing your really nice script.

Pinchly

Link to comment
Share on other sites

Glad you like it Pinchly :D

1. When the script begins the first download, it should check to see if a change in IP is required to avoid the 15 minute waiting period. Currently it simply obeys whatever waiting period RS imposes.

I'll look in to this, sounds like something i did intentionally.

2. If I'm correct, the "Donate Download" option is currently ineffective in generating a RS point for you in appreciation for using your script. Obviously, I cannot see your tally, so I tried it with one of my 5.1 MB files and no point was awarded. I thought this might be because I have a premium membership, so I tried it on a friends computer, who is not a premium member, and had the same negative result.

Correct, the current file isn't generating points for me, the "Donate Download" option could be ignored until it's fixed. Rapidshare tricked me by giving me 3 points earlier so i thought it worked. Size for the Thanks.rar is 6.4MB btw. I'm thinking there's something wrong with Rapidshares system.

"Only files stored in the Collector's Zone (here) can score points. Once they are downloaded by a free-user, each of them generates a RapidPoint. Subject to the condition is that the file is at least 5 Megabyte in size and the downloader has generated not more than 3 RapidPoints in the last hours. Premium users can generate for you up to 255 points per day." // rapidshare

Edited by Datenshi
Link to comment
Share on other sites

I'm thinking there's something wrong with Rapidshares system.

I know for certain that one of your competitor's RS downloader does generate a point for the uploader, for collectors or premium users. I just double checked that program, and it gave me a point. I also know that program is written in AutoIt3, so there must be a way. I sent you a message with the name of that program.

"Only files stored in the Collector's Zone (here) can score points. Once they are downloaded by a free-user, each of them generates a RapidPoint. Subject to the condition is that the file is at least 5 Megabyte in size and the downloader has generated not more than 3 RapidPoints in the last hours. Premium users can generate for you up to 255 points per day.

This also applies to free users downloading from a premium account. See http://rapidshare.com/faq.html which states the following under "What are Rapid Points":

Each time a Free Users downloads your files, you gain points. You get a RapidPoint for each download, provided that the file is bigger than 5 megabyte and the user has not generated more than 3 RapidPoints in the last hour. Premium users can generate for you up to 255 points per day. Your files have to be stored in the Collector's Zone or the Premium Zone so that we can assign the download to your account

I wish you luck in figuring it out as you deserve some donations!

Pinchly

Link to comment
Share on other sites

Thanks Pinchly, there was some guy advertising that elephant software when i just started this thread. But i don't really see it as a competitor, That software tries to build a network to generate points for its users, i dont know how successful it is in that particular area but i tried to use the software and i felt it was so bloated to the point that i was scared to even open it. There was advertisement and all that crap in the software as well as the coding felt really inefficient. Their project is closed source-code while mine is open, and I'm a bit paranoid when it comes to bloated software with a closed source. A bit of the charm with RapidQueuer would be the fact that it's open-source and people on here usually have a bit of knowledge with AutoIT, it'll allow them to customize it to their needs and remove things they don't feel is useful.

Further I'm in no means trying to make money off RapidQueuer, the point with the "donation download" is to first of all generate enough so i could obtain a premium account with the points, but also to see a rough amount of usage of RapidQueuer, i would imagine people send it over to friends and family themselves so the DL counter on here, isn't accurate when it comes to usage. That is also why i left it as an option.

But as i see it, RapidQueuer is mostly a project meant for learning, and i kinda started writing it because i needed it for a particular site that hosts a lot of files on Rapidshare, which works wonders as a test site as well :D

I also take happiness out of the fact that other people find it useful and in some cases take their time to thank me or suggest features to further my AutoIt knowledge and the RapidQueuer project. I had hopes that other people could chime in and code features which could be included in it but most people are perfectly satisfied with a downloader that's easy and just gets them the file, without that extra stuff.

Edited by Datenshi
Link to comment
Share on other sites

Datenshi,

Below is a bare minimal script, based upon your RapidQueuer script, whose purpose is only to demonstrate that it will result in one RS Point being awarded to a collectors or premium user account. It has very minimal error checking, and requires a fresh IP before it is used (otherwise it will fail unless it's been longer than 15 minutes since the last RS download as a free user). The URL is hard coded to your Thanks.rar file.

I spent my time getting the RS Point to be awarded, and I do not know what is preventing it in your RQ script. Hopefully my script will assist you in resolving that issue.

To test, do the following:

1. Refresh your RS log-in page and take note of your current Point tally as well as the download count for each of your downloads.

2. Obtain a new IP.

3. Run the script.

4. When the script completes, refresh your RS log-in page and examine the new Point tally. You should be one point richer, and only your Thanks.rar file should reflect it was downloaded one more time! If others have increased, your point tally should also reflect them.

The downloaded file is deposited into the same directory as the script.

CODE
;~ _Test-Getting-RS-Point.au3

;~ By pinchly, 2009/06/14

;~ Based upon original coding by Datenshi

;~ Reference: AutoIt Forum thread: http://www.autoitscript.com/forum/index.ph...=29631&st=0

;~

;~ This script is stipped down to the bare minimum, with minimal error

;~ checking, no support for proxies, and assumes a fresh IP is current,

;~ which avoids the long wait period between RS downloads. The intent is

;~ only to show that an RS point will be awarded to premium users and collectors

;~ when one of their 5,000,000 byte files is downloaded by a free user using

;~ this script.

;~

;~ The URL is hard coded to the Thanks.rar belonging to Datenshi.

;; Specify the test url here.

$sStartURL = "http://rapidshare.com/files/204450618/Thanks.rar"

$iReply = MsgBox(4097, "Test-Get-RS-Point", "Test acquiring one RS point using the following URL:" &@LF&@LF& $sStartURL &@LF&@LF& "Please ensure a new IP has been obtained" &@LF& "before clicking OK")

If $iReply = 2 Then Exit

#include <String.au3>

$iPort = "80"

TCPStartup()

;; Step 1:

;; Get the 1st web page from RS.

;; The follwoing obtains the RS web page containing the "Free user" and "Premium user"

;; buttons for the file specified in $sStartURL.

SplashTextOn("Obtaining data", "Obtaining free user download data for " &@LF& $sStartURL, 500, 60, -1, -1, 16)

$sHost = StringMid($sStartURL, StringInStr($sStartURL, "/", 0, 2) + 1)

$sHost = StringLeft($sHost, StringInStr($sHost, "/", 0, 1) - 1)

$sPage = StringMid($sStartURL, StringInStr($sStartURL, "/files", 0, 1))

$KeepAliveType = "Keep-Alive: 300" & @CRLF & "Connection: keep-alive" & @CRLF

$sPost = "dl.start=Free"

$sHeader = "POST " & $sPage & " HTTP/1.1" & @CRLF & _

"Host: " & $sHost & @CRLF & _

"User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.5) Gecko/2008120122 Firefox/3.0.5 (.NET CLR 3.5.30729)" & @CRLF & _

"Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8" & @CRLF & _

"Accept-Language: en-us,en;q=0.5" & @CRLF & _

"Accept-Encoding: " & @CRLF & _

"Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7" & @CRLF & _

$KeepAliveType & _

"Referer: " & $sStartURL & @CRLF & _

"Content-Length: " & StringLen($sPost) & @CRLF & @CRLF & _

$sPost

$iIP = TCPNameToIP($sHost)

$iSocket = TCPConnect($iIP, $iPort)

$iSend = TCPSend($iSocket, $sHeader)

;; Receive data back from server after request for data by the above TCPSend function.

While 1

$sRecv = TCPRecv($iSocket, 4096)

$iErr = @error

If $sRecv <> '' Then

While 1

$sRecv &= TCPRecv($iSocket, 4096)

If @error Then ExitLoop 2

WEnd

EndIf

WEnd

SplashOff()

;; Step 2:

;; Extract the URL for the "Free user" button from the received data in step 1.

$aSubmitServer = _StringBetween($sRecv, 'action="', '" method')

$sFreeUserURL = $aSubmitServer[0]

;; Step 3:

;; Submit the URL from step 2 in order to receive the countdown page

;; and the actual URL for the download.

$sHost = StringMid($sFreeUserURL, StringInStr($sFreeUserURL, "/", 0, 2) + 1)

$sHost = StringLeft($sHost, StringInStr($sHost, "/", 0, 1) - 1)

$sPage = StringMid($sFreeUserURL, StringInStr($sFreeUserURL, "/files", 0, 1))

$KeepAliveType = "Keep-Alive: 300" & @CRLF & "Connection: keep-alive" & @CRLF

$sPost = "dl.start=Free"

$sHeader = "POST " & $sPage & " HTTP/1.1" & @CRLF & _

"Host: " & $sHost & @CRLF & _

"User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.5) Gecko/2008120122 Firefox/3.0.5 (.NET CLR 3.5.30729)" & @CRLF & _

"Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8" & @CRLF & _

"Accept-Language: en-us,en;q=0.5" & @CRLF & _

"Accept-Encoding: " & @CRLF & _

"Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7" & @CRLF & _

$KeepAliveType & _

"Referer: " & $sFreeUserURL & @CRLF & _

"Content-Length: " & StringLen($sPost) & @CRLF & @CRLF & _

$sPost

$iIP = TCPNameToIP($sHost)

$iSocket = TCPConnect($iIP, $iPort)

$iSend = TCPSend($iSocket, $sHeader)

;; Receive data back from server after request for data by the above TCPSend function.

While 1

$sRecv = TCPRecv($iSocket, 4096)

$iErr = @error

If $sRecv <> '' Then

While 1

$sRecv &= TCPRecv($iSocket, 4096)

If @error Then ExitLoop 2

WEnd

EndIf

WEnd

;; Step 4:

;; The 2nd page is the countdown page. Extract the countdown duration in

;; order to wait the proper amount of time for the countdown to reach zero.

$aSecondsToWait = _StringBetween($sRecv, "var c=", ";")

$iSecondsToWait = $aSecondsToWait[0] + 1

;; Step 5:

;; The 2nd page also contains the actual URL to be downloaded.

;; Extract it and pause for the countdown duration.

$aSubmitServer = _StringBetween($sRecv, 'action="', '" method')

$sDownloadURL = $aSubmitServer[0]

SplashOff()

ProgressOn("Waiting to download...", "Waiting " & $iSecondsToWait & " seconds to begin download...", $iSecondsToWait & " seconds to go..." &@LF&@LF& "(RS mandated Free User wait period)")

For $i = 1 to 100 step 100 / $iSecondsToWait

sleep(1000)

ProgressSet($i, int($iSecondsToWait - ($i / 100 * $iSecondsToWait)) & " seconds to go..." &@LF&@LF& "(RS mandated Free User wait period)")

Next

ProgressSet(100 , "0 seconds to go!" &@LF& "Beginning download")

sleep(500)

ProgressOff()

;; Step 6:

;; Perform the actual download.

$sDownloadFilename = @ScriptDir & "\" & StringMid($sDownloadURL, StringInStr($sDownloadURL, "/", 0, -1) + 1)

SplashTextOn("Downloading", "Downloading " &@LF& $sDownloadURL &@LF& "to " & $sDownloadFilename, 500, 85, -1, -1, 16)

Inetget($sDownloadURL, $sDownloadFilename)

SplashOff()

;; Cleanup and exit.

TCPCloseSocket($iSocket)

TCPShutdown()

MsgBox(4096, "Download complete", "Download complete" &@LF& "Please check your RS point total")

Exit

BTW, no points are currently being awarded to any uploader for any downloads performed by RQ. That is unfortunate for them, and another reason to get RQ fixed.

Cheers,

Pinchly

Edited by pinchly
Link to comment
Share on other sites

Thanks man. I didn't think of the fact that it doesn't award points for other downloads, You're right this should be fixed :D I'll use your script to find out whats going on.

Just checked with your script, generated a point as u said, hm..RQ does count towards number of total downloads but doesn't generate the point. Pretty weird

Alright I've been trying to squash this tonight, It seems to be related to the InetGetSize request before the download.

Edited by Datenshi
Link to comment
Share on other sites

Alright I've been trying to squash this tonight, It seems to be related to the InetGetSize request before the download.

Datenshi,

You're on the right track! I just returned to tell you what I found. It's really simple, and will take you less than a minute to fix.

Replace the following line in two locations:

$iSize = InetGetSize($FastestMirror)

with

$iSize = InetGetSize($DownloadLinkList[$x])

I don't know why, but when InetGetSize is used with the actual download link (the same one used by InetGet), regardless if used before or after InetGet, something mysterious happens and the point is not awarded. I suspect RS intercepts two requests for the same URL and does whatever they are doing.

Cheers,

Pinchly

Link to comment
Share on other sites

Datenshi,

CAUTION!!!

I just view my RS logs and found a number of recent entries. While they are all for my IPs, none should have been recorded because all were from using RQ since I revised it as stated above and performed several tests to make sure it was working.

I recommend checking your log after each download to make sure this is not happening to you as RS will cancel your password if you have too many IP's for the day.

Cheers,

Pinchly

Link to comment
Share on other sites

Alright, I'm on a Collectors account so i don't think i can check any logs..at least i cant seem to find one. But i haven't been downloading my file that much, split over 3IP's. Pretty much compared your header string to mine and then went ahead and commented out any line communicating with the RS server until i got to the InetGetSize. In the end the bug wasn't THAT evasive, luck maybe? :D It sure was weird tho, but Rapidshare has a complicated system of servers, while coding RQ I've come across a few weird behaviors. Before 2.2 i had a bug that i tried to track down for a week, in the end it magically disappeared and I'm still not sure what squashed it.

You deserve a big thank you for the help :D And i'll add you to credits

btw, $iSize = InetGetSize($DownloadLinkList[$x]) would return the size of the 1 rs page i think, so it'll screw up calculations and size matches, just need a way around to get the size. I'm thinking it'll work by sending the InetGetSize to a different hosting server, since they store the same files across a couple of servers. It'll be simple if it works but i'll have to try it tomorrow, clock is 4 AM here ;)

Edited by Datenshi
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...