JonnyQuy Posted January 17, 2018 Share Posted January 17, 2018 How do I delete, filter the text in the same txt file For example http://www.link1.comhttp://www.link2.comhttp://www.link3.comhttp://www.link4.comhttp://www.link1.comhttp://www.link2.comhttp://www.link3.comhttp://www.link4.com Link to comment Share on other sites More sharing options...
Developers Jos Posted January 17, 2018 Developers Share Posted January 17, 2018 Cookie with that? Earthshine 1 SciTE4AutoIt3 Full installer Download page - Beta files Read before posting How to post scriptsource Forum etiquette Forum Rules Live for the present, Dream of the future, Learn from the past. Link to comment Share on other sites More sharing options...
Earthshine Posted January 17, 2018 Share Posted January 17, 2018 lol, there are active links currently dealing with this. nobody ever reads anymore. My resources are limited. You must ask the right questions Link to comment Share on other sites More sharing options...
benners Posted January 17, 2018 Share Posted January 17, 2018 Read the file to an array using FileReadToArray Remove the duplicates with _ArrayUnique Then process the array returned by _ArrayUnique to do what you want with the links JonnyQuy 1 Link to comment Share on other sites More sharing options...
AspirinJunkie Posted January 18, 2018 Share Posted January 18, 2018 Maybe with StringRegExpReplace: https://regex101.com/r/z6BWJD/1 JonnyQuy and benners 2 Link to comment Share on other sites More sharing options...
JonnyQuy Posted January 18, 2018 Author Share Posted January 18, 2018 thank you very much i have done it: D Link to comment Share on other sites More sharing options...
JonnyQuy Posted January 18, 2018 Author Share Posted January 18, 2018 How to save it Link to comment Share on other sites More sharing options...
Nunos Posted January 18, 2018 Share Posted January 18, 2018 What format is it in now? Array? String? Perhaps if you post your code someone can help you better. Link to comment Share on other sites More sharing options...
benners Posted January 18, 2018 Share Posted January 18, 2018 Using the StringRegExpReplace method. Credits to AsprinJinkie and Mikell #include <File.au3> ; open the file that the results will be written to Local $h_File = FileOpen(@ScriptDir & '\done.txt', $FO_OVERWRITE + $FO_CREATEPATH) ; write the text with the duplicates removed FileWrite($h_File, _ StringRegExpReplace( _ ; start the string replacement StringRegExpReplace(FileRead(@ScriptDir & '\links.txt'), '(?msx)(?i)(^http:\/\/www\. .+$)(?= .* \1)', ''), _ ; remove the duplicate links Credit to AsprinJunkie "(?m)^\s*$\R?", "")) ; remove the blank lines credit to Mikell https://www.autoitscript.com/forum/topic/191985-how-to-delete-blank-lines-in-txt-file-with-stringregexpreplace-method/?do=findComment&comment=1377392 If using an array then use _FileWriteFromArray() JonnyQuy 1 Link to comment Share on other sites More sharing options...
JonnyQuy Posted January 19, 2018 Author Share Posted January 19, 2018 $link = FileReadToArray(@ScriptDir & '\link.txt') $unique = _ArrayUnique($link) FileWrite(@ScriptDir & '\chayngay.txt', $unique) _ArrayDisplay ($unique) I did it and it can not save Link to comment Share on other sites More sharing options...
JonnyQuy Posted January 19, 2018 Author Share Posted January 19, 2018 18 hours ago, benners said: Sử dụng phương pháp StringRegExpReplace. Tín dụng cho AsprinJinkie và Mikell #include <File.au3> ; mở tập tin đó kết quả sẽ được ghi vào Local $ h_File = FileOpen ( @ScriptDir & '\ done.txt' , $ FO_OVERWRITE + $ FO_CREATEPATH ) ; viết văn bản với các bản sao loại bỏ FileWrite ( $ h_File , _ StringRegExpReplace ( _ ; bắt đầu chuỗi thay thế StringRegExpReplace ( FileRead ( @ScriptDir & '\ links.txt' ) , '(MSX) (i) (^ http:? \ .. / \ / www \ + $) (= * \ 1)'?. , '' ) , _ , loại bỏ các liên kết trùng lặp tín dụng để AsprinJunkie "(m) ^ \ s * $ \ R?" , "" ) ) ; loại bỏ các tín dụng dòng trống để Mikell https://www.autoitscript.com/forum/topic/191985-how-to-delete-blank-lines-in-txt-file-with-stringregexpreplace-method/?do=findComment&comment= 1377392 Nếu sử dụng một mảng thì sử dụng _FileWriteFromArray () I used this method to filter, and I re-filtered it again, but not save the file Link to comment Share on other sites More sharing options...
benners Posted January 19, 2018 Share Posted January 19, 2018 I have tried these two methods and they both save a file. Method 1 ; open the file that the results will be written to Local $h_File = FileOpen(@ScriptDir & '\chayngay.txt', $FO_OVERWRITE + $FO_CREATEPATH) ; write the text with the duplicates removed FileWrite($h_File, _ StringRegExpReplace( _ ; start the string replacement StringRegExpReplace(FileRead(@ScriptDir & '\link.txt'), '(?msx)(?i)(^http:\/\/www\. .+$)(?= .* \1)', ''), _ ; remove the duplicate links Credit to AsprinJunkie "(?m)^\s*$\R?", "")) ; remove the blank lines credit to Mikell https://www.autoitscript.com/forum/topic/191985-how-to-delete-blank-lines-in-txt-file-with-stringregexpreplace-method/?do=findComment&comment=1377392 FileClose($h_File) ; close the file Method 2 using the code in post #10. Filewrite has been changed to _FileWriteFromArray() $link = FileReadToArray(@ScriptDir & '\link.txt') $unique = _ArrayUnique($link) _FileWriteFromArray(@ScriptDir & '\chayngay.txt', $unique, 1) _ArrayDisplay ($unique) JonnyQuy 1 Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now