Jump to content

RTFC

MVPs
  • Posts

    1,039
  • Joined

  • Last visited

  • Days Won

    17

RTFC last won the day on May 11 2022

RTFC had the most liked content!

3 Followers

About RTFC

Profile Information

  • Member Title
    Seeker

Recent Profile Visitors

1,730 profile views

RTFC's Achievements

  1. Good day to you too. My day was crap, thank you for asking. A1: first read this. So given you're running in a 64-bit environment, call _WinAPI_Wow64EnableWow64FsRedirection with parameter True if you wish to use the legacy 32-bit Windwos system dll's, and False (or don't call it) if you know you can use the newer 64-bit versions that are native to x64 Windows. Basically if you use legacy code yourself that relies on the old 32-bit Windows dll's, you'd want to set this, otherwise leave it be. A2: the #include keyword followed by a filename (in quotation marks, or between angled brackets if it's a native AutoIt #include) does exactly that (assuming that the file is found, that is): it takes the contents of that file and inserts these as one giant copy-paste into the master script that will be compiled/executed. most include's are function libraries whose functions can thereafter be called by other parts of your code, but you could use it to insert the complete works of Shakespeare if you like (but as a (rather large) comment section, otherwise the compiler will complain, because it hates Shakespeare (as it doesn't really understand it)). Note that the included text (whatever it is) is inserted at the exact point where the #include statement was in your original script, which can be important (if you're #including raw lines of code, for example). So don't place an #include in the middle of another function, unless you know what you're doing. You can run Au3Stripper (either at the cmdline or from (full) Scite4AutoIt) to produce the single script with all required parts of the #includes added (and the #include statements removed). Compare and contrast with the original source for hours of fun.
  2. I think you may have misinterpreted/overestimated the (very limited) "command" functionality its editor implements; these are mostly startup cmdline options that you can set/change at runtime, and stuff like taking screenshots at some specific resolution. To do the things you want to do in a development environment (i.e., uncooked, unpacked application), by far the easiest way is to create a new, or edit an existing blueprint. Blueprints were explicitly designed for people that are uncomfortable writing code (e.g., most game designers). It's incredibly simple to learn, just connect event/function nodes with their Exec pins to form a flow diagram in the BP, and stick variables into your inputs (or read them from node outputs) where required/desired. Example: You wrote you need JSON interaction. So open your project, Top Menu->Edit->Plugins, search "JSON", then enable (tick) "JSON Blueprint Utilities" (available for free), and confirm. Restart the Engine, open any blueprint (or create a new one), right-click anywhere on the background, type json in the new window's search bar, and select any of these functions: There are literally hundreds of video tutorials online to get you up to speed in a matter of a few hours. And as long as your tasks are not frame-critical, you don't need to engage with the underlying C++ code at all.
  3. I have trouble understanding why you would wish to automate Unreal Editor; that's like wanting to automate Scite to write an AutoIt script. If you have access to the uncooked, unpackaged project, it makes more sense to directly edit the blueprints (or add your own BPs to run alongside, or edit/add to project source code) to achieve your goals: JSON: there are several JSON plugins available out of the (UE) box to interact with JSON data (Edit->Plugins->Search "JSON", and several more on the marketplace start/stop: just run the editor headless from the cmdline with the appropriate settings (e.g., if you launch the editor with -game this would act the same as choosing the standalone launch option in the editor) the output log is already written to file (in <YourProjectFolder>\Saves\Logs), or in the Output Log window, click Settings->Open in External editor. Moreover, automating UE editor seems a terrible idea to me anyway, because its GUI layout/ menu organsiation tends to change (sometimes drastically) with almost every engine update (of which there are several a year).
  4. It helps if you open the dll first. #include "zlib_udf.au3" Local $sData = 'hahahahahahahahahahahehe' Local $bData = StringToBinary($sData) Local $tBuffer = DllStructCreate('byte Data[' & BinaryLen($bData) & ']') $tBuffer.Data = $bData _Zlib_Startup() $result = _Zlib_CalculateAdler32(DllStructGetPtr($tBuffer), DllStructGetSize($tBuffer)) ConsoleWrite($result & @CRLF) _Zlib_Shutdown()
  5. Diffie-Hellman is definitely the way to go; don't bother with _Crypt_*. I use this C++ library myself, but @jchd has provided this AutoIt implementation.
  6. And if we're going to do a proper comparison then one should include the efforts of @timmy2 + @UEZ with enhancements by @Beege, @MrCreatoR, and @Malkey (cobbled together by yours truly).
  7. There are two separate issues here, 1. struct element retrieval and 2. Hex representation. ad 1. struct dot notation won't tell you this, but if you check the help page for DllStructGetData, you'll see: ad 2. Reading up on the Hex function in the Help, you'll find:
  8. Yes it does in my opinion, provided that your code does check any pointer for null before actually using it.
  9. More often than not, garbage collection (GC) is not instantaneous, and calling Delete/Dispose/Close functions merely marks the parsed pointer as no longer needed (as if placing it in a bin beside the curb, but in theory still accessible until the garbage truck comes to empty it), to be physically removed (as in memory being freed) whenever the GC comes round to it. So although it's extremely unsafe to still reference the pointer after calling the disposal function, it's perfectly possible that it's still valid. Therefore it makes good sense to actively null it out yourself if there is any chance that any code might still reference it in the interim.
  10. In path searches through a subset of points more generally (for simulated annealing or any other time-consuming algorithm), it may be of significant advantage (speedwise, that is) to compute all point-to-point distances in advance, and store these in a single look-up reference (table or matrix). Here's a simple, generic example of doing just that using matrices: This example is, however, far from optimised. One improvement would be to get rid of the final column vector copy to the output matrix, and instead repeatedly re-map a column in the final output as our vector to directly collect results in. This is faster, but still evaluates each point individually. A more efficient approach would be to create separate matrices for each coordinate dimension, and perform each math operation once per dimension only, collecting final results in the first dimension's container. Note that this much faster solution requires more memory.
  11. @Acanis: As they say in German: "Da gibt's noch Luft nach oben." You're making things faaaaaar too complicated (I'm not even going to go into how you're setting up/evaluating your 3D/2D grid...). Was this ChatGPT's idea? But you earn some points for at least annotating your code. Or was that ChatGPT as well? I edited the TSP example on the assumption that the desired number of intermediate points is fixed, and point duplication is not allowed. You can play with $tempfactor (and $maxStepsWithoutImprovement) to adjust how much exploration of the solution space is allowed. If $verbose = true (default false now), the best sequence of journey legs so far is ArrayDisplayed every time a better solution is found (press <Esc> (every time) to continue); press <Space> to terminate prematurely. This is all I'm going to write for you here; visualisation and any changes you'll have to implement yourself. Hope it helps.
  12. @Nine: You're wasting your breath, mate; that's a bot you're shouting at.
  13. WinAPI provides this: #include <WinAPIFiles.au3> Local $aData = _WinAPI_GetVolumeInformation() ; you can parse a root dir (as string, with trailing backslash) here ConsoleWrite('Serial number: ' & $aData[1] & @CRLF)
  14. E4A Version 5.5 is released. This is a minor update, mainly providing more support for binary operations on matrices of integer type, notably in file I/O, bitwise & logical operations on a single matrix cell value, CwiseScalarOps with logical operators (+ parsed value), reversing the bit-order in all cells of a matrix (part) with new CwiseUnaryOp operator #37 ("reverseBits"), and Rowwise Pack/Unpack functions for converting integer cells to/from 32 individual bits, 4 bytes, or 2 words. A new test script (#32: BitAndByteOps.au3 in the .\EigenTest subdirectory) illustrates various features. Full details can be found in the History page of the online Help (note that as of this version, the .chm Helpfile is no longer supported/present; just download the latest online Help if you need an offline version). Hope it helps.
  15. Maximum TCP packet size is 65535 bytes in theory, but in practice it's restricted by your network's Maximum Transmission Unit (MTU, ethernet frame; do not confuse this with a UDP datagram, which is much smaller), 1500 bytes is common for MTU. Both TCP (reliable, slower) and UDP (unreliable = packets may be dropped, faster ) can be used for streaming.
×
×
  • Create New...