JohnClaw Posted May 2, 2024 Posted May 2, 2024 Hi. Created a minimal AutoIt script that allows to chat with such local AI models like this: microsoft/Phi-3-mini-4k-instruct · Hugging Face I used Dllama library that provides useful functions to communicate with offline AI models: tinyBigGAMES/Dllama: Local LLM inference Library (github.com) Before running the script you need to create folder at this local path: C:\LLM\gguf. After creating the folder you should place there all required files: AutoIt script file, models.json, Dllama.dll and .gguf model itself that can be downloaded from here: https://huggingface.co/microsoft/Phi-3-mini-4k-instruct-gguf/resolve/main/Phi-3-mini-4k-instruct-q4.gguf?download=true Script's code: expandcollapse popup#include <GUIConstantsEx.au3> #include <MsgBoxConstants.au3> #include <EditConstants.au3> #include <WindowsConstants.au3> #include <StringConstants.au3> DllCall("User32.dll", "bool", "SetProcessDpiAwarenessContext" , "HWND", "DPI_AWARENESS_CONTEXT" -2) Opt("GUIOnEventMode", 1) Local $hMainGUI = GUICreate("Dllama_AutoIt_GUI", 925, 950) GUISetOnEvent($GUI_EVENT_CLOSE, "terminate_app") Local $start_thinking = GUICtrlCreateButton("Think and answer, AI!", 350, 890, 250, 50) Local $place_for_question = GUICtrlCreateInput("Type here your question for AI.", 10, 15, 900, 40) Local $place_for_answer = GUICtrlCreateEdit("Here AI will type answer for your question", 10, 80, 900, 800, $ES_MULTILINE) GUICtrlSetOnEvent($start_thinking, "get_answer") GUICtrlSetFont($place_for_question, 14) GUICtrlSetFont($place_for_answer, 14) GUICtrlSetFont($start_thinking, 14) GUISetState(@SW_SHOW, $hMainGUI) While 1 Sleep(100) WEnd Func get_answer() Local $question $question = GUICtrlRead($place_for_question) Local $hDLL = DllOpen("Dllama.dll") Local $answer = DllCall($hDLL, "str:cdecl", "Dllama_Simple_Inference", "str", "C:\LLM\gguf\", "str", "models.json", "str", "phi3:4B:Q4", "boolean", False, "uint", 1024, "int", 27, "str", $question) GUICtrlSetData($place_for_answer, $answer[0]) DllClose($hDLL) EndFunc Func terminate_app() Exit EndFunc Dllama_GUI_AutoIt.au3Fetching info... models.jsonFetching info... Dllama.dllFetching info...
Developers Jos Posted May 2, 2024 Developers Posted May 2, 2024 (edited) Moved to the appropriate AutoIt General Help and Support forum, as the Developer General Discussion forum very clearly states: Quote General development and scripting discussions. Do not create AutoIt-related topics here, use the AutoIt General Help and Support or AutoIt Technical Discussion forums. Expand Moderation Team PS: Welcome to our forums. We understand you are new around here, so tell me why we would be able to trust your attached DLL? To my surprise, I was the first one to check on virustotal: https://www.virustotal.com/gui/file/bfe4de94fcd7b27a80f099a7fe504f4f85d22fdc03536c00a637de1ed1a16fac/detection Edited May 2, 2024 by Jos SciTE4AutoIt3 Full installer Download page - Beta files Read before posting How to post scriptsource Forum etiquette Forum Rules Live for the present, Dream of the future, Learn from the past.
Andreik Posted May 2, 2024 Posted May 2, 2024 (edited) @Jos I think this is supposed to be some sort of example GUI for a LLM, maybe it fits better in examples section of the forum. In case anyone wonder why the script it doesn't work, just run it using x64 version, since the dll it's compiled as 64bit library. #AutoIt3Wrapper_UseX64=y As for me, I can't think of any place where I can use this anyway because it's very very slow. It takes like 30 seconds to answer to any basic question. Edited May 2, 2024 by Andreik
JohnClaw Posted May 3, 2024 Author Posted May 3, 2024 Sorry, guys, i was tired and forgot to write about hardware acceleration. If you have gpu that supports vulkan backend, you can get answers from AI much faster. To enable gpu you should change False to True in this script's line: Local $answer = DllCall($hDLL, "str:cdecl", "Dllama_Simple_Inference", "str", "C:\LLM\gguf\", "str", "models.json", "str", "phi3:4B:Q4", "boolean", False, "uint", 1024, "int", 27, "str", $question) By the way, please, download the latest dll version from official github repo. It was updated since i posted this thread: https://github.com/tinyBigGAMES/Dllama/blob/main/bin/Dllama.dll Virustotal check can be found here: https://github.com/tinyBigGAMES/Dllama/blob/main/docs/VIRUSTOTAL.txt
rsn Posted July 8, 2024 Posted July 8, 2024 (edited) @JohnClaw Looks like the link for Dllama.dll is broken. I'll guess that you removed that project from GitHub and replaced it with LMEngine? From my bit of fiddling about with it, It doesn't seem to be a drop in replacement for the old DLL. Also, when flipping false to true for GPU acceleration, the output is gobbledygook. Edited July 10, 2024 by rsn
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now