Have you tried Matrix?
Have you tried Matrix?
LLMs are statistical word association machines. Or tokens more accurately. So if you tell it to not make mistakes, it’ll likely weight the output towards having validation, checks, etc. It might still produce silly output saying no mistakes were made despite having bugs or logic errors. But LLMs are just a tool! So use them for what they’re good at and can actually do, not what they themselves claim they can do lol.
OpenWebUI connected tabbyUI’s OpenAI endpoint. I will try reducing temperature and seeing if that makes it more accurate.
Context was set to anywhere between 8k and 16k. It was responding in English properly, and then about halfway to 3/4s of the way through a response, it would start outputting tokens in either a foreign language (Russian/Chinese in the case of Qwen 2.5) or things that don’t make sense (random code snippets, improperly formatted text). Sometimes the text was repeating as well. But I thought that might have been a template problem, because it seemed to be answering the question twice.
Otherwise, all settings are the defaults.
I tried it with both Qwen 14b and Llama 3.1. Both were exl2 quants produced by bartowski.
Perplexica works. It can understand ollama and custom OpenAI providers.
Super useful guide. However after playing around with TabbyAPI, the responses from models quickly become jibberish, usually halfway through or towards the end. I’m using exl2 models off of HuggingFace, with Q4, Q6, and FP16 cache. Any tips? Also, how do I control context length on a per-model basis? max_seq_len in config.json?
Seems to be the only necessary thing in my case! Thanks.
Yeah I definitely have the default GTK chooser. Guess I have some config playing to do later.
Can you explain a bit more about this and how to configure it? When I use FF on gnome, the save dialogue just looks like other dialogues?
Not necessarily. While of course in many many cases, open source is a volunteer effort, there’s usually some implicit transaction going on. Whether that’s improving the software for yourself and passing that on to others, being a business and improving a library or something you use that helps your project generate revenue, or even a straight up commercial transaction.
But in all these cases, the open source project can be taken by you (or others) and you can do whatever you want with it. In the case of Winamp here, you cannot do any of that. It would be different if they were paying for contributions. But they’re not, so.
A lot of times immigrants to Iceland in low paying jobs like this do not understand their rights. Wouldn’t surprise me if this guy has gotten away with it before. Possibly more than once.
Iceland isn’t perfect. If a business wants to get rid of someone, they’ll find a way to do it. But it is illegal to prevent someone from joining a union, or issue threats like this. Companies over a certain size (50+ I think?) are actually required to have a union representative.
Somebody is going to get steamrolled by Icelandic labor laws. And it’s not going to be the employee.
Edit: like this is seriously illegal in Iceland. Also, if you’re going to be a corrupt and immoral business owner (evil really in this case), the number one thing you DON’T do is broadcast your nefarious intentions over a recordable medium.
They basically want free labor.
That is exactly the plan.
You can right click the URL bar for sites that support the OpenSearch XML standard. Which I guess is what they wanted to replace it with. But I don’t really know why they removed the button to a about: config setting. Could at least be a checkbox or something to enable.
Returns the add custom search engine button. Which for some reason, has been hidden by default.
Anyone have any suggestions for bulk options in the Netherlands?
Is it possible to use ollama or an arbitrary OpenAI-compatible endpoint with the chatbot feature yet? Or only the cloud providers?
https://agnos.is/posts/tech-recruitment-is-out-of-control.html
This was my experience at the beginning of 2024. It was bad enough that I had to write a blog post about it.