• 0 Posts
  • 25 Comments
Joined 10 months ago
cake
Cake day: December 28th, 2023

help-circle




  • One of the worst parts of this boom in LLM models is the fact that they can “invade” online spaces and control a narrative. For an example, just go on twitter and scroll to the comments on any tagesschau (german news site) post- it’s all rightwing bots and crap. LLMs do have uses, but the big problem is that a bad actor can basically control any narrative with the amount of sheer crap they can output. And OpenAI does nothing- even though they are the biggest provider. It earns them money, after all.

    I also can’t really think of a good way to combat this. If you would verify people using an ID, you basically nuke all semblance of online anonymity. If you have some sort of captcha, it will probably be easily bypassed- it doesn’t even need to be tricked. Just pay some human in a country with extremely cheap labour that will solve it for your bot. It really sucks.
















  • ok, fair; but do consider the context that the models are open weight. You can download them and use them for free.

    There is a slight catch though which I’m very annoyed at: it’s not actually Apache. It’s this weird license where you can use the model commercially up until you have 700M Monthly users, which then you have to request a custom license from meta. ok, I kinda understand them not wanting companies like bytedance or google using their models just like that, but Mistral has their models on Apache-2.0 open weight so the context should definitely be reconsidered, especially for llama3.

    It’s kind of a thing right now- publishers don’t want models trained on their books, „because it breaks copyright“ even though the model doesn’t actually remember copyrighted passages from the book. Many arguments hinge on the publishers being mad that you can prompt the model to repeat a copyrighted passage, which it can do. IMO this is a bullshit reason

    anyway, will be an interesting two years as (hopefully) copyright will get turned inside out :)