Most people learn a new language in order to make headway in their career, be able to move abroad or just to speak with people of that country or consume their media. For people who learn for these reasons, will advances in AI and LLMs make learning a language more obsolete? Are there actually less people picking up a foreign language since LLMs opened to the public? What about the “human connection” which translators won’t be able to replicate?

I guess we’re still far off from real-time translation without delay in every kind of situation, especially since making sense of a sentence in many languages is very dependant on context or some word at the end of the sentence that changes the meaning of the first few words spoken.

I see learning a language as a way not only to communicate with different people, but to also learn a different way of seeing the world. That’s also kind of why I’m against a global language replacing all others: in a language, the culture of the people speaking it is intrinsically linked. Wiping out a language means wiping out the culture. People don’t think the same in English as they do in Mongolian. Even the concept of “time” can be different, depending on how it’s expressed in another language. Translators at the moment aren’t able to capture all these nuances and differences, even if they sometimes succeed.

  • Jeena@piefed.jeena.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    13 hours ago

    I’m using local open source LLMs for translation like DeepSeek, Gemma, phi, etc. And they are very similar to ChatGTP.

    • Photuris@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      8 hours ago

      What does your hardware setup look like, if you don’t mind me asking?

      I’m thinking of building something, but I don’t want to spend a fortune if I can help it. I run Lllama on a Mac Mini, which works fine, but I’m not able to run the bigger models on that.

      • ikt@aussie.zone
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        11 hours ago

        very easy to get started, make sure you have a graphics card with ideally more than 6gb of vram (the more the better), grab lm studio: https://lmstudio.ai/ then under the discover section you can grab a local model

        these ones run locally on your PC and don’t touch the internet hence the more VRAM you have the faster they go and more larger models you can run

        • pasdechance@jlai.lu
          link
          fedilink
          arrow-up
          2
          ·
          9 hours ago

          I have a 10-year old laptop with integrated graphics running Debian Stable, so I don’t think I’ll be using a local LLM any time soon haha. I tell my students I don’t want them to use any of these tools so I don’t use them either.

            • pasdechance@jlai.lu
              link
              fedilink
              arrow-up
              2
              ·
              7 hours ago

              It’s what I’ve been using since the early 2000’s. Whatever laptop I can get for free with boring Linux. I teach all my classes across multiple establishments with it. Battery still lasts over 9 hours. Beats the Raspberry Pi I used as a computer for 6 months!

              I do have a colleague that installed one of the LLMs on their computer to play around with translation and live subtitles, and another who claims ChatGPT taught him French. Maybe there is something to it, but I draw the line at using AI because, as I said, I forbid it in my classes.