• Gamma@beehaw.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 days ago

    The program is statistically an average white guy that knows about a lot of things but doesn’t understand any of it soooooo I’m not even sure what point you thought you had

    • nesc@lemmy.cafe
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 days ago

      Chat bot will impersonate whoever you’ll tell them to impersonate (as stated in the article), my point is pretty simple, people don’t need a guide when using a chat bot that tells them how they should treat and interact with it.

      I get it, that was just perfunctory self depreciation with intended audience being other first worlders.

      • SaltSong@startrek.website
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 days ago

        people don’t need a guide when using a chat bot that tells them how they should treat and interact with it.

        Then why are people always surprised to find out that chat bots will make shit up to answer their questions?

        People absolutely need a guide for using a chat bot, because people are idiots.

        • chicken@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          1
          ·
          7 days ago

          Not even just because people are idiots, but also because a LLM is going to have quirks you will need to work around or exploit to get the best results out of it. Like how it’s better to edit your question to clarify a misunderstanding and regenerate the response than it is to respond again with the correction, because there is more of a risk it gets stuck on its mistake that way. Or how it can be useful in some situations to (if the interface allows this) manually edit part of the LLM output to be more in line with what you want it to be saying before generating the rest.