• 0 Posts
  • 25 Comments
Joined 1 year ago
cake
Cake day: May 1st, 2024

help-circle

  • I agree with the overall sentiment, but I’d like to add two points:

    1. Everyone starts off as a code editor, and through a combination of (self-)education and experience can become a software engineer.

    2. To the point of code editors having to worry about LLM’s taking their job, I agree, but I don’t think it will be as over the top as people literally being replaced by “AI agents”. Rather I think it will be a combination of code editors becoming more productive through use of LLMs, decreasing the demand for code editors, and lay people (i.e. almost no code skills) being able to do more through LLMs applied in the right places, like some website builders are doing now.


  • stormeuh@lemmy.worldtoNot The Onion@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    2 months ago

    Yeah in this case I think it’s more a case of “hey this guy looks kind of like my son”. In this case I think it led to a miscarriage of justice, but I think in other cases that kind of thinking could protect against excessively harsh punishments. In the end I think it comes down to inequality. Bigger inequality shrinks the pool of people judges can intuitively relate to, which in turn makes judgements more unequal.




  • Even though I haven’t run anything Debian based as a daily driver in about a decade, I still recommend Debian based distro’s to beginners. With Ubuntu being so widespread it just makes sense, because whenever you search for “how do I install xyz on linux” it’s going to be a guide for Ubuntu 99% of the time, which should work on other Debian based distro’s most times.


  • I agree that it’s editorialized compared to the very neutral way the survey puts it. That said, I think you also have to take into account how AI has been marketed by the industry.

    They have been claiming AGI is right around the corner pretty much since chatGPT first came to market. It’s often implied (e.g. you’ll be able to replace workers with this) or they are more vague on timeline (e.g. OpenAI saying they believe their research will eventually lead to AGI).

    With that context I think it’s fair to editorialize to this being a dead-end, because even with billions of dollars being poured into this, they won’t be able to deliver AGI on the timeline they are promising.