Here’s what he said in a post on his telegram channel:

🤫 A story shared by Jack Dorsey, the founder of Twitter, uncovered that the current leaders of Signal, an allegedly “secure” messaging app, are activists used by the US state department for regime change abroad 🥷

🥸 The US government spent $3M to build Signal’s encryption, and today the exact same encryption is implemented in WhatsApp, Facebook Messenger, Google Messages and even Skype. It looks almost as if big tech in the US is not allowed to build its own encryption protocols that would be independent of government interference 🐕‍🦺

🕵️‍♂️ An alarming number of important people I’ve spoken to remarked that their “private” Signal messages had been exploited against them in US courts or media. But whenever somebody raises doubt about their encryption, Signal’s typical response is “we are open source so anyone can verify that everything is all right”. That, however, is a trick 🤡

🕵️‍♂️ Unlike Telegram, Signal doesn’t allow researchers to make sure that their GitHub code is the same code that is used in the Signal app run on users’ iPhones. Signal refused to add reproducible builds for iOS, closing a GitHub request from the community. And WhatsApp doesn’t even publish the code of its apps, so all their talk about “privacy” is an even more obvious circus trick 💤

🛡 Telegram is the only massively popular messaging service that allows everyone to make sure that all of its apps indeed use the same open source code that is published on Github. For the past ten years, Telegram Secret Chats have remained the only popular method of communication that is verifiably private 💪

Original post: https://t.me/durov/274

  • rivvvver@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    78
    arrow-down
    1
    ·
    edit-2
    6 months ago

    arent telegram chats unencrypted by default?

    An alarming number of important people I’ve spoken to remarked that their “private” Signal messages had been exploited against them in US courts or media

    source?? (i bet this ends up being a “they had full access to my unlocked phone” situation again)

    also the whole thing abt US funded encryption is the same bullshit argument ppl use against Tor all the time. it doesnt mean shit.

    this just reads like someone desperately trying to get more market share by spreading FUD

    • VeganCheesecake@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      8
      ·
      6 months ago

      https://www.spiegel.de/netzwelt/apps/telegram-gibt-nutzerdaten-an-das-bundeskriminalamt-a-0e4d3fcb-8081-4b87-b062-db412bbc294b

      Well, Telegram seems to be giving user data to the German Federal Criminal Police Office, and if they’re cooperating with the German authorities, I don’t see why I’d presume they aren’t cooperating with others as well.

      All this is actually documented, compared to those nebulous “important people”.

      • UnfortunateShort@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        6 months ago

        Tbf, they held a user vote in Germany (supposedly, although the app did ask me to vote) whether to work with them or risk to cease services. Iirc the backgrounds were extremist (terrorist?) groups operating on the platform

    • rdri@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      6 months ago

      arent telegram chats unencrypted by default?

      Encryption is always there. Problem is, some people refer to anything “not e2e encrypted” as “unencrypted” for some reason.

      • Fushuan [he/him]@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        And it infuriates me to no end. It’s one thing to trust them and their servers and it’s another thing altogether to send actual plaintext data around the net, that’s crazy and it’s what people are implying.

        For the record, until WhatsApp implemented e2e their messages were indeed fucking plaintext, and it took a while before they were pressured into e2e. It helps for them that their platform is very mobile based vs telegram, where the service is more server based. Telegram did have enough time to implement a server based e2e 0 knowledge encryption protocol though, it’s not really rocket science at this point.

        • rdri@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          6 months ago

          Telegram did have enough time to implement a server based e2e 0 knowledge encryption protocol though, it’s not really rocket science at this point.

          What do you mean by server based e2e? From what I get, most people’s complain is that Telegram doesn’t support e2e in group chats, and that is what seems to be close to rocket science in my opinion. Also Telegram is historically filled with ever growing group chats, which means quite serious implications for server requirements from what I understand.

          • Fushuan [he/him]@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 months ago

            Tegram stores all the conversation in their servers, since you don’t need to be connected in the phone or have the phone witchednon if you want to chat in the pc, or in another phone. This means that the authority is the server. WhatsApp it’s not like that, if you delete a shared photo after a while it will be cached out and you will lost access to it, meaning that they don’t store that stuff. The same thing happens with WhatsApp desktop or web, they stay in an infinite loading icon until you twitch on the phone or sometimes even unlock it.

            This means that whatever telegram develops must not only keep the group chat encrypted in the server, but any valid client of a user must be able to decipher the content, so every client must somehow have the key to unlock the content. One way of doing it would be for every client of a single user to generate keys (which I’m sure they already do) and reform a key exchange between them, to share that way a single shared key, which is what identifies your account. Then toy could use that shared key to decipher the group chat shared key which telegram can store on their server or do whatever is done in those cases, I’m not that well versed.

            The problem here lies in what happens when you delete and/or logout of all the accounts, currently you can login into the server again, because telegram has all the info required, but if they store the “shared key” then it’s all moot, I guess they could store a user identifying key pair, with the private key encrypted with a password, so that it can be accessed from wherever. They should as always offer MFA and passkey alternatives to be able to identify as yourself every time you want to log into a new client, without requiring the password and so on.

            This is some roughly designed idea I just had that should theoretically work, but I’m sure that there’s more elegant ways to go about this.

            It’s work for sure to implement all of this in a secure way, provided that you have to somehow merge everything that already exists into the new encryption model, make everyone create a password and yada yada while making sure that it’s as seamless as possible for users. However, I feel like it’s been quite a while and that if they did not do it already, theybjist won’t, we either trust them with our data or search for an alternative, and sadly there’s no alternative that has all the fuzz right now.

            • rdri@lemmy.world
              link
              fedilink
              arrow-up
              2
              arrow-down
              1
              ·
              6 months ago

              Sorry I have a hard time understanding the gist of your text. I don’t think it’s viable to be upset about what happens with access that was already acquired previously because that very fact already poses a bigger threat (which might have more to do with the nature of conversations vs how the platform works).

              • Fushuan [he/him]@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                6 months ago

                I wasn’t talking about situations with compromised accounts, I was talking about legitimate accounts that were created in a typical way being converted to a zero knowledge encryption method, I was aknowledging that it’s hard doing that conversion when a user might have several clients logged on (2 phones, 6 computers…).

                My point was that if they have not put any motivation in the transition, they never will because the bigger the userbase, the harder for them to manage the transition. Also, I find that sad because they should have invested more effort in that instead of all the features we are getting, but whatever.

                If you found the technical terms confusing, public/private keys are some sort of asymmetric “passwords” used in cryptography that secure messages, and shared keys would be symmetrical passwords. The theory between key exchanges and all around those protocols are taught in introductory courses to cryptography in bachelors and masters, and I’m sorry to say that I don’t have the energy to explain more but feel free to read about the terms if you feel like it.

                If you however found it confusing because I write like crap, I’m sorry for potentially offending you with the above paragraph and I’ll blame my phone keyboard about it :)

                • rdri@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  edit-2
                  6 months ago

                  No that’s not what I didn’t understand. The problem itself as you described it seems either a non-issue or something very few people (who’s already using telegram for some time) would care about. I don’t understand the scenario that would pose a problem for the user. The moment some account legitimately gains access to some chat is probably what should trouble you instead.