The head of Telegram, Pavel Durov, has been charged by the French judiciary for allegedly allowing criminal activity on the messaging app but avoided jail with a €5m bail.

The Russian-born multi-billionaire, who has French citizenship, was granted release on condition that he report to a police station twice a week and remain in France, Paris prosecutor Laure Beccuau said in a statement.

The charges against Durov include complicity in the spread of sexual images of children and a litany of other alleged violations on the messaging app.

His surprise arrest has put a spotlight on the criminal liability of Telegram, the popular app with around 1 billion users, and has sparked debate over free speech and government censorship.

  • kungen@feddit.nu
    link
    fedilink
    arrow-up
    11
    ·
    4 months ago

    certain public features of telegram that do allow you to report illegal materials have been used to spread them.

    I don’t understand, what do you mean? Does clicking “report” on a message not simply send a report to moderators only?

    • graphene@lemm.ee
      link
      fedilink
      arrow-up
      9
      ·
      4 months ago

      I’m saying that Telegram’s moderators are not moderating stuff they should be moderating and that they have admitted they should be moderating. I know that it’s not their fault, it’s the small size of the team compared to almost a billion monthly active users, but still.

      • GnuLinuxDude@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        I know that it’s not their fault, it’s the small size of the team

        This part is directly Telegram’s fault. If they cannot keep up with their moderation queue then they need a bigger moderation team. Preferably properly remunerated. There are news reports about how Facebook’s sub-contracted moderators work for these extremely shitty companies who track them based on how many reviews a minute they do, and which causes extreme psychological damage to the workers both because of the extreme content they have to see as part of their jobs and the bad working conditions they must put up with.

        • graphene@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          4 months ago

          Yes, basically every corporate social media site needs more moderators. A single person can barely moderate 200K users (cohost), so a platform with 900 million should probably have a trust and safety team larger than 30 or 60 (Durov didn’t confirm it).