It’s clear that companies are currently unable to make chatbots like ChatGPT comply with EU law, when processing data about individuals. If a system cannot produce accurate and transparent results, it cannot be used to generate data about individuals. The technology has to follow the legal requirements, not the other way around.

  • Xhieron@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    7 months ago

    The technology has to follow the legal requirements, not the other way around.

    This is something that really needs to be taught better, at least in the US.

    GDPR doesn’t mean that LLMs are forbidden in the EU, but it does mean that the companies that create them may be liable for damages. That said, the damages must be real. Actual damages is somewhat cut and dry (e.g., ChatGPT publishes defamatory information about you, and someone relies on it to your detriment), but GDPR also contemplates damages for distress (e.g., emotional).

    If that’s true then the legal requirements will have to be changed …

    I think this position needs to be rejected in the strongest possible terms. Our response to any emerging technology should not be “It’s too good not to have, so who cares if people lose their rights?” The right to privacy and with it the right to control one’s likeness, name, and personal data is a much easier right to conceptually trade away than, say, the right to bodily integrity, but I think we’ve seen enough dystopian sci-fi at this point to understand where the intersections might lie between other rights and correspondingly miraculous technologies. [And after all, without the combustion engine we probably wouldn’t be staring down the barrel of climate change right now.]

    Should we, for instance, do away with the right to bodily integrity if it means everyone gets chipped shortly after birth? [The analogy to circumcision is unintentional but not lost on me.] After all, the chips mean that we can locate missing and abducted children easily and at trivial cost. They also mean that we no longer need to carry money or proxies for money. Crime is at an all-time low. Worth it, right? After all, the procedure is “minimally invasive.”

    The point is, rights have to be sacrosanct. They need to be the first consideration, and they need to be non-negotiable. If a technology needs those rights to bend or give way in order to exist, then it should not exist. If it’s of sufficient benefit to society, then it can be made to exist in a way that preserves those rights, and those who are unwilling to create it in such a way should suffer the sanction of law.