Basically a deer with a human face. Despite probably being some sort of magical nature spirit, his interests are primarily in technology and politics and science fiction.

Spent many years on Reddit and then some time on kbin.social.

  • 0 Posts
  • 487 Comments
Joined 4 months ago
cake
Cake day: March 3rd, 2024

help-circle

  • I suspect it’s probably too late for the Democrats to switch horses now, but I hope that this finally pile-drives some smidgen of a lesson into them that they need to actually try to win elections rather than just assume that since they’re not crazy or evil like the Republican candidates people will naturally prefer them. They need to pick candidates that the people like and put in the effort to run those candidates well.

    There’s still five months until the election, there’s plenty that Biden can do between now and then to show people that he’s a good choice. And then maybe this time once it’s all over the Democratic Party can do a major rethink of what exactly they’re doing with all this. Like they should have the last time they ran a candidate against Trump that they thought simply “deserved” to win.






  • Did take companies long to stop pretending like they care.

    Of course they care, they care about what their customers think because that’s where their money comes from. This is just how corporations work, and it would have the opposite outcome if their customer base wanted those goals of theirs.

    If you want corporations to change then convince them that they’ll make more money that way, by whatever means. Through customer preferences, regulations, etc. Don’t expect a corporation to “do what’s right because it’s right,” any more than you should expect a shark to “do what’s right.” It’s not designed that way.


  • There actually isn’t a downside to de-duplicating data sets, overfitting is simply a flaw. Generative models aren’t supposed to “memorize” stuff - if you really want a copy of an existing picture there are far easier and more reliable ways to accomplish that than giant GPU server farms. These models don’t derive any benefit from drilling on the same subset of data over and over. It makes them less creative.

    I want to normalize the notion that copyright isn’t an all-powerful fundamental law of physics like so many people seem to assume these days, and if I can get big companies like Meta to throw their resources behind me in that argument then all the better.


  • Remember when piracy communities thought that the media companies were wrong to sue switch manufacturers because of that?

    It baffles me that there’s such an anti-AI sentiment going around that it would cause even folks here to go “you know, maybe those litigious copyright cartels had the right idea after all.”

    We should be cheering that we’ve got Meta on the side of fair use for once.

    look up sample recover attacks.

    Look up “overfitting.” It’s a flaw in generative AI training that modern AI trainers have done a great deal to resolve, and even in the cases of overfitting it’s not all of the training data that gets “memorized.” Only the stuff that got hammered into the AI thousands of times in error.