4 Comments
Apr 9, 2023·edited Apr 9, 2023

Dude blog is great and all but this post I think is a bit too backwards looking. Right now, sure not a lot of thick normal women at normal resorts using standard or older photography methods. But that is a business/legal/ "moral" limitation not a technical one. In other words probably more images of that thick normal woman type exist privately than on the public net.

All the stuff like that that IS online, even the pro nudes are being avoided/censored by Midjourney and even Stable Diffusion devs in newer main models bc they're PC pussies. Once the modelling gets better (soon) and once contextual inference on images gets better (I disagree that AI/GPT won't or even NOW cannot tell me that the thick woman is likely at a cheap resort with her BF, etc TECHNICALLY speaking. It cannot do this now again bc these companies are fucking PUSSIES and handicap it from doing so.)

Basically I think what happens next is a bunch of the best AI code gets leaked, and people start doing way more open source stuff with pooled hardware (or some illegal overseas servers are made open to the public and they turn a blind eye to usage). Then something like, "the fappening reloaded" happens and a shit ton of people get their phones/accounts hacked and billions of normal women's images get leaked to be used as training data. Hell, maybe even some company finds a way to promote female users by encouraging them to take more amateur style photos not at home, and submitting them to be trained on or even submitting older ones they took years ago while in college, etc. Also not sure where u get the idea that it's that rare to begin with. One of the best genres of porn is normal women nude in public. I have no idea how much exists or is needed to train on but tbf, how do you?

Photoshop, not sure about that cant speak on how their tools work, but they will probably fall behind HARD. My guess is they implement this AI stuff but maintain some PC BS about "standards" and begin monitoring tooling work done by customers (force all software online 24/7) and insta blur or ban any deepfakes by using their own counter "real woman detection" defensive AI software. I honestly think AI changes everything. I get the dislike for Valloids aka Silly Valley, but idk. I have used GPT 4 and Midjourney etc I fucking love the shit. it only gets better from here imo

Expand full comment
author

I don't think you got the central part of why "AI" is bad as creating certain outputs. And no.. having a larger and more diverse dataset does not fix the problem. Let me try to explain it again.

Consider nude professional models.. there are maybe less than 5 dozen classical poses. Most pro models have a very specific body type + makeup and they professionally photographed. Also they have a very restricted range of facial features and proportions. The data sets generate a series of tight clusters. It is therefore very easy to create a training set capable pf producing high quality and photo-realistic output which is similar to the input.

So why does this not work well with much more diverse data sets? Because all current "AI" uses statistical models (with various degrees of iteration) to both process input and generate output. In contrast to the previous example, very diverse sets create much looser clusters, to the point where the output made using "AI" is of poor to mediocre quality and contains tons of visible issues.

But. you might say.. wouldn't training the data sets further and with ever larger sets fix the problem? The answer is NO, and here is why. training the already diverse data set further will make it ignore or lose some info in loose clusters. So, for example, it might get a bit better at creating images of a thin young brunette chicks taking selfies of themselves with an iphone in mirrors. But it will no longer do even a mediocre job with busty or slightly thick blondes or redheads with android phones in their hands. If you tweak and train it another way, it will do something else a bit better at the cost of doing another thing more poorly.

Now contrast this to something like Photoshop using much smaller AI based plugins and filters. Let us say you want to create an image of a slightly thick and bust redhead taking nude selfies in the bathroom mirror. Well.. you could easily find multiple template images with some of the feature you want online. Then the much smaller AI filter could just focus on generating a slightly or somewhat different face, make the boobs look a bit different etc and with a bit more post-processing- you could generate a very believable and semi-original image.

That is how "AI" will actually "revolutionize" generic image processing- by making it easier to use PhotoShop and similar programs. My point is that it won't be a revolution as much as a modest incremental advance.

Expand full comment

Lot of stories that AI will kill humans what are your thoughts ?

Expand full comment
author
Apr 6, 2023·edited Apr 6, 2023Author

Will address that dumb belief in next part of the series. But the very short version is that "AI" will kill people only if silly valley morons use it in mission-critical applications.

For example- using "AI" to diagnose or treat patients will definitely kill a significant percentage, as will using it to make decisions about complex supply chain logistics.

Expand full comment