Generative AI: How Does it Work and Is It Bad?
One of the things that is new in the world of computing and the world generally in 2025 is generative AI, commonly just called AI, in the parlance of our times. Some people love it, some use it with some reservation and/or caution, and some decry it loudly. I'm in the middle camp.
Any sufficiently advanced technology is indistinguishable from magic - Arthur C Clark
If you have used AI at all, then you know that it can feel like magic. While I cannot speak for you, I find the notion of magical things deeply freaky. So, here is how it works in general. Developers of these systems feed them large amounts of data. This will be things like articles and books, images, code, and discussions about code. And in most cases, they will use information that they can lay their hands on, and not necessarily information that they legally own or licence.
This itself raises interesting and problematic issues. The systems \"read\" and analyse all of this for patterns, but they are not just spitting out the consumed information on request. Instead, it builds up a neural network of connections between elements, be they text, sound, or images. This is similar to what we do when we consume information and develop an understanding. We read and view all sorts of things that we don't have a legal right to reproduce, but it shapes our understanding and knowledge. We synthesise information and we call this understanding and knowledge.
Writers read a lot of books, musicians listen to a lot of music, filmmakers watch a shit load of films, and coders read a lot of other people's code. One key difference is scale. What is a lot to a computer is much more than a lot is to a person. Much, much more.
Another difference - and I don't think this is a small thing - is that there are limits on how humans can deploy and monetise this knowledge. An AI that has been trained can be widely deployed at a massive scale. But I think that it is also worth keeping in mind that many people - especially in the technology sectors - have become obscenely rich by synthesising information and using it to generate things.
AI code assistants - like Cursor which I use - are so powerful because they work on two levels. They are trained on information about how languages work and learn from the most common descriptions of how specific elements work. If you write code in a way that conflicts with these descriptions, it will help you get your code to be consistent with these ideals. The second thing that these models do is observe the high-level concepts in problem-solving and implement these into their suggestions. The result is unnervingly close to magic a lot of the time. Even when you are coding something that it hasn't seen before, it can still apply the concepts of coding.
AI image generation is even harder to comprehend, at least for me. The model captures the key concepts from our input and tries to map the connection between the concepts. Then it creates what is essentially an image that is just noise and compares it to the model of what it should be. Then it refines the images (and I don't understand at all how it does this) before evaluating the image against the model it has of what it should represent. Then it repeats this process again and again. Once again, it is using the patterns from millions of images that it doesn't own. The pattern tells it that the sky is normally above the ground, that people have two legs, and - for some reason - that the number of human fingers is highly variable.
But it basically does what we do. It observes relationships and creates things using those relationships.
So, is it good or bad?
Copyright is an interesting issue. These things are trained on text and images that the companies in question do not have the right to reproduce, for the most part. However, copyright protects the actual creation not the patterns evident in the creation. You could write a blog post using the same grammar and basic ideas as this post without infringing on my work legally and - I would argue - morally. In fact, I came up with this text through my experience with products and by reading what others have written. But I don't have the legal right to reproduce any of their work verbatim. But the scale of a machine doing it might make it morally different.
Is it deskilling people? Perhaps. I would argue that we augment people's skills all the time. I almost never screw things together using a screwdriver, choosing to use a power drill instead. Likewise, I use power saws, and I drive a car where I don't have to change the gear. I don't choose the path that my emails take. I can build a simple circuit and program with an Arduino, but I couldn't build a computer from sand and metal to save my life. We become skilled in new things and lose skills we used to need. Again, I am not sure.
Is it going to put everyone out of work? A hundred years ago, many people worked on farms, but large farm machinery has dramatically reduced the need for farmhands. There are machines that shake olive trees so that the harvest is done in a day by a handful of people. My mother-in-law used to walk messages around the city for the company she works for. Some of this same work is now done by couriers using bikes, but a lot of it is email or other electronic systems. A generation ago, people developed film and printed photos, but those jobs are gone. It's alarming as fuck, but jobs come and go. I can see a time when an AI might mean we need fewer teachers. Students might have individual lessons, based on which they can show they currently know and understand. And the teacher's role might be to help this on this individual path.
I made this image using AI. Is it taking a task away from a human artist? Not in this case, because this is too trivial for me to ask a person to take time to do it. But could AI be used to replace some human artistic work? Certainly and that's tough on artists, which is a shame. When I code using AI, I can code twice and fast. Could this drive down the cost of getting things made? This is good news if you want to pay someone, and bad news if you want to get paid.
Does it use a lot of power? Yes, far too much. And, in turn, water to cool down the computers. Much like crypto, it's a massive consumer of energy.
So, at the end of all of this, I hope you know a little more. And I've so doubt missed things and perhaps you disagree. I'd love to hear about it in the comments.
Post Script: My brother-in-law, Darryn, pointed out this article, which is much more in-depth and thoughtful, if you want to read more.
10:05 am, January 16, 2025
Copy link to this post