Awash in mediocre content
Mediocre content might not be such a bad thing if you use it to your advantage.
One of the things that I've been thinking about when it comes to the latest generative AI (LLMs) is that they are very good at being mediocre, which has certain consequences as a result.
When you play around with ChatGPT and the like, you noticed that they do a remarkable job at producing content generally. When you try to get them to produce content within a subject that you know well, however, they become less impressive. It can still do it, but the output is generally just okay and often has many flaws. You notice this in your area of expertise because you know the things that make something average vs great.
Of course generative AI will only get better over time, but for now, this means that you can assume, in most subjects, the content it produces is mediocre. Hence, if you play out millions of people using these generative AIs, we'll be awash in mediocre content soon enough, if we aren't already.
For many people, this might sound like a bad thing. And sure, there are definite downsides to easy, cheap, mediocre content being everywhere, but another way to look at it is that it actually brings up anyone who is below mediocre in a particular subject.
For example, let's say you are a marketer at a company and you've been playing around with ChatGPT and the like and you find that it just doesn't cut it as a marketer so you write it off as not being helpful for you to do your job. However, as a marketer, your skills in SQL are probably not classified as “above average”. In fact, I'd guess, most marketers hear the phrase SQL and take off in the opposite direction and make a note to themselves never to talk to that person again.
But there's a lot you might be able to do better in marketing if you knew pretty average SQL. An indication that this might be the case is to ask yourself how many times the marketing department has to ask the Business Intelligence department questions about data. If you are asking the BI team for something, you are essentially saying that SQL is valuable to know. This is especially true since I bet, unless you are asking about something that the business executives are also curious about, you can probably expect a response from the BI team between a couple months from now and never.
And this is why I think mediocre content generation is going to be a huge benefit. It can give someone like our hypothetical marketer the ability to write average SQL and therefore get answers to their questions in less time then it would to get a response back from the BI team telling them to beat it.
Multiply this general concept out across all industries and jobs and you can start to see that mediocre content might not be all that bad after all. Another interesting thing, if you play this out a bit, is that it almost eliminates “beginners”. For example, when you are just starting to learn a new subject, you don’t know most things in that subject. That often makes it tough to do much with it. You kind of have to grind away until you get to a “good enough” (aka mediocre ;) point until you can do much in it.
I remember when I first taught myself to program, it probably took about two years before I could actually get a job doing it, and doing so at a mediocre level at best. Now, with generative AI, you can pick up a subject and be mediocre right from the start. We’re still in very early days for what this might mean but it’s pretty exciting to think that we can all be pretty good in a lot of subjects we might have otherwise looked at as being too insurmountable before.
Most people are specialists, therefore, most people are really good at some things and really bad at most other things. You now have the capability of being mediocre in a bunch of other things, which ironically, might make you even better in the things you specialize in. And remember, these systems will only get better. So what is mediocre today is likely to be above average in the future. Plan accordingly.
Eventually, I’d like to explore the consequences of what happens when everyone uses AI to generate content and to summarize the content that is generated. There’s something interesting there. But, for this post, just wanted to explore something else.
There’s actually a concept similar to this called the Gell-Mann Amnesia effect, which essentially is that you notice the flaws in articles in a newspaper written about your area of expertise and therefore dismiss it as not informative, but as soon as you go to another article in another subject that you don’t have any expertise in, you forget this and take the article as something that could be informative in that particular subject.
It has shown the ability to produce expert level answers in some contexts but I’m still slightly skeptical on these since they are mostly done on exam-style tests. Tests are great and all, but they aren’t a true reflection of reality and the things you deal with as an expert in a particular area. That said, I’m sure these things will only get better and better as time goes on.