A new era of ethics

I recently wrote about one aspect of the advent of technology like ChatGPT, specifically in regard to the blurry legality of how it “learns” by scraping anything and everything online.

But I’ve been thinking recently about the ethics of using something like ChatGPT for various other purposes. I just wanted to touch on a few here that I’ve been noodling on.

Is it unethical, or otherwise acceptable for a student to get ChatGPT to produce an essay instead of writing it themselves?
I can appreciate a few angles on this. First off, I wouldn’t call it unethical, per se. Against the spirit of what schooling is trying to accomplish, yes. But unethical? I wouldn’t say so.

Is it acceptable for a student to do this? I’m leaning toward no — that whole “against the spirit” viewpoint — but that question does get into a little more of a grey area for me. The purpose of asking a student to write an essay is for the teacher to gauge how well the student grasps, and can articulate, the topic that has been taught. That has for decades, perhaps hundreds of years, been a written process: Get yourself to a library for research of written material and for more advanced education, talk to whoever you need to who can offer insights on the subject, and over days or weeks or months, write a convincing argument about a stance on the subject.

But now ChatGPT will do that for a student with just some keystrokes. You can literally ask it to write an essay on any subject, of whatever length you want, for whatever grade or education level you wish, and within seconds it will start producing that content. And it will keep going, be that mere seconds or minutes later, until it has fulfilled your request.

Will it be perfect out of the gate? No. It will absolutely require going over and tweaking before it can be submitted. But 99% of the work, and potentially 99.999% of the student’s time will have been saved by getting this computer system — free for anyone to use at any time — to do the vast bulk of the research and writing.

And make no mistake: It is doing research to write the content. At least in a matter of speaking. Going back to that initial link at the top of this post, I explain how ChatGPT (and other, similar “A.I.” technology) is scraping the internet for any and all written content it can find. That includes untold numbers of entire books. So if you type in a command for it to write a paper on the influence of the Italian Renaissance on modern artwork, it will literally gather information about the subject from written material. Just like a student would, only in fractions of a second.

All of this of course is antithetical to proving that the student has learned anything about the subject. A low-mark-earning fifth grader could use ChatGPT to supply you with an extremely advanced, and verifiably accurate paper on quantum mechanics, just by typing in a string of criteria for ChatGPT to go with. The material handed in by an individual no longer means that the individual knows even the first thing about the topic. And you of course can’t award marks, let alone a degree, to someone who was supposed to have produced their own valid and convincing work in a field when they just had a computer do most of the work for them. But without also asking detailed questions about the process they undertook to write, say, a doctoral thesis, and seeing hard evidence of their rough work and note-taking, how do you know if the student has actually done of the work at all?

The short answer is, other than making an educated guess — like being pretty certain about that low fifth grader handing in Einsten-level material about quantum mechanics not actually grasping that level of understanding — you can’t know. Part of that is due to ChatGPT’s design, which is such that the same command criteria put in more than once, even by the same person, will produce completely different results. Gone now are the days that a teacher could check parts of an essay a student submitted against known websites offering essays-for-hire, where the student paid someone to send them a pre-made essay on myriad topics for myriad grade levels. Now teachers are unable to get the same results they’re seeing in essays, even if they know precisely the wording the student used on ChatGPT, because it will produce a completely different result every time it’s asked to write about the same criteria.

The only reasonable solution I’ve heard so about how to get around this problem is for schools to stop relying so heavily on written material outside of the school environment. Perhaps, as I’ve heard suggested, go back to more in-class, or at least in-person, oral exams. But even that only goes so far. If a doctoral candidate has traditionally had to make and write up about a thesis of their own, oral exams of course won’t cut it.

Suffice to say, schooling is going to have to come to grips with, and learn to live with, the new reality of anyone being able to produce any written content they want to at a whim. ChatGPT is a tool. And like any other tool, it can be used for good or bad (though it does have some boundaries in place where it will try to avoid supplying any output on known illegal activities).

Is it unethical, or otherwise acceptable for someone to sell something ChatGPT produces?
This is another grey area for me, but here I’d lean more toward yes. But with the caveat that I don’t for sure know what industries will particularly care.

Here’s the thing: Part of the recent WGA strike was about writers wanting some assurance that their jobs won’t be at risk in favour of using systems like ChatGPT to produce written material. There were reports going around during the strike that some smaller companies made no bones about not even needing writers any more because they could get what they need with the likes of ChatGPT. However, one thing such systems can’t do is produce long-term series content. It can produce sitcom, or medical drama, or police procedural, etc., pilot scripts all day long. But what it can’t do is then take those same characters it just creatred and put them in episodes 2 through 20, or whatever length of season you have, and evolve certain characters in certain ways. ChatGPT excells, again in part by design, in producing original one-shots of content. But many TV shows in particular tend to carry on some storylines from one episode to another. Characters fall in love, or one character slowly spins toward self-destruction as they contend with a progressing disease, etc. That’s not something ChatGPT can do.
Not yet, at least.
That is the kind of thing that TV companies (thankfully) still need writers for.

But that leads me to ask: If ChatGPT could successfully write original and ongoing TV series material, would people care? I don’t mean the people (like yours truly) who support the writers and don’t want to see their jobs go away and would actively avoid supporting companies who killed those jobs to save some money. But aside from those people with any real interest in the well-being of the writers themselves, would the vast majority of people care how their content was made, provided they still got their entertaining shows? I don’t know that they would.

And that gets into other industries, as well.

If I sent a pitch for a completed book to a publisher summarizing what the storyline is, they’ve never had reason before to ask if I was the one who wrote it. Now they do have that angle to consider. Because yes, ChatGPT will write a book of whatever length and subject you want, and recently ways have come out to even have it learn your own writing style to mimic so it seems more like it’s done by you personally. But what I’m curious to see is if they will ask that question. Because on the one hand, again, you can’t practically prove they didn’t write it. And on the other hand, companies would run the risk of offending anyone who had actually written the material on their own and were then confronted with, “Hold it, there, buddy… how do we know you actually wrote this?”

And again, I wonder if publishing companies will even really care. If I pitch them on a great idea for a book that’s already written and it’s exactly the kind of thing they’re looking for now, and they ask to read it and with some light editing it’s basically good to go… are they going to balk over not knowing if it was written by a person or a machine? Or are they going to ignore that question — which again is impossible to verify anyway — and just agree to publish it?

Ditto with the tabletop game industry. It’s worth an estimated $24 billion annually, and is projected to double that inside six years. That’s huge.

Here’s a fun fact: My best friend was playing around with ChatGPT and asked it to create a simple tabletop game and, further still, to run through some potential playtesting scenarios with it. And, again, starting within moments and stopping shortly afterward like with a term paper, it did so.

Hearing his results, I tried a similar thing: Went onto ChatGPT and asked it to create a tabletop game, this time using elements of (hugely popular games) Settlers of Catan, Ticket to Ride, and Dominion. And not only did it do so effortlessly, but the detail in what it provided included the multi-step breakdown of each player’s turn; what sequence they did in what order before their turn was completed. And then of course what the point of the game was, how points were accumulated and what the goal and endgame was.

In seconds.

Understand, I’ve dabbled in some tabletop game design for years, including with said friend mentioned above. And it’s not hours or days but weeks and months of mulling over options in your head, and then hammering through how any changes would work and what they’d alter about the operation and flow of the game, and then trying out that change to see if it was better or worse than the version you’d come up with previously. And ChatGPT can now do all that work, to make a game the world has never seen before, way faster than the time it takes you fix yourself a snack.

An acquaintance who had made and successfully sold several tabletop games told me years ago, when I asked, that if he were to break down how much money he sold them for compared to how much time he’d spent on making the games, it probably wasn’t worth the effort. It was a work of love for him more than income.

But what happens when someone can now sell a game to a company that was produced entirely by ChatGPT and then maybe tweaked a bit by the person selling it? As with book publishers, would they care about who (or what) made the game? Or would they only be interested, as any business could understandably care about, whether or not it’s something they want to put their stamp on and make some money from? Further, what’s to stop any game publishing company from just going to ChatGPT themselves and making their own games instead of relying on just the effort of people, of all things, to offer them new game ideas?

Now here’s the thing: I, for one, couldn’t in good conscience do any of the above. I couldn’t hand in a thesis I hadn’t written, or pitch a book or movie I hadn’t written, or a game I hadn’t designed, and take credit for any of it. I’ve got plenty of impostor syndrome happening in my life already. I don’t needed the added stress of saying that yes, I’d done something that I was selling but that I hadn’t actually done. It’s not like it’s stealing an idea or product from someone and claiming it’s my own, but it’s work I hadn’t put into it that I would be claiming I did. It feels like a short-cutting cheat I’d be pulling on others.

But there are plenty of people out there who wouldn’t care less about that part of it. Crank out a book on ChatGPT, edit it, then shop it around as something they wrote. Crank out a game, maybe massage a bit of the gameplay here and there so it works in real life, and offer it to game publishers. Crank out an action movie that only needed some quick edits before it could be pitched to production companies who are so often hungry for more and more content. There are, at risk of sounding pessimistic but honestly just being realistic, more than enough people who would take advantage of this type of thing and would (perhaps will) laugh all the way to the bank about why other people aren’t doing the same. People who don’t even consider caring about it. Hell yeah, of course it’s mine. You buying, or should I offer it to the next guy?

Bear in mind that publishing and producing is in large part a numbers game. The more people you approach about your book/movie/game, the more likely you are to find someone who will buy it. The more often you can do that, the more likely you’ll sell more frequently. It can take months or even years to write a book or movie script. But it would take ChatGPT minutes. And it may not be brilliant, but it doesn’t need to be. It just needs to be good enough for someone somewhere to want it. Even at a relatively small buying price, when you’re working in bulk and just cranking this stuff out to sell again and again and again, that can add up to a lot of money, for zero effort to write and only time spent pitching it around. And between writers writing the material themselves, or users getting ChatGPT to crank another one out while they go make a sandwich… who’s got the edge in that numbers game?

And here’s the catch: If the book and game publishers also don’t care — and understand, I’m not saying they don’t or wouldn’t, just pointing out that if they don’t care — they why should people who are selling them that kind of material? Why not sell them something they’ll readily just buy? In which case, who wouldn’t be tempted to do the same thing and just sell them whatever they can?

I realize that my ethics in that way may very well be hamstringing myself financially. In my working toward voiceover work I already know there are certain companies, and of course certain individuals and political parties, etc., I would never work for. And I know that means less money at the end of the day. But while I’d definitely like more money, I need to be able to live with myself while making it. And that means needing to stick to my own personal code of ethics. I’d way rather scrape by but feel good about how I’ve done things, than make a fortune doing something in a way I can’t respect myself.

Having said all that, the WGA agreement, when they settled their dispute a couple of weeks ago, does allow that systems like ChatGPT can be used by writers to help with material they’re working on. That, it seems, was agreed upon by all parties. They seem to have agreed that it is a tool that can, as all tools, be used to make work faster and more efficient. So who knows? Maybe the new reality is that it’ll be totally acceptable by one and all to utilizing content-creating systems to help get your written project from points A to Z.

Depending on how we get accustomed to harnessing new technology like this, the wide grey area of using it ethically could at least get lighter.

That may be, like in so many aspects of life, the best we can hope for.