Concerned About the Use of Generative AI in Games?

Are you concerned about the use of generative AI by game developers?

  • Yes, it will decrease the quality of games

    Votes: 10 19.2%
  • Yes, it will impact people's jobs / shift the industry

    Votes: 13 25.0%
  • Yes, but it is too early to say whether the net result will be negative

    Votes: 21 40.4%
  • No, it will increase the quality of games

    Votes: 8 15.4%
  • No, it will improve people's jobs

    Votes: 3 5.8%
  • No, it won't substantially change anything

    Votes: 3 5.8%
  • I don't have any opinion for or against it / don't know enough about generative AI to say

    Votes: 6 11.5%

  • Total voters
    52
Ok, think about the whole 2.5 centuries of industrialisation and take your pick. That example is the first that came to mind, but it's the same across hundreds of others.
Humans lost their jobs to so many new techs. Have you been to any modern factory ?
We used to need hundreds of workers. I went to one which had less than 20 and everything else had been automated. This was over a decade ago.
In the 16th century everything was done by hand, then machines came along, that took people's jobs, but you wouldn't have a computer writing on RPGWatch if not for all these advances.
You really don't see the difference? There's no creativity in fabrication. It's a rote job, done as accurately and quickly and safely as possible, over and over. There's no art, and barely any craft. Machines could replace humans in those jobs because machines can do things more accurately and quickly, nobody cares about their safety (just their maintenance), and machines don't get bored or take smoke breaks.

With AI, we're talking about using non-thinking, non creative machines to take the place of craftsmen and artists. We're also talking about those computers doing their work with the raw material not of steel and aluminum and silicon, but the creative work done by the people the computers are replacing.
 
Joined
Aug 31, 2013
Messages
4,936
Location
Portland, OR
The AI is parroting at the moment, because it's new, and even then it's no longer fully parroting like it was a couple of years ago.
Wait a decade or two and things will yet again change.
AI will always just parrot, unless it becomes actual artificial intelligence, sentient machines, and at that point there will be an entirely new host of problems.
 
Joined
Aug 31, 2013
Messages
4,936
Location
Portland, OR
Ok, think about the whole 2.5 centuries of industrialisation and take your pick. That example is the first that came to mind, but it's the same across hundreds of others.
To add to what @JFarrell71 just wrote, I also wonder if there's any improvement on the result or for the artist that justifies this change. It seems the only incentive is for the user to save money, but the result will be average and without any spark of creativity. But I'm not excluding prior evolutions of the technology that followed the same pattern, of course.

The AI is parroting at the moment, because it's new, and even then it's no longer fully parroting like it was a couple of years ago.
Wait a decade or two and things will yet again change.
We'll cross that bridge when we get there. Right now, we only have a system that learns and copies or mixes what it has previously seen.

In a way, it's funny: it's also what we humans do since our brain learns by imitating. But our creative process involves years of experience in all the domains, not only the art itself (painting, drawing, singing, acting, ...). This creative process also goes beyond. Good artists don't spend years of watching others' work and limit themselves to imitating what they've seen. They probably start like that, but they develop their own tricks, their own style, their own themes. The result comes from all that, drawing from years of personal history (pun not intended).

It would require a tremendous work to train a machine to do the same, so even if it succeeded, would anyone spend the money for it, just for themselves? The only way they could afford it is by doing what people do now, by training one system for everyone - and it already costs several millions for the current, limited systems we know. So we're circling back to a unique source to produce art for all the projects using that technology, and everything looking the same as if a single artist was employed in every project, by everyone.
 
Joined
Aug 29, 2020
Messages
10,391
Location
Good old Europe
You really don't see the difference? There's no creativity in fabrication. It's a rote job, done as accurately and quickly and safely as possible, over and over. There's no art, and barely any craft. Machines could replace humans in those jobs because machines can do things more accurately and quickly, nobody cares about their safety (just their maintenance), and machines don't get bored or take smoke breaks.

With AI, we're talking about using non-thinking, non creative machines to take the place of craftsmen and artists. We're also talking about those computers doing their work with the raw material not of steel and aluminum and silicon, but the creative work done by the people the computers are replacing.
The artisans making furniture were all craftsmen.


They had their trade guilds. They were all seen as artists. Most of those things still exist, just in tiny quantities and cost a fortune. 90% of things were not amazing quality, which is why machines could replace them. They were derivative works of previously existing ones.

The super-high quality work will remain in the hands of humans, which will be mostly the design work. The rote work will disappear, because if an AI can do it, then just like factories, it just means they are derivative pieces of work in the end.
 
Joined
Nov 13, 2006
Messages
9,195
Location
Manchester, United Kingdom
AI will always just parrot, unless it becomes actual artificial intelligence, sentient machines, and at that point there will be an entirely new host of problems.
Yes, and no. Most humans 'creative' work is derivative from prior works. Most human 'anything' is derivative from something else. We didn't build computers of today from scratch, they are based on slight evolutionary improvements over decades.

There are only a select few artists, scientists, ... anything in the world that really create anything new.

AI will be taking 10bn source points and create something new from it. Is that parroting? Well just as much as painters today take their painting techniques from the grandmasters of Flanders and Italy.
 
Joined
Nov 13, 2006
Messages
9,195
Location
Manchester, United Kingdom
To add to what @JFarrell71 just wrote, I also wonder if there's any improvement on the result or for the artist that justifies this change. It seems the only incentive is for the user to save money, but the result will be average and without any spark of creativity. But I'm not excluding prior evolutions of the technology that followed the same pattern, of course.
What spark of creativity ?
Do you think that most 'creative' people actually create new things ?

How many tv shows have you watched or books read, where you knew exactly what would happen next ? That's because most are not creative. They use the same techniques for storytelling as have been done for millennia. There are maybe 1/2 shows a year, which are truly offering some unique insights into something.

We'll cross that bridge when we get there. Right now, we only have a system that learns and copies or mixes what it has previously seen.

In a way, it's funny: it's also what we humans do since our brain learns by imitating. But our creative process involves years of experience in all the domains, not only the art itself (painting, drawing, singing, acting, ...). This creative process also goes beyond. Good artists don't spend years of watching others' work and limit themselves to imitating what they've seen. They probably start like that, but they develop their own tricks, their own style, their own themes. The result comes from all that, drawing from years of personal history (pun not intended).

It would require a tremendous work to train a machine to do the same, so even if it succeeded, would anyone spend the money for it, just for themselves? The only way they could afford it is by doing what people do now, by training one system for everyone - and it already costs several millions for the current, limited systems we know. So we're circling back to a unique source to produce art for all the projects using that technology, and everything looking the same as if a single artist was employed in every project, by everyone.
The AI training methods are completely new compared to our multi-thousand-year history as humans and in a decade or two we have gone from barely able to write a program that can beat someone in chess to a tool that can be used in thousands of facets of life, including image generation and image detection.

Had you asked someone about this in 1990 and they would have said you watch too many movies.

Obviously the cost of these things is humongous, so were the first computers.

1710711347934.png

400k in 1945 would be about $7m today. You can now buy a laptop that can outperform this computer over a trillion-fold for less than $1k.

Obviously, new tech costs a lot of money.

Solar energy cost reduced by 80% in the past decade or so.
 
Joined
Nov 13, 2006
Messages
9,195
Location
Manchester, United Kingdom
What spark of creativity ?
Do you think that most 'creative' people actually create new things ?

How many tv shows have you watched or books read, where you knew exactly what would happen next ? That's because most are not creative. They use the same techniques for storytelling as have been done for millennia. There are maybe 1/2 shows a year, which are truly offering some unique insights into something.
Not everything is original, I'll grant you that, especially when we see so many remakes and remasters. :D But at least some of it has some originality. That's why I like indies, and why I fear the use of tools that could kill any originality.

The AI training methods are completely new compared to our multi-thousand-year history as humans and in a decade or two we have gone from barely able to write a program that can beat someone in chess to a tool that can be used in thousands of facets of life, including image generation and image detection.
You've got a good point; technology will likely evolve enough that this process will be less costly in decades. Maybe we can even extrapolate from what it currently takes to train an AI to estimate when.

How do you give that system a unique experience, though? It seems like a lot of effort, why would they bother when there's something available today, right now, that people are already using in its current state? But maybe, just maybe, we'll have a decade or two filled with more and more plagiarism, an uncomfortable shift in art jobs, followed by a renaissance of sort when AI gets much better.

Or we'll just have adapted to the fast-food of craft industry, like we did for mass-produced furniture, glassware, or crockery.

Either way, culture will have removed humans from the process. Doesn't that sound a little odd? But I'm sure we'll find something else interesting to do. ;)
 
Joined
Aug 29, 2020
Messages
10,391
Location
Good old Europe
@Pladio ENAIC wasn’t the first public computer. That’s a common misconception. The two “professors” actually stole ideas from John Atanasoff at Iowa State University, who was more interested in physics calculations than having any business sense about what he was actually doing. At least one of the ENIAC profs visited Atanasoff and took ideas.
 
Joined
Dec 29, 2023
Messages
180
Location
United States
@Pladio ENAIC wasn’t the first public computer. That’s a common misconception. The two “professors” actually stole ideas from John Atanasoff at Iowa State University, who was more interested in physics calculations than having any business sense about what he was actually doing. At least one of the ENIAC profs visited Atanasoff and took ideas.
Point stands.
New tech costs a lot....
 
Joined
Nov 13, 2006
Messages
9,195
Location
Manchester, United Kingdom
Not everything is original, I'll grant you that, especially when we see so many remakes and remasters. :D But at least some of it has some originality. That's why I like indies, and why I fear the use of tools that could kill any originality.


You've got a good point; technology will likely evolve enough that this process will be less costly in decades. Maybe we can even extrapolate from what it currently takes to train an AI to estimate when.

How do you give that system a unique experience, though? It seems like a lot of effort, why would they bother when there's something available today, right now, that people are already using in its current state? But maybe, just maybe, we'll have a decade or two filled with more and more plagiarism, an uncomfortable shift in art jobs, followed by a renaissance of sort when AI gets much better.

Or we'll just have adapted to the fast-food of craft industry, like we did for mass-produced furniture, glassware, or crockery.

Either way, culture will have removed humans from the process. Doesn't that sound a little odd? But I'm sure we'll find something else interesting to do. ;)
100% this is going to be hurting lots of people and make life harder for many.
I'm saying that in the long run, decades from now, my belief is that AI will simply be a tool like many others, which also displaced people's jobs.

New things will come, new problems, new requirements for the world.
 
Joined
Nov 13, 2006
Messages
9,195
Location
Manchester, United Kingdom
100% this is going to be hurting lots of people and make life harder for many.
I'm saying that in the long run, decades from now, my belief is that AI will simply be a tool like many others, which also displaced people's jobs.
I'm only looking at the present because I don't think the future will unfold gently on its own. As they say, hope for the best, be prepared for the worst (and amazed by everything in between).

This thing raises interesting questions. Something I completely overlooked is the use of generative AI for helping people, like @Qayto reported earlier. So there's definitely something worthy in that technology. And, as in all technologies, there are bad uses, too.
 
Joined
Aug 29, 2020
Messages
10,391
Location
Good old Europe
I see that the EC is already addressing the issue along with other AI-related problems, thankfully.

The European AI Office, established in February 2024 within the Commission, oversees the AI Act’s enforcement and implementation with the member states. It aims to create an environment where AI technologies respect human dignity, rights, and trust. It also fosters collaboration, innovation, and research in AI among various stakeholders. Moreover, it engages in international dialogue and cooperation on AI issues, acknowledging the need for global alignment on AI governance. Through these efforts, the European AI Office strives to position Europe as a leader in the ethical and sustainable development of AI technologies.

The current topic would belong to the 'limited risk' category:
Limited risk refers to the risks associated with lack of transparency in AI usage. The AI Act introduces specific transparency obligations to ensure that humans are informed when necessary, fostering trust. For instance, when using AI systems such as chatbots, humans should be made aware that they are interacting with a machine so they can take an informed decision to continue or step back. Providers will also have to ensure that AI-generated content is identifiable. Besides, AI-generated text published with the purpose to inform the public on matters of public interest must be labelled as artificially generated. This also applies to audio and video content constituting deep fakes.
 
Joined
Aug 29, 2020
Messages
10,391
Location
Good old Europe
I'm only looking at the present because I don't think the future will unfold gently on its own. As they say, hope for the best, be prepared for the worst (and amazed by everything in between).

This thing raises interesting questions. Something I completely overlooked is the use of generative AI for helping people, like @Qayto reported earlier. So there's definitely something worthy in that technology. And, as in all technologies, there are bad uses, too.
I have a friend who works in AI and uses it for almost everything now, both in his professional life and personal life.
For example, his boss asked him to prepare a report - so he asked ChatGPT to do it for him and didn't need to spend 5 hours doing something manually.
He gets ChatGPT to draft code for him and then he only has to review it and make minor edits to get it the way he needs it to be.

In his personal life, he uses it to simplify daily things too. He uses it as a tool that helps him do everything he needs.
 
Joined
Nov 13, 2006
Messages
9,195
Location
Manchester, United Kingdom
I have a friend who works in AI and uses it for almost everything now, both in his professional life and personal life.
For example, his boss asked him to prepare a report - so he asked ChatGPT to do it for him and didn't need to spend 5 hours doing something manually.
He gets ChatGPT to draft code for him and then he only has to review it and make minor edits to get it the way he needs it to be.

In his personal life, he uses it to simplify daily things too. He uses it as a tool that helps him do everything he needs.
Good for him if that makes him more performant. More and more people seem to embrace AI as a software development tool. The StackOverflow 2023 survey added AI questions, and the result showed 44% were already using it, and 25% were planning to use it soon. Only 29% were not using nor planning to use it (out of 89k responses). 77% were favourable to using AI in the scope of SW development, which is a lot.

I was surprised that so many people were using it. I think a good part is hype. Personally, I have no ethical objection to it; I even think that it could be useful to detect problems (pointing out simple mistakes, for ex), especially with technology becoming more and more complicated. But I don't trust it, and I wouldn't like it to write code for me. Maybe part of that is pride, but another part is because finding the right algorithm and writing it is something I like to do and which brings me satisfaction.

I did a few tests, though, and found out it was still far from reliable (and not very helpful). Note that the same survey showed that less than 3% highly trusted AI and 39% somewhat trusted it.

Some companies forbid the use of AI tools. They even refuse their developers use IDEs that have AI plugins, which backfired on Jetbrains recently because the AI plugin was installed by default, and you could only disable it but not remove it entirely. They quickly backtracked, and the next release will make it entirely optional. Not sure if it's just a vocal minority or something more serious.

Funny how people are divided about it.
 
Joined
Aug 29, 2020
Messages
10,391
Location
Good old Europe
The point my friend was making is that the design is done by him, he tells the AI to put the code together based on pseudo-code essentially. Most of the decisions are made by him.
Ah OK, so he gives some general idea and lets the AI puts the details? That's interesting. What I tested, and what I've usually seen, was through a chat window to tell the AI what to write or to explain, not to transform pseudo-code into real code. Maybe I'll try just to see what it does.
 
Joined
Aug 29, 2020
Messages
10,391
Location
Good old Europe
Ah OK, so he gives some general idea and lets the AI puts the details? That's interesting. What I tested, and what I've usually seen, was through a chat window to tell the AI what to write or to explain, not to transform pseudo-code into real code. Maybe I'll try just to see what it does.
He uses the ChatGPT4 API though, not the standard chat window. It has a lot more going for it.
 
Joined
Nov 13, 2006
Messages
9,195
Location
Manchester, United Kingdom
After watching a few videos, experimenting, and reading a little about it, I think the general idea is:
  • Do not use AI assistants to generate source code. You'll spend more time looking for bugs, debugging, and refactoring the code than if you write it yourself. It's also harder to maintain, reduces the developer's skills, and raises legal issues (without even mentioning the ethical issues). See this article for some of those concerns, for example.
  • Never use AI assistants to generate test code. It's unreliable, doesn't integrate well into existing testing framework, and it's just irresponsible for something so critical.
  • Examining code, detecting problems, and possibly generating comments is OK. Small auto-completions are fine, too.
For fun, I tried both Copilot and Jetbrains on real code I'm working on. Jetbrains is still very bad. I'm amazed that they could even make people pay for something like that. Copilot was a little better but still unstable in VSCode (maybe it's just a problem with VSCode). The code it generated was easier to obtain than with Jetbrains' AI assistant, and it even compiled and gave the correct results, but when I asked Copilot to do some refactoring, it broke the logic and introduced bugs. The code it initially wrote was quite poor, rookie level. Yet, Copilot's recognized as one of the leading AI tools...

I suppose that the AI agent/copilot used for those AIs still need a lot of work before they're suited to production use, on real code and not just for little demos. There's a gap similar to the difference between speed chess and chess, as Andrej Karpathy explained in his video (that I posted in another thread a few days ago).

This problem is arguably much more serious for the long term than lower quality art used as game assets! Still, it'll be exciting to see where it goes. The issue is understanding how to use it right now.
 
Last edited:
Joined
Aug 29, 2020
Messages
10,391
Location
Good old Europe
@Redglyph

Hey, just for the record, I'm not in to that Paul Wattson guy. I agree; he's annoying as fuck and talks shit all the time. I just keep getting linked to his vids cos I clicked one 10 years ago and the algorithm sucks! xD
 
Joined
Jul 10, 2007
Messages
3,006
Location
Australia
Back
Top Bottom