Jump to content
Wikimedia Meta-Wiki

Talk:Wikimedia Commons AI

Add topic
From Meta, a Wikimedia project coordination wiki
Latest comment: 5 months ago by OhanaUnited in topic Disadvantages, downsides and difficulties

Disadvantages, downsides and difficulties

[edit ]
Latest comment: 5 months ago 19 comments5 people in discussion

I would like to know what might be the cons of Wikimedia Commons AI. What do you guys think? I would love to hear your feedback! S. Perquin (talk) 22:47, 19 January 2024 (UTC) Reply

It's nearly impossible to approve a new project

[edit ]

It's nearly impossible to approve a new project, and doing so requires both godlike patience and overwhelming support. It took seven years to develop and approve the latest project, Wikifunctions, and three years to launch it. Even the most popular Wikiproject proposals have been in "development hell" for a decade or more. —Written by Dronebogus on the content page

The fact that it might take a long time or require substantial support before a new project is approved shouldn't be an excuse not to initiate a project. It's better to start now rather than postponing or never starting at all. After all, it does provide a solution to the ongoing debate about AI-generated media files within the Wikimedia Foundation. S. Perquin (talk) 08:50, 20 January 2024 (UTC) Reply
I agree with Dronebogus. There are two current proposals (Wikispore and WikiJournal) which have overwhelming support since 2019. Both are still open proposals pending a sister project approval process to be developed. You are looking at almost a decade-long process from start to finish. OhanaUnited Talk page 19:58, 9 July 2024 (UTC) Reply

Failing to fill a completely unique niche

[edit ]

This is wholly and obviously redundant to Commons, failing the most basic principle of new projects to fill a completely unique niche. —Written by Dronebogus on the content page

Why would this be unnecessary? It's a unique project, focusing on AI-generated media files. Currently, it's taken for granted that these are placed on Wikimedia Commons, but I actually find this quite odd. After all, they are a different kind of media than those made by humans. S. Perquin (talk) 08:50, 20 January 2024 (UTC) Reply

Creating a content fork is a bad reason to start a new project

[edit ]

Creating a truly gargantuan w:wp:content fork is an extremely bad reason to start a new project. —Written by Dronebogus on the content page

What do you mean by "content fork"? Media generated by AI would be placed on Wikimedia Commons AI, and media created by humans on Wikimedia Commons. It's not a mirror site or anything. It's just for distinguishing between the two types of media. S. Perquin (talk) 08:50, 20 January 2024 (UTC) Reply

A project based on a zeitgeisty new technology is not going to stand the test of time

[edit ]

A project based on a zeitgeisty new technology that you admitted may be demolished by legal issues in coming years is not going to stand the test of time— even if AI art is here to stay and survives both the populist loathing of Wikimedians, artists, and the general public and a hypothetical tsunami of copyright violation cases, it would then likely cease to be controversial or even just become a dead medium like 8-tracks or laserdisc. —Written by Dronebogus on the content page

I believe it's important to look ahead to the future and anticipate it. Of course, you can never know exactly how developments will unfold, but Wikimedia Commons AI should consider how to handle certain uploaded media. A database for media files generated by AI doesn't mean that everything is permissible to be posted. And if Wikimedia Commons AI doesn't come to fruition, then that's just how it will be. Perhaps Wikipedia won't exist forever either, as AI becomes increasingly popular for information retrieval. S. Perquin (talk) 08:50, 20 January 2024 (UTC) Reply
[edit ]

There are legal objections to AI. —Written by Groempdebeer on the content page

Nobody knows how long this will last, so we can already start thinking about the future of AI in Wikimedia projects. We could at least structure the data better, see also my essay. S. Perquin (talk) 08:55, 20 January 2024 (UTC) Reply
There are multiple categories existing. Structure isn't the problem yet. I don't think it's nice to put years of hard work in this project with the result "forbidden by law". The governments must make better policy for AI, otherwise it's insure if this project will survive. Groempdebeer (talk) 10:47, 20 January 2024 (UTC) Reply
Perhaps one day there will be a law stating that only media files in the public domain may be used by AI for generating images, sounds and videos. Personally, I don't expect AI-generated media to become completely illegal, because there will always be people who don't mind if their works are used by AI, like me. Nevertheless, I see a bright future for Wikimedia Commons AI. I will formulate core values in my essay about what Wikimedia Commons AI should represent! S. Perquin (talk) 11:00, 20 January 2024 (UTC) Reply

Waste of resources and donation money

[edit ]

It's a waste of resources and donation money. —Written by Natuur12 on the content page

Why do you think that? What would be so bad about creating a separate database for AI-generated media? S. Perquin (talk) 19:36, 20 January 2024 (UTC) Reply

Redundancy

[edit ]

It's interesting but agree with Dronebogus above in regards to redundancy. It's somewhat redundant to the branches of the well-working category system of WMC where all media made using AI should be located in – Commons:AI-generated media. AI media could thus be easily excluded or explored and maintained separately from other contents there. It's also possible to have a tag for images made using AI or maybe even requiring all such images to only be in AI-specific subcategories so that it's clear already from the category page / the thumbnail or file-title that it's not a human-manually-made image/video. It would make such contents much less findable even when they could be very useful or the AI aspect is not that large – for example if just a small part of an image is made or modified using AI or videos that were redubbed using reviewed machine translation transcripts to AI-generated voice. I think a better approach would be something like a WikiProject for Wikimedia Commons or Wikimedia, I'd be interested in that or maybe something else like that but not 'a separate Wikimedia project for AI media' in particular. —Written by Prototyperspective on the content page

My philosophy is that AI-generated expressions should be strictly separated from human-created expressions, to make the separation between humans and AI clearer. Humans are always very competitive and want everything to be better and more perfect. Someday, there will be some kind of battle over who is better at making art, and AI will win. AI art will become superior to human art. Man-made art will no longer stand out among all the more "perfect" AI-generated art. At least that's what I fear. Which is why I think human art should be protected. S. Perquin (talk) 14:01, 21 January 2024 (UTC) Reply
They can be separated via categories and the proposed tags (there already is a template albeit there could be an additional one and this info is usually prominently included in the file descriptions).
AI art is human art to some degree as well for two reasons: it draws upon human art through its training (it's like summarizing combined elements of human art according to the prompt) and it's directed via skillful prompting by a human.
That AI art will be better in most cases or overall is speculation and irrelevant. There is no clear separation for the reasons provided and even if there was, that's not a good reason for putting these media on an isolated project where they're just buried, hard to organize, and barely findable.
On top of all of this there can be conventionally manual art where just some relatively small segments of the workflow is done by AI or where just one part of the image is done by the AI.
Human art is just transformed into something that is more efficient and productive in terms of not just workflows but also which subjects and activities humans spend their time on (e.g. ideation, post-generation editing, sketching, drawing specific parts of an image, drawing specific subjects, etc). Lastly, having AI art on a separate project I think doesn't "protect" traditional fully-manual art in any way. Prototyperspective (talk) 17:20, 21 January 2024 (UTC) Reply
You do have a point there. However, my perspective is still that pure human creativity and art, as it existed before the era of AI, should remain separate from content or artistic expressions produced using artificial intelligence or automated processes. This is partly because AI-generated art can indeed be seen as a sort of summary or blend of the original media.
The question indeed remains, whether or not Wikimedia Commons AI would ever happen, what the definition of "AI-generated media" is. Is it media that was generated entirely with a prompt? Does it matter how much time went into creating a prompt? And what about media that was also edited afterward, or media that was actually made almost entirely by hand but had an effect made overlaid by AI?
What makes AI different from non-AI art? Does it have to do with it being fake? Photoshop and filters also create fake images. Does it have to do with that little time and energy was put into it? Some artists throw a bucket of paint over an empty canvas. Or does it have to do with a lack of creativity? What creativity is, depends once again on each individual.
And so we could probably go on philosophizing for hours, days, weeks, months, years... I fear there won't ever be a dividing line between AI-generated media and human-made media, though I try to get people to realize how important this line may be. People are becoming less and less natural. Pure nature is no more. I personally think this is a great sin, because human beings are naturally beautiful. The next step is when humans become cyborgs and eventually become immortal robots. What then is a human being anyway? S. Perquin (talk) 18:24, 21 January 2024 (UTC) Reply
But the point is, one can seperate human art from AI art on Commons. One could create a banner on everything in the category "AI art" clarifying its origin. So I believe Commons is already the place where one can do what you aim. Functionally, Commons AI would simply be a copy of regular Commons but with other images, right? Dajasj (talk) 12:15, 22 January 2024 (UTC) Reply
You could indeed separate it within the same platform, but then I am afraid that human-created media will no longer stand out within the growing AI-generated media. You could make AI-generated media less noticeable by creating a certain type of system, but that would be unfair to AI-generated media, because they can be just as good and functional as human-created media. S. Perquin (talk) 20:43, 22 January 2024 (UTC) Reply

It's not easier to identify

[edit ]

No it does not make it easier to identify. If an image is used in Wikipedia, I still see no difference. And if a difference can be seen, this does not need a separate platform. —Written by Romaine on the content page

By "easier to identify" I meant that it is easier for users to tell whether large groups of images under a certain category (for example "dog" or "building") are generated by AI or created by humans. I would personally like to know whether something is made by AI or by a human. S. Perquin (talk) 20:43, 22 January 2024 (UTC) Reply

You don't have to build a whole new platform to store data

[edit ]

If the data (that it is AI generated) is properly stored, this can be easily retrieved without having to build a whole new platform. If this point is a problem, easier solutions can be thought of. —Written by Romaine on the content page

My concern is not necessarily with being able to retrieve AI-generated media, but with distinguishing media made by AI and humans. For example, as I mentioned above, when people are looking for images from a certain category and they are not looking for AI-generated images. Right now, you cannot filter AI-generated media from human-generated media. S. Perquin (talk) 20:43, 22 January 2024 (UTC) Reply

It doesn't reduce the possible situation of getting overloaded

[edit ]

Having a separate platform doesn't reduce the possible situation of getting overloaded (still the same amount of work), it only moves it and creates a lot extra work. Because of duplicity a lot of extra work needs to be put in duplicating the full Wikimedia Commons infrastructure! —Written by Romaine on the content page

I didn't mean "overloaded" in the sense that the data centers can no longer handle the amount of data, but that finding human-made media in the future is propably like looking for a needle in a haystack. S. Perquin (talk) 20:43, 22 January 2024 (UTC) Reply

Not properly chosen domain names

[edit ]

The domain names are not properly chosen: commons.wikimedia.ai.org would mean that the organisation ai would have a section for wikimedia, in which the section commons. Wikimedia is not in the position to create a new organisation regarding AI. —Written by Romaine on the content page

I didn't know that. I'll change it right away! S. Perquin (talk) 20:43, 22 January 2024 (UTC) Reply

Future laws

[edit ]
Latest comment: 8 months ago 2 comments2 people in discussion

I am just gonna bring this up for y’all regarding ai generated media for Wikimedia.

In March of this year Tennessee passed the Elvis Act. Making it a crime to copy a musician’s voice without permission. Also there is this.

Yes I know this is all very recent but I do think Wikimedia should be cautious. Because it seems new laws can harm Wikipedia’s usage of ai generated media.CycoMa1 (talk) 19:07, 14 April 2024 (UTC) Reply

That is why I still think it is better to separate 'real' and AI-generated media from each other through two different platforms. It just makes it a little more organized. Kind regards, S. Perquin (talk) 20:29, 26 April 2024 (UTC) Reply

AltStyle によって変換されたページ (->オリジナル) /