That is true. Take, for example, movies. Cinema studious with big budgets are usually very risk averse, simply due to the cost of failure being so high. So they have to make sure they can turn a profit. But how can you make sure any given thing will be profitable? Well, that is a prediction, and to predict anything, you need data to base that prediction on. Predictions are based on past events. And so they make sequel after sequel. They make things that have been proven to work. New things, by virtue of being new, don't have tons of data (past examples) for them to make good predictions and so they avoid new things. This results in the homogenization of art. Homogenization induced by Capital, has Capital only sees value in profit, and thus, for Capital, only predictably profitable art is given the resources to flourish.
Machine Learning made images art the epiphany of this. All output is based on previous input. The machine is constructed to not deviate too much from the training data (loss function). And thus struggles to do things it does not have much data on, like original ideas.
I think that what we’re likely to see are parallel worlds of art. The first and biggest being the homogenous, public and commercial one which we’re seeing now but with more of it produced by machines, and the other a more intimate, private and personal one that we discover by tuning back into our real lives and recognising art that has been made by others who are doing the same.
That's kind of already a thing. Just without the AI. Like in the example above, Capital wants predictable profit. Therefore only the most widely appealing, proven to be profitable art will get significant budgets. Creative and unique ideas are just too risky, and therefore delegated to the indie space, where, should any ever become successful, Capital is willing to help... Under the condition they get all the money (Think, for example, how Spotify takes most of the revenue made by the songs they distribute).
By "Capital" I mean those who own things necessary to produce value.
Not if you are part of the AI-bros club. There is a reason Marketing agencies insist in using the term Artificial Intelligence.
Unfortunately, this is not common knowledge, as experts and Marketing Agencies explain Machine Learning to the masses by saying that "It looks at the data and learns from it, like a human would", which combined with the name Artificial Intelligence and the other terms, like Neural Networks and Machine Learning can make someone think these things are actually intelligent.
Furthermore, we, humans, can see humanity where there is none. We can see faces where there are no faces, we can empathize with things that aren't even alive. So, when this thing shows up, which is capable of creating somewhat coherent text, people are quick to Anthropomorphize the machine. To add to this, we are also very language focused: If someone is really good with the language they speak, they are usually seen as more intelligent.
And finally, never underestimate tech illiteracy.