Lol. This comment sent me down a rabbit hole. I still don't know if it's logically correct from a non-physicalist POV, but I did come to the conclusion that I lean toward eliminative materialism and illusionism. Now I don't have to think about consciousness anymore because it's just a trick our brains play on us (consciousness always seemed poorly defined to me anyways).
I guess when AI appears to be sufficiently human or animal-like in its cognitive abilities and emotions, I'll start worrying about its suffering.
The company would need violence. There's no reason for workers to work in a factory for less money than their goods are sold for, and there's no reason for the company to pay workers more than the goods are sold for. Without violence the workers could just produce and sell the goods themselves and ignore the company.