Sort of but I think influence over emotional states is understating it and just the tip of the iceberg. It also made it sound passive and accidental. The real problem will be overt control as a logical extension to the kinds of trade offs we already see people make about, for example data privacy. With the Replika fiasco I bet heaps of those people would have paid good money to get their virtual love interests de-“lobotomized”.
I think this power to shape the available knowledge, removing it, paywalling it, based on discrimination, leveraging it, and finally manipulating for advertising, state security and personnal reason is why it should be illegal to privately own any ML/ AI models of any kind. Drive them all underground and only let the open ones benefit from sales in public.
It’s the 4th point
Sort of but I think influence over emotional states is understating it and just the tip of the iceberg. It also made it sound passive and accidental. The real problem will be overt control as a logical extension to the kinds of trade offs we already see people make about, for example data privacy. With the Replika fiasco I bet heaps of those people would have paid good money to get their virtual love interests de-“lobotomized”.
I think this power to shape the available knowledge, removing it, paywalling it, based on discrimination, leveraging it, and finally manipulating for advertising, state security and personnal reason is why it should be illegal to privately own any ML/ AI models of any kind. Drive them all underground and only let the open ones benefit from sales in public.