Generative AI entails a credit-blame asymmetry
Journal article
Porsdam Mann, S., Earp, B.D., Nyholm, S., Danaher, J., Møller, N., Bowman-Smart, H., Hatherley, J., Koplin, J., Plozza, M., Rodger, D., Treit, P.V., Renard, G., McMillan, J. and Savulescu, J. (2023). Generative AI entails a credit-blame asymmetry. Nature Machine Intelligence.
Authors | Porsdam Mann, S., Earp, B.D., Nyholm, S., Danaher, J., Møller, N., Bowman-Smart, H., Hatherley, J., Koplin, J., Plozza, M., Rodger, D., Treit, P.V., Renard, G., McMillan, J. and Savulescu, J. |
---|---|
Abstract | Generative AI programs can produce high-quality written and visual content that may be used for good or ill. We argue that a credit-blame asymmetry arises for assigning responsibility for these outputs and discuss urgent ethical and policy implications focused on large-scale language models. |
Keywords | ChatGPT, Large-scale language models, Authorship, AI, Responsibility, Transparency, Rights, Interests, Achievement |
Year | 2023 |
Journal | Nature Machine Intelligence |
Publisher | Nature Research |
ISSN | 2522-5839 |
Web address (URL) | https://doi.org/10.1038/s42256-023-00653-1 |
Publication dates | |
Online | 04 May 2023 |
Publication process dates | |
Accepted | 06 Apr 2023 |
Deposited | 24 Apr 2023 |
Permalink -
https://openresearch.lsbu.ac.uk/item/93q8x
Restricted files
Accepted author manuscript
Under embargo indefinitely
28
total views8
total downloads28
views this month0
downloads this month