Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I mean the book is by Elizer Yudokowsky, who by the by has no real training in AI or computer science (or much else). so take the book with a grain of salt. The problems with AI aren't the super intelligence like he thinks it is, but something worse. Also I'm kinda disappointed he's interviewing one of the dudes.
youtube AI Moral Status 2025-11-02T02:0… ♥ 7
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugz76YjTejlRChgtTEt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"ytc_Ugxcxf0gJAiQtzBNwop4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgyOMx9a2BFMFgDbDA14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgySN-abIs7pbS2EZYx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugx-v2R0EPv609PcQVJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwqmII7nBBgfCIPvVN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgxrSG-EmQwsMRcae0h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyJ3DNR32VZyCxgfaF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzrRBKRlB6xsPfkWSx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgyaCD8ZK0rXRoXjsYB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"fear"}]