Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
People forget we trained a consensus group of pigeons to identify cancer with 99% accuracy. We don't need the AI to be smart to do jobs we assume would require smarts. The real danger is when we start Rube Goldburg machine chaining idiot AIs.
youtube AI Moral Status 2023-07-06T04:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgzI74UtgkSGovn4Zt94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxoXRAssENL1SBrfKJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyX5mq2JRqRdk8aCmp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwRViyy9MZYU9RN8a94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugwcq2faTTGVKV2BUNt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugz22MCYCYjQ9-0XVnZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwZ6ENtNMGFirzfyvt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxHMtnEcd7kLc1BvYJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwotLq43wgKwpHEYQF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgwZsVkKgsygEqrtLw14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]