Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There's nothing wrong with the term, the problem is that it gets arbitrarily redefined, such as Sir Penrose is doing. Artificial Intelligence simply emulates faculties of the human mind. Computation is a faculty of the human mind, the computers prove that it doesn't require consciousness to perform. While AI can always become more analogous to the mind, it can't ever be an equivalent.
youtube AI Moral Status 2025-06-29T17:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugysi2EwE_rcQbi6IdB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxMTKzcOJGpKeaJtq54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxVk_giEJ-TnMDPAsN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy6Pc3UXGX7u3cZUTZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzuqMwbO5eI94imHxl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyOMdaciM2cpg94mJd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwGl8_-oE5eLXMGtpV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyYaPP4-s3ev46P1ot4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwKmpUpAsL4a-UWzrd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzfWq_KgyCxAqs_l3B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]