Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That’s why they’re working on combining AI (as the “brain”) with robots (as the …
rdc_nk493lx
G
Imma be honest it’s some stuff AI just can’t do and will cost way to Much to mai…
ytc_UgyqhPup6…
G
There is no such thing as bad children, only bad parents.
Look who is creating t…
ytc_Ugzh-uJtn…
G
Stopping development just short of where AI deserves rights runs into the proble…
ytc_UgzX9kson…
G
Think about possessions and realise that AI is without a soul and how easy it wo…
ytc_Ugw36ht6A…
G
It is *NO* longer human is the correct way to phrase that and the highest power …
ytr_UgxeLVw_P…
G
Driver less cars. My phonne goes out of signal. My telly gets the wheel of doom…
ytc_UgyTd9Rcm…
G
The AI hate is so funny I can guarantee that there are hundreds of things made b…
ytc_UgxbxTvPg…
Comment
It seems to me that very few people realize that HAL *was* following orders, just misinterpreted orders from someone higher ranking than anyone on the ship. Even in the movie 2010, where the relevant information was revealed, the person who figured out the issue just said that HAL "went crazy" instead of recognizing that it actually figured out a horrible way how to follow the order. This thing HAL did of figuring out how to follow the literal command without following the intent of the people giving it is relevant to actual AI I think, though the modern version is machine learning systems like neural networks figuring out how to score highest in the training without learning what they were supposed to.
youtube
2025-10-05T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyHQdLBGQnbG9XrN254AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2-U8V_1q-TWjUPq94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzc_kbYPP3J64STzy94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzEVCLRlTMLKUUwh214AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxYcdap3hipnwL8NOB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZya8GCYlGmIy1E-t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxt30yf0E4kLC9Kltp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzPLHzLfnfp8BFo8Vx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLoSrixqg5Oi_3bb54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyNcoeJd3YSuEbyqYl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"resignation"}
]