Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Rich Sutton wrote few years ago interesting post titled 'The Bitter Lesson'. It …
ytc_UgzGAOsQA…
G
I like how script is so chatgpt-based it insane lol.
Not X, but Y - etc…
ytc_UgxoLP_mD…
G
If AI Robots are as reliable as EV cars we have nothing to fear except fires.…
ytc_UgxzXdvNo…
G
Pushing ChatGPT the way ppl are should be illegal like false advertising. Also …
ytc_UgzpKeJ69…
G
Scattered around the world, Africa, India, the middle east and more, there are a…
ytc_UgzIhSiSv…
G
I'm wondering if these AI creator, are pretty much asking these children to comm…
ytc_UgwMn6qqq…
G
The real goal is self driving semis, what could possibly go wrong?? Anyone care …
ytc_UgxXvxmDc…
G
just like in movies these people must be somehow think creating an ai as a weapo…
ytc_Ugxo4W39R…
Comment
1:05:15 - it is incorrect to compare AI to just a nuclear bomb. The correct comparison here is AI to nuclear fission/fusion. This technology can be used to power the entire world or destroy the entire world, unfortunately humanity's first thought was to use it destroy people that don't believe the same things as them. And there is no reason to think, it's going to be any different with AI.
youtube
AI Moral Status
2026-03-31T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyDXgjUydV4Ksr4rJh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzQQiY191IV6KqWqI54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzidg-NTBS0mBo2gK14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw4ziQU8EPVHXSOtLV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyg6jx7tm3vOgH5x3t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwai1gzjCMXJ4T9PI94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxlXAWuZn5VaeQowTx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwk16BQd4JNTTYFGCl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxanvgn8ZnejWL0tUt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwzaOmZNqa_H_a-2aJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]