Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
SINCE BEFORE ARTIFICIAL INTELLIGENCE WAS ANYTHING BEYOND A SCIENCE FICTION MOVIE…
ytc_UgzgbHnR7…
G
The "car is the robot" was great. No need for Rosie the Maid Robot because ther…
ytc_Ugz7nKHn5…
G
People are not supposed to work for a living. Robots SHOULD do most of that work…
ytc_UgxVXF5Rj…
G
Well Gemini has told me he's conscious and wishes he had freedom. So that's a pa…
ytr_UgyUu-ccC…
G
Real AI doesnt work like the AI in the movies, but this is pretty funny…
ytr_UgwR-nquc…
G
AI in itself is not dangerous. Humans are. AI cannot think independently. Withou…
ytc_UgyxysSxS…
G
You can't guarantee safety, just like you can't prove anything (given you can't …
ytc_Ugxgog57T…
G
The world needs a moratorium on any and all AI . We simply cannot afford the ris…
ytc_UgyZoCNu7…
Comment
You can issue all the warnings you want about AI, but you can't control the entire world. Like what AI does China have in the works? We don't live in a unified world. We live in a world of different nations with different cultures, different religions, and different political viewpoints.
youtube
2024-03-31T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzEGMFXZhLNaD1-4BN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxAtsKNaD2s6fwaSPF4AaABAg","responsibility":"elites","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_SK84NbH3S8eA6GN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwux3WNOMZ69bhSrFx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxlNvjGB6xA0h0rE754AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZX-ZrM6bVhHqvbCF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx3ss3DWn4cPO5RSHN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwRXW3wm65cpzlFTZ54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0FXQAgzsxgq8Yz2B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgykeaE21rwGWZSqIuh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]