Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is what really WOULD HAVE HAPPENED IF REAL JET PLANES WOULD HAVE CRASHED IN…
ytc_UgxDyQuLg…
G
What is essentially a robot can remember as much as you want it to remember. A h…
ytc_UgyAafK1E…
G
I'm not sure that super intelligent AI turning on us or eliminating us somehow i…
ytc_UgxUX0Yfg…
G
@Mouseline I can imagine a lift up of the workload on them through AI, but I do …
ytr_UgyJWiQCc…
G
It will go faster than that. I say that at somewhere around 2040 we are fighting…
ytc_UgxlRsHNN…
G
yeah but if you poison the well enough that growth goes into the direction that …
ytr_UgzjfXXKL…
G
There is no such an ai safety there is a performance to make and rhat is why ai …
ytc_Ugx33Bjbd…
G
That’s horrible!!!! The only people that would give a 💩 about this is a very wea…
ytc_Ugxx7_50m…
Comment
10:30 Asking an AI if they would on us. Ai Telling us if a machine turns on us it happens because we told it to so and that it is very likely to happen. It saying humans are so stupid that humans will tell AI to turn on humans without realising it.
youtube
2025-11-06T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugz_UwipnXUoOkKGbTV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxmMWzGB1Wq9FEADEd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzDqzYLCyXkpwWNVYl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyq3Q67ibrEpBAKGx14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgykXpjuSnHHMake91Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzEhBlzIUWXvRqAXuZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwSTuPST7yTbqD7mwR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyCWL3LZXz17d9Akr54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxy7U696rFraK85Q994AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw_-wl11uSmUmSMK5d4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"approval"}]