Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I actually have these ideas. First, people can freelance for more money if UBI a…
ytc_Ugw8SxkRE…
G
It makes sense to me it didn't take the robot no time to figure out that it was …
ytc_UgxivghVp…
G
So far, on researching technical subjects, AI generates lots of misinformation. …
ytc_UgzzTGkLs…
G
i use ai myself but even i think this is absurd. its otta be illegal right? i wa…
ytc_UgwjoLTt-…
G
You should look at/interview people about Reinforcement Learning. It's a differe…
ytc_Ugy3wiP9x…
G
He talked about it on Joe Rogan at least a year or more ago & said it's not goin…
ytr_UgyKml59Z…
G
“AI is too big to categorically condemn”
Ok they used the term AI as a rhetorica…
ytc_UgwM1YQB0…
G
I think, way back when it started, it was an interesting thing. Crai yon. I pla…
ytc_UgxnM-1Ms…
Comment
Thank you for this. My fear is that we simply can’t, with our limited intelligence, predict and prepare for what AI will do. It’s going to escape our control. Period.
By definition you can’t prepare for what you can’t even imagine. Can a dog prepare for a nuclear blast? it can’t imagine it. (And even if it could, still there is nothing a dog could do to survive it.) We humans are the dogs.
youtube
2026-02-15T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwrKcsgfDzFNjAD22V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyXJ5Vm5UT18qnSli94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzrpGiAjvYLutmNlFx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxt4ABObDDyusK61yB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzNsOf7VjQFx62bA9N4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyBZ-EGn5EkRzOcIFd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwA99H5lzdY9Dr2Q694AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyJhUy98T89HWdTyMl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeiG7YS3hX0VvuitB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzLYvRdyhv-PMG2tBF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]