Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thanks for bringing Geoffrey Hinton on tour interview! As a former software engi…
ytc_UgxCp2ztm…
G
„Study a real career“ wow what a nice and thoughtful comment. I know that you pr…
ytr_Ugw6vifel…
G
Well, from the real world startup I interviewed with, AI does play a pivotal rol…
rdc_ohyq36m
G
Our generation take using calculator for granted at school and our kids would wo…
ytc_UgzcaUlYF…
G
Have the robot shoot up in the air like a mad man, oh and people worried about t…
ytc_UgwQ3MHk4…
G
I bet you none of those drivers around this dangerous AI consented to participat…
ytc_UgzIl_LAE…
G
I only use ai for research because sometimes going to the 2nd page of google is …
ytc_UgyHsgNle…
G
Humans might need to merge with AI because we’ll be too dumb to function the way…
ytc_Ugw3NxuyR…
Comment
The human brain has evolved up to the Paleolithic level, plus a little more that has allowed our civilization to occasionally have peace. If AI Starts there, it might be hostile toward threatening humans, but it would likely be a mistake to get rid of us. The AI brain will evolve separately from us - 10 AI brains might go 10 different ways. We will get smarter, but it will likely take hundreds or thousands of years (if we survive our environmental issues). AI is evolving now. May it be merciful (wisest brain wins) .
youtube
AI Governance
2026-03-16T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwHeWSDjSvUag3MeE54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxV1NHjZ7D9nnFfxYN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyPLuPpb9JGglPVdQV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz3okCWX_hDE1NkDYx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgziPeBb90qUjsWyCf14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyfer9CueH8vZWS1SB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz4yKOeUUkqD8Bxj6R4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgylUY_M5TJi917tM3l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz-3tQEZPidUGNPgAt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxgaWfqU3iPZfMXzkZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]