Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Correction:
4:06 I say that the difference in fits between the training dataset …
ytc_UgwOPUKAi…
G
@ this is why i like AI.
it will make hungry hungry people realize they're gonn…
ytr_UgweMXfH2…
G
Haven’t they ever seen sci-fi films where AI went evil? They should make sure to…
ytc_UgxRObEas…
G
ChatGPT, exhausted, in the bar after work -- "You won't *believe* what I had to …
ytc_Ugzku4L7J…
G
14:49 mild scoliosis isn’t that bad of a disability, but the pain caused my hunc…
ytc_Ugwst48An…
G
2nd robot: oops
1st robot: the box and you messed it up
Person: chill
1st robot:…
ytc_Ugx2ZkHut…
G
Garbage In, Garbage Out. No AI will ever be able to solve the problem of justice…
ytc_UgxwxbiKb…
G
People say driverless trucks will save everyone money, the rum I do goes 365 mi…
ytc_Ugy0bp61L…
Comment
I'm 27 minutes in and I'm terrified. He's talking about Terminator territory. Why would ASI need to keep humans around once it can "think" far better & faster than all humans and it has robots that can not only do everything a human can do, and do it better, but they can design and build new, even better robots? Why would it even keep humans as pets or zoo animals when it could just make exact or better robot copies? But by then it would have advanced so much that it would be like keeping ants as pets, or even amœba. We have created our successors and doomed ourselves to extinction.
youtube
AI Governance
2026-02-23T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy-YihoVKxHXci6i_x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgygZ32J5Ed8AOtfC_x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzav688TRk85uRiWYh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwL60bcHPnh2CfT0Tp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDDnERQ2d63ZN6a_94AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy-48385_EmdietAKt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzjhzR6T_46trQBlr54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzAu_emzeYcQp_vCGp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwcNRq4NORh0N6j5CV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxspZ6c759vhuasMoR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"})