Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We joke about it, but I’ve found AI to be more useful at those type of things, “…
ytr_UgzCRNufc…
G
Nah i think the Google people are messing with something very alien like . And t…
ytc_UgyZCIe9M…
G
AI is not cheap, they have spent $ billions on it, they want their money back.…
ytr_UgxUDRAJp…
G
This will never happen. Because to be self aware you need consciousness and cons…
ytc_UgxBS36At…
G
Ai = Google on top of Google. Stop using word salad machines. Dessert before m…
ytc_UgwsNW3Ns…
G
@IKnowThisIsStupidButso correct me if I am wrong, basically in your analogy you…
ytr_UgycytbZi…
G
HAL 9000 is literally not the villain, how do you people not understand this, fo…
ytr_Ugi0tzyav…
G
It’s gonna be the end for humans. Most people are not taking this seriously, but…
ytc_UgwmZ-Zbm…
Comment
Technology has always disrupted education — from calculators to computers. The real question isn’t whether AI is tested enough, but whether educators are trained enough to guide its use. Instead of fear, let’s focus on curriculum redesign, digital ethics, and teacher training. Students deserve to learn how to use AI wisely, not be left behind because adults are afraid to adapt. Why blame Trump?
youtube
2025-12-02T23:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxuqwrgh7WojQteL5x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEb2So3obuDkQKS_F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzVxWpyhZfiAiboWDh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwyUpT6YAT8-fGhX6F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzvFFbgC755BlAyV514AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx8Ic_zMOJPF4I9nM94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzDOUMZRB0x1-zGL7B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz4q_KMKH1jISDH_z94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxk8k9a00RpAySPXVB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzekj6jQtbbSTwjsFB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]