Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No they don’t. The driver on the other hand does. They should make everyone go t…
ytr_UgxyedjuV…
G
"AI makes art accessible for people who cant afford supplies" any AI model subsc…
ytc_UgzxlajRJ…
G
If these billionaires replaced millions of workers with robots and AI to boost p…
ytc_Ugwf9MrkA…
G
@The-bestest-uberfish ???? Did you read a single thing I wrote? The point is tha…
ytr_UgzZSTPrk…
G
I think this is really cool and i wouldnt mind having a robot friend cuz that wo…
ytc_Ugy-7NBg6…
G
ai art and thats it and still people think that saying a prompt is art…
ytc_UgwFHpXhT…
G
I would probably use it to bully the ai, I know at least one person probably has…
ytc_Ugxowvi4G…
G
For any person, complaining that AI takes someone's jobs. That it takes 1 person…
ytc_Ugz2_y2at…
Comment
First, Alex is brilliant. The scenarios he presented stemmed from his past experiences with AI and knowing its limitations. He knew how to confuse AI by slowly presenting new situations and information and situations that were slightly different in context, a process that AI's programming could not properly discern. This proves that a smart human being has characteristics AI can never have. Alex inadvertantly also pointed out the universal need for a Great Educator whose teachings AI could reference and that Alex himself could also use to better craft his questions. Overall, this is an excellent video. I subscribed.
youtube
2025-03-17T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzkLRRgnS-hz69U5-h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxB_rep2-pYyO-mrjl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGiui3j5r9q2ghJvh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwm3rShyEM5d5_YBXR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzsHPX7GFkQiX8G_VZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxAzu0FNmPtVAg-7iV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzw7sR7nZxFf0qyuG94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwl17iy1D_ntcKyUEB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-lXzd5VODQaWHL154AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx0c1q96r7N_f1F2BR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]