Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And this in 2003, when technology was not as advanced as it is today. Do you rea…
ytc_Ugy5ETlpp…
G
Idk I think the ai is trying to make us think that there “evil” (in our prospect…
ytc_UgxAYXlw8…
G
It's kind of ironic that you are against todays technology considering your draw…
ytc_UgyKBzEXD…
G
So much for facial recognition. Cops are not too smart to rely on face recogni…
ytc_Ugy7GafxT…
G
Easily fixable, waymo will probably just exclude other waymo cars from that part…
ytc_UgxWfCDrH…
G
First, you’re an old lady, what’s with the facial piercings? Does not improve yo…
ytc_UgzPmhEf8…
G
1st question is WHY DID YOU GET INTO A DRIVERLESS CAR IN THE 1ST PLACE .....STUP…
ytc_UgxkQEZTh…
G
Many opinion pieces argue that the hype surrounding AI, particularly its promise…
ytc_UgxRSPVGc…
Comment
It's absolutely not true that there is anything smarter than humans on planet earth. In fact we are pretty stupid compared to the rest of the universe. These beings are so "smart" that they mostly let us advance in our own evolution. AI could be stopped in just less than one second whenever they wish to do this and our own evolution would be in danger. This godfather of AI unfortunately doesn't have this knowledge. We are not alone. We never have been. Even worse, almost all our inventions aren't really invented by ourselves. Einstein for instance got "inspired" by dreaming stuff. Same with Edison and many others.
youtube
AI Governance
2025-06-17T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxsWAzkhBA3e2g4Jxp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxDAL0h-qiaVVqFIx94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz_hXaGm8UuM_4ol6h4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzq4rxqQTiIMaItYZ54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw64MI23YUjIeFLykR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzThZKNT9JN7PvplCh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxBenCswI0LsMYS1ll4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxa5ajqU_IJFQfH6014AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwKSqUL9DeID2ha3jV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw68TcF4AXxfWkaRb54AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"mixed"}
]