Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The future date in the first Terminator movie was 2029! That movie came out in 1…
ytc_UgwzGt0rC…
G
I suppose robots (or AI) will need some sort of rights that will give them oppor…
ytc_Uggd7HuqJ…
G
I am worried about AI getting out of control and turning against us.
I feel the…
ytc_UgxNVOH9G…
G
All the worries , concerns, and examples of what AI can and should not do is all…
ytc_UgxyxhZv1…
G
Arent all those text model ai's just an overlty compicated and imperfect chinese…
ytc_Ugwzpgl_G…
G
The thing is… I go back, look at what happened, and then build something togethe…
ytc_UgxWtrGTS…
G
i hate ai art, ai doesn't have imagination bc it's not a human, ai just copies w…
ytc_UgyJDvpLu…
G
At least this time the AI wasn't directly telling him to do it. Unlike some of t…
ytc_Ugzekvc7U…
Comment
The thing is it's advanced so fast that it's kinda ripping away some artists commissions in the first place. A robot can do alot more in terms of computers and whatnot, but it shouldn't take away the joy of what art really is. Digital hardware has always been evolving and evolves faster than humans. We're fucked if this keeps up
youtube
Viral AI Reaction
2024-07-30T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyQzw4LhP-yx_dbedp4AaABAg.A6WyVY6nYpHA6XUA3N8_Rp","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugyjo67cOo5a7aPXMVZ4AaABAg.A6Wv5nNdV1kA6Yhl_zzn_W","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgwiVLubqIIRQKjjYAZ4AaABAg.A6Wdo0A_37QA6ZxPvp85fX","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwiVLubqIIRQKjjYAZ4AaABAg.A6Wdo0A_37QA6fqpvsqgNq","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgztNgKijyQLEgImFUd4AaABAg.A6WaeiqwY-1A6XbysTUlEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugzxs2Ul9jQehziHYI54AaABAg.A6WO2x28qgKA6nAsA78fQp","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugw7-tZMHDfRFlW5kA14AaABAg.A6WGCZmN5jeA6Wg6foIECm","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgzUKXAWPgiQRbOH3z14AaABAg.A6VYwUL7JYsA6ZqOI_pTlW","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxiw77-LcTSF5Zumh14AaABAg.A6VYYDGfhhaA6WlxlWeDus","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxHmc8GYPEDejVRcPd4AaABAg.A6VO4056KZKA6VPl0ZG2rS","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]