Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
An AI that doesn't have empathy or understanding of what it is to be human, coul…
ytc_UgzgVal-j…
G
HONESTLY THIS AI STUFF IS GEETING OUT OF HAND plus this is even worse for the va…
ytc_UgxquYq-x…
G
A couple years ago I supported ai "art" and didn't understand why it was an issu…
ytc_UgzEV3ay0…
G
@Pete_mossswrong analogy. Think Bitcoin. About 2010 the hype was it was going…
ytr_UgwqJYFXC…
G
6:26 im not surprised, they do this already. Theybsay the need workers while usi…
ytc_UgxzYXyNx…
G
Whats interesting is that an AI drawing inspired these many artists to rethink t…
ytc_UgyUg-CRn…
G
Now that's a topic I never really thought about. I guess I'm not prepared for th…
ytc_Ugh-m7XYI…
G
I'd like to add one more analogy to the list:
If a genius chef who, by all mea…
ytc_Ugxbc82zk…
Comment
Thing is, when something goes wrong with humans, it generally becomes obvious and gets corrected before becoming a catastrophe. Computers produce total gibberish with the same breathtaking speed with which they produce quality products. Two AI’s communicating with eachother could crash the whole economy in less than 24 hours and us humans would be out here being all “Why bank card no worky?”. 🥴
youtube
Viral AI Reaction
2024-12-26T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw3IDyt37nVlxqo7P54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxhc8_fGYz8WwKONfV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyNg5Mv-7KTDLjhI-N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgygLOKbHnz7wFOfImV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgynqffQb3g1LitN5194AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugza2LDq1JdeL4qlhzV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUBdmHqAp1ZNEKqEh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwtQLb57LmiS7-6WhV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxp4wtB1rcuPrzf-0p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyS4VzzEfnUkmnmSV54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]