Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You're thinking too much like a capitalist. You need to put yourself in the min…
ytc_Ugy0XAEG_…
G
Maybe so, bust what do you think employers would want, someone they have to pay …
ytr_UgwEWKEg3…
G
@DonaldWWitt you know what? If they only used AI trained on works that were _giv…
ytr_UgzRbPsAJ…
G
AI will for sure take jobs. Ai might not do the job end to end yet, but 1 person…
ytc_UgyZi-XN-…
G
This isn't real, right. It can't be. They didn't actually hand a robot A machine…
ytc_Ugx2aZvq5…
G
It's NOT AI that is afraid of us or we need to be afraid of, it's about APPLE, C…
ytc_Ugzyw2Ui3…
G
When you ask Max how many lives he would end in order to save AI, he's clearly w…
ytc_Ugzrq0pzh…
G
What's the essential difference between hiring a person to create artwork for yo…
ytc_UgzCqX3B8…
Comment
If I was a judge and was given a legal argument drafted by an AI, I'd immediately give the winnings to the opposite side. AI are unreliable at this time and the technology is constantly being misused and mismanaged
youtube
Cross-Cultural
2025-08-22T16:1…
♥ 25
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz0a-SL7Uk8KB9pJKV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyiBmlMl1yicYDHkDt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzQTXn1Xgu08dYI0ix4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkVtnLyTZm9t-5eBl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxrWgR4H6Fy3riBzZd4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyvz064NuvnTG4RWJd4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_UgzwYs46RsphvW-gDOp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyDKwWgJKL6dg9vu5R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxy3sV5qCCEm96yifZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkynGypu27o8khtb14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}]