Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We're not taxing companies as much, with the excuse of it hurts companies and jo…
ytc_UgyyIhjOT…
G
Humans tend to do what they can for self-preservation. Does that make humans too…
ytc_UgwvEQiYc…
G
The whole concept of exchanging human labor for wages will have to be renegotiat…
ytc_UgzqabT7Y…
G
@yendyjc the people i see on the road daily wouldnt last a second without automa…
ytr_Ugx3wlxbe…
G
If the AI singularity ever happens, the T-800 is knocking on JimBob’s door first…
ytc_Ugz66b51Q…
G
i would like to see the links he's talking about with the ai realizing its being…
ytc_UgxXugQ5O…
G
They're always saying how safe these driverless cars are. Yet, there aren't tha…
ytc_UgwqFEkfj…
G
I'm sympathetic to people who are opposed to AI in some sense but I think there'…
ytc_UgwEvm8w2…
Comment
Isn’t fair use only valid if the use in question isn’t competing with the source?
Like reviewing a videogame still gives a person a reason to play a videogame. But why would someone buy a TIMES magazine if AI can just tell them what’s in it anyway.
youtube
AI Responsibility
2026-04-11T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw679e2QgZFrF-dWYd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxfndZJWYaFOu1rs2N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxfuZBgppz2VIiTc4N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyD5tP27ZkcIRq1v7l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUGxckbTE7FQlhBr54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxU6H6Z9-DofUGXpkR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwh13APZyjDXASgMf54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZD8m1wTIqd_qIsF54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyjyJR7HPuHuIodk5x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzwN2Jy13y1qDwIkJ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]