Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Couldn’t you also say the same thing for AI art? Because they’re practically ste…
ytr_UgyVE5UgR…
G
@aaacolorcodedlyrics It's all connected tho, you can't advance in one field with…
ytr_Ugzkpi5Z6…
G
@terrytibs3365 All the travel agents, secretaries, printers, etc...all said the …
ytr_Ugz9afUw-…
G
It really would not matter a dog's dick how statistically safe AI cars are per d…
ytc_UgzEJLQTr…
G
@SineEyedHate to say it but a lot of artists (even the ones with a deal) use sit…
ytr_UgzAGwxJy…
G
The thing is in my experience bots are replacing jobs and saving companies money…
ytc_UgwvV7Edv…
G
So let me get this straight. This man owns an AI Safety company, and is preachin…
ytc_UgxPadi9M…
G
45:28 an example I'm aware of that fits this topic is number theory in algebra. …
ytc_UgxJbud1i…
Comment
I sometimes wonder if the amount of media out there on the topic of the existential threat of AI feeds into its own "learning" that it SHOULD be an existential threat? Wouldn't that be ironic?
youtube
AI Harm Incident
2025-07-29T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgytM3bNtAn0uRffuKV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxF1DhBtgMCsZm0KnZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyg4JUd8S2hKyUXk_R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxLAkxk_zly2jXdp0l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw5jN-oxePRqx1OCsZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzDcEuQjsefxdOjaOx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyFUfJt_BmKHyKzZfJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwiPMTek86cZ7PEoGh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgymC3oOEd3F3yGGZg54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzbtRkn6ZSf3nL4y1F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]