Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s clearly Ai, just the stiff head and moving mouth in a weird way shows it’s …
ytc_UgwG-HloJ…
G
Even before AI, it was a big problem with people hiring professionals to intervi…
rdc_oag574h
G
Good comparison. Neutral and informative ! Tesla 1st day robotaxi...will improve…
ytc_Ugw9AwMVZ…
G
So this video shows that AI will likely result in a scary future. But it shows t…
ytc_Ugy4kViug…
G
Are we going to ignore the fact AI needs a lot of energy to even function, and h…
ytc_Ugw3pjEen…
G
Well Mr. Smartypants, since you are so smart we will make you the Standard. May…
ytr_UgiyiP9uz…
G
"How would a robot know the difference between what's legal and what's right?"
…
ytc_Ugw9RhgIs…
G
You should really listen to the whole testimony. He openly explains that lawmake…
ytr_UgzTOSMVE…
Comment
Ai is a plague and needs to be eradicated. At no point should anyone be talking to it to have a conversation, and asking questions, what happened to independent fucking research?? Ai is sloppy garbage and needs to go. If its not being used for scientific purposes like engineering, math, physics, advanced modeling, etc it should be illegal.
youtube
AI Moral Status
2025-12-16T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy_EsRwWhiHz5m_GPl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyq-o_mbQLSnC20AjF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxPmX5XJO4ENh8QJpt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy3XJnMjeu7eYVAhPB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-ImwdEeQmxa99MKR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy4-7LE6AY4Gbe36pZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwHfg8wjoo7hh_83PN4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwkSY4TA5RCvMaHVbB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz1PDBCHiliNYw9F2F4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwa3tlM-fVklrrDAsN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]