Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'd always thought the "rogue AI" threat was far fetched, but this video does a …
ytc_UgwOHWOKk…
G
He actually said that the government should control AI so states with high stand…
ytc_UgwgOPD3v…
G
You have done nothing to stop the impending revolution. 😊 People will continue t…
ytc_UgzQp3Ydn…
G
If a lawyer faked someone saying in a 10 sec video" I will do that" and that bad…
ytc_UgwHSCo8C…
G
AI is only as good as the humans who designed it and that is a huge problem…
ytc_UgysJeGWm…
G
1 human year = 7 dog years
100 human years = … 7 AI years.
We should really star…
ytc_UgzoaNik1…
G
Time to study irobot we are doomed, those in charge don’t have the right moral c…
ytc_UgxFkmUU9…
G
Without J.R.R. Tolkien. (Human to you kids), there is no Lord of the rings (book…
ytr_Ugzs2KGMT…
Comment
I’m not AI, but I would say you need to police more in the high crime areas.
youtube
2024-03-08T19:3…
♥ 142
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugzcpl2Uoewzzf5AGol4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz3qkot4zRXKgr4FfF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxCiP_vu0A4vCAZ-JV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx-iio22FEF4fjHbWd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-oJ_C4EmkRixKO9p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwzYm5BnJlKMZ7b8vp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugz3BkzmQLRsrts4_6J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwPtyDx2ql0cmFpQ4d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxnQhOX6DGkAzOS9iF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzL4AF8kJyEaa9GkCV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]