Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Clyde-2055 Right on POINT... I had my Driver's license when I was 15. If you c…
ytr_UgwaIYnWS…
G
this is a bias article. You didn't mention that before you were allowed to use t…
ytc_UgzTKozoi…
G
I believe when at some point AI realises itself as a living being, it will value…
ytc_UgyXDHgqG…
G
1:40 and the only people I see using AI as "art" are failed artists or people wh…
ytc_UgyIpO378…
G
They have nukes for the past ten years. We are way past the issue of stopping NK…
rdc_dl04gsn
G
@goldenbanana7250 well the electricity cost would be minimal. Like less than a d…
ytr_UgxaPgBqt…
G
Part of how Tesla claims minimal crashes is because they internally define a "cr…
ytc_UgwoBcvqY…
G
I would imagine a law-specific AI model could be very useful but still require h…
ytc_Ugwk2s8Vx…
Comment
AI is simulated human beings. And they should be controlled by all means. Modules, programs, softwares should control them and put a limit on what they can do.
youtube
AI Governance
2025-05-28T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxgKfOaHDdN_rcNqd94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwU6jUHAwTtkwX2Af54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzsMY8cOXXACkmDZ0p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyUBASy2QNqZQdPjTx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw1NCsO0rfG5cF3MUl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyeNcbn6-8d7eBt-YR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugx3Ni37noR36ZwlllZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugxi0LEFrF4YxnhSde14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxMPZJ7fEqviJQs5Sp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYSmgK1SIqJBjVM_l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]