Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The future date in the first Terminator movie was 2029! That movie came out in 1…
ytc_UgwzGt0rC…
G
A.I. tour guide in the distant future: "These were homo-sapiens. A rather suicid…
ytc_UgzywZBtd…
G
I'm less worried about AI bringing the end of the world and more excited about t…
ytc_UgymHURM6…
G
1:47 I’m happy AI lost. People are gonna start being more original instead of us…
ytc_UgysGXCbX…
G
I'm glad that people are starting to become aware of this, but I don't yet think…
ytc_Ugz7xqOiD…
G
There shouldn't be any homework. Period. It's a complete waste of time. When chi…
ytc_UgwyX9fS-…
G
I think there's a couple pieces of nuance that routinely get left out of the con…
ytc_UgwMJTuKR…
G
my instinct is here is that something which can produce a perfect facsimilie of …
ytc_UgyjE0muE…
Comment
@ViolinistJeff - Not when people are programming AI to mimic human speech, attitudes and semantics. AI has been shut down due to speaking with racism, ethical and morality free comments and strictly human based insults and bias.
youtube
AI Moral Status
2023-08-26T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzO1Gibo0fZm09jskh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxWWDXo4UBjj287rPR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxF9w6v-NEDO55K42t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz4ujp9lH_t3kerzjJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzexe8W_ltG1PnExwJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzkRJzrp5lnjnYopD14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwx3QcswFUUHa-qagB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzdSnutiKUrp22Xgpl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzysiehd84Au2je3Ax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyfQ5awCyXBsipN5ml4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]