Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly, Make multiple AI so that if one goes off the deep end another will fix…
ytc_UgybClLuo…
G
In my opinion the flaw within the foundations of western civilization are the pr…
ytc_UgxHRscAk…
G
The scary part is....the government has been doing this the whole time.... who n…
ytc_UgyBw8cbk…
G
I dont understand how this man can be considered one of the fathers of AI when h…
ytc_UgyG02ECK…
G
I need my deck replaced, new plumbing and electrical work. Someone to cook and …
ytc_UgxCvmTmf…
G
@ryderwashington4199 Guess what, "AI" exists way longer than the hype right now.…
ytr_UgwD54SpD…
G
@Waysed733090% of everything is going to be garbage. Total AI generated slop ev…
ytr_UgywOVb4B…
G
It won't take my job I'd like to see Ai install carpet in anything and robots at…
ytc_UgwVjwgld…
Comment
There are two types of algorithms, the one that predicts user’s intentions, and the other that guides it. I am fearful of the second one that it has already struck a lot of people already as if it’s forming a religion.
youtube
AI Moral Status
2025-07-09T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgybOHvncLweRqC5WCB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzirYfteVPPBPArt6J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwknrEx-rhAglHtS-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxRbxg2kxwCj_UoHF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxRamO3CG81KqbCRmx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyrFfJFMLPbgmtojY94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9RbU0wgJGryRC7094AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxGi03h5wGlU_4v1WF4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz8CAjkvWT0V0jl84h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwX4yYiWQ-d3xKFklN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"resignation"}
]