Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have AI fatigue. It was interesting and fun at first, but now if I come across…
ytc_UgwxTBPE9…
G
> self driving cars aren't perfect yet. DOOM AND GLOOM!
as if AI hasn't been imp…
ytc_Ugw4o2Vdr…
G
To integrate Hashgraph smart contracts in AI oversight, a decentralized ''Stewar…
ytc_UgyaffFFN…
G
The documentary provides a thorough exploration of the potential and risks of ar…
ytc_UgwUqiMCl…
G
*Firstly, I don't think anyone has ever said:*
"Oh my I feel terrible for that …
ytc_Ugxg75PXS…
G
As a computer scientist myself, I trust Arvind Narayanan. Hinton's statements ma…
ytc_UgySSqvag…
G
Haha, has AI blackmailed Dario yet? I like that he is candid. It definitely give…
ytc_UgxrPIp5L…
G
I've always said, anything AI should be clearly defined when something a compan…
ytc_Ugw2iU_R-…
Comment
i love seeing graphs that have no actual data points just bars with %'s
really tells me a lot of info
that and AI's telling people to k*ll themselves, without showing the full chat, clearly if you read some of the prior messages its already them they are worthless, i'd bet a lot that it was prompted to do it by humans.
i hate AI as much as the next guy for the slop its bringing, but this is just bad narrative work and honestly just leads with a dissident bias and that is frankly quite saddening.
Do better
youtube
AI Moral Status
2026-01-09T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxyFGLcWdj4tji1ERt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRCKMLaPwvL79RRMZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy3mcwTR9s_7GrE4Vl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwc8kofrGfUyOHlQ494AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzmfOqyHDKGHBjh2U54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyLoDXIsN9rfH_HHx14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwn_UV-_KjWh33Ymjx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz2j5dJL6XVyaLM1a94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxufYnb7O5FIAFIyYd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwzPC6E90EduXYSz7Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]