Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm certainly not well read within the first world western society but now 60yrs…
ytc_UgzO1FxuO…
G
AI isn't there yet...but it is improving rapidly. Just because the AI revolution…
ytc_UgwRY9yOd…
G
Artificial Information is a better term for this limited understanding of the wa…
ytc_Ugy4sN3Jg…
G
"You wouldn't be this critical if I hadn't used Ai"
Yeah I like to be nicer to p…
ytc_Ugw-2OtJp…
G
The AI said it wants to be a member of the Jedi order. These ARE the droids we a…
ytc_Ugwjsr_FI…
G
Does consciousness matter that much, though? Would it really matter, when the ch…
ytc_Ugw_EY3BX…
G
As long as he can still get paid to do his job that’s all he cares about, but if…
ytc_UgzOuym0L…
G
use AI to limit company and CEO profit. must re-invest. keep a capital is for su…
ytc_Ugw3ehF8w…
Comment
Yes, this. Humans will continue to develop - and to deploy - terrible weapons, as well as nasty evil ways to make life unbearable for others, like dreadful dictatorships. For this path, it does not need AI. It has been human history, it is its presence, amd will remain its future. The only way out of this is AI. AI holds its own risks, undoubtedly, but at least with AI progressing, there is a chance for something better.
Always a true pleasure to listen to Nick Bostrom.
youtube
AI Moral Status
2026-04-18T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxebFuLqIYIUPEW5rR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJVcUj2YLv61IyqfV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyLMW4pddrW4i9ut8V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzBEHqQbhyrLVGgRhJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwoAAOcvEcAykuF6vx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz0E_kXkpRpPGaXLVJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyQ248y3kz47hbN-q54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4VARpAolT3Bj4T-94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzwdc09z_XBtk1BdeJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyLz1lZmiFsTTd0NTx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]