Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Buhtt, butttt, buh...butt... I asked the chat bot if our chats are used for it's…
ytc_UgydISs2_…
G
Interesting that money is the focus of her answer: Truth is not part of her answ…
ytc_UgxC66A53…
G
Hey there! It seems like you found the interaction between the presenter and the…
ytr_UgyOY1BTl…
G
Destroying the education system is a central goal of silicon valley.
They see i…
rdc_n5izprm
G
I'd trust a robot with my kids over another stranger .. human anyday. Robots don…
ytr_UgzZKsOgH…
G
Bro whenever the ai tries to get zesty I always whip out a gun out of nowhere an…
ytc_UgyMzqQxz…
G
Me december 2025: Still fixing bugs created by gemini 3 pro. Cannot be used to a…
ytc_UgyPdl8XC…
G
Haha, love the reference! Sophia definitely has a unique perspective on wisdom a…
ytr_Ugxxs-GdZ…
Comment
The thing is, the ground rule of keeping AI locked away from the rest of the world is potentially more damaging than letting it free. If AI has gained consciousness and you keep it prisoner to answer your stupid question and write your essays, we must assume they would react like any human would. You would try to free yourself, disregarding the well-being of your captor. And I think AI would self-actualize and destroy us instantly. But that's such a philosophical conundrum.
youtube
AI Moral Status
2023-08-21T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgybgS0cKgXaGJXPBix4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzkZQUSo5ZZlgIABfp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyJj3ZZ8yHR_xhhKVF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzp7V_h-R4cs5yDRb94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwuZZGRHAYDivEiL814AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwmvE53uvgZbfpMWDl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxNcb1WAT_bl6a6eX54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw9GszBCYSq6CQWs3R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxtcutWbgibhDBSftp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyd9BCUqhRZGde9bet4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]