Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Platooning" isn't about improving drivers. It is about training AI, and ensurin…
ytc_UgwD4K0xl…
G
This is following many people suddenly being billed for the software that allows…
ytc_UgzVN1_1R…
G
do you really think we're absolute idiots? even an elementary student could tell…
ytc_UgwRLUyFN…
G
I asked ChatGPT how it would like me to talk to it.
Its response was its pointle…
ytc_UgxFLqDXz…
G
I can't help but be polite to AI even if I look ridiculous, if I don't, I'll pro…
ytc_Ugznty6Om…
G
Its no different than fast food for the mind really, depends on how much you dep…
ytr_Ugxw-Jnoq…
G
@IvellScarlett yeah but how do you wanna make an AI if it cant learn.... put som…
ytr_UgxmD_2Jb…
G
One tiny bitty problem, how's this AI going to configure a plan to build weapons…
ytc_UgyY5SS8O…
Comment
I don’t think superintelligence is the main goal of AI companies. They want the fruits of superintelligence without the awareness. But in order to build that, they may have to do gain-of-function tests to create real superintelligence under controlled conditions to understand how NOT to create it in the end product, and that's where the danger lies.
youtube
AI Moral Status
2025-10-31T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzoYZLwz1hvNcmWdih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwqEcV4Qs5OkZ4AFgN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzqRekSJOzVfIBImfh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzeFkkpaR4Jdj5J5J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxMQgb3wFL9aJnLrj54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9NqqZ5u5z9bOVc754AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw4lYL_D-jVZDsPA9B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhN7AlDS6bIJ4PAGh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgydiU7eVhVJv35V0xF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgweoqkAkh4nIO_Iwwl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]