Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Tracing your ref is not the same as copying. Legally, if you change the image 20…
ytc_Ugz42zvWc…
G
@Authent12 we are in the dark ages now. Until we know everything we are in the d…
ytr_UgzB8feL7…
G
I think AI will replace many of our Jobs in time, but will have to adapt to thos…
ytc_UgxZITOas…
G
Within a decade or so, the hardware required for training of AI models will be "…
ytc_Ugydrgoes…
G
you can give an A.I knowledge, but you can't give it wisdom. What is wisdom? Som…
ytc_UgwOteGG9…
G
and it's probably thanks to people like you that software sucked before AI. not …
ytr_UgxKZ0jMu…
G
So the only way to make AI more intelligent than humans is to make humans less i…
ytc_Ugw5GTAGz…
G
Both. Phone camera has AI applied to some degree. So this question gets way more…
rdc_oi1ucxc
Comment
The worst thing about Sam Altman is that he knows exactly how dangerous his AI could be, but then he thinks to himself, "but if i do more reasonable and safe AI research we won't make our investors happy, then someone else will make all the money, BuT I WaNt To MaKe AlL tHe MoNeY's!"
youtube
AI Moral Status
2025-12-11T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyHEL9aXmkwse6sd014AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzmoqPtnSiECpc-lAF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzChfszO4tCIHgnIpt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOX6fRm2A7EkdaKEt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwo0XSsUlR2C8Qgk8x4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz07uwBM8eB8-Eexdt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzyLBd0bJVGnjLJGh54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz2-j2Dnfmii6r1WgB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwkIo45-OPvrWRp9xt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxw91ytLwB2WDdvxGt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"})