Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who cares if it was AI. The point is when anyone agitates, impedes or doesn't pr…
ytc_UgwsD9odt…
G
Total hype. AI will replace all jobs but to say all jobs will be 'wiped out' is…
ytc_Ugx-4amem…
G
You should have been downloading everything that you thought was valuable over t…
ytc_UgyAq6TKz…
G
The brain processes information from the environment and executes the calculated…
rdc_du46h2b
G
Artists can still do it for themselves
Tho yeah, they won't be able to earn mone…
ytr_Ugx8foFZV…
G
Driverless cars should never be a thing 🤦 some day it's going to be a kid not a …
ytc_UgywyB7MN…
G
Fr tho, i only use it for tramatising the ai but sometimes it tries to "punish" …
ytc_UgzwWydQG…
G
Fr I missed when I hear ai I think of sci fi movies about ai going crazy and wan…
ytr_Ugx5ZYn-4…
Comment
A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. Issac Asimov. This is all very very frightening folks!
youtube
AI Moral Status
2023-09-07T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxYYfWJ8NTgmDZ1HGp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQWRV0ctLYY9YVfAl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzgiXurQDVWlaE7YQ14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzBPsqlobQXqnagfD94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwuegNWGl5oYUDzz0x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyfvgqXYIqTt-oJubx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz-oNAotAVpv1ht9bB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyGbAL4zxk782Om8LV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxUPBuJIl8ayYq45rt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxF8aNpOnAIHjqjCD54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}
]