Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why not live in harmony? All w have to do is make a ai art genarator that only h…
ytc_UgwuCUN4M…
G
Its leaked footage from a robot test site, the robots will rise against humans a…
ytr_Ugy33HLtY…
G
It seems like you're not a fan of the conversation! The interaction between Soph…
ytr_Ugz3r90pZ…
G
You just play this off like it cannot potentially become a REALLY big deal in th…
ytr_UgzfjzFsK…
G
I'm going to ask chatgpt whether I should breath oxygen. If it says "no" I'll po…
ytc_UgzxZGgs6…
G
The fact this person decided to comment repeatedly is kinda sad.
And can we not …
ytc_Ugyj5Gls2…
G
write the script for a video essay in the style of the youtuber
exurb1a on wha…
ytc_UgzvicKY-…
G
Well how foolish this people are they don't use a.i to improve their drawing but…
ytr_UgzNUcJIM…
Comment
Ok, I'm only 5 minutes in, and have not read the book. But here's my opinion-
These AI companies would not build a device that could take away THEIR control
They're only building these things to control us. And that's why AI fear has spread across only the internet in such a short time. If we just stop using these devices they lose all control (except maybe the weather, which is a real Ahole thing to do)
youtube
AI Moral Status
2026-01-04T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwo1P8kisYu_1IAwe54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwAERHzdC0QhPBUAPd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzU38CVeCSuHrUQ_jt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyMflifZsFXoXafBa54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyXfHEwu88GP9Htddp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHHAIpRBNdQfiV78d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxnuNPn12og6DD9ZMR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwbhKyWyRViJUoFgwF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz9nrrKluo20eoRQxp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwTCO29C3Xm7_404-V4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}
]