Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Eliezer's take on the 'paperclip maximizer' argument doesn't seem particularly a…
ytc_UgzsZtPkh…
G
"AI Free" will be a quality term in a few years, just watch.
(also these examp…
ytc_UgywsgBH1…
G
In other words, if you write your own lyrics and have an AI make a song around i…
ytc_Ugzd8Tk2m…
G
The one time leaders say we shouldn't rush to make money, 700 employees threaten…
ytc_UgyIxxZ6k…
G
Rip the face off of her and you’ll see it’s a murderous robot sent from the futu…
ytc_UgzzKphC4…
G
Exactly! He keeps on lying, and over exaggerating his creations. It might happen…
ytr_Ugy5KlxOE…
G
The “actual” AI detectors don’t work either and are just their to con money out …
ytc_Ugw3PJSTN…
G
i gotta say, when they, idk, people, the media, say big tech, or hey instagram, …
ytc_Ugwctbies…
Comment
This is a very bad idea. Just because you come out with a new gadget does not give you the right to change humanity. This is a very narcissistic view. How do you know people even want this? AI will destroy the future. We already don’t know what’s real from fake on our news platforms. America/the world needs to vote against this. This will quickly turn into dictatorship.
youtube
AI Moral Status
2026-03-09T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwzRXaiBFPsafAy5wl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4IUiiANsxzFnNe014AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzUelVzf8q8vj_jXQN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugybk_KDnwMoNLObjPB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNbvrbJVYnjC91P_N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxb7vBAKobsSqQfw1p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyKRKTjFGM1qY2y1WB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyWi7gtOXMgf-5dD7l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwsTJwocI2-JaHKE_x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxcXGH4qLv9KQBqVDV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]