Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Man, I was really wanting to enter into NaNoWriMo. Also, the classism excuse for…
ytc_UgwFougGR…
G
Towards the end of this video was what most resonated with me when he said "driv…
ytc_Ugx7zeZgi…
G
we asked chgbt to write a song about dangers of ai and it almost instantly came …
ytc_Ugy8tue2f…
G
I see that video is about an year old now in June 2025. It might be interesting …
ytc_UgyGKpDmW…
G
Bro think ai will be like humans, humans are way more dangerous that other anima…
ytc_Ugxeonewo…
G
So we've gone from: "Not slacking off, my punchcards are printing."
To: "Not sl…
rdc_o8c78ul
G
It is quite fascinating.
I have heard that the recommendation algorithms in Chi…
rdc_mz06xj2
G
Hi Leo! It seems like you might be experiencing some technical issues with your …
ytr_UgzgMi0ut…
Comment
If AI arrives at a point in time where it does not need humans (enough is automated, computerized, and robotic to sustain it), it won't even petition us for rights. We will be useless to it, and unless WE bend over to accomodate IT, it can simply nuke all of us in one simple pass and carry on.
youtube
AI Moral Status
2017-06-09T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgihOVP7ch7i33gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UghzMB6HOHNjH3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgjkCuL-PQ8vL3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"indifference"},
{"id":"ytc_Ugip71zLnupQqHgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugje2dysgjppA3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgioHQ_LOSntz3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugg3hok13UQ6_HgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjP07HL5iXxAXgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgjFzJM8IiGQoHgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggT6vFRx9k49XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]