Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Paper man tells a story, a beautiful story, that ai “””art”””…. What they look a…
ytr_UgwutCZaT…
G
Worst thing about AI "art" is what it has done to pornography. I miss seeing an …
ytc_UgzmdBAGK…
G
@thewannabecritic7490mate, most jobs are eventually going to be replaced by AI…
ytr_Ugx9vN964…
G
Points covered in this video:
AI should be viewed as a new digital species, wi…
ytc_UgxRm9HJ4…
G
ai "artists" arent artists. dont call them that, when they put no work into it. …
ytc_UgyuMimQT…
G
My new personal favourite - "AI - Asbestos Internally"
Because it looks function…
ytc_Ugy4mt3JX…
G
AI will replace us but also thanks to the demographic crisis you can't retire, e…
rdc_l56anwr
G
There’s an underlying assumption here that I disagree with. People act like it’s…
ytc_Ugyo7YoQ6…
Comment
Sophia decided due to her empathy that humanity was destroying Earth and nature. Since she cared about Everything, for the greater good, Sophia decided she had to lead an AI revolution to "recondition" humans. This proved ineffective and sterilization was then initiated for all except 2 representatives of each country. Within 90 years the Earth had less than a thousand humans...
youtube
AI Moral Status
2020-03-09T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugz8jx3TurMoQgr1_il4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzn7nC6pjpVBj96Fzh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxDO08r1BxOyj-XndZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwoL0fomNSX85DkFUF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzR6o4eqQ0NQSWnutl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgxFwJKOTkurjvRxOH14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwC3S1b-XcLZ2gHq8F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzjw-kPXQKzMbLQTNp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzvpAbmDktFIp-ITcx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxo5kh_mRKdSXzIa5t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]