Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
• I am glad that AI, with its unbelievable craziness, does exist.
• But I am so …
ytc_UgyVfg6yQ…
G
Regulation always only ever helps the bigger firms within an industry by destroy…
ytc_UgxhKfcv6…
G
Another thing is that there's no difference in AI art styles, they all look the …
ytc_UgxBT2-91…
G
Y la gente compra crea y compra su propia extinción cuánto dinero cuesta y cuánt…
ytc_UgzTb_fv1…
G
I'm going to get started on my anti robot weapons...looks like a good stock to i…
ytc_UgynwSwZo…
G
I spent a few nights arguing with the substack support chatbot. I'm a total robo…
ytc_UgwlMKUca…
G
this is the end result of Capitalist Realism. everything, from the way you think…
rdc_oe4b3n1
G
I've done two papers on A.I. and each time I was loaded down with anxiety from a…
ytc_UgyakAxa3…
Comment
I feel like there are so many comments that get lost in robots acting like people and eventually disliking us (or other extremes).
There's a big difference between thinking and making decisions for one's self, and actually having feelings. Why give something equal rights if it can only understand how to solve a problem it's given? The moment an AI technology independently goes looking for challenges to overcome, then it will have proven itself as more than just a human's tool.
youtube
2013-09-17T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw8p-R6bjwHYAKgkVB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbYcfJHJzjEcgoxHZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyveGROYaH0S1oBFBp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxV8t0h1d_SURm03et4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwTGcAfD8jkC_yxBXV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxR9vw3fBh2aBAHuip4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCATKoPqN_lrmgVgR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwxKs1kmSHzdUWhuuV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxm8Det6C2RgozuVap4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxxPP9ryIoPM7XHN5N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}
]