Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@MrPretzel6000 i'd argue its more like a studio work. the database is made with …
ytr_UgwaaFw_H…
G
We can't even accept other human beings. I am not afraid of AI. I am afraid for …
ytc_UgwkuWWV6…
G
Me Sanders, your proposals are adequate. They leave out, however, the mean issue…
ytc_UgxqBa5OA…
G
A man who never thought AI would be used for evil saying Elon Musk has no moral …
ytc_UgxrPRPw3…
G
There is the catch, everything that it is "essential" to maintain things runing…
ytr_UgzwgByM3…
G
AI is every dystopian novel coming true in real time. I am rooting for an astero…
ytc_UgzFbObIS…
G
29:02 I am just amazed by how stupid she is to say that waymo cannot create a ro…
ytc_Ugzg6ohZY…
G
+jog10210 lol kid, this kid has master degrees in psychology and science... do …
ytr_Ugh8bif9X…
Comment
I believe Ai will kill many of us (billion speaking) by all possible means mostly indirectly Ex: Matthew effect going exponential. Its going to be far worse than hell for some time. but it will bring eternal peace for those who remain and generation after them. in this world $$ would be useless, every one would have a fix number of Joule to ''spend'' doing whatever they want each day.
youtube
2018-04-16T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzq4Q_khAOQr_8ku3J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxQncBw-CN965L8N894AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx1QkCrhPsZfbBPWwt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDPEKrbLqUafCfBaR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz1akl15VFUobJauOl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYcak1jeRRrbt89xF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzIJ7IaCFjD0W9YyZV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw-RpLOAS9Y8VxL0KR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyfnBJ2M1YJEY2TRXt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwqsAaYQNKgOhBYURV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"}
]