Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would rather die than ever admit what I’ve said or who I’ve spoken to on that …
ytc_Ugy2Eoeo6…
G
Proliferate monopolistic health insurance companies that do not cover healthcare…
rdc_m6z6rw2
G
Some day soon someone is going to hack into the Waymo system and then we will se…
ytc_Ugzx3utB6…
G
Relax. There is zero evidence that Artificial General Intelligence (which is a s…
ytc_UgxvztazT…
G
"AI screams to stay on", because those who programmed it told it to do so. ...it…
ytc_UgzkzHcQB…
G
I could draw a stick figure and it would 1,000X better than any ai art ya’ll wan…
ytc_Ugz8KJA4o…
G
Funny things was getting the interview interrupted by some random fellow in a ad…
ytc_UgxtJlWxp…
G
ummmm and why would we need SLAVERY ENFORCEMENT, if AI would replace slaves? You…
ytr_UgzQQv3-N…
Comment
AI is such a scary but amazing thing to talk about.
The part that scares me is that I don't see what people would do in the very far future. Every task/job that I can think of could (theoretically) be done by a robot... except people sports. Robot sports are "different".
So question: What would our "jobs" be in the very far future? (500+ years from now)
youtube
2013-07-04T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxlymtF7AZIsF8hs8t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy5ELZrEi8odmijOFZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyQK9QtiCZymnsajzB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"unclear"},
{"id":"ytc_UgxCf61kfFLddphnoc94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"unclear"},
{"id":"ytc_UgziRsg2tmEBgr54Rhx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzN2SDs4-ukmmWP7K14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgytIx0dEEb_9BWBLxt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_8PwpPqJMkTVv31h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLetMJJYZK8PPsqkt4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwpQdgG9z2dYci7a3d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"unclear"}
]