Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Old 'Its not about quantity, its about quality' adage is more relevant than ever…
ytc_Ugw4FwHN1…
G
not disabled but i have Aphantasia and cant rlly 'imagine' anything or see image…
ytc_UgwHLYZcf…
G
I have used the AI GPT... its a rather stupid Program. It's a talking Wikipedia …
ytc_UgxcdhZ4W…
G
"They" already don't have control. The background scroll of voting totals showed…
ytc_UgyjUWTzp…
G
Yeah that’s what it really comes down to. It’s late stage American Dream-ism tha…
rdc_oe5pat7
G
I work in research at a major university. AI is *not* going to replace the major…
ytc_UgwE1GUGA…
G
If (when) you have AI sentient program on you PC and you chose to uninstall will…
ytc_UgyDlqyAL…
G
@WarpedCatWHY THE FUCK CANT I SEE MY OTHER COMMENT
OH FOR FUCK'S SAKE
anyway
i…
ytr_Ugz0Wmn6b…
Comment
I agree that AI isn't going to take over things because it's still a computer at the end of the day. However, if you're trying to convince someone who is scared of AI that all you have to do is turn off a data center, if AI gets smart enough that it starts making autonomous decisions like firing nuclear missiles, don't you think it would be smart enough to escape the data center it's in as well? I'm not sure how you would convince someone who is that scared of AI that it's not scary, but the "turn off the data center" answer is overly simplistic.
youtube
AI Governance
2024-11-10T22:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyA7Ln12hEVOvdyiG94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw8phYbtHgGr7NE51h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy0qLt8y2vaAZ3jbkp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxVAr4F2ObnU2iY7S94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxg5oH9ya7z0aJZN-V4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwtsXnfFT6q9ixTNNZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz9-8HfNOyHtiquQb94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyJvr6J0jfHk6ytR8F4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxnWzlKrMLuZkC6OFh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw1exbf0vVFapr40QV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]