Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1:47 I wanna note that this tweet was most likely made ironically. Gooseworx is…
ytc_UgzxiS-p3…
G
Some of the best paying jobs will soon be gone to A.I. any engineering position …
ytc_UgxLOqUyd…
G
That’s not even an AI robot, that’s from the game becoming human and it has Been…
ytc_Ugzl_Nnzb…
G
What I got from this is that moderators lack maidens, and we really need to stop…
ytc_UgzogcBYh…
G
I love AI, it's fun to generate but I NEVER claim it as MY art and want it to be…
ytc_Ugwz64QYN…
G
@ It's more of showing who doesn't consent to their art being used to train.
If…
ytr_Ugx0rgxBx…
G
Let AI have it. Trucking is not a good life. The pay is not worth it. You are go…
ytc_UgyNFkF-1…
G
Carpocalypse.
The takover of ai
The ai equivalent of the little left to right st…
ytc_Ugzc7moyo…
Comment
You just know that the question on everyone's lips when they are told about the big tech companies signing the document about their concerns over AI's extestential risk would be 'if it's so bad then why are they doing it?'
It seems this question is never addressed, it should be addressed, as just because people don't ask it doesn't mean they don't think it.
youtube
AI Governance
2024-09-24T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwiLXYUzt_krrC3CSF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxKrKiWKHhLp15lLAx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxmpVaJAhJ2sJrChsl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy_SsFcuE-PO3CNDqJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzElq51atImiBIFduh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZ1h6jIA69jpnLpo14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw6YkbNtTueIn-rzMl4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzsUyQF1jYeG0RGq7x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwTNPzHHbIGFGMeTPl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwdsAZrcVt6UXwXSad4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]