Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Repetitive tasks. There are lots of jobs that can be replaced, but most aren't r…
rdc_ekbus9p
G
5 years max before AI powered machines are taking over mundane jobs. What then,…
ytc_UgwxPlX41…
G
The created Intelligence will always imitate its creator. The problem with a cop…
ytc_UgxR_f9t3…
G
Or maybe the exchange is the other way, and Anthropic is providing user data bac…
rdc_oi45is1
G
Overhead Crane Operator.
I believe to a degree, AI “could” do my job. There ar…
ytc_Ugy0AMxw_…
G
It's ironic if those people tell artists to just use AI so they can do something…
ytc_Ugz-MBw7R…
G
Nope. Idiots who think cutting corners with AI is the better decision than actua…
ytc_UgycEXYvq…
G
@Subarune and that's why companies are racing to create a robotaxi. A car comin…
ytr_UgxDD-11U…
Comment
It's the same as every country having nuclear weapons. That makes no sense cuz no matter who is the first one to pull that lever and and get that new that nuclear weapon towards we're all going to suffer. I mean it's just like they can do an AI really fast if they want to. But we're all going to suffer. Is it nuclear war? I mean think about it. If Russia blows everybody up with they're going to die too. They're going to suffer too. LOL
youtube
AI Governance
2025-12-25T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxhPPKZY2KTp88jVPN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy8sfFtHfTKqK-gO4h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzs1TP6JhqHwmMW4gN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzNobJa9Q2YPODre954AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxKSwSo7KesN6UouIJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzYzV7CHoJr0Vq34YR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy10c9rFGpEIWyZn994AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJZoZKzRL647WxmZB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyQkMsvgOPuFd9HUR14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyxwntR-cQyxP8esnR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]