Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
🤖RobotCop:: "i smell you are smoking pott, pls refrain and show me iD." 🤠no t…
ytc_UgykbrtEv…
G
That's a fascinating perspective! It's amazing to think about how far technology…
ytr_Ugz3pU260…
G
It probably costs money because the time and skill that it takes to make good AI…
ytc_Ugwd9v7hd…
G
But then, in the same tone, once you increase the gap between the rich and poor,…
ytc_UgwyMNf8g…
G
There was an Oxford study a few years back about jobs that are likely to be repl…
rdc_j42rb1p
G
“If anyone builds it, everyone dies”
- yudokowsky, the founder of the alignment…
ytc_UgzHLQ93n…
G
A simple solution, ask law enforcement if this was evaluated for ai? Plus 3 yea…
ytc_UgzgZJkeb…
G
I hate ai chatting as much as anyone
But op you really need to learn how to tal…
rdc_n0p32id
Comment
I work in Ai, I have made apps that has taken jobs in the past. I now make stuff that wont take jobs at all but makes things better for humans. Its not easy creating something new that doesnt step on someones toes.
youtube
AI Governance
2025-06-16T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyXIgo0M5PlQq_WCrV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxBFlTWiQ_Ud_c7UbJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwmx2qCHzj_E3Md6Xd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyVxOcQ6kpbLmsYLgx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugyl1Q7qoquIG9qvXXt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyr5XmBCtmMURmP9AJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzAO2AE4j54yMHS6A54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzR3vTH4Efkr-I3ydJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgztnJFEsZQ2Tkt-Wq54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOHhxogJNkvU8oWoJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]