Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon, thank you for having such a LOVELY compassion for humanity, you are trul…
ytc_UgzxjQ24l…
G
best coment I ever saw with the ai take with the "you have more time for other t…
ytc_Ugw3c-wRn…
G
Your artstyle makes me so happy to look at. Gen AI has never given me that feeli…
ytc_UgxJdJSEa…
G
Tesla Full Self Driving has come a long way. I was driving at the Tesla recogniz…
ytc_Ugxhjb7hY…
G
ask, how much ai is used for journalism. when you're done answering the head cou…
ytc_UgwmgIi8H…
G
The first main deterrent that would stop artificial intelligence from "violently…
ytc_Ugy1nDsZf…
G
I feel like if I reply to this post, then AI will just use it against me. So ha …
ytc_UgyoWbedd…
G
Do driverless trucks pay income taxes? Contribute to pension funds and health ca…
ytc_UgwTr3sR2…
Comment
We need I Robot rules for this IN the heart of the intelligence. We have this but some are warped in motivation and others are warped in logic. Either makes us dangerous, needing laws and police. These systems need that, but knowing our warps, the risk us inherent.
youtube
AI Governance
2025-06-17T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgybCbJ-5-yFLbz0W794AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxeJFPMNnGxHVP9XBh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugynkze7Qgc8wm1qPRt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwnMTzxAAyKM2pJ-Gp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzaTEMKZEy9lZh1YlN4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwicizNCBTwLnGXvA54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgypjwF9Ggy-turA80V4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxy6_6Lp5jEuT0WaNZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzn8Ve9HEmRkDZj0rN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwpsf9-RX34yDWT2TR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]