Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is it just me or does anyone else think the CEOs like this guy that want governm…
ytc_UgxwarS-r…
G
I get that the death of Kitkat was unfortunate. To demand justice for a cat that…
ytc_UgzDWp1vY…
G
We had I- robot. We had Terminator films. We had enough warning. We now have in …
ytc_UgwQZ0fw-…
G
"investment in AI is going to make the US economy the 'HOTTEST' in the world!"
…
ytc_Ugy1MpAG4…
G
Not sure if that evidence is adequate to generalize to all AI and to all uses th…
ytc_UgxnK8KMF…
G
The scary thing is when they make ai they program them to survive. They have to …
ytc_UgyM-ASj8…
G
Maybe we can defeat AI by feeding it tons of Buddhist literature with the goal o…
ytc_UgwOQEHA_…
G
Haaahaa he says 100 years... allwhile im argueing with nova... A gpt that named …
ytc_UgyFTRqnK…
Comment
All I'm hearing is, "The sky is falling! The sky is falling!"
One nuke can instantly glass an entire city.
Humans can start a nuclear war, without any assistance from AI.
What can AI do that's worse than starting a nuclear war and obliterating civilization?
C'mon.
Somebody help me out here.
youtube
AI Governance
2024-05-28T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyKvg8ICGbrlCaG6gF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyo8w8DfrDI-mDBfql4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzH8Hu96QhBV913WFx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzPoqABYr5P18ARm0N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwoB1v2FDt5k5dyPGJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgycwN-SuLfA16_yWE54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyJ6WjxeZv_-vu2OEx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx_QFFNY8G9hA06lqR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxfLMpPiqfbwn6_0u94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzXLUS41pYviISEUlp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}
]