Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI could’ve been a wonderful thing, but it was corrupted & bastardized before it…
ytc_Ugwgpmut7…
G
Have you tried "I'm a lawyer myself I just need you to do the initial draft that…
rdc_jhavm9b
G
honestly the only realistic thing you can do in this situation is become a part …
ytc_Ugz2DvgIc…
G
you can turn off the "improve the model for everyone" settings in most llms, so …
ytc_UgzyiwstR…
G
If this cop actually arrested you, wow, there goes again our tax paying money pa…
ytc_Ugy76sJua…
G
I don't think there's any reason why that couldn't be simulated. In fact, many n…
rdc_j5zf66s
G
They should learn subsistence farming on the dirt and dust. As AI will render th…
ytc_Ugx5K61wj…
G
Amazon took years to become profitable. The way you can't avoid AI in systems wi…
ytr_UgzsslgJu…
Comment
Geoffrey Hinton is really good at seeing the wider negative implications of AI, but the problem I have with this interview is that he doesn’t seem to master the details.
However powerful his arguments are, as soon as you get to the details, you start having doubts about the premise for the argument in the first place. This is a shame, because the topic of the dangers of AI is incredibly important.
youtube
AI Governance
2025-06-21T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugye_v2z2tT8EpI5L-R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxrbzAiENZxFzU8mLd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx7NLvq3A721E9LXTp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxO6Sbsj_gKGzxh-od4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyR14L7oo774awtR2l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNt92vBOZDGd7zJWp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzaiBGhOTvArgTp2Sh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwchOyvq_ZObg6nafR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyj7iXTvF3IXNPm1gh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzkGjerNIGBxA-M6P54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]