Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My son has asked me over and over again as to why I talk to AI while maintaining…
ytc_UgwnCI-SJ…
G
I feel like ai art is fine if you use royalty free art in the data set and have …
ytc_UgxQqEhPy…
G
OK, so there’s a limited amount of productivity that you can actually expect of …
ytc_UgyapTP4_…
G
WOW AI TV shows, now that's something interesting to check out, I'll go check an…
ytc_Ugx9m2u_9…
G
I absolutely love that this perspective on AI is so nuanced and considering our …
ytc_UgwyGjCSE…
G
AI is definitely powerful—and potentially dangerous. It reminds me of the time w…
ytc_UgxWDEPsN…
G
It’s just code and algorithms after all! 😅 with no power then no computers hahah…
ytc_UgyLA2Tad…
G
While it's an interesting philosophical question, robots, including advanced AI …
ytr_UgyqysuQb…
Comment
The thing is, if values are not misaligned, if AI doesn’t do something outside of the realm of what its designers want, it’s going to still disrupt the work and well being of 95% of people with white collar jobs in the next decade. Society is not ready for the ramifications of this. It is going to make everyone’s lives immensely worse
youtube
AI Governance
2026-02-14T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxuSEuDMEC0tjiJE9V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyTCxNwGq_lgqDh3Kx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyAx2Qpr6NczK02Snl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNxLLZ3dY_a9Gt6Dp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzinWPBk9jl7p9eQvZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxX6Vu6aMBrr_4GDv14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzdCq2DGG1FE1ibytx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwWW2Il3p8Faim5A6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzsHTOyhAvQG85uZP54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx849mvh7WM2eMtkQ54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"}
]