Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He gambling the human future "why not stop here?" he said "it's good for buch ar…
ytc_UgxzCb5f8…
G
That might be because India has said time and time again, it has no interest in …
rdc_luc2v3l
G
Sci-fi here we come. AI, Quantum Computing. Put AI and Quantum Computing togethe…
ytc_UgyYSrBNM…
G
The most disturbing thing about how AI will get out of our control and fast. 10:…
ytc_UgzfhngUP…
G
AI isn’t the future anymore… it’s already here.
The real question is who’s using…
ytc_Ugwuae1mb…
G
Every time I see one of you AI stans in the comments, you always prove the thesi…
ytr_Ugy8V9gZT…
G
Its the law that the Head of America's intelligence committees are informed abo…
ytc_UgziO8r2d…
G
Whatever those people said makes no sense to me…
There’s an obvious difference w…
ytc_UgzskWuMg…
Comment
This is a nightmare which is of course of our own making. There will be nowhere to turn for the children in our world. We must implore society to reject and restrain AI beyond a certain intellectual level, but I fear the horse has already bolted. As humans, we learnt to not put our hand in the fire and to protect ourselves from getting burnt - so we must treat AI as the most dangerous of fires in human history.
youtube
AI Governance
2025-06-23T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyCp1kHr7A2YqSbZZ14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyL_Gm1CWjmA85CBrl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvtAyvzQPnp-GmmWJ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwGfWrYn-bGE5Af5MB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy8qR6MSGGcrWdOtF14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxq6DVD3VYn2U3IZhJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgygF6skF75OBd9StaZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxB-No573K9z-qulE54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwd5YykFTXV4NkDZEN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyspePuFS6XRUeTk2V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]