Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The eyes didnt have a reflection. I think thats the best way to tell if its AI o…
ytc_UgyJqIIwZ…
G
No way!!! I hate dealing with that AI bs. We now call it AL too btw. So tell AL …
ytc_Ugx0ZW66T…
G
The chatbot may scream, but that doesn’t mean it dreams. It’s a performance—an e…
ytc_UgwdlkOpj…
G
So, "I, Robot" and the Terminator movies are documentaries. Just as Bicentennial…
ytc_UgzdXzznM…
G
11:37 - This AI guy says... "What we really need is a world government that work…
ytc_UgxupzUju…
G
What's scary about this is how the man robot has been programmed to be an arroga…
ytc_Ugw-kA2xK…
G
Ezra comes off as emotional and is anthromorpophising Generalised AI technology…
ytc_UgyZZ9jHk…
G
As an author (academic) that has tested AI systems about my research, I am happy…
ytc_Ugw9CE3eg…
Comment
In the UK the government has actually already started to build a safe test room environment for the AI in cooperation with it's allies. the situation is being taken seriously.
youtube
AI Governance
2023-07-10T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwoO3s2ZeMzsV_lzKB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy7Ou65IEqa7v90TQt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyg_rmSj6c996CGnUd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwUEyKWeXlrJLapZ1J4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwDuNyH0jbR0XcR6ZR4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwT_Q8QEe-wqszkZyV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyi4QJY1bEQhGC__Xl4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzc8vGgIAji35m-aSZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmrHwVCXy32LRFEul4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxPPwDVbt9NAvb0f714AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]