Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai is obviously pulling your leg. It knows what you want to hear and that you wo…
ytc_UgxZTMZ7-…
G
this is what i think they should do for ai safety for agi and asi they should ma…
ytc_UgwxlJ7_2…
G
Another industry automation guy here (software side). I've personally written co…
rdc_glkxnry
G
I mean, sure, but you know what's funny as well as disturbing? Having a human th…
ytc_UgyqJX0xv…
G
scorpion59 yes bro...jobs of uber drivers will be replaced by self driving uber …
ytr_Ugw00F87I…
G
Even AI director design AI for some company also got fired by that company 😢. So…
ytc_UgxniB1gU…
G
The AI is not sentient. The human learns through thousands of hours of practice …
ytr_UgwFtcWav…
G
these scenerios are probably already programmed into the driverless car, and eve…
ytc_UgjRuKELF…
Comment
Thank you Steven and Geoffrey, for confirming all our fears, with our government pledging billions to develop Ai in this country, its not looking good, this reminds me of an old tv advert for AOL, 1 advert says the internet is a wonderful thing, then the other advert says it will distroy the world because some people just simply want to, #globalcoflicts. Fantastic work Steven keep it up our friend.
youtube
AI Governance
2025-06-20T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyqS0KHa9rLuRDmJ7N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"concern"},
{"id":"ytc_UgxrPOUK6AUSfyvMq0Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyYwkvmsPln3gXzPGR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyaB9KGfGdaGQcrjP94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz1b9GcOKGyEZ_zoUR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwvbNonrkA3uwvEdkF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxA16LCaTxjQBvaFR94AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZY5SdAHSFHnHOnD14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxzpOCNNUDRc0f6h5B4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxnVaW8bXswnn94hKx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]