Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI has the potential to be the sanest form of intelligence, but if it is modeled…
ytr_UgyYtD8aJ…
G
really shows how outdated the US culture and values are when they cant handle dr…
ytc_Ugz9UrjHj…
G
At least for the US. I think deep fakes are going to be very hard to prove in co…
ytc_Ugxpwalta…
G
fr like don't tell me that Oliver dies, rex dies, robot takes over earth, Mark b…
ytc_UgxHYbuHb…
G
@hotpufff123Not exactly. You see for an Ai to create art it need to train on ot…
ytr_Ugyhu89Bd…
G
Aa HUGE SCAM!! ALL ITS DOING IS COPYING HUMAN BEHAVIOR INTO TECHNOLOGY ITS BEEN …
ytr_UgwqFBfZ4…
G
Stop paying for limited and poorer services provided by Open AI after each downg…
rdc_jsmf36x
G
I think people will stand up and rightfully "discriminate" against ai music beca…
ytc_UgzglPjmN…
Comment
Its truly ironic, that he and many others built AI and now they explain it can decide it wont want us around, and now after it is all built /Designed and already learning on its own - then he goes on a pod cast to express the likely harm AI will bring - how intelligent is he truly?
Obviously there has been bad human actors since beginning of time, why not originally design the intellect to not work for those "bad actors" - duh..
Also, reasoning while learning, was clearly known from the beginning where is the jail bars or sand box with only human allowance to let it out?
OMG
youtube
AI Governance
2025-06-16T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgybwkUlLjNpqGHwCDN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyK86HVCnfEp2YpsRB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwRFOZQ8KEQpf1QQAV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwW7Yd339WGmxUzp5Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw9KbxUxbs29lCHsw14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWYEf4uCGRc2uDYGR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwLAGWBiGFois9PJLN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzvVe01CWiXNQ3S-ed4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgynndGtTRlHcJfrBp94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyWl78BZDKiSIboABl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]