Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI response to Ex Machina missed the part where the movie AI was completely …
ytc_UgxWb5K1_…
G
I don't feel good about humanoid robots and AI. I don't know but it is creepy to…
ytc_UgyuYuX9u…
G
Amazon is basically a storefront for Chinese goods at this point. They have comp…
rdc_gupy7cm
G
To add to the comment above about that we can not go back...having done a comput…
ytr_UgzaXJhsW…
G
This is likely how current AI models fail. It requires such a large dataset to …
ytc_UgznU32uh…
G
Everything written on the internet is a source for future AI to apply and innova…
ytr_UgwtC5j-a…
G
Keep dreaming, if you are a programmer you are fcked. AI will never be able to r…
ytc_UgxCT90SI…
G
Ok now lets take a look at these 2 processes side by side. Whats the difference?…
ytc_UgyuOIalX…
Comment
Everyone sees how AI can/will go wrong for humans even if it works as intended for peaceful purposes (most everyone loses useful occupation), yet because bad actors will develop weapons with it we must move forward with it's development. It works out badly for humans either way. Unless something changes it's a lose-lose technology. (Great for creating Cybertron from Earth though.)
youtube
AI Governance
2025-07-07T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyxtFViUJPI52aKYzN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxAr-_N4H4YGGT41mB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyM5AhSmAAoeF8Rbix4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJbTvKL4_V04-ocQl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxfjAUXfjf4sOBdPXN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugy232ZY_msh8TC27Ox4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxUnRUg1WHUO6PJvZN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUwFD6yko2e37b4al4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx-hMF0TER8eXZtwAp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzZ1SlhHqItNXs2dgN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]