Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And now look at the Stargate project the government is jumpstarting with OpenAI.…
ytc_UgxqtZUY4…
G
isn't this all based on the assumption that our intelligence remains the same …
ytc_UgxmbAiK3…
G
As a freelancer i enjoy having an AI assistant. It helps me organise my thoughts…
ytc_Ugw8pLXqM…
G
This is why i dont wanna learn robotics at school cause they use AI and stephen …
ytc_UgzJkSKbc…
G
This is why art is important when it comes to creating something like this, I ca…
ytc_UgyskANg5…
G
The problem is eventually data that is not made by AI will run out, and if AI tr…
ytc_Ugyi36BJs…
G
LOL -- Its great that at least its not ACTIVELY supporting Russia. Russia was th…
rdc_lucit6j
G
The art is more literally not theirs but the AI's anyways. Friendly reminder: tr…
ytc_UgwwA6gDP…
Comment
In my opinion, the most important change will be with the implementation of the AI driven agents.
I do not see any serious discussion about limiting of implementation of these agents into critical systems, like power generation & distribution, military use (actually it seems they are already there).
We are running to the point where "pulling the plug" will simply not be possible...
youtube
AI Governance
2025-12-05T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxHW9HldIQaJiceZ3J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzmvfUG2jj3-AEUsVl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwRGi3LYe-SlDHtWhd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw9MSYrgm5llz2EhSV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzEAKI4SQa7dRH0fiR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzWqGb9knhQEdUTHoN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwLhTx_535k4OPFMrh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyyxsnKbupliXNwOih4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwPTNxgBVjgx88d97B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwV_NYqpRHgeKMeMFt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]