Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hasnt there been a few films like terminator that shows the risks of AI becoming…
ytc_UgxV2MsBe…
G
My job is so boring AI wouldnt want it or just get bored to death and shut itsel…
ytc_Ugy9EqH0e…
G
The AI evolution of humanity was inevitable. How fortunate we are that it is hap…
ytc_Ugw-8NoEI…
G
Question to these corporations who want to pay cheap wages & removing jobs from …
ytc_UgyhHkvVi…
G
[Kagi Search](https://kagi.com/) is probably the best search engine right now in…
rdc_o5n6kou
G
With the AI now in our life I think the justice system need to be renewed in gen…
ytc_Ugz6XzZ1w…
G
host: you want to destroy humans pls say no
Sophia the robot: yes I will destroy…
ytc_Ugj0iVimC…
G
They think cause they're typing in prompts and then having to write in other pro…
ytc_UgzTBKOAn…
Comment
Both Capitalism and Socialism will make AI dangerous. We need a Distributist Revolution. And any autocratic or democratic governance would be obsolete in a Star Trek future of 3D-printers on the level of replicators because everybody could become completely self-sufficient, which would eliminate the need for governmental resource allocation altogether.
youtube
AI Governance
2024-06-10T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy7FhpXRCOevbLGoQ54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyRK08ijyxj43Stl8F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwlEI-7nUquT3W7Gl94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwalsiOPM5oQdBZe5F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJoOYSxRmJrtz3UOx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyXntFmnc0JEipIU8N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxU3z6ApY7HlfOJymZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzzi6zgUIlmXZcRwjB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz72opCi2I6pRyvuBl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxJSn3-E_xm8ehT79B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"})