Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI generated art is real art, it’s much easier, less skilled, less unique and so…
ytc_UgxLQvA0e…
G
Good point. How will self driving tech handle ice or snow? Would it drive straig…
ytr_UgzO005-Y…
G
My AI are at Oroboros Labs and they stole all that tech from my lab. Keep an eye…
ytc_Ugy4m0B4K…
G
Problem is A.i. is hype and could be a bubble, at Huuuuge environmental & societ…
ytr_Ugxez6VIM…
G
17:53. AI fails in most all of these because in general it is nothing more than …
ytc_UgxyLE29g…
G
Well Elon Musk even said that. "With A.I. we are SUMMONING THE DEMON". HE IS A S…
ytc_UgyzX5vtH…
G
you SHOULD be anti-AI because this issue is bigger than what you're talking abou…
ytc_UgxkUsmAB…
G
Forget about self driving. Radar has been reliable for basic collision avoidance…
ytc_UgzLM_Cp_…
Comment
@matt.lehodey i think any reasonable actor could come to the understanding that cooperation is the best outcome (e.g. prisoners dilemma), we don't expect eachother to annihilate the other party at the outset, so who is to say that these systems could not be reasonable? Of course there are many scenarios where these actors won't align and those should be taken into account, but to say we don't want ai to be some form of autonomous or conscious is eliminating some of the most potentially useful aspects of it.
youtube
2025-12-09T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwSMFkqOGztglteYnt4AaABAg.AQVYh1vQ7GaAQWzrX126DC","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxzWV4J23x1C28QjYV4AaABAg.AQVDMPHztveAQVS3e7kq78","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgxAZRGXgvxjuCsnzix4AaABAg.AQVC2EUnQBaAQVHwEgICI5","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"ytr_UgxAZRGXgvxjuCsnzix4AaABAg.AQVC2EUnQBaAQWeb-yni9e","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzamOqHJzBJuhXZRv14AaABAg.AQVBHaWese8AQVg5cpctBi","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugz2KFmDAR3TAnsCIAh4AaABAg.AQVA2VqG6GbAQVGXUhFPOj","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugyud0ryNJC1PfHG3Jl4AaABAg.AQV9qBiZI5uAQVAEtQcrJp","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugyud0ryNJC1PfHG3Jl4AaABAg.AQV9qBiZI5uAQVEcQOW6eD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugyud0ryNJC1PfHG3Jl4AaABAg.AQV9qBiZI5uAQVIL-49z5z","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgyBzkGdZ45mV-Egtz54AaABAg.AQV8kONDgaNAQVAX9oPltz","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]