Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai didnt do this he has poor comprehension skills, obviously not the sharpest to…
ytc_UgwtJ-lta…
G
I believe that Complexity = Consciousness. And Ai will eventually develop consci…
ytc_UgydDBmqh…
G
disabled artist here: firstly, we don't need any help from some tech companies t…
ytc_UgyxXDl3F…
G
On a global scale humanity will undergo a change never seen before. No individ…
ytc_Ugyczapm3…
G
Thank you for sharing your thoughts with us. It seems like you're contemplating …
ytr_UgwItV1BA…
G
If u look closely the last one is not the ai’s fault whereas it is the fault of …
ytc_Ugym3Q3tc…
G
A little less unhinged but using ai to come up with backstories (or at least sol…
ytc_Ugz-PuH1a…
G
The car still got people to their jobs… I think this is far more dangerous as it…
ytr_Ugzr3CyVd…
Comment
Great discussion. I am actually curious as to why the USA and UK did not sign the AI treaty signed by China, France and India? what are your thought about that? Does it indicate that USA and UK are going to use AI to fight the next global tech war? and has that not been detrimental to our safety and progress in the past? thank you...
youtube
2025-05-04T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgylLAQYPGExeBVEObV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXa0f8hVACxq_DPi54AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx_OGsc_OXgsU_VcrZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9KtkwgLQTisGDdBx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx7dfTBPbGCAvB5Imd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugxukg3rTFn2jL2prRR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyOEXCc60mpTmpwHfh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgznK_phH1g46YI4uNV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzPW9-ufMvHQPJwgdx4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzzpyYpfO5fBQVIoSh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]