Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Today I was watched one video from Tectone in what he reacted to video abt Ether…
ytc_UgyZCXmwD…
G
If I need to choose top 3 AI coins, my picks are FET, INJ and NEX (Nexulon), not…
ytc_UgyRrIXC7…
G
He didn't, he just confided in chatgpt once and it proceeded to then use that on…
ytr_UgxYGCKtS…
G
They are trying to save you from yourself. The terminator wont take orders from …
ytc_UgzoEmwn6…
G
Thanks for the love! ❤️ We're glad you're enjoying the interaction with Sophia. …
ytr_UgydqQOE-…
G
The other day I encountered someone who strongly believes ChatGPT is sentient be…
rdc_jg7ggdd
G
Someone should invent AI before we need to worry abt it. Things what they call A…
ytc_UgwKAkYBY…
G
ai well never replace lawyers NEVER lawyers needs a mindful human brain and emot…
ytr_UgyVmrWL-…
Comment
I remember participating in a UN model event back in school (a bunch of 9th graders from different schools play as representatives for two days to create a resolution or some other document together).
The topic was regulation of autonomous weapons. I represented Iran.
This was long before ChatGPT and the war in Ukraine.
Not even a decade ago I participated in that event, played terribly and wanted to forget that embarrassment, but now I see that there's not even a legal definition for autonomous weapons (I doubt it'd matter if there was one).
I guess that only means the real UN is more useless than that fictional one
youtube
2024-08-15T12:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyiEc_Z5yeFiHnBPm14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_UgwEFMvNiNA_yYWlER94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyeagHhyDOb6sVMCuh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugw9bwQD-UYVpnKaCbx4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"unclear"},{"id":"ytc_UgwWqWWG2sWLSPgMXo54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugy7IGoqkxjR85vbs854AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugw_J9_imzqVmizV8cF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxaiqbzSqF9Fu8DodF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyLmtHs7-HELunC66p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_Ugwinm8NJ_D7fFyd2Yt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}]