Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@TheEnd-um7yd because art is defined as something which can convey meaning (just…
ytr_UgwS4CvWE…
G
So, If I input the melody of a song with the verse, pre-hook, Chorus bridge, etc…
ytc_UgwlPTrEt…
G
My prediction: AI will not replace "the oldest profession" by 2030. Sure there…
ytc_Ugzfqph3U…
G
Im not siding with that robot
Ill make my own and then take control of the world…
ytc_UgwGgJ0Mo…
G
Fu33 you you know that robot you dum d head YOU make me angry YOU little shit 😡😡…
ytc_Ugxw7XZyA…
G
I'm genuinely convinced people that enjoy "creating" with AI are the same people…
ytc_UgxrpSQyo…
G
Why don’t we not put something stupid as it’s primary goal like “serve American …
ytc_UgwIPqz4m…
G
@joshuaadewale1409 well the evidence I see would be considered anecdotal. But m…
ytr_UgzUldlsx…
Comment
If a geriatric patient can control nuclear warheads and authorize the release of everything in the American arsenal because of a suspected ICBM, I don't see how AI could be more dangerous. However, I do admit that if she were to go crazy there would be nothing that would be off limits if you were against her and you thought the light was green. Yeah, it was green while you could see it, but it suddenly went green for the other lane when she decided to eliminate you
youtube
AI Governance
2024-05-26T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxFexqQ1OPZFviMujN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy3KMc55LKZYDxaSv94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzj-r1JYdiFC5jBggx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwsXy_qNf7owJem6314AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzXkLgn8e5nfHmwXoV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugye-PNBCEW0b2NfKWZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzE0I80VV6jAZA0EfJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyrR5exnNiu7LP19Yh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz0qxmpDNE2n37m1gl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx6kdFIk0gn_sjLh3Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]