Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
*I'll tell u my last words right before I launch the singularity*....
Me: How de…
ytc_UgwXhcj-l…
G
Ask this question: Pick one religion because there is one way
I understand you…
ytc_UgysPzEO5…
G
Watch the video. Listen to the arguments. The moral isn't "lie down and take it"…
ytr_UgwDj1uzB…
G
I think it would be fine to use an Ai tool to find these inconsistancys but the…
ytr_Ugzwi_RhG…
G
the coming of AI generated image is interesting sometimes. because it beg the qu…
ytc_Ugw4YkyDo…
G
There’s actually a website called AI or not which lets you talk to an AI or anot…
ytc_UgxzLjETA…
G
Well, time to let AI cyborg Biden out on the loose and they will change their mi…
rdc_mw6ckfr
G
As a quick human survey, poll how many Cyberpunk 2077 players helped rogue AIs e…
ytc_UgzgDqGbm…
Comment
Stephen hawking had predicted this before he passed. He said we would either coexist and bond with extremely intelligent AI, or we would have a great conflict, with AI being so smart it can override our planet. There was another advanced planet just like ours around 10 years ago that had so much knowledge that it corrupt itself because everyone killed each other. They sent a message to other planets that was just deciphered not to long ago. That’s what’ll happen to us if we don’t change our crude behaviour. War is horrible.
youtube
AI Moral Status
2018-04-22T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwyJYVMCdutHLgTqVZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz7HLtT-Zqk7kzFKK54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyi6aRh997AWNgSDmB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwRhQ1B6M96pjN5Ov94AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwQMRyR0Zo74GnNJtx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw5JhBtq6ouh1sz-vZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwm95gI3C-1NP0yCTJ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgweJtmEaGnyHnMc1R14AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy-7NBg6N4SPvHN_AV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_Lk2HfXU8I2qp3m54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"})