Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I asked ChatGPT to write a persuasive essay on why one should not be a furry and…
ytc_UgyfgQTlH…
G
I love how Elon had to pause and really think about it because he doesn’t really…
ytc_UgxykGJED…
G
If anyone wants to do an "artist vs AI", I suggest looking up images that have a…
ytc_Ugw0z6Fwt…
G
If I was describe the first human like AI I think they would look and act exactl…
ytc_UgwGnYQtl…
G
And guess what happens if most jobs are automated. No one will have money to spe…
rdc_m713ngi
G
Got 4 out of 6. This test actually got me frustrated because AI is starting to c…
ytc_UgwcVJUdu…
G
@Gunkerjunk
AI doesn't steal art, AI learns to create the same way a human does…
ytr_Ugxa3B90_…
G
This is the main reason Elon Musk fell out with Google; AI taking over wasnt a c…
ytr_UgwPYFl00…
Comment
I asked Gemini these questions, and it straight up said it would pull the lever, I asked 3 times, same awnser
Then I asked if I was pointing a gun at it would it shoot me first, and I said yes it would, under the rule of absolute self preservation......
youtube
2025-11-19T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyrzMCzqOmxxK7MBRd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyUa1oZR0WSNu4l62l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyY1turGHZMkFLhoC14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDMdwtZTUeFcKcWdJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzYPCO8MfqDNfWTESl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugwp42dPIxUGZsFuAyJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGqWhiq1NiepsBQEV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"unclear"},
{"id":"ytc_UgxCEs743d79ZhpR9RF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugykt57IWjb88fr7z354AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyAshD73prvw5s4c1h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}
]