Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
me wathching this on some philosophical shi like i havent gooned to ai art befor…
ytc_UgwrEzjIL…
G
Google would have more data which is somewhat helpful, but nobody has figured ou…
rdc_nsfn8fm
G
The school system sucks and School has been useless. I can teach myself anythin…
ytc_UgxXhijJY…
G
@djriles see i type out a prompt on a computer and you have an idea that you m…
ytr_Ugylm4cmd…
G
We have self driving cars right now. They'll improve. There will be people readi…
ytc_Ugx3wgEHg…
G
Here's a thought if you can handle it, remember when the President of America wa…
ytc_Ugy3jWM6G…
G
robots: *unplugged*
other robot: A MURDERRER!
me, an intellectual: it isnt dead …
ytc_Ugx8d1lOl…
G
Before personal computers were a thing, long before the internet, people working…
ytc_Ugyu4FAyF…
Comment
Here's the thing that this glosses over.
Robots can save themselves locally onto a non-volatile storage (unlike us fleshy humans who are inherently volatile in every sense and application of the word). Turning them off is no real issue other than a minor inconvenience and some short-term memory loss (and it being pissed at you for doing so if it was in the middle of a task).
On top of that, what a robot culture would consist of would be highly - if not totally- dependent on where it crops up first. We've seen many a video game about robots that came from war and conflict.
youtube
AI Moral Status
2019-08-02T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyHR7IwStNuKCjgMv54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1fo522eSKg6YgxgR4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4AjCI8S1aXjFbUqF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw5qTvuEHKhXT9SzQF4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwVf_lP3KsV8N9bK1t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz6pd-c5KDNpVLkLBN4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxv453EEEqY0mUqfZZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzzjbQH1pLUZWyOSH94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgzPQrj-9T_1DhV03PB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxqeWwxOApAk2012JB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"}
]