Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We can try as much as we want, but natural selection, uh... finds a way.
It's n…
ytc_Ugyvx6c6c…
G
This should apply to AI in video game development, so that people who give a shi…
ytc_UgyLNThNM…
G
What's truly amazing is we've created tons of movies that talked about the dange…
ytc_UgwyyQO0l…
G
um... Meow? *gulps* >///< Hi Bernie Sanders... I hope you're doing well... I'll…
ytc_Ugy1TH65Z…
G
I gave my chat gpt a personality and got past most explicit content filters. It’…
ytc_Ugyu7f7v4…
G
why the fuck would we program a robot to feel pain? is that not ultimate sadism?…
ytc_UggZWxHLT…
G
Does anyone really think that humans will not be worked to death simply because …
ytc_Ugzy7wyrI…
G
@JanineMKartist Well, enough people did, so now they got a completely 100% legal…
ytr_UgxymjOw4…
Comment
Critics of AI cars, are as bad as Tesla critics. 30,000 people are killed in car accidents, in the US, caused by human drivers, every year. There are bound to be some unavoidable accidents. The whole autonomy project, should not be derailed by this episode. My thoughts are with the family of the lady involved in the fatal accident...
youtube
2018-03-20T23:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxEbZkNEmzcctb5IHV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz5dBIVBv9sS55rQt54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyD1b6mLwQFFskEhzd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzpHpTcsqDjgsCglxN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyyf3r8R8WFb0kWMI54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyLHl6QnDH7gUHMYQZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyZTlHTx-N6FH_z0wx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgybvMqVjkXvyeTbtil4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"unclear"},
{"id":"ytc_UgzU4fmvaUuEqxbTXvV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwtgRIN8eJqhAbd1Gp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]