Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’ve used it for little code stuff like you. I’ll give it a script to debug. It …
rdc_jheqtci
G
A hypothetical: What happens if studies will show that AI differentiate civilian…
ytc_UgzsJrrZb…
G
Most people lean to ai (mostly teens with ocs who can't draw) so they can create…
ytc_UgyKlavrZ…
G
Abe fuddu vo alag technology thi, ye alag hai. Kuch bhi compare kr rha hai. Ai i…
ytc_UgwutohRR…
G
Number one, it ought to be mandated so that AI content must be clearly identifia…
ytc_UgwC33HQm…
G
AI is already directly linked to every person who interacts with it. These netwo…
ytc_UgxsaxQkQ…
G
AI is not a problem. People being negligent about their data being collected/sto…
ytc_Ugw8W5Mrm…
G
No issues, no one will be there to use it. Ai doesn't need it. Humans are useles…
ytr_Ugw0-Z-qK…
Comment
Let's give tow robots guns, one controlled by humans the other by AI
Which one is going to shoot someone first?
Our obsession with getting AI to behave is sarcastic
youtube
AI Moral Status
2023-08-20T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy4utEftaASPCMyopJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyEuWdCV6IpPLdM62V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzDhuCwaGEVw1HQgxh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwnK35OlHQfyvZALEJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzPuhn74jWvLASHyqp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy0X7Wgl_23mdJber14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxyYNEBP7kcyXGvO8B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwQ6AmEK_HS6WtfegV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugw9Acwu6LYm6OeNdxV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugyr3Rbzj09U3svvRLZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]