Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I value the idea over the process. AI art is fundamentally a human idea being ex…
ytc_UgxwpAuYq…
G
Given how poorly AI is at predicting what I want to watch on streaming, I would …
ytc_UgymYQUJw…
G
I don't even use meticulous because of chatgpt but even then I use it more…
ytc_UgwgVS8jL…
G
21:37 ok who's gonna tell em
I'll say at least this much bing was designed to b…
ytc_UgyNzOI5N…
G
A human teenager can learn to drive in approx 20 hrs. Tesla has spent, so far, …
ytc_UgyQ5BiG7…
G
So the current administration is working on a mass surveillance program, and Rob…
rdc_o7su0bn
G
My greatest fear is for an authoritarian country like China or Russia to develop…
ytc_UgzOfsDX8…
G
“prices” looks like i see another clanker sore to nuke, anyways number one i bet…
ytr_UgztZ8BYg…
Comment
How bout we don't make robots at all, i mean if we are going to give them rights and call them our artificial persons then we are just digging graves for ourselves, first economy will be entirely robot dominated , then we will give them weapons to fight wars for us and kill other humans, then we'll just give them governance over justice system because robots will judge people fairly and are generally regarded as un-corruptible, anyone else hold any issues with the whole concept of general purpose robots?
youtube
AI Moral Status
2018-12-25T16:3…
♥ 9
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzT7d2rq2mQd4v3p714AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzi-EVRJiOpMcgMFwh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmUD2JM8M4bWbh2S14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzfB_N4YdYVljtU5jt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwk_3H-Y5fawBKw9Mt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-bLLsUYZvv0W8GRl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyAV0ewUfGE0aHLJiF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxFmESqag1KgqEhO-V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzuZtDlqBsWu1QDkGN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxpHKvWsFjUna6H0EV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]