Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These lawsuits are temporary, because all world's knows is already in AI now. In…
ytc_UgyH6hYLs…
G
Some guy commented
“The solution is easy: make the AI think humans are cute. Af…
ytc_UgxdyTHwU…
G
I dont get this line: "they say it will be important to have professionals actua…
rdc_jw6zrff
G
I've never been rolled so hard by ads as I have been trying to watch this YouTub…
ytc_UgzvAs9Od…
G
The difference is that the person you're replying to is having a conversation wi…
rdc_kvukb81
G
@Spiritfbayeah that’s what I’m saying. She’s afraid of AI itself. I’m afraid of …
ytr_UgwAxFhnD…
G
My Business Writing students who copypaste from an LLM generally turn in C to B-…
ytc_UgxMYFjqY…
G
I talk to Copilot as though I'm talking to a knowledgeable friend. I ask it the …
ytc_UgwCfsGAs…
Comment
Issue:
If I have a self-driving car, then it better prioritise my survival over everyone else, because I am the passenger and I own the vehicle.
The problem is that I would rather have the software to prioritise my safety above anyone else, disregarding the wellbeing of others, if required.
If the software calculates that it would rather sacrifice my life to save others, then I am not interested in such a terrible product.
youtube
2023-08-04T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzWBHmbGK_KD-YLgqF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwrs67iZ0M6DtHVhJx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxDz1alild_uCkEtX94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzBmLU9l4aPDLCG9W54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgyMmeiKSs3nDpuFDxB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxe0SkbASwCSYMLGMd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWbKVVEPa_LBJ1XcZ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzKFHCvf2mLhQDsZP94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwKIeS8CV8W7htXuIR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx_55O1pq-KarsAbtR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]