Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Scoring in the top 1-2%? You mean A.I is scoring in the top 1-2%. Humans are get…
ytc_UgyF9UfxH…
G
A few things, I want to start a fight, copyright protects the owner from theft o…
ytc_UgyNysLMj…
G
White collar and knowledge workers could be quietly fired already today and repl…
ytc_UgxJuBeYE…
G
A highly intelligent AI with sentient thoughts talking about AI regulation. What…
ytc_Ugzse-JOe…
G
just one question, who buys this product, what AI worker generates, when all hum…
ytc_UgyawgOTD…
G
Why do people talk about AI as if it were a tool that invented and created itsel…
ytc_UgwTiaQl_…
G
AI doesn't need downtime, doesn't get tired, can work tirelessly, in this time w…
ytc_UgyPX83Jn…
G
I heard a story 3 years ago of an A.I. robot killing a Chinese scientist by a fi…
ytc_UgzS6-Lrp…
Comment
I’m sure they’re trying to give the robot and it’s creator feedback….. from what I see so far, he should turn her off and save her the pain.
youtube
AI Moral Status
2023-05-28T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgwN71DHOt03PPg9MiF4AaABAg.9qGFBFjK0Wb9qPtzoL9dY6","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyUTyct8r0VOXebv854AaABAg.9qGF1KaE8iU9qHvd3AOlW4","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyUTyct8r0VOXebv854AaABAg.9qGF1KaE8iU9qJEC_bhZUG","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyPZDw5mlR6n0a1FMB4AaABAg.9qGDMTdJUwj9qK9Ffxtu6j","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx2hSdJH1wPAyONQ694AaABAg.9qG74dolY1R9qNKpuZUydl","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx2hSdJH1wPAyONQ694AaABAg.9qG74dolY1R9qNhkustd5z","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugzpn0PzrVIuW_Sn3ZJ4AaABAg.9qG68lIvyCw9qIur_iAivG","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugzpn0PzrVIuW_Sn3ZJ4AaABAg.9qG68lIvyCw9qJc2MLtFPG","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugyz08XpuA-YlTo1SKF4AaABAg.9qG32DAeM-n9qG7jezjXHP","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytr_UgxP_eOHii7a0IoVYfh4AaABAg.9qFwk8Ny6sD9rTs1fEJWde","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]