Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@G@GWT-qt it’s such a non-issue. Computers have been able to create art for ages…
ytr_Ugz3ntODx…
G
Today, there are two social contracts: one for the working class and one for the…
ytc_UgxCbfWiZ…
G
Nobody here possesses an IQ enough to understand these next sentences..
A paper…
ytc_UgxOq2NvQ…
G
use those energy to train an AI is still far more fun and beneficial than mining…
ytc_UgzqhQehG…
G
Who gonna pay their taxes if all workers are robot? The government will be desol…
ytc_UgxGQBMxA…
G
The scary part is....the government has been doing this the whole time.... who n…
ytc_UgyBw8cbk…
G
AI saves me at least 5 hrs a week — can’t imagine working without it now ⚡️…
ytc_UgzaWaO_G…
G
Comment demander à un être humain d'avoir des émotions devant une machine, la fi…
ytc_UgwUPL11F…
Comment
Well civil rights refer to individuals and that definition may include sentient AI in the far future. There may be some uses for AI having emotion if the tasks it deals with are very complicated or morally unclear. Either way though if a human can invent it they probably will regardless, so preparing for it isn't unwise.
youtube
AI Moral Status
2020-08-13T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxvZio8J3d42zYPJX54AaABAg.9DaJgOb75ov9bRYItuDq-c","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwgQg5ZfAe0SEcrt554AaABAg.9CvDz0sbZLl9HNXpWGR8Uy","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzcGhOARlbguAAPRAF4AaABAg.9CGUf8PUsh19CHzHSzdW_x","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyzoZBZPL9HP3Xc6I54AaABAg.9CBHlNvccp89CHzZ9wqGhl","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytr_UgxIMp0mytvW_5Xnr-V4AaABAg.9C-rQkejZnB9cMjXMb1Eoa","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyDLUjEeYcUG5ildLh4AaABAg.9BopoU6POKx9cQ-TZJoYVU","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgzNVpGNPiHD8-wuJ8V4AaABAg.9BQ0fHxhr9Q9BQ3QH-Nuiq","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzNVpGNPiHD8-wuJ8V4AaABAg.9BQ0fHxhr9Q9BQaHpWckBu","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzNVpGNPiHD8-wuJ8V4AaABAg.9BQ0fHxhr9Q9BQvVnFvd3M","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxIrFYmPo3YCarPGOZ4AaABAg.9BOa_YHhHmW9BQNy2HIAPV","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"}
]