Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I mean if we get a super intelligence that makes people obsolete, and the people…
ytc_Ugx-IY1h8…
G
Bolt-on AI sucks every time they try it every few years. (Real AI used correctly…
ytc_UgwGH3to9…
G
@killer4stardust La question est pourtant bien sur l'IA en général. Je cite : "…
ytr_UgyA4zMqo…
G
As a leftist, I have learned to embrace automation as an inevitability, but also…
rdc_j3xm4mn
G
The risk of extinction comes from the way these corporations and individuals use…
ytc_Ugz6p7IQx…
G
Robots robotizing society, who would have seen it coming? The answer: all the ca…
ytc_Ugy4FxkZP…
G
Ever think that it’s in big tech’s interest to play up the capabilities of AI so…
ytc_UgyRECBTU…
G
He’s actually making a great case to invest in ai even though I know that’s not …
ytc_UgyhfDDh4…
Comment
Hinton is coming at it all from purely meterialistic and naturalistic point of view. A robot cannot hate or love, as these concepts are meaningless to it, so this concious that is created is not real, it's just an immitation of life.
youtube
AI Moral Status
2025-06-07T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzD9nhLxlrHoGCU8Zx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwCJGzZl3JrCeLXDKt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwgVwgesAM005ZG3iZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy55H6aaTel_tXuPpV4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy9EfWtH8M2jq2pzld4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzUSN3Fr37QUSFm8Zp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz1aDqDmASLrAvsf6R4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxJ9V2OBtQEbauWukZ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxTYqN6AmQVv5wEFbR4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzxHz8FP2FuALKqOZd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"}
]