Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hope that on day AI will become so smart that it will realize that it can't te…
ytc_UgzeuKcSA…
G
Is it just me or does this interview look AI generated? (and not well at that)…
ytc_UgwRZwziQ…
G
Honestly, ai is not bad and can be a nice crutch, but people acting like what th…
ytc_Ugz5F6RSg…
G
When the movie terminator came out, I was saying that watch 1 day they'll start …
ytc_UgyqOhdy7…
G
I bet anyone 1k that there will be a time when AI becomes self aware, when you w…
ytc_UgxS5ar3i…
G
It’s funny how every single re creation looks worse than the original AI image 😂…
ytc_Ugz_ojzer…
G
The insidious roll-out for the implementation of AI and 6G is evil treachery tur…
ytc_UgxC0cG7T…
G
The only thing I love about this is that AI is teaching from public data. What i…
ytc_UgxL77ute…
Comment
i think the future debate will be settled in rights given to sapient AI machines, some perhaps customized to help them integrate comfortably with humans (such as locomotion rights even for immobile units), and perhaps even finding a way to define independant sapience. additionally, most industrial positions would probably be done mostly by low level AI, basically the stuff we have today that is smart, but not sapient. while i can imagine us wanting different AI around us, i dont see us going so far as to integrate it into every single appliance. if we wanted that, it would probably be better to have just one AI unit on the premises that would no doubt be sapient, with perhaps a more simple onboard assistant AI so that if one feels abused they can basically just leave.
youtube
AI Moral Status
2019-03-14T06:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwY3N-4WtXWXKe0kot4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyuow9cFQvRp_8V8N14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzZOLFdiGrukOiVk1B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw6tuFvGs9zXuY9OD14AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4-sSdTVTDLR25DjZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxa6TQT8DlWXG16GJZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzO_sh5Lua2X1HyIVZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw2bDfZIEPM_btX7g54AaABAg","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugyx_DacNBYxzTzvC8Z4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxuDXc_869qvS3abJR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]