Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
oh and even when you get to that point you are going to have to argue that a per…
ytr_UgyRO4ANd…
G
Detecting employees activities doesn’t need AI; it only needs a despicable emplo…
ytc_UgyjpRC79…
G
That’s a good question. Maybe ai will read all the books then tell us which ones…
rdc_lz7er06
G
Lets not be naive ourselves. They know this crap is AI, but they love the narrat…
ytc_UgyE-wjl9…
G
Your phone and cameras have been enhancing pictures digitally for years, especia…
ytc_UgxlT8MKv…
G
5:42 I agree that certain sw engineering positions will be around for a while de…
ytc_UgzKWiXLc…
G
Esta la creencia pero es una que no esta basada en conocimientos técnicos profes…
ytc_UgxwfYEY_…
G
These "Flock" cameras reminded me of the Apple Australia video:
https://www.you…
ytc_Ugx-CB11I…
Comment
Consciousness empowers you to transcend the rules - so the Terminator movies won’t happen cause the AI has no and won’t develop consciousness if the AI is just a super computational machines. But once the Terminator scene happens, it means the AI has ability to transcend rules and human beings is about to be replaced by them. I think the rudimentary rules human gives AI would be no hurting people but if this rule is transcended, Professor Penrose argument fails. So, it’s why more and more people warn to be very careful for AI development.
youtube
AI Moral Status
2025-09-17T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwbpiLGPRZb16SOiiV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxiUBIJQSszJ-ufOWF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwV8QiBlk5oWTHjFId4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyvmeO7VCkLXMmMjdJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwm0Jdn1MCUlyzjYIl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzujoTKwOndKB08rkx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyRvZBw9EwPxNo5y3t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwGAWZzHCcIeH9REM14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzQVp1LDdhO2JqXjLh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwMpWGj_L2dUD_tXLF4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]