Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
😭 fr. Like I was using character ai on a mcr danger days lore roleplay and all I…
ytc_UgwPcyA2e…
G
Actually..you know what...we artists should make different versions of this AI "…
ytc_Ugyz2td9H…
G
Smart guy that has lost the plot from his bias of mimicry and his attenuation to…
ytc_UgwNCBqY9…
G
Digital research ended up being not correct as it was completed by AI since all …
ytc_UgyyzyXPh…
G
Some issues with the time for other things.
'Other things like what exactly?' *Y…
ytc_Ugy_BaRzW…
G
You act like robot's in general will just become "sentient" at one point. Not al…
ytc_Ugg4D6Jf2…
G
he's probably pissed that google can't get a monopoly on AI to, and has to face …
rdc_m27yvj6
G
ALSO ON OTHER SEXTORTION VIDEOS YOU MAY SEE COMMENTS ABOUT TESTIMONIAL COMMENTS …
ytc_Ugz_MJf5K…
Comment
It seems that the wise approach would be to not give AGI/SI the ability to directly manipulate matter in the our reality. No robots, drones, direct connections, factory/manufacturing control, self propagation technologies or self-governed/generative evolution. The old concept used in high security environments of having an “air-gap” between the AI and our infrastructure would severely hinder losing control to AI for a time. Our gravest danger is in relying on those, motivated by greed and fame, to self-regulate their progress instead of rushing full speed ahead over the precipice by giving SI too many resources and thus dooming our society.
youtube
AI Moral Status
2025-04-27T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwlVBCMmb4O-BKCpuZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxjEqg3UwyhAtepWdF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgywNRNOUz0cnjev6E94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwvmW31laKm2ngQzD14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxTTNLkDBGt0rgndhV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMnH5pTeRl6RJpQjx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8C__02AqdlowwDUp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy21yCven-NPuB4wjN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzldAFsYNstEMrK6Yh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwv13S_u_8Qxv_f1FN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]