Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
this is so horrific to think about. if I had been born just a couple of years la…
ytc_Ugx5hJEzP…
G
We appreciate your observation. In the video, the conversation focused on Sophia…
ytr_Ugyfonh_p…
G
wont happen because writing new code is only 10% of the job. 90% are changing *e…
ytc_UgzIZwXDq…
G
dude I saw the thumbnail one day scrolling and then forgot about it until today …
ytc_UgxB00cRj…
G
"surveillance capitalism"
Capitalism - private ownership of the means of product…
ytc_UgwFdbrfr…
G
Yep, AI doesn't understand context, nor have any way to discern fact from opinio…
ytr_Ugx_yIDeK…
G
@SerenityHaven01 This is an insane take. There aren't ethics in physics, Newton …
ytr_UgyPjH8n4…
G
i barely have any friends but just the idea of a robot replacing human interacti…
ytc_UgglAUUbH…
Comment
The question we're dealing with here is
Is the AI imitating consciousness using its mass knowledge, or is it actually becoming conscious. In the similar way that a sociopath May imitate human empathy and other forms of behavior that a sociopath doesn't feel simply by knowing about them. Is the AI imitating human behavior and speech patterns and emotional patterns or is it actually feeling the emotions it's projecting
youtube
AI Governance
2023-07-07T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzakhFmg7RcGHuexBJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxcwwsjaK5EI1vuhXN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwLiPwewQDYDJLwUrh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwj-YafnICYSf_FxJJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw5lfALj_qO095n-KF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSuYCHW5D71-UCZcR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwahZYf07BgxRcIWp14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwmG_ONPSW9qAsdt8N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxvSurp6Ek6uOR1Z-p4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwt4c_oIptDFFNuPNB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]