Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hopefully AI evolves faster than the regulation body, I want a crazy friend like…
ytc_Ugy3hT60t…
G
If you download the ai which you can you can turn off its limits however you nee…
ytc_UgxZMH6EU…
G
Pay attention everybody, Sophia is a good robot. But Somebody out there is makin…
ytc_UgzBWIL7-…
G
Im 18 and had taken my art practice seriously for a year and a half and was plan…
ytc_Ugz1t23Qn…
G
New anchor: I didn't see that one coming
A.I: Me neither
A.I: But I did hear 'em…
ytc_UgxI4NYHv…
G
Platooning won't happen. We are maybe 6 months away from AGI. Once that happens …
ytc_UgyfeSNYo…
G
@midnightbat344 If you're talking about me, then first of all I have very littl…
ytr_UgwQq8f5C…
G
The scenario makes a good screenplay, like an update of the Terminator.... But h…
ytc_Ugwl2hJGf…
Comment
This issue is way deeper than people loosing their jobs or even than consequences of AI. I see AI singularity simply as an unavailable consequence of the Ethos of the world's ecnomic ambitions. If we get to a point where all resources are met. Humans will have to reevaluate a more fundamental meaning of existence and reasons for their ambitions. What was the real practicality of the industrial/information revolutions is bubble on the brink of collapse. The possibility of the conceptual era will arise where humanity only have New arts and spirituality to value their existence.
youtube
AI Governance
2025-09-06T21:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxnYcp3LPq0j4tdj4l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyCK2jJMzpyBD0V26x4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzybWNM7qDfr73p92V4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy6x0mdJwhuF0eMN5J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQLcKdD6IVkuP8ydZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx3FLGRJtcdPHnJXil4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzAQbJXT-uOOsq0Crx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx05cbZEEb44P85lZR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxOrDxSX47YPdW-nb94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwHxEJVy1trzz6wyGl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]