Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
After watching this I asked ChatGPT, "what are some world changing events that c…
ytc_UgynxTeab…
G
@AI3Dorinte
The argument is simple. Hallucinations are a feature, not a bug, o…
ytr_Ugw20-wC_…
G
The mistake AI programers make is that they program their robots to "know" thing…
ytc_UgizxtiV1…
G
——- IS AI GOING TO REPLACE RADIOLOGISTS AND ELIMINATE THE FIELD OF RADIOLOGY …
ytc_Ugx0vjNcF…
G
There exist no AI consciousness, at least not yet. Easy to gauge:
Does it have …
ytc_Ugw-Mgv6U…
G
I don't understand why these billion dollar AI companies don't just pay artists …
ytc_Ugw7a6JUb…
G
@sadimsora Wow AI music slop poster made a de4th threat you mfs are such great p…
ytr_UgysdyB_x…
G
@VoxelGarage Waymo are already operating autonomous rides. Most people have no i…
ytr_UgztlCX8b…
Comment
Except the world is for humans to live in, not AI. The human crisis points will determine the outcome, not the capabilities of machines. Has anyone calculated the energy and clean water costs of AI doing EVERYTHING, all the time? There are limitations in resources. AI requires massive data centers that devour massive resources, and it is still not close to matching the human mind. We are machines that prioritize survival and resource conservation is king. Why would we waste all those resources to have AI do what humans can do much cheaper. That seems counter to survival.
youtube
AI Governance
2026-02-11T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy5OBm6DvpVluaJTq14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyDOcVGgNMzZlQVijV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzmAbZN_JPiKLgFWEp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTj7KoONg9e0HuC-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwVvyuCXeiC8yMUjUp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzxXfVtunf6UPdQUoF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgylG_bvLyqjIhcsQPl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyowMRnQWLfDTGZpqx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-BVxG5IEbNY-koAB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWx0DFXWtxaYjKMwN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]