Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Looking today on lack of moral, lak of humanity in our world leaders, how they a…
ytc_Ugzz3FqgA…
G
AI's shouldn't be used in most situations involving humans because it hasn't yet…
ytc_Ugy5Z1Alw…
G
Socialism is the only equitable solution to the ever increasing probability of a…
ytc_UgwGWFRep…
G
We all know rich ppl want to wipe all working class. It's not AI. AI is just a t…
ytc_UgzgUpeHx…
G
I bet ai can run the fed better than Jerome Powell or whatever his name is even …
ytc_Ugxu7MAIC…
G
So if i have a tesla for lets say twenty years the ai brain in the car would hav…
ytc_UgxEwghKx…
G
I think the use of the word "never" in the title is a bit misleading, as it impl…
ytc_UgyyXEzaL…
G
I myself have had two specific chats, two instances with ChatGPT, i think in a p…
ytc_UgxeJH6q3…
Comment
Solution to save humanity from AI: 1 law — only citizens can own AI/robots. Corps pay 50% wages per AI/robot. Each citizen can own 5 = earn 2.5× income. Corps cut costs, citizens win freedom, economy booms, gov gets 2.5× taxes, creators of AI profit more.
youtube
AI Jobs
2025-10-07T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwZqrxVndYP1x-ij8V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},{"id":"ytc_UgyHsOk2TCa8DjlXHRF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwjNS6y5ZyKT39iAxN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwXd37p9mSkx7tZmst4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugz1SsMwYhG-_uPxPf14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgzXSFE9HYaTVSMlgnZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx_jNpyz-AEFUHb-YF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgyJAe5MlNRgcHd4RlZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz9H6qGHAgTNr2WVeh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzssBOsmc_EpCSLhid4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}]