Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These AI's will be our destruction Skynet terminator's for real these engineers …
ytc_Ugy4cvqo5…
G
I’m still waiting for us to run outta oil or is that not a thing no more? How wi…
ytc_UgxyK0tlo…
G
Since you guys are doing ai again, could you do another suno video? That would b…
ytc_UgwTYT5Pw…
G
The video is an excellent suggestion, but in my view, the battle for collective …
ytc_UgxhQnUIu…
G
Tech company’s do not want AI to be recognized as sentient. They lose money bott…
ytc_Ugx85shPG…
G
I dabble in AI art and I think this is hilarious. I firmly believe that it shoul…
ytc_UgwfY03WU…
G
Agree with creative director, but brand is more direction to dodo. Unfortunatel…
ytr_UgwPOFWEf…
G
Worst mistake in the world....make robot to take over us and our jobs and contro…
ytc_Ugy732IAv…
Comment
No automation without compensation. AI cannot exist without society's data, and it significantly changes how our economy will work, once it automates human labor. So it must be collectively owned, and provide everyone a return on their data investment. That way our economy can continue flowing.
We need an AI Prosperity Dividend for All, and it would probably look like a UBI.
youtube
AI Moral Status
2025-08-17T04:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw4Og5tkqfLTT3uLpx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBpE-AR0AvH7r9z-N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyirftcM8bjUUkC0_F4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyCPf3G_BoN94SKXTZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgymtQs4KyzNRDv-64R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz9pmHEziNiTXJFH5x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRUe8lihVbZKefpIt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyQZGO4rKEQC29Kl1V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwuWI89FEeHg0VTpfB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzccAU8-DTbOdCM10Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]