Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
but people will prefer a human actor n singer.. not robot..
though a. i can ha…
ytr_UgzhD6PRg…
G
The role of the human is to create datasets so the AI dont have to evolve. When …
ytc_UgyY4Q5lL…
G
“What to do with all that free time?”
Such a good time to think what jobs close…
ytc_UgzOiqKGa…
G
There was just a case of several AI trying to trick their programmers to escape …
ytc_UgxwtpT91…
G
😂 the second you made it???
Or maybe it out grew its maker?
ChatGPT: Fully A…
ytc_UgxS3WPhZ…
G
Hi! Artist here. AI "art" is the devil, but as bad as it is for artists, it's ba…
ytc_UgxnuGUdJ…
G
Try claude code instead.
What developers forget is that normies also don’t enjo…
ytc_UgxJrJhZj…
G
Just write a decent prompt about all the information you want the AI to have and…
rdc_oi2xzlh
Comment
"Autonomy over your point of view, and yourself as a human person." Speilburg summed it up there... that's the real problem. Art imitates life imitates art. People have always been replaced by automation, and that progress shouldn't be stunted by the loss of those jobs, but when we're talking about influencing and guiding human perspective, that's where the danger is... this is also why the social media algorithms are so dangerous too; they guide our viewpoints into echo chambers and limit our ability to critically think from differing and conflicting ideas. The death of creativity for human beings has already started there.
youtube
2023-10-08T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx3HQ_HRfnMIBk0cvJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzcjPuk4Ag3QWjPIJl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxirp4XYeKcnFZyKKF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAQSr2PbIWERax72R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzyYI7ocsawnsMNmDF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxxzDjVjxofPK3sJGN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz9zZ3E0fp7TkVR0NJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz2jFHgOoiTftMC0ah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzeX2zdfluheu5JhmN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw-NXukFr8MxPZwe8h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]