Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm not an artist but what pisses me off the most is when people unpload the stu…
ytc_Ugy7foojd…
G
@AbdulfataiWaliyulaiI have used chatgpt to make draft.so is there something whi…
ytr_UgxdNQA6Y…
G
Factory myths? Factory truths! They generally don't lie. Most Factories have se…
ytr_UgxwwxQdg…
G
It would be more accurate for you to say something like "I do not consider you a…
ytr_Ugy2Ql-ZI…
G
Hey @AnthonyWright-xr7xb, thanks for sharing the video! Robot's skills are so im…
ytr_UgyQiNRRp…
G
at least you would be imagining it, with ai you just type in some words and hope…
ytr_UgzlBJMXZ…
G
In case Ai really takes over our jobs and we have to rely on the government then…
ytc_UgzQ2DFD7…
G
As an artist this immediately became one of my favorite episodes of intentionall…
ytc_UgwPciy-7…
Comment
Its a bad paper. They believe the owners of the A.I. and governments will give us a basic univereal income or anything, that goes counter to human nature. We are greedy people. Were all going to be jobless and murdering eachother in the streets to eat on a daily basis.
There is a reason most scifi works of prominence arent utopian society's. The probability is just to low as all the power gets concentrated in the hands of fewer and fewer individuals.
youtube
AI Governance
2025-08-02T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwyqQB5XyCEzHFtAMt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyrQpWmMRFrhNnBrBB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyOJQHGFBD594OOc3t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyK6TFIzC0yPPh_lIh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxHXGTjSbgUsGSWuHB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz8f5I6hel6Ic38fkx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyVaiDcbu1CsmYWlwh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzDy4WMaynrfhLIgwl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugwhf0I8djQ_B7uDfbd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwoVKvMCJNVYEQ-fCh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]