Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don’t think this lawyer understands how AI works at all. You can become an exp…
ytc_UgyNUTu_-…
G
Sjamsucks clarified. I know what you are referring to now and I get it. But I do…
ytr_UgztNF_84…
G
Did you label the time sections on this video or did Youtube do that automatical…
ytc_Ugz2O4DNY…
G
The Israeli children on October the 7th killed, pillaged, and burned don’t come …
ytc_UgzyPAJZv…
G
Sabine, you lost me at hallucinations. it isn't a bug, is a feature. Humans crea…
ytc_Ugxr7adwK…
G
man will destroy man with AI/robots. the divide between the haves and have nots…
ytc_UgzRg5SrG…
G
@rogerbrown376 Yep. You got it. Even if they aren't recieving support from the A…
ytr_UgxBYeWCC…
G
Good video, a lot of inside baseball stuff. I'm not a programer, I just use a lo…
ytc_UgxDBjvzq…
Comment
When we approach the subject in terms of postmodern philosophy, of course, it seems possible to encounter a suspicious situation. For example, it can be argued whether a scientific achievement belongs to human or artificial intelligence. This is the work of philosophers. What is democratic and beneficial is to extend access to artificial intelligence to all people. Supervision and control will emerge as a social consequence. A lot can be said about this; but I am sure they will not be harmful. They will be very useful.
youtube
AI Governance
2023-04-18T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzkwNGcmwP9qpL-j0x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgypOPMIJYiESP553E54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwVOMYTX8-sXSDWwo94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxRFqh0i3r2Q_k9HTl4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgTOuyIjpT-MFfUCN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyuLdqlCxaIWif-ngV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwpIkUZAX-gyxIgC2l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyr6akwCh-QxA86pLR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxWLoH9EhF9aWEXcQJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwchyYLGGh104PMPb54AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]