Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The economist discusses how rapid AI-driven automation could displace large swat…
ytc_UgwHCYW0q…
G
And in the near future they will become an intimate companion, and all cool and …
ytc_UgyC9Fk1d…
G
Un coup de marteau 🔨 sur ton robot, et le problème est régler😂. Qui va les répar…
ytr_Ugy7Qq5Qu…
G
@boglenight1551 yes and automation shouldn't be replacing people's jobs. Where e…
ytr_UgzLr3xpx…
G
what until they make SUPER robots. Wait, robots. Wars. Super robots that fight w…
rdc_ff258fo
G
Really pretty annoying watching the same people that create this stuff constantl…
ytc_UgyXW_x1b…
G
@brianmi40 It's overhyped because the general conversation these days is that AI…
ytr_UgxAcioiW…
G
Edgardo Peregrino No, they will not become self aware. In fact, that is highly u…
ytr_UgiQlhIkT…
Comment
You can clearly see the prof is a BBC watcher. Musk is the only one who pushes for AI safety. He cannot know that because he got his information from the BBC like he said, his bubble. The irony.
youtube
AI Governance
2025-06-21T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugwe6p494MSZUiKNeFp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyXcGe1ucGGXsRVGct4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxcVdoVb4SA9XtXZdV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw4HRTNK7LG8Z_guIZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxQJeg5xFxp-OJEI5p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyQTRHtHfaSrhO1z8V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyoKPVrHozCtQDY-xt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzokjAV7-XipTOGuV54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgzsnsuMY5cJHqOXw8N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyI2T7eXrZ-2rJlHvN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]