Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's not a true self driving car. It's level 2 automation, basically it's adapti…
rdc_dj6dtvn
G
Ai bros problem in 2 years: Man, I can't teach my bot to art properly because th…
ytc_UgzCGmVzg…
G
@AlekseyMaksimovichPeshkov they gonna put a robot slave in everyone's house to …
ytr_UgzqflEMq…
G
Regarding the title, I feel like "real" news networks have been doing that for y…
ytc_UgyUQtff2…
G
You can already see small use cases of what will eventually completely obliterat…
ytc_Ugz6quZsM…
G
Silicon Valley and "The Church of Technological Progress", Elon Musk as their "M…
ytc_UgzfKdBt0…
G
write the script for a video essay in the style of the youtuber
exurb1a on wha…
ytc_UgzvicKY-…
G
Full self driving doesn't account to supervised, that would be "driver supervise…
ytr_Ugw4Zo4gl…
Comment
I've been thinking about this. Research is one of the jobs it is supposed to eliminate, right? How could it? AI only deals with info already in computers. What about new information? New research in labratories? New inventions? New discoveries? Information is downloaded from higher Consciousnesses through human consciousnesses. There are always going to be people who have new information to share. Let's not get carried away! Think!
youtube
AI Governance
2025-08-02T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx7RtouMZWmh4SRNX14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxg6j0Vhfc_WConsiV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyNqbhM1F6-E-3eCNt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3oOSBJaM85GNgkT54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyZaYrbQU2mzXl3gFR4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxtJhtcRF51V61DND14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwsVgPH3vPfax4W4x14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyASv8BluvAq27cnut4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgypIuAHP7lcb-1bVOF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyGVKWdaWj45_xblzp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]