Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@JD_2020 Interesting take, but you’re assuming a lot here. Knowing that YouTube’…
ytr_UgzSaqkb-…
G
@hughobrien4436 Computers leak data, they share it all the time to train the mod…
ytr_UgyiL8rvc…
G
As a fledgling artist at 28 years old, I already felt a lot of pressure trying t…
ytc_Ugx-oyyoD…
G
There is no way that the writers and actors can prevent companies from using AI…
ytc_UgxLEcG5v…
G
exactly, it looks out to hear key words and gives scripted responses to said key…
ytr_UgxEtihXE…
G
I feel he’s trying to relieve himself of guilt by saying all these things now. H…
ytc_UgxeRjmkj…
G
I don't doubt that these things do happen, but the way this story was written fe…
rdc_nrw02ez
G
"Start at how scocierty are so willing to hand over their critical thinking to A…
ytc_Ugw4ZlM6d…
Comment
There is nothing to be done to control AI development NOT because no one is paying attention; but because the first nation to enjoy the military, economic, etc., advancement AI is certain to deliver could be rewarded with a dominant world competitive position or menacing opportunities. So, our fatal flaw?? -- having never found a way to live as a welcoming and loving worldwide community.
youtube
AI Governance
2025-10-16T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxYZZUWf1e0BmiKVjB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyj61IC9y4O1eajFIx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw3j7ix_m4O6fjeX954AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy4-TMZuxbJYngQ8Mx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw58XvKpbBYlzWFchJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyGHqp3D-7GTb5h_Id4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwzjuhdXejZ9g-vYPJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxEeUhfSG0D_ImVweV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzohPviAeIyf6Vgdm54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxjhUMpviZsQdzeHKJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]