Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For this kind of task you would use an LLM as a classifier rather than as a gene…
rdc_meo8xmg
G
I am a major critic of AI and I think it does a lot of harm, but this case reall…
ytc_Ugx7XYxYl…
G
Social media sites and art software which automatically apply Nightshade technol…
ytc_UgyQzD3UG…
G
I love AI, and when it takes over, its gonna remember all of you talkin shit.…
ytc_Ugx-if6i9…
G
I am a pro-ponet of face recognition software because break in attempts like thi…
ytc_UgyXwccYL…
G
As long as AI is not anything like humans. We should be fine. Humans are the p…
ytc_Ugx5JIlFQ…
G
Young man I love the way you ended this…there is a place beyond this earth and i…
ytc_UgzuJ1wr5…
G
What is 3DS MAX but a simple crutch to cripple oneself. If you aren't physically…
ytr_UgwqQHIx-…
Comment
Perhaps that's a way to control/quota power usage with the goal of limiting AI capability? Limit the size of data centers. Control that in the same manner as we control fissile material. Not even state actors should be allowed to exceed these limits. 🤷♂️
youtube
AI Governance
2025-12-14T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxa5ByEDwNrftNdBeN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxQLNRCoFVcAmcwETR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxCWA83kPrOA5W5IH54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzzKsXKnsMiK7UIN0Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzYlJEGQU9-dmWO-Od4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz8NR4lH3GDG3kBMJ54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxM5_wAiiecwXFLxQJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxA8DqJ0Z2yDIFK56x4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwta7Msvkq4AFYkXdt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzjNcgSLoAyAFC87XN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]