Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai for replacement? Nah. Ai for Aid people assignment specific need for, not 100…
ytc_Ugx-e3ACS…
G
Ai is not anywhere near self aware in its current state. It's a word guessing …
ytc_UgxdP6QQu…
G
There are risks, but not as big as Yampolskiy is making out here. We are not goi…
ytc_Ugx612rMb…
G
My theory is that AI is not as sophisticated as everyone is trying to make us be…
ytc_UgztSgqqD…
G
Even AI stated that the AI industries request to Congress for regulation was jus…
ytc_UgwQOiyO3…
G
Im proud to live in Colorado, the first state who passed a law to regulate the u…
ytc_UgwFUk2xH…
G
maybe some prompts are better than others, sure, but there is no such thing as a…
ytr_UgzpKgaRv…
G
Needing to convince my friends to stop using ai and just talk to me instead when…
ytc_UgynqrhT9…
Comment
I'm confused. People invented AI, and now they're worrying about them and their existence. We have full control of what AI will do to impact the world. It's not a joke how strong these robots are, but we could just stop making them, and then our non-existent problem will be easily fixed.
youtube
AI Governance
2024-11-08T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugyq9Qzyj1sUeAIYsBN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxAFSVinb97o7r4TmF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgymunjQEqXTWAaELr94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNzcVgQNL_-tn4fKN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwiBETzcN_ZGnsmwOh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwREs1EBBFLKt6VdKB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwmEQLWg9c-qwLYef14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwqJLhRFcJ0qX0Ex5h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgybY24isA9BeaSbHrN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzykIdR-VPX7ZmkVX54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]