Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Take the seat up over the green sauce also Google AI straight up stupid as fuk…
ytc_UgzegRyjq…
G
Anthropic ToS always excluded the use of their service for things the government…
rdc_o788aae
G
You guys are kind of silly yeah you got yourself driving car that anybody can ha…
ytc_Ugwkzn30E…
G
The guys never done manual labour - they aren't going to spend 100’s of thousand…
ytc_Ugw5NC-Op…
G
Humans doesn't need new 'partners'. We need tools that we can order to do what w…
ytc_UgzyAHY7B…
G
If I loose my job due to AI I'm gonna give the AI companies the united healthcar…
ytc_UgzUBdmHq…
G
I’m not convinced by the argument that AI doesn’t know what it’s doing. Frontier…
ytc_UgwvCjvp2…
G
They are trying to force AI onto everyone but kids are an easy target and in so …
ytc_UgwZF60Oh…
Comment
Humanity is guaranteed to self-destruct eventually anyway. Probably sooner rather than later. Think about it; it is 2025, and we are still using war to solve border disputes, and we refuse to switch to green energy, even though it would create greater prosperity. The recent USA election has proven that democracy can't save us. AI is our only chance to avoid extinction.
youtube
AI Responsibility
2025-02-16T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwHHe0-Biubv204IfZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyN-SMq7xxNcIFTN-x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugysg9cmBMf9dgbWaUt4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw5n1NHSneSi6_76_14AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwIftpNBBTwxYAnrr14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzRhrkDidwwPL7IL6d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxX8AxD_RlDAcjZLFd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxqgvBpy0th4lAVG_h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwMiANo5VBxcQ89Anh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwhzZCypJ5_uONzHGN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]