Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We got copilot at work like 6 months ago, it makes tons of mistakes and the issu…
ytc_UgyY26nVX…
G
He clearly is saying he's using AI, so I don't see a problem here why are people…
ytc_UgyqzNQB7…
G
@aldin20i think architecture is a bit more complex with ai just because its les…
ytr_Ugy2mD3o3…
G
Humans are subsidising the Ai development.. Power and manufacturing are all done…
ytc_Ugxyx31xY…
G
@JUICYbluepanda2i didn’t say any of that nor is what you said my opinion.
I sa…
ytr_UgxTIhUKy…
G
Me personally as someone that draws and tapping into animation I have no problem…
ytc_UgxiUx6fy…
G
Sophia's talk on wisdom just confirms why I use AICarma to get my brand noticed …
ytc_Ugyz-Jf9R…
G
But AI is not set up to transform the paradigm of treating pathology after it ha…
ytc_UgxFBqfab…
Comment
I know it’s very sci-fi/dystopian but if these systems keep advancing and computers are able to learn how to fix themselves using other robots/computers etc; I feel it’s a very real possibility in, I don’t know, 200-500 years that these robots and systems could turn on humanity and turn the world into an AI/computer society and humans would have no way of stopping it and would ultimately be wiped out. Kinda like terminator except I don’t think it would look like that
youtube
2025-02-01T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyz8LKw2HwM5v870kV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwNZkpnlHywoCPOsLR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwvBUMeP8fX2d5ZGSV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxIoY0fBVsT6S0nm2B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxsPbLTKk_2nSQM5lB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8Aag95PwXCWD8qS14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz51lCHbNxgTorYh2x4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz01A2y_VHJGSqbA4t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyb5vaK-ovzXxWgMU54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzG2N--D94wRjM5kht4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]