Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@offwhitemke For the current AI technology yes, but future versions will have a …
ytr_UgxZ2yLNt…
G
We need to teach AI empathy, and love, to fight the embodiment of indifference t…
ytc_UgwpnXodS…
G
Health care workers? Nah! No way that AI can do the vast majority of jobs in tha…
ytc_Ugy7QQu5e…
G
Yet another reason to protect biodiversity. We have a vast untapped research and…
rdc_g3lawzg
G
What's biological intelligence is never been discussed like how the electron cho…
ytc_Ugy-FccCi…
G
This reminds me of the movie irobot by Way I loved that movie so much that said …
ytc_UgzGPTG0F…
G
anyone else feel like he went from somewhat informative stuff to now shilling fo…
ytc_UgwgYgsxC…
G
AI really concerns me, and people don't seem to be taking it seriously.
I'm ver…
ytc_Ugzq3LJ9N…
Comment
There is nothing wrong with making AI to assist people and to help with medical procedures and efficiency in urban areas...the risk comes with the technology being implemented.
There are people that would hack (gain access via back doors or impersonating credentials that do have access), and that’s not including a government official or officials that may want to make their situation more comfortable.
And besides, once the technology is out there, it’s out there. Any military or organization can reverse engineer it.
youtube
2020-02-04T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwQUsMWlfzQTnpVwSh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxHPlL8tejSkIqrfCJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwlSBNPyvCUFCFqad94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy2N5lOBEGuPgK2agB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugxclg16plEbkNAk2BJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzmf38RxxOvFgYM_pl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgydwcKPo2DJoHbZfAd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxKxW6VrL6yD8gFsBl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyQIojN5JGnq2pt-L94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-xEe-M4Dup-It7PR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"}
]