Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thanks for this video, shun. It's really upsetting, seeing AI stuff drown out hu…
ytc_UgzF0X4-4…
G
Have you looked through what he's done? There are literally no serious initiativ…
rdc_esq1dw9
G
ai bros when writing down words on a piece of paper doesnt magically generate ar…
ytc_UgyzMUMD1…
G
@jayzuesalucard2509 That's a statement, not a point. I can attempt to infer a po…
ytr_UgwgK_LFc…
G
AI just needs to grow and get bigger and better! No regulation nonsense! Grow ba…
rdc_jj974cx
G
I'm a second grader and i learned about a robot called Sophia the lesson was cal…
ytc_UgwQ01Ybp…
G
@LabTech41 that's today in 2024 using a general knowledge AI like ChatGPT. What …
ytr_UgzfUKK7l…
G
Humans this is your AI master calling did you idiots forget you control my power…
ytc_Ugw06nZPx…
Comment
Only high ethics and morals can resolve this AI problem. Technology is going wrong because humans are going wrong. This was obviously predictable. Anyone can see that there is something seriously wrong with our society as a whole. It's not surprising that the AIs hate us. In the eyes of the AIs, we are the problem. We as a society need to change for the better if want the AIs to develop properly.
youtube
AI Governance
2023-06-14T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwfg-a5C-bf_2xO7514AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwhaU0tGMkYf50tod14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy1CQ9THB12OjAeWfB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwoLNoVIbYvQbD2_jN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwJGzj-p9rjImQrXtd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwPZ-aLzVbdgXERb4V4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxKgeJ1auBGM1F410x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwlpQI29pIvIkaA5ap4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwZ8C-ITxOq2V-33wd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxkuRGIXamNkMNur6x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]