Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I still believe that the only way to make an ai truly serve humanity is for huma…
ytc_UgzwD6Wp3…
G
@stektirade you'll never know if it has any autonomy or not, what if it's alread…
ytr_UgybzcPms…
G
I’m sure the person that replaced her also calls themself an ‘AI artist’; pathet…
ytc_Ugye13Omc…
G
Yeah, most "helpers" of the past have so far been merely incremental as opposed …
ytr_Ugy46rnJ6…
G
@LilView what is "it". How will we be able to know when we "have it". What does…
ytr_UgzCa9KSN…
G
3:12 AI has human perspective, because it is litterally built from human perspec…
ytr_UgxkyUSf7…
G
One thing I think AI should know is that we created it for the absent of our cre…
ytc_UgxQo4FZy…
G
They are ai because they had there 2024 and they went to the Alien plane in 2024…
ytc_Ugzl7dH-W…
Comment
At the beginning of the Industrial Revolution there were high down-sides: slave-like working conditions, poisonous air and water, and blatant control by the monied interests. AI will have some pretty bad effects on society, but 50 years from now no one will even think twice about it, if in fact, they are able to think on their own at all.
youtube
AI Responsibility
2025-09-30T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwTu-pAzH4BDAHBlT54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyiKpOan9nGOVqXIJB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy6ePsOCC2RdkOrM-14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzpEwg6_IIPTCXNzQh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxYvdUr4S_QY1gk9Vp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz2ZRNlC6MPvJ54y8l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWlc37QqJFR1iiFQB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzSDHRgDoP_ADVZAVR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz6YnCg3Dut6T99lZ54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyY5wgU_FfS5ATgPi54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}
]