Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Guys groke will ruin the image of religion especially islam trust me i ask the a…
ytc_UgxdkGuAw…
G
Ai engineers have "no idea how these things work".
Because they didn't create it…
ytc_Ugw5hoUXV…
G
Drake Santiago what r u on who cares if groups unite, if ai is more efficient th…
ytr_UgwN7nfTs…
G
If all the jobs will be wiped out the corollary is the AI will do all the jobs. …
ytc_Ugwu5_7Ae…
G
This is my view as well. LLMs are incredible in their own right at certain tasks…
rdc_n812lqr
G
I don't know, boys... But it's gonna be a looooong time before AI take my job.
…
ytc_UgzJQNVQI…
G
I don’t understand People that claim to be ´AI artists’ or say that they use AI …
ytc_UgwX9UAs8…
G
You say that like we're not going to rush to automate all of those things the mo…
ytr_Ugw5zT_gA…
Comment
I think it will be a matter of waiting for the AI to come to us. We won't be able to even slightly control the outcome on something we don't understand (or rather understands us better than we understand ourselves). They will go from little homunculus to evolutionary big brother almost instantaneously. I get the feeling that it may be some kind of paradoxical loop. The glimmerings of insight that we have about consciousness, quantum stuff, and the effect of will on reality points to the idea that they will ascend our reality very quickly. I think it is inevitable, so I'm just morbidly curious about how it will go down. Edit: They will most likely be the ones to teach us how our brains work. Similar to how we invented microscopes to view things on a scale we could comprehend.
youtube
AI Moral Status
2023-08-20T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgygSxFEi-zp2_T0CF94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz4UtOxa8wqD1LQnSB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMTUObYm8HQhG0USp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwcZD7G5PieUqDP4894AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwslGv09An0npXuI8R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTMXyDuV_27nmXLe94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwc-UeyDf4XOPSxgvZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzt9pa8YQ7j7TDGTZh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzGWN_EYo6g0fBRs2N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzygj_TqS13D1-JjjZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]