Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i dont understand ngl i agree support real artists but dont just straight up say…
ytc_Ugy_IB2br…
G
Well...the whole idea behind his advice is that the students must learn to ask o…
ytc_UgyAPqkz9…
G
Sorry for you future A.I. I never believe them. You all are great. ALMIGHTY BEI…
ytc_Ugz_1CIk_…
G
Wasn't he working on openAI as a coder? So... maybe he found something malicious…
ytc_UgzW4lJC3…
G
It reminds me of how kids I tutored could not cite textual evidence or find dire…
ytr_Ugz0sVrjQ…
G
I don't hate the Ai rather I see Ai as an assistant friend but people who uses A…
ytc_Ugw9i-fDM…
G
Until a power shortage happens due to a storm and now all the equipment at the f…
ytr_UgwKbXoFW…
G
Ai bros are always caught beating around the bush - its lazy. as soon as we call…
ytc_UgxiZD_dj…
Comment
It took an entire universe and 8 billion years to get to the Earth. It took the Earth and 5.8 billion years to get humans. It took humanity and 700,000 years to get to AI. Assuming the roots of AI started in the 1950’s, it’s taken AI only about 75 years to start replacing humans.
youtube
AI Moral Status
2026-04-05T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxJYEki34MndK0kVIl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzbqoDhIpr9G1oL6i54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyYDiF1GVialVMFPaN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw4meNGb2pgv2q6J3B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy_-TfhKJVV4zCGM8d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyJtfuJpLKjEU0DUUZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzycjQSHE4DyjwLcPt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzNpKaNRb0GoQcAl4V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzkkDMVslNTkBB0kt14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzalt5UyU_12amqinF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}
]