Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How about doing some actual number digging. Tesla has billions of road miles. St…
ytc_UgwBhpfrb…
G
We can all take action at our own scale. For example, you can talk about these i…
ytr_UgxRIebW4…
G
Stop lying. If you watch the full clip, theyre literally pitching the idea of a …
ytr_Ugy3nm1DK…
G
What’s stupid is in 2026 alone some almost 700 billion will be spent on AI and y…
ytc_UgxwXDFXw…
G
Ironic, John Hopkins has a history of night doctors and they also took Henrietta…
ytc_UgyJ4dLTx…
G
Ok but the part that makes me genuinely wonder: did Disney think the hard part w…
rdc_ocpfe9y
G
Wait! You have to press a button on a potentially broken phone to open the door?…
ytc_UgwAy1arB…
G
As an electrical and software engineer I hate this kind of misleading bullshit t…
ytc_UgzlpGHnp…
Comment
Interesting but not real. You see although there is a sort of brain behind (neural networks) these dont get trained on how toa ctually think but what they do is try to match as best they can the best combiantion of sentences based on a form of statistics that gets acquired from the weights of the neural networks. So teh AI of today is useful a form of advanced google if you like. It looks certainly very close to humans but it has limited memory even of the current discussion and has no long learning capacity. To learn you have to train it over thousands of examples and again and again. Saying that it is very useful and I believe it is a stepping stone towards generic AI.
youtube
AI Moral Status
2025-06-27T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzpHq7KAhNUKpIl7jl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwXRRFKCHOviB4Wc-14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw4LFfKVWc0C3QWwtZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWneAZLD6NOfFVnh54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugyer2yv0WzAEFon9-x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxAcUsePInIgGgYVtZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyfSd_9-jBIX76L8a14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwxDFXHTtOcuKC4ylB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxHdkCOoctRmYAwkS54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwnNIjwPjJGIPcNxE94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]