Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Saying we need to develop AI so that we can fight against its misuse in the futu…
ytc_UgxdmL1of…
G
I believe he's exaggerating to generate headlines. Current large language models…
ytc_Ugz_jQzq2…
G
To the folks who keep telling artists that AI will take over and artists should …
ytc_UgwGAzkLq…
G
As someone who crochets, AI is so frustrating. No I don't care how "inspired" yo…
ytc_UgxhMoVe_…
G
I said it years ago. We can never let Ai, robots, software, hardware have total …
ytc_Ugx4nQxbQ…
G
I think people are mistaking the implications of the run locally aspect. Any org…
rdc_m9grnyb
G
Comparing AI to man is like comparing Einstein to an amoeba. The intelligence ga…
ytc_UgwiNXSoS…
G
The fact that we still kill each other in ridiculous things like warefare and ha…
ytc_UgwjPPuyA…
Comment
a good number of top AI researchers are already of the opinion that LLMs are not simply "digital parrots" but do build a world model from which the derive actual understanding, at which point we have to look for a distinguishing factor between AI and our brains, because there really doesn't seem to be something all that much distinct.
youtube
AI Moral Status
2023-08-23T17:0…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzKlY65SXcKHhNEK9N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwagoUoJ-EBSEC3cvR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKv_aqEvTReZIuXaZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzrTtzXAF1e9NjFOgx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwG4GbfxFGVgjcX5-F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxHgROZdDg89kDFoet4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxS4wegPYx9Pli_bFZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyFUj6-LR78pHUQcMF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz3tH-DOe8RzE4p-OZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzQQsDrBswUTBhsfNt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]