Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My best guess is that the cases pulled by ChatGPT were likely used in some TV sh…
ytc_UgwTknYs2…
G
I remember a comment in the movie, the matrix, where they said they had to inter…
ytc_UgxiGXDM6…
G
You gave it parameters in which it responded as a hypothetical. I'm curious to k…
ytc_Ugy_uns5w…
G
Sir , you are telling us your problem of how ai is making a change ,how much imp…
ytc_UgyAamgsf…
G
Large Language Model is a term I find akin to Long Gun. Blah, Blah, Blah. . . . …
ytc_UgwiAdD6Y…
G
As a programmer, no. Maybe fewer places that would accept lower level programmer…
ytc_Ugx1EWguQ…
G
So AI will go mining for different minerals, process them and build roads? Or wi…
ytc_UgwTJDCaC…
G
Krystal, when a company releases a new device, they can either say its 'novel' a…
ytc_UgwvmITzv…
Comment
I think the question is not whether when AI will actually be conscious (because it's probably impossible to verify), but rather "when can humans no longer believe the AI is not conscious?" because the AI is ours to make and use, it is ours to decide when they think on our level. Similarly, slavery was abolished not because slavery stopped being bad (since it always has been), but rather because those who weren't experiencing slavery could no longer excuse it.
youtube
AI Moral Status
2024-05-20T19:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwFM7MG7t5X17OiHtZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw9FGfNr3A6TvxxW694AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw_q2qIvn96v2hUqcV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx9YtSzZab_8isGAEp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyZpq5ZqmyLrq-s5gN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzPn85aV_VXhgSzD654AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxjHpSsjA-r2HM-D5N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyqHbKQeeGJfTz8kNV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzc-9KlZs7RibTgLBJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzfrIpYhPePWFdXnwl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]