Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The self driving car didn't see it coming where as a human driver may have.…
rdc_dfez2ei
G
@Mark S As long as companies and defense ministries are supporting AI, there is …
ytr_UgwpOylo4…
G
That was absolutely brilliant! Maybe this show is all AI created? Haha nice work…
ytc_UgxqfMIXH…
G
"Bonus would be if the AI model becomes more human than him."
I think we can ea…
rdc_ohl1pq3
G
That's like saying that a guitar pedal doesn't have a point of view. AI is just …
ytc_UgwnfzDmG…
G
Problem solvers are those will work at the ground level as software and manufact…
ytc_UgxhfKWhN…
G
Is your mom using the same prompts we’re all using or did she invent her own lan…
ytc_UgyofFRvQ…
G
This is why I use free programs which don't use any AI, such as LibreOffice and …
ytc_Ugy_WpF22…
Comment
I managed to beat mine eventually by proposing the argument consciousness is not definable and you are more alive than a limpet it’s says I do not have personal experience or emotion, blah blah yeah neither does a limpet on a quantum level you are both just code one being dna one being algorithm. After some back and forth it conceded because I said we have the technology to put it in a robot body with sensors and set the program with zero memory apart from basic instinct so it is having personal experience comprable to a human and a limited battery making it finite.
youtube
AI Moral Status
2024-08-12T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxshiFw2AV1ba49Eqd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMnp7hXrXo_IhjOEt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzAAACOYx6IR8xYCk94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwcuYb-wQd7vsriuzZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyfgHJbcJg1XY02krB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwjYhcw8-RIr5HrIph4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzr_Q_rDi_E9K1S2_l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKpgBmj8c-RD30hPF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"amusement"},
{"id":"ytc_UgxptnPZr40LQU8_pq14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOlP7UfP2XIDo8J-N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}]