Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Tesla needs to make a company similar to Waymo with FSD as good as this.…
ytc_Ugzbk6xd5…
G
Uh no actually in terms of resources consumption it is cheaper then farming whea…
ytr_UgyOuZaIC…
G
They say automating jobs will lead to growth in different fields, but companies …
ytc_UgxKjL406…
G
I'm having a hard time believing this was a real chat with AI. with all those "…
ytc_UgxY6Yd8c…
G
You won't lose your jobs to AI. WHO would the companies sell to if no one has mo…
ytc_Ugx6qdR0L…
G
I use AI, but mainly as a lazier mans search engine for articles. Or to make my …
ytc_UgygMvnQE…
G
So, I have dysgraphia which makes it pretty much impossible for me to draw curve…
ytc_UgyAzrL05…
G
Autonomous driving technologies are destined to improve and prevent collisions s…
ytc_UgxytBM0Y…
Comment
We went from back propagation to philosophy we skipped a few steps in between the debate lies in if we get AI to make up its own data (i.e synthetic data) in the absence of real world experimentations built into simulations, one still can't get past the fact, for completely new theories one still needs physical world experimentations, in the absence of which AI could generate an enormous amount of what we call in the human world crazy ideas. E.g. AI could think it created a black hole but in reality it might have created a giant fart in a box. The reality is that AI still is unable to consistently do some pretty basic human tasks. So there is a real disconnect in the extrapolations vs the actual on the ground progression.
youtube
AI Moral Status
2026-03-09T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzXWW0M2y3hmFbP1MB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwDT4XpxL_jo_dNMSN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzfl-Au565Lkc9lxgN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGh8JonO5_vBzbH1p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx_MQ4yn-i8coAzn_d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYcj7I8-ac4tsQJEJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxIYMczHXjNDauY-9l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAY5seDhBavNnrr2R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwuiHQvbaxvHSmEtVJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxyglF3hGwETtqBZal4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}]