Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think that we should drop A.I because it will take over all mankind and render…
ytc_UgyzZTGen…
G
You’re comparing healthcare to IT jobs. Healthcare got slim margins for errors, …
ytr_UgxQrw-eG…
G
@elegantcourtierIt will probably be a race to advance AI to do senior level role…
ytr_UgytAHZEs…
G
Robot: I will destroy humans
Us: ahhh u think we are gonna scream bring the wat…
ytc_Ugyifopyq…
G
UBI Will not happen long term. It will only happen until the corporations and th…
ytc_Ugx6h1wRo…
G
It’s not a great turn. I wouldn’t consider the AI for a passing grade. The other…
ytc_UgyEAhu4H…
G
A lot of these thing is because people use general use gen-ai. I’ve found that s…
ytc_UgxsgZKQE…
G
klarna moved all their staff out to ai now they want them to return.
I'm open …
ytc_UgxvQm41g…
Comment
No. First off, we don't need to understand human conscience for AI to become conscience. (As humans, we already understand very little.) And as far as AI "deducing", well AI already does deduce, it is induction, inducing, that would raise the AI game. And when AI gets better at inducing (which it is already good at), then we can start talking about "artificial conscience".
youtube
AI Responsibility
2026-01-25T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgymrvWn3dt-3GmTNxd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5fC5dGZUJA7YPYSd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxyaox9oKP7fQvRXOl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAOX4QkbmEOG9QrIJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw7qjTXYavP7LSTrJ54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyf5k6zbixe4p5VQbF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxNHug172RgiM0tqeN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz0ygvGNWGFdtw1Td14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwpUpJtf1J85M8r2sF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwO3Z9oTF58lfCO9Pp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]