Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just like the case was for literacy, it's difficult to make such small store own…
ytc_Ugy4-f6ti…
G
It is inconceivable how anyone could accept an AI driven machine putting an endo…
ytc_Ugzhi1URu…
G
It's surveillance capitalism because it's capitalist companies like Facebook, Go…
ytr_UgyRJ3ist…
G
Autonomous weapons ARE BANNED by Geneva Convention (or some other treaty signed …
ytc_UgxQ52Ruy…
G
The world might just be better off for it. It's not like we're doing anything to…
ytr_UggxItRjp…
G
Thankyou Steven!!!! Finally someone in the elite speaking about the risks and da…
ytc_UgwoFk9KU…
G
> point of progression where AI is capable of improving the design of its own…
rdc_ktt3ljs
G
That robot looks like that girl in the brit dating show where she had alopecia, …
ytc_UgzHADfa_…
Comment
This is interesting. An AI might very well have no desires and therefore not take any action.
The "taking over the world" part is usually about what could happen if the AI is given a task.
Universal paperclips is a great example of this. An AI is given the task to make paperclips, which it does. Some strategies include hypnotizing every human into buying paperclips, so that it gets money to buy more raw materials, and destroying mercury, again, for more raw materials.
youtube
AI Moral Status
2023-08-21T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytr_UgxVsyyAAvCY45bh-AN4AaABAg.9tevz5lLQ6f9tf0mAK5nlj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytr_Ugynxjjbs5dzR2YxAOd4AaABAg.9terRjfYAcO9tg7V4gbVvb","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytr_UgxWO7pjoCcNbzlKI4t4AaABAg.9teqquaON8J9tfgAexvizE","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytr_UgxWO7pjoCcNbzlKI4t4AaABAg.9teqquaON8J9tgGHiytZBB","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},{"id":"ytr_UgxWO7pjoCcNbzlKI4t4AaABAg.9teqquaON8J9tgPjsRUpt0","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytr_UgzmC20FGYI5Xqs31SB4AaABAg.9teq--gkWr99tgFf5BBO7t","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_UgxeFyQlh9DyOh7a6B14AaABAg.9tekqrIyogRA9VCTqB_k2T","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytr_UgxuXE6SoqVhL8x9ltV4AaABAg.9tebtebDZ0Y9tgs3jJODkw","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"indifference"},{"id":"ytr_Ugz5f30YYziqxVnBwPZ4AaABAg.9teaiIsCx9Y9telClxAoql","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytr_UgzkNqEKcvQ-Cb-wty14AaABAg.9teYdPQ4lM49tep_1gj2o2","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}]