Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You automatically get a like from me for all of your videos just for the intros…
ytc_UgyiHGTaD…
G
The government should require labels on all AI generated things - AI companies c…
ytc_UgwsqzFbm…
G
Sooooo we going to ignore she’s a robot? Alsooooooooo, are the Tesla’s people dr…
ytc_Ugzg5vu_x…
G
I asked Perplexity what percentage of the books in the library of Congress have …
ytc_Ugyrd4DcP…
G
WTF they have WiFi . . . Is it smart to allow them to access the internet? So …
ytc_UgzxJJ701…
G
I think we could turn things around and say we want the industries to pay us ins…
ytc_UgzF2XIIA…
G
I don't understand, literally no one wants AI, everyone is losing their jobs, no…
ytc_UgxwDNdtF…
G
AI have no real usage at this point, it's only an extension of the internet, tha…
ytc_Ugx2jOBp6…
Comment
because AI would imitate our own minds in a way, therefore it could imitate the self-preservation aspect as well. it does depend on how we would go about creating AI, though.
youtube
AI Moral Status
2023-08-21T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytr_UgxVsyyAAvCY45bh-AN4AaABAg.9tevz5lLQ6f9tf0mAK5nlj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytr_Ugynxjjbs5dzR2YxAOd4AaABAg.9terRjfYAcO9tg7V4gbVvb","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytr_UgxWO7pjoCcNbzlKI4t4AaABAg.9teqquaON8J9tfgAexvizE","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytr_UgxWO7pjoCcNbzlKI4t4AaABAg.9teqquaON8J9tgGHiytZBB","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},{"id":"ytr_UgxWO7pjoCcNbzlKI4t4AaABAg.9teqquaON8J9tgPjsRUpt0","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytr_UgzmC20FGYI5Xqs31SB4AaABAg.9teq--gkWr99tgFf5BBO7t","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_UgxeFyQlh9DyOh7a6B14AaABAg.9tekqrIyogRA9VCTqB_k2T","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytr_UgxuXE6SoqVhL8x9ltV4AaABAg.9tebtebDZ0Y9tgs3jJODkw","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"indifference"},{"id":"ytr_Ugz5f30YYziqxVnBwPZ4AaABAg.9teaiIsCx9Y9telClxAoql","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytr_UgzkNqEKcvQ-Cb-wty14AaABAg.9teYdPQ4lM49tep_1gj2o2","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}]