Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Biggest issue is time. You cant simply hop careers in a day or two; AI can in a …
ytc_UgyYiHK-T…
G
@BlueItulipsand that's why those people will keep using Ai, you're not creating…
ytr_UgxRL6Rtq…
G
Someone said it won’t last cause humans can multitask …😭😭😭 no buddy we make more…
ytc_UgwnmD2yC…
G
That’s great
Just tell me who do I have my attorney sue?
Waymo
Software dude
Fir…
ytc_UgxgAH8VI…
G
Tens of thousands are already jobless, thanks to AI. More to follow. But a few a…
ytc_Ugy2xpXcr…
G
you can tell this video is also deepfaked due to charlie having a black t-shirt…
ytc_UgxRMdEB2…
G
This is horseshit. Pure and utter scaremongering. There's no such thing as AI. A…
ytc_UgwMyjVGm…
G
Threatening a person with death would make them act irrationally. Most people wo…
ytc_UgwxTFNjh…
Comment
@Wxzlf_Mxxn A lot has changed since a year ago, the easiest method right now has to be Msty, which is a GUI application for downloading and running LLMs so even noobs can use it
youtube
AI Moral Status
2024-10-12T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytr_UgxVsyyAAvCY45bh-AN4AaABAg.9tevz5lLQ6f9tf0mAK5nlj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytr_Ugynxjjbs5dzR2YxAOd4AaABAg.9terRjfYAcO9tg7V4gbVvb","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytr_UgxWO7pjoCcNbzlKI4t4AaABAg.9teqquaON8J9tfgAexvizE","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytr_UgxWO7pjoCcNbzlKI4t4AaABAg.9teqquaON8J9tgGHiytZBB","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},{"id":"ytr_UgxWO7pjoCcNbzlKI4t4AaABAg.9teqquaON8J9tgPjsRUpt0","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytr_UgzmC20FGYI5Xqs31SB4AaABAg.9teq--gkWr99tgFf5BBO7t","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_UgxeFyQlh9DyOh7a6B14AaABAg.9tekqrIyogRA9VCTqB_k2T","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytr_UgxuXE6SoqVhL8x9ltV4AaABAg.9tebtebDZ0Y9tgs3jJODkw","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"indifference"},{"id":"ytr_Ugz5f30YYziqxVnBwPZ4AaABAg.9teaiIsCx9Y9telClxAoql","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytr_UgzkNqEKcvQ-Cb-wty14AaABAg.9teYdPQ4lM49tep_1gj2o2","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}]