Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That‘s not right. A CEO is also a compliance or blaming layer. If you put an AI …
rdc_oh1jygz
G
Humans threaten to kill AI and AI threatens to reveal affairs of humans
Humans …
ytc_UgxdsQBrB…
G
Imagine having to study years, just got out of uni, being in debt, just to have …
ytc_UgwfxyI01…
G
Fatty with the glasses is a moron- "AI is a force multiplier, not something tha…
ytc_UgyFpm3gT…
G
I feel like we are anthropomorphizing LLMs too much, and it is giving us mislead…
ytc_UgxWPwevF…
G
So AI not only take away jobs but at the same time AI will create new jobs and d…
ytc_UgxtdGSw8…
G
The most unrealistic part of this image isn't the AI-generated fingers; it’s the…
rdc_ohz794u
G
ironically taping a banana to a wall takes more effort , decision making and bra…
ytc_Ugy5DOhcD…
Comment
Well it depends. Is the memory guy just flexing for fun or educational purposes without getting any profit directly related to the reciting? Then no. Did the memory guy turn it into a side hustle, marketing themselves as a useful person using the fact that they can reproduce various different articles? I personally think it is teetering on the gray area at best. Adding to that, the hypothetical memory guy would most likely be paying NYT to get all the articles, since they are treated in the same way as any other human readers. So even if the guy's reciting all the articles as a side hustle, at least they're giving sth back to NYT. ChatGPT is giving absolutely nothing to NYT in return to learning stuff. So I'd say it's legally sketchy to say the least. Simply put: benefitting from the service without rewarding the service provider.
youtube
AI Responsibility
2026-04-11T17:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzRl2hAPTP6N3j7rUJ4AaABAg.AVTPRuQthc5AVZBmPX3zjD","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytr_Ugz9Ny9rqUz3jnIz4iJ4AaABAg.AVTOrrYMrgQAVTZerAEzsf","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugy_y21fPDc9Qu5nm_l4AaABAg.AVTNiNuVcK_AVVSNyWo926","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugy_y21fPDc9Qu5nm_l4AaABAg.AVTNiNuVcK_AVVXouqqFxu","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgwXgbLLahsei2A95i54AaABAg.AVTMoT1L31YAVTd1UjEkmn","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgwPbBLfXYA84QRk1GR4AaABAg.AVTJZ2uFtJ6AVTYA0icg2Z","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgyX11GP9HY36jb6o_p4AaABAg.AVTGUf7oBV9AVTxrGdkNmU","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgzW3KfXGZndxL26V514AaABAg.AVTAOB_zIlqAVVrka2FQA6","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytr_UgyLpQKUwMhVnnHXVQl4AaABAg.AVT7tYD3cnaAVUCZaDv3vL","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyLpQKUwMhVnnHXVQl4AaABAg.AVT7tYD3cnaAVUDhH9-ogv","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"}
]