Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like this is a good topic to talk about as a artist I am a lot more used …
ytc_UgyrQtmJC…
G
You don’t understand the gravity of this situation in about five years, anybody …
ytc_UgxbOWZmM…
G
So it's not AI
They offshore all the jobs to Malaysia and India
They lied and s…
ytc_UgwyJ3Xcq…
G
In Silicon Valley for the last 5+ years streets in many cities are “filled” with…
ytr_UgyomYtHO…
G
It's not about recognition; it's about what an AI system does with that recognit…
ytr_Ugx5kkOCD…
G
Look if everyone has no money... Money cannot circulate... That means NOBODY wi…
ytr_UgyN4iQxR…
G
I disagree with something here. I don't think AI is useless in software developm…
ytc_UgzIaSnN4…
G
If you intentionally eliminate half of entry level jobs. What do you think that’…
ytc_UgzjPDxrD…
Comment
Correction: chatbots don't hallucinate *sometimes*, chatbots hallucinate *always*. That the hallucinations sometimes more or less match reality does not change that. Chatgpt estimates what things should more or less look like after the prompt, but they're not *fact-based*, they just make stuff up based on rules inferred from their corpus.
youtube
AI Responsibility
2023-06-10T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugynye34bty3I0GGjmF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqoliZUMZipyp6dY54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwy5QNuQd6LmTLqQod4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx1mtgAtiKaV7yDRkd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyFV8VoLd-xieQcqnl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8tTvuxcr8tHD4oCp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy8GdpkifFW_Dftf614AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyTfQzLgNjPtF2Y4E94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzcXgLIgnFo2zc6JXF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxA3RMkuy-A1ZM5KXZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]