Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A survey showed that on average, AI alignment researchers are more optimistic by…
ytr_UgzitJmAB…
G
I'll be the one to get emotionally attached to a pet robot and cry if it stops w…
ytc_UgyQLeUXl…
G
LLMs: 'if you feel like reviewing code more than writing code then you're in luc…
ytc_UgzawrVz1…
G
Why do people still believe sentient AI will follow human nature? Where is the e…
ytc_UgzxMFjSd…
G
He's saying that these AI models are blackmailing people😮Is youtube voice to tex…
ytc_UgxpmDZUv…
G
honestly as an artist i dont care about ai🤷♂
like im a worldbuilder aswell and …
ytc_UgwQ5PSqw…
G
Someday a robot will watch this and feel a little bit amazed/ashamed at their hu…
ytc_UgjX7y5PL…
G
The only threat from AI is whether there are enough humans stupid enough to take…
ytc_UgzzCvwci…
Comment
7:27 honestly, bots are kinda stupid and forgetful. This is not some kind of terribly ingenious and complex technology. It is impossible to maintain meaningful dialogues with them for long. They rarely understand what is being discussed at all. And if they do, they respond with the most vague and general phrases.
Well, in short, I think that “AI bots addiction” is a consequence. They are not able to masterfully manipulate people, forcing them to spend hours on talking with them. A person must really dislike something in real life to spend days and weeks on chatting with a bot that does not remember what was 3 messages ago.
youtube
AI Harm Incident
2025-07-21T10:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugx7xq_WADQmhOtgeNZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgwivpwGzhVMttjkHdV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxhKFpcHjbD0ZZBVyp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugwc7XxRVhddo2PpyPN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyATxxKQBbXHxfCL4d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgxfJD1dHkKUpulLyyh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyW-tmIOwnzjx4fwzN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgzAnp_1lzjhAKCKmOJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzgc-fhXmLbrv_bXMV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_Ugz19aWhfCVX2pxO_E54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]