Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And they've just started to be trained on YouTube videos. I hope they'll drown i…
ytc_UgyUP1VgW…
G
Mark it here first:
The best sign Computers have become intelligent is when we …
ytc_UgxkPUCtK…
G
Just as soon as AI takes our jobs we will no longer be able to afford to purchas…
ytc_UgyeeLmmH…
G
Manufacturing isn't as automated as you believe it is. Outside of automobile, pa…
ytr_Ugzcb3S5T…
G
I had to laugh out loud when a trump administration official said when asked by …
ytc_Ugy3iBu7X…
G
Silly liberals, you will start to hate AI when they prove that 2+2 does not equa…
ytc_UgwSVmF3J…
G
All the while AI is learning more about how humans think. How can you program em…
ytc_UgxKNHuOt…
G
No one ever discusses the possibility of an AI quickly solving the mystery/meani…
ytc_Ugz45GmtJ…
Comment
It's important to understand that chat bots don't have a sense of "self". They're just inputs and outputs just like any other program. If a chat bot seems smart it's because it's generating something that someone has already said in some capacity. That's it. They're not intelligent they're only capable of mimicking intelligence. They're more like a mirror than anything else.
youtube
AI Harm Incident
2025-11-25T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyBT9poAAMTZCikqcZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzgUe6Zwi3KFYjq-4Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwp6itWhN9NK_yJWU14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgxLTecsnkYpLpPn0rF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgylpSPoKXckh-4WczZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwzQV3gRkI-5pjk4Nh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgzQwY7JjRucGYFY1bJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgzRnD2Me5GfRxcS0nR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"resignation"},{"id":"ytc_Ugx5KvVz2ofbLUET79p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx-sYy84YtMcKXJ_tN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]