Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI is used for augmentation, GREAT....if you're using it thinking you can rep…
ytc_Ugxll5s4z…
G
Nowhere in this video did it discuss the cost of this AI technology to the compa…
ytc_UgwVRQDGc…
G
How about program self driving cars NOT to tailgate.
Ideally the distance betw…
ytc_UghvD0lAX…
G
Would love to see two AI and ML robots interactions, there conversations and beh…
ytc_UgyA8TRFN…
G
I saw a meme on r/ProgrammerHumor recently which describes my feelings toward th…
rdc_lz5nzau
G
Dear Host,
These AI Communities as well as Current Medical Sciences even doesn'…
ytc_Ugy_I1ahs…
G
Yeah sure and Tesla stole the beeping sound coming from the cameras pissing them…
ytr_Ugw5Z2i3d…
G
As an actually good artist ai shouldn’t be used for art, mainly because the enti…
ytc_Ugz4OR03W…
Comment
The problem I see with AI is it will never say "I don't have enough data or training to answer your question accurately." It will just convincingly and authoritatively start making stuff up. And depending on the prompt, will answer different ways in an attempt to please the user.
youtube
AI Harm Incident
2025-11-28T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwzF9zeZ06MzGAwk2l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxU-_9auTarJyXTjXh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1bf7r9wcDPUdHz3l4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwslIcni6KMOtM3v3x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwdF9lGGKSwKB4v_Kh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxfiYEqf2_b_2QxdKx4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyuxSjCKl-aGtGGSfF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3Pi4FxBsxJhhLs8Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwul8TcYLbk1-zVyV14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyhRNyKSG8tULfGYG14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]