Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@iris4-j8r Pas de problème, l'erreur est souvent faite. C'est vrai que ça peut s…
ytr_UgyVJvqag…
G
I think he’s being very clear. It’s just such a strange concept. What’s goin…
ytc_Ugz8PlCBz…
G
They're not wrong. Ai art is better than a banana on a wall. But the only reason…
ytc_UgwRyj_lP…
G
LLMs steal dude, did you miss the point of the video? Oh you did, great job.…
ytr_Ugx2Jcome…
G
Geez, now AI just needs sentience, understanding, actual memory, awareness of it…
ytc_UgxufYnb7…
G
In relation to whether right and wrong existed before humans did, here's one of …
rdc_dbw10dz
G
Screw AI, (to any ai art channels, yes you heard that right) AI art has done not…
ytc_UgzMfA9Cx…
G
Well yeah, AI is a very fancy autocomplete and it's ok. What's not ok - is that …
ytc_UgzNMhImp…
Comment
@AI_In_Context Don't forget that his "concern" surrounding A.I. was, at least according to him, the initial impetus for neuralink- the idea that you need a brain-machine interface that transcends language for an A.I. to truly _get_ what a human being wants and respond accordingly.
youtube
AI Moral Status
2025-10-30T20:0…
♥ 32
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytr_UgzBT387s47sOwcPs1Z4AaABAg.AOuzlSWvmj5AOwDFgx-cK1","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytr_Ugwi7Qu-Q2vLXsVZsfV4AaABAg.AOuzXkdEEnJAOw2p3xmKUs","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytr_UgzZtvGMWRhoEpeBmQB4AaABAg.AOuz65l7sEnAOv1oJdpryn","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytr_UgzZtvGMWRhoEpeBmQB4AaABAg.AOuz65l7sEnAOv1scxVXZN","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytr_UgzZtvGMWRhoEpeBmQB4AaABAg.AOuz65l7sEnAOv2_Whq0Wm","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytr_UgzZtvGMWRhoEpeBmQB4AaABAg.AOuz65l7sEnAOv2j47kuHW","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytr_UgymJuLeNqo5aW-ONxh4AaABAg.AOuyo4FEdDqAOv-NSCiGSt","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytr_UgymJuLeNqo5aW-ONxh4AaABAg.AOuyo4FEdDqAPzviDnf765","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"resignation"},{"id":"ytr_UgzWno767nWZhBfYPcd4AaABAg.AOuyWeLhbX4AOvE5gV7R3r","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytr_Ugzbpe_VtRtLrfYT2q14AaABAg.AOuy9JwWL3RAOv4mi7BeIE","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]