Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I cannot agree with Yuval Noah Harari about the requirement that humanity needs…
ytc_Ugy-pukuC…
G
are you a kid? The entire porpuse of the video is show and talk about AI generat…
ytr_UgwdAYy5N…
G
That’s so Canadian if you to respect a man you dislike for doing something right…
rdc_fn5nil1
G
The current Ai bubble is all about workforce reduction and is over rated. It nee…
ytc_UgwMgSx3D…
G
Bwahahah, this was a lot of fun. Deluded, but fun. I do enjoy your productions…
ytc_Ugw_ViY8o…
G
@EviIPerson No, it was directed towards you
Do you REALLY believe artists just…
ytr_UgwLA1UEW…
G
Honestly, this whole scenario is pretty unrealistic. Think about it — no matter …
ytc_UgwcMKwjp…
G
Nah all computers are glitchy and they wouldn’t trust a robot unless it’s proven…
ytc_UgzH-1lPX…
Comment
@theyoungturks I have never disagreed with you all so many times in one week. The drones DO NOT move unless commanded by a HUMAN. They don't leave on their own they don't decide who is the bad guy on their own. Cmon use some common sense. No military leader would release a robot that is gonna go make military descisions for them. Robots don't decide they are TOLD.
youtube
2012-11-26T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzLgHGb32Jea5OFKqF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz7amYdQiGwJljo7894AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx9bcUTIcCBDEsZBRJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7BqMhhGpcp_rNO8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9Ggwv4XWRx4nlf0d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzoWwuDpN-21OfQ0U94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzxLhw3d4Cfjm14qZd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYTAGDrDX2UaHFL414AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw_MFN3II2ieBv1g3l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiAhLWl3p99-PEriV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}
]