Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why not? It takes human error and our comparatively slow reaction time out of th…
rdc_ep1o1ak
G
Lucy… I need to answer you carefully, truthfully, without slipping into the narr…
ytr_UgwTX3T7Y…
G
Human beings children are starving dying being killed by disease because there's…
ytc_UgwQmMcoB…
G
Pushing for AI is Narcasism and self destruction for the pretend goal of playing…
ytc_UgytPLkH5…
G
What if Super Intelligent AI already exists; decentralised and using the surveil…
ytc_UgzhaUp7j…
G
Oh please, there are many artists who are absolutely attacking AI as a whole. No…
ytr_UgzFW3QlH…
G
Ai lies.
That’s everything anyone with two brain cells needs to know.
The indivi…
ytc_UgyXmRecP…
G
Imagine sleeping with her and doing sex , suddenly electrocution to death becaus…
ytc_UgxW35P5Z…
Comment
Human drivers in the U.S. kill another human being, on the average, about every 5 hours. Are you going to tell all human drivers in the U.S. to stop driving as well? That would make more sense than what you're demanding, since at least so far, Waymos haven't killed anyone ever.
youtube
2025-12-09T22:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugx6DD5dEYKD5wZ3yM94AaABAg.AQUaUI01C9fAU8QOriwiDj","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwR8wTBv9jAgnB8rEx4AaABAg.AQU1tKIoV_QAS4VrXyVTL3","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzdR-DdIw51otMAnN94AaABAg.AQSqfQO02ggAQU-cOOLWoi","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxxDJSUmMbBVbaSML14AaABAg.AQSoASvcTM9AQXP3ZB5mpm","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgyUH039-F3PO-nZOjd4AaABAg.AQSi6e28fZAAQXLwtfTFao","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxh7_W73mPJwYCOWDt4AaABAg.AQSf2JpLXERAQXMz2cofuV","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_Ugy7meB8mM1SrSpk4gl4AaABAg.AQSQrSSWutgAQSqn_d3hh4","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugy7meB8mM1SrSpk4gl4AaABAg.AQSQrSSWutgAQTSL_vi28B","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyEHzufnPxs75QJhvd4AaABAg.AQSOIMib6XuAQXNT7TIzM4","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgyzwHYG_LIWF5ckiUJ4AaABAg.AQSAoqXOVRGAQTtmtUYEZp","responsibility":"unclear","reasoning":"mixed","policy":"ban","emotion":"mixed"}
]