Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well, let's put this straight. The really rich know something (or more likely pl…
rdc_nm3jg6y
G
Ok, 1, Tom Cruise is broader across the shoulders and chest. 2 serious deep fake…
ytc_UgxNkLprg…
G
I you can't recognize when a LLM "hallucinates" given an order, you shouldn't g…
ytc_Ugwlephpo…
G
AI is still super recognizable though, and some people might consider you a chea…
ytr_Ugxa8futL…
G
as long as Ai is used for silly stuff, like memes and things like that, I don't …
ytc_UgyEOLOMD…
G
Tesla Autopilot should never have been allowed on the road in the first place. I…
ytc_UgwpSsLV-…
G
@yellowkillSCi don’t agree with the csmps. The camps are evil. Ai is a feed back…
ytr_UgxcaFml_…
G
I want Fully Automated Luxury Space Communism.
I do not want Fully Automated Dy…
ytc_UgzhVoE_6…
Comment
Hi Sabine, did you watch the recent video by Cool Worlds: We need to talk about AI....? Here he discusses the use of ai in research and it´s implications. It´s eye opening and corresponds with my own ai use cases. Maybe a follow up video by you would be interesting to many.
youtube
2026-02-08T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxF8JBtSW_ZnnkT8Zh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyo708vngfXsRhg9ax4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwnIqWfXs-RnIENVaZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOBpPx4uHHI1vidCt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxRQ8ZjaLEQYp4R8RR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxzeGQLinLJMO7c3a94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyArDcxhymD_mJVA4Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypehuOg61jYmua30J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxhcFgQXdkWFh7cxcZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx2bTwyX5xyN60rJRd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}
]