Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If this scenario were true, which is not unlikely, AI and robots will then not o…
ytc_UgzAyIW73…
G
Eh, people find grace and competence in all kinds of things. Just because that a…
ytc_UgxjlxmMC…
G
Is the passenger slow? Waymo support lady tells him she can't stop the car, then…
ytc_UgxoBqiDQ…
G
@desu38 Dork doesn't apply since it's more associated with stuff like DND, he l…
ytr_Ugwi5hIt2…
G
I have this small YouTube I determined to share about art and animation. However…
ytc_Ugxi8jQ6t…
G
Talked to my mom about this. She's a teacher and she says they're already rollin…
ytc_Ugw7-b40I…
G
AI has been around since 1970, and still is no brighter than a group of Students…
ytc_UgyRhruYW…
G
Fricking synths... There is a cancer, known as the Institute. A malignant growth…
ytc_UgjFHqon-…
Comment
I think the biggest tragedy of AI is that people will soon not know what's real of fake. Someone posted a pic of these two adorable birds and people were arguing if it was AI generated or not. That was just about birds! What about current events, politics, etc.? It won't happen all at once, but civilization could unravel because of AI. When we can't agree of fundamental truths, what does that leave us with?
youtube
AI Governance
2024-09-26T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxIl_-3hBIBFkGfXSN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwlu7Xh6Zoj1AlHFit4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFLkjFWDVrf1XxfDB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwRVtuphbt4guTvLrx4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxKEaMltW6dMwtEmvh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyy2zFIDv9-O12zcKF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzYdGLoFOgbdNM_1M94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwCuW3fwZm-TLpRkQd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyHKpyW7ZhwFBbmK5V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxoijAO6nMRHRs2Bbp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]