Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They'll even make them for useful purposes. A program that detects human faces c…
rdc_cthxfzq
G
If most jobs disappear while AI supports the economy, the only "solution" is uni…
ytc_Ugw3TOjXp…
G
I think we are past the point of AI learning on its own. Did it not teach itself…
ytc_UgxUFW4Ct…
G
Definitely a hit piece on Musk/Tesla. The driver is always responsible and shoul…
ytc_UgzS438WT…
G
They don’t understand how their product works. They can only try to guide it, bu…
ytr_UgzqyAYKW…
G
I always wonder when they start doing self driving semi trucks that weighs up to…
ytc_UgxwhEwak…
G
The last question was the one that hit me the most. 4 of the AIs think we don't …
ytc_UgzKTz0GD…
G
At least the dude is getting free fanart by his AI OC but getting roasted at the…
ytc_UgzN0jzU_…
Comment
I use AI all the time, it’s no more useful than a customised Google search. It’s very handy to be able to use it to extract information from a particular document or data set. It still needs fact checking for everything.
The fact that it can’t even correctly calculate the number of letters in a word shows up the fact it’s not even remotely intelligent. The more self generated trash it absorbs on the internet the worse it’s going to be. The robotics side of things is so far away from taking away even basic jobs that are in any kind of dynamic environment.
youtube
AI Governance
2026-01-28T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx3NvQh7Xqpxi_2fz94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzfwXfVF_AUC2KaAxh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBK9QHgoZJ-izCE-d4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz4MOSR7VPjKmM6YDF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxoBv-U4Ar3Ca3mVNR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMqcnfxSpEJXLNdsd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJE4fF7VycPJ8RK6p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugxd5Btl2kf_qnEtOzB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugzp8M3uqRiD9N33i414AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxQgof1SMkNw1k3MLx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]