Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We don't want to answer to robots or ai...we want people ...so stop pushing it.…
ytc_UgzAh7OxQ…
G
@A_Psych_Nurse > However, remember that as powerful as AI has become, it's cruci…
ytr_UgxxVufKS…
G
Well that was uplifting. Having said that, I'm not convinced that AI is as scar…
ytc_Ugwq6yeIe…
G
Imagine a world beyond money — where automation handles most production, and wor…
ytr_UgyGcBuEI…
G
Sounds like he was just an idiot in general. I used ChatGPT and my own common se…
ytc_UgwQF739O…
G
AI is trained on humans. Humans are terrible. AI will be terrible as a result.…
ytc_UgwjBuPEC…
G
"ai" is just glorified autocorrect at this point, and is based on content theft.…
ytr_UgwSP9bfs…
G
Trust me we don’t need AI on this earth but we truly need more humanity…
ytc_UgzLvNiGt…
Comment
3:36 - the problem is everything a LLM tells you is made up, it's just what is predicted to fit whatever you ask of it. It can't reason, it cant think, it can't confirm the validity of the output, it's just easier the "hallucinations" (a poor term as it implies a LLM does not hallucinate most of the time, or implies some form of sentience) to spot when it's wildly inaccurate or incorrect. Also AI != LLM.
youtube
AI Responsibility
2025-12-17T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzAyB4UkzQDPD1dT5Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyJdQc7VpWLV7Obpax4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyWyASXvzut5TnPQrp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugy0gvvEv-uNvolHgHZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugwd7PHqixPGiU76k9t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"frustration"},{"id":"ytc_UgyA-KhnhLpiJ1ZyC1h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugz_D2Nnqh5G0siPE5B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwFJbtsw0d_mVbzX5x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgwGvMj1O2A0X7sefx54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgxJo3hRkgUEbNPDovl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}]