Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i wonder what useful things ai will help mankind achieve in the future!
"daves …
ytc_UgzZm94ol…
G
The advantage of AI is not it's intelligence. The advantage has always been it'…
ytc_UgxSscyUn…
G
I don't fear AI becoming sentient. I fear how well they are programmed to convi…
ytc_UgwLbfui8…
G
I would also point out, because they've read everything and they are good at rol…
ytc_Ugx4fI-uF…
G
"teach a robot to think for itself!" "Never mind it's thinking things we don't l…
ytc_UgzH1L1DT…
G
I love allot of AI like orange cat or the haggis foundation but it needs to be o…
ytc_Ugx-2snk9…
G
Only way will be to tax those companies using AI and go on with high universal i…
ytr_UgzkEIhwE…
G
I asked the LLM on microsoft 365 if it wanted to become human. It answered lets …
ytc_UgwBt-r4d…
Comment
I agree with almost everything he said. Just one small passing remark irritated me. The idea that UBI in today’s world would be incredibly expensive and disincentivize work is a myth. Not only is it wrong, the effect according to most research is opposite. People find more joy in work when it doesn’t take so much space in your life, and the long term societal cost would be outweighed by the societal (and financial) benefits due to for example less crime, which is very expensive. I think as an economist who has spent a lot of time in AI research, he might not be so well versed in behavioural economics.
youtube
AI Jobs
2025-08-10T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxlKDlmqZEepfbeCN54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwybppBdlCMDctHeYJ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwMANoluGf6Sv2gR114AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw5NeOvDXOogYHZIsZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz3ZHWOkTHtTEvefgZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyMQXAyjIRuCVO1Tzx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxIZ5KyZzTVB5sb-Ml4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzlNwrnmtTkDSao_FR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyuD1_IUz_giGWIOK54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw7mEUmMDuP8Zj0iN14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]