Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The beginning of the program promised very much but then it focused too much to …
ytc_Ugx40mE0R…
G
Ray Bradbury warned us about censoring books and robot police dogs, and we've ju…
rdc_jfyi65e
G
i love how they just made a super cool art piece just to troll a ai artist…
ytc_UgyilGPVH…
G
Legally, anything created by AI can NOT be copyright. So it’s not even stealing,…
ytc_UgxtWOHFT…
G
I like the thought I heard once that we have just been another step in evolution…
ytc_UgxdbfJG-…
G
Meanwhile we're training AI to do our jobs and not solve our social problems. We…
rdc_j6f3pwa
G
All of these endless podcasts of mental masturbation about a technology that is …
ytc_Ugwm_HK6Y…
G
He does? If so, my stubborn assumption that he has no idea what he's talking abo…
ytr_UgzX93Ejw…
Comment
Like all man's inventions there are positive and negative consequences.
The advantages of machines and AI are improved efficiencies but with the growing human population on the planet, man will be made redundant which would cause all sorts of psychological problems. A possible consequence could be an AI human war.
Humans through job losses would need to be supported by government social programs.
youtube
Cross-Cultural
2025-10-08T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwJjcp_ZjchLKD9DZN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzCS0yQBSmETxumtDl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwcvHhjkCjPo_2XDIJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx_ne51Yc7thmzb3V54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_eoktA9c7yDYSDRp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx6AY6EzM1qYU6N4Sl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxuw5Tv46JY0wY2llp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxkl7v4UXS-vxGCNHF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxOwuM5PXAONezZ1QJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6q-pmPBWbA9gTXqd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]