Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yep it will be automated and be just as precise as a professional high paying su…
ytr_UgypEzpQ3…
G
Calling LLM errors "hallucinations" just anthropomorphizes the machine. It's not…
ytc_UgxiqSmOa…
G
@canimanamino2433 but taking a job away from someone who developed a passion fo…
ytr_UgyaO-YnO…
G
Why are AI people stupid? Because they dislike thinking, let the computer think …
ytc_UgwAZIf2q…
G
@videogamer596 You say that as if automated things are good for you. When the ma…
ytr_Ugw7ZiLj9…
G
Ah yes, a commercial AI installed in the DOD with access to confidential input, …
rdc_ntceygk
G
I learn that people this day are so impatient. They want everything now. Serve r…
ytc_UgwOhF99b…
G
Autonomous cars are cool but they're not going to fix traffic. Yall need trains …
ytc_Ugy2muqpP…
Comment
I remember when smart boards were the new thing. I was teaching children who had never seen the Western alphabet and needed personal interaction, so I saw no usefulness at all and said so. Smart boards lasted about five minutes. I also see no use for AI on a broad scale. People, in general, need that personal interaction to problem solve. If we don't learn to work together, our country will fold.
youtube
AI Moral Status
2025-06-05T12:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxy6e4_DQwSRKNkXEZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy41P7lznErheFIH1R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxJq2kHAJ7larxaCLl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxP4MFEBh5WpHuauN54AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzjbzXW0rA_rkvFKfZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9wyHhOBEhkg5xWrd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyivr_iTxb-J4OpTXN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwx-kSVSmJjcKyVaVp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzW67qR_Upp_BiRnCR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz2PaU39ov8KMk0VYt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]