Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Seriously if your poisoning the future of AI growth your a selfish piece of tras…
ytc_UgyrvE4ki…
G
Harry Potter 8 and Harry Potter 44 about to come out the same day. Can't wait to…
rdc_lz8ooxz
G
I own a small business. I honestly can't think of a single use of AI that would …
ytc_Ugw5QVlSb…
G
My question is when did artificial intelligence start and I know I had to start …
ytc_Ugw2FDvfH…
G
Absolute hero.
*Begins to explain why you want to poison AI*
"You had me at poi…
ytc_Ugz1Vy7Ai…
G
Honestly I think indeed does this too, but conceals it better cause it takes lon…
ytc_UgyCEwzoC…
G
AI taking over is expected. What else....? Only the rich and the governments a…
ytc_UgxD9f9lW…
G
This problem with Ai isnt only with art but with alot other careers as well And …
ytc_Ugy7QGkL2…
Comment
It's not only that it will replace most of the jobs; there have been a lot of different types of technological progresses that have eliminated jobs at their time, and humans came up with new jobs to do for humans, but a big danger is that it has the inherent potential to make humans dumber. As a teacher, I can already see the effects on students. And the mental gap will get bigger and bigger between smart students who know how to use AI to get smarter and not so smart students who use AI to outsource any thinking and thus get dumber.
youtube
2025-07-23T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxz9NZDlIw9ZDEtxdd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzJUxgDlLZ-XAbJSRV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFrnx-7fOUUje6R6R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzI_JHeHkU51f2iljZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxa3QqsW-Jv0Z6D99R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyT2Y_0NdD5ssWXcIp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxSJzmx5Y9ASp5gpYx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx5m86V0KykSfqACRt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgygPHyMa7Ka22kwneV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxXDRZFTpMQDZ-QOXR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]