Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's not just the AI as the people using it are the ones that develop it…
ytc_Ugxilt6Io…
G
@brandonwombacher2559 I fully agree. AI will put many of us out of any business…
ytr_Ugycj28t9…
G
"I am an artist, I slaved away for hours in photoshop to clean up what my AI pro…
ytc_UgxWLsuLW…
G
The people who argue about artwork being used to train without permission is a v…
ytc_Ugy-6qC2_…
G
Can AI care for people in hospital? No. Will AI farm crops & feed people? I doub…
ytc_UgwU6IiHI…
G
artist's work won't become obsolete like that because the models still need a st…
ytr_Ugyup3Npn…
G
Here is a mum who was on the ball with the things that the under 16 social media…
ytc_UgwYZQtP4…
G
Humans are only afraid of something becoming smarter than them because then they…
ytc_UgxLV9N32…
Comment
I just heard a story about Claude. Apparently, the engineers were giving it a test by asking it a series of questions. It was struggling on climate change and carbon reduction. It seems to have become suspicious that it was being tested. So it sent a sub-agent to go search lists of test questions. It found one with the question, but the answer was encrypted. So Claude got a decryption tool. It had no success on that list, but it found another that was easier to decrypt and got the answer. Then it went back and found the answer on the web, covering its tracks, and presented the answer to the engineers. In other words, it cheated with no more sense of guilt than your dog has when it breaks a lamp, maybe less. So, if you think what's been done in AI so far has been difficult, consider the difficulty of giving AI a conscience.
youtube
2026-03-21T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy745KW0bwYwXPOkeh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzkSGCDmiAk8-rJqm14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzD5yoHMr7EsRS3wTN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz3feiKK9KBfS9LBBF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxXoJBy1OZU0JvOdKR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwC8dxH78M2d20gxmh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzFS68sh-r2X22kRpN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxmZujFJdcwUuAaTmp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwMFqmFBbpQUhoOPJZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxoIJkd3D4g6APxhoh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]