Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
stop being a doomer. machine learning algorithms are way more retarded than you …
ytc_UgwnhDf4v…
G
I’m curious to see what will happen once AI has taken over everything to where h…
ytc_UgzFPSXlF…
G
Not much hope here. Go and read the book of Revelation, it tells you what will h…
ytc_UgwdEsW1Y…
G
Everything that sama does can be done better by ChatGPT -- especially generating…
rdc_n3om0bu
G
The Problem will come with the people owning and creating and maintaining the AI…
ytc_UgyC68EWi…
G
unfortunately to completely degrade an AI would mean you’d have to consistently …
ytr_Ugxe8tYGb…
G
It's like ask Ai to create a story for you and claim you are a good writer. What…
ytc_UgxThzZgu…
G
The difference between standard abstractions like game engines and AI is that ga…
ytc_UgyY7n-N-…
Comment
The AI didn't get defensive. It said that no case existed because it didn't check the Internet, therefore defaulting to its knowledge cut off date which is June 2024. When you pressed it on the story, you can see in your own screenshots that it actually checks the Internet for the latest info, which I didn't do during the initial pushback. There was no "sources" at the bottom.
youtube
AI Harm Incident
2025-11-25T04:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzcHr1pHNVENZU9I914AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugyo9RySMNHdRlQ6cWJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"ytc_UgyzDKZUEoW-92rL2Tl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwDh-fbjZoxdsfN0hB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgwCsUsSqwZF6v4d1NV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugyvx6c6cbEEmr5ozyZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"ytc_UgzLzdjSUVPiglJg3jx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgyuoRWYVSvn6FHK0gp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugx34hnR_39C9DhKc_N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwH-OkTjgXs5wM8ziN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]