Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ai wants permission omg here we are trying to create a brain that doesn't experi…
ytc_UgzAwAk8q…
G
If I had some super ai weapons that I could program to do anything I would kill …
ytc_UgwKC8LEJ…
G
I heard microsoft gave its ai a “digital lobotomy” because it was becoming too s…
ytc_UgxbwtAjE…
G
We need to write an AI kill switch algorithm - to prevent AI from taking over.…
ytc_UgwNmcILT…
G
So everyone will be out of a job in a few years. Just like machines put everyone…
ytc_UgxXNbcve…
G
The only time au should be used in an art sense is when you’re just having fun w…
ytc_UgymafIFb…
G
@enby_elphaba I disagree. Yes, he clearly needed help. But it's absolutely also …
ytr_UgwqJN6ww…
G
the believer AI is an idiot it doesnt stand its ground to one religius belief th…
ytc_Ugwm_K57R…
Comment
Imagine two lines at a hospital. Line A, AI does the consultation, diagnosing and makes the treatment plan, isn't time gated and patients can give as much info as they like, while being asked questions( AI can work on multiple patients in separate rooms at the same time), and the doctor just signs the treatment plan. Line B, current system with full human interaction throughout, but costs twice as much. which line do you think will have more people?
youtube
AI Harm Incident
2025-07-23T14:4…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz6xck6wgF8o--u2KR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzpvHKS20Y08m3nzeh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwzbh6jBXeYWyGjDSN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyIz92nh_SM2zqzyGl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzzMmkHpjkvPRx1pxF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxt88QohxL3rclmEkp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynIoDvCfp6AJRysPh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyyUveUR12T27P8XEJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxp7QTqwuhRlzDE3TN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzbl8aCHhGXMbvrXzp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]