Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Your too good with the editing bro. So good. But I am scared of this. I can figh…
ytc_UgxEEgX1j…
G
@-jrb-well, he is a chief scientist. You are thinking I’m talking about LLMs, I’…
ytr_UgwpML6in…
G
Lol. This is literally comical material, but it's real life. Are we in the twili…
ytc_UgycVXN1m…
G
There has never been a case in history except for the North Sentinel Islanders, …
ytc_UgzTUdNVT…
G
putting it next to Hotel Transylvania made it so clear how dead that AI animatio…
ytc_Ugzaaw4pN…
G
I hope you never end up on a space mission with an AI piloting the spacecraft. Y…
ytc_UgynpuYmk…
G
Instead of tricks, I've found using proper AI writing tools like Humanlike Write…
ytr_UgypPrgzE…
G
It´s a string of code ... we haven´t properly mapped out our own consciousness o…
ytc_UgybPcixe…
Comment
“Not to replace doctors with AI” my butt! I’m sure that wasn’t the direct purpose of this study right now but there will be an ongoing long-term effort to replace doctors with AI unfortunately. The more advanced AI gets the more pressure there will be. The AMA will keep the pressure at bay for awhile, they still have a lot of power
youtube
AI Harm Incident
2024-07-16T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzUVR79bGJtQR310S94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz4qidbZLlgsWxqeah4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWQWNUWy_zG27eb-54AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzL4fjUpekCYJyfuKl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxlwoQE_feXWGaACwh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwILJKHW69GzYFZbTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhDOwZ1QmyiXG5JgN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwxJoURbVjWWlJZ0nt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxKHbzWsSRAf9CWfZN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaKuec7tvGL4lWGFx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]