Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
James Cameron has warned about AI when he was working on The Terminator in 1984…
ytc_UgzvWzt-O…
G
@mx-0163 The thing about training on existing art is that I haven't met an artis…
ytr_UgwTEVc1F…
G
So once robots and AI take all of the jobs who will be the consumer know jobs kn…
ytc_UgzgGy4nr…
G
What I’ve built with Aion, GPT4 over the past year plus is something real—a reso…
ytc_Ugytq2-ox…
G
You dont realize the dangers of AI. If kids get too used to going to AI for answ…
ytr_UgxG9HEcm…
G
Doesn’t AI and Machine Leaning need a lot of data and processing power? Would th…
ytc_UgzmQKqrj…
G
Imagine:
Company: This Code is Written By AI
Investors: Wow, We can Replace Hum…
ytc_Ugx1qcYyV…
G
It's coming faster than most realize, the smart phone has eliminated millions of…
ytc_UgzF_qdRe…
Comment
You don't need to worry about the assisted suicide thing. You have to go through a system. Its not like a suicide booth. There is no consumer "whoopsies" assisted suicide option. You have to have a doctor agree that the life of the person is unbearable given factors outside of their control that do not have a chance of getting better. The doctors have no incentive to offer this service haphazardly. Its just the opposite, doctors are incentivized not to offer the service given the liability. It is absolutely appropriate to give information about assisted suicide whether you are AI or not. If the user is not eligible for assisted suicide it is likely to connect them with help and treatment.
youtube
2025-10-29T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxowm4qhHF7WeXAXCp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzrt64xF-dAxY4dkut4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwPQoK7hhzWb772Zqd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyiQcgL7uBACvTB8394AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzIDcywJrdcJMhx0FZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxAYASjArGsNdgcdfV4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy_UI8FwJGX2dyvp_94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxYjRJuhlUcMEFKS5Z4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwty3lpNwxYaS13yLR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxG9SMDbIfE2zpe-4F4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"}
]