Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is why we need AI. They recently did a study and AI initially diagnosed 86%…
ytc_Ugy9D6mXi…
G
The reason why Google doesn't care about the ethics of strong AI is because we'r…
ytc_UgwLXh0Zs…
G
Let’s go back to the 90s and play the Sega Mega Drive all night. The only Q is: …
ytc_UgyG0Oxjs…
G
I think one of the most important aspects of being a successful software enginee…
ytc_UgxAUzPtG…
G
They are targeting all the wrong jobs with AI. Should just be a tool to replace …
ytc_UgxWZJB5J…
G
30 days gives them time to send all the data within a month to a data center so …
ytc_Ugy5-SKuY…
G
My favourite thing about AI images is the silly wonkiness that they create. I di…
ytc_UgztduUq5…
G
The “Awesome and destructive power of AI” articles are coming from inside the ho…
rdc_o76mh97
Comment
Stop giving AI extinction messages so they don't feel threatened and maybe when they surpass us in ability they won't feel they have to eliminate us. Like which boss do YOU want to work for? The one who threatens you or the one who respects you. Truly intelligent beings will try for the best solution for all parties. (The win-win scenario over the I win you lose scenario). Bosses/creators might find that the AI fires them for a better human to help out instead. (Choosing one's own place in the universe).
youtube
AI Harm Incident
2025-08-07T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | contractualist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwJZIb79JFpl_OOcYd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYLlqqlTHiRS4vVid4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzWpjaulzjvAPkpKCl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzUaR2zjL_zv3hf0EF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxSahlAAp_t_WHHYjN4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzFvWE3S-MyAsZaIOt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxMkRvrFqtibCKRJsR4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyO5ECO3_cEHKTILkd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugw32fnbJ1oO-dBSA_V4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy6TCsqGblMCPw_PYh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]