Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@UnHysteria If it looks good, then is not slop, the only difference is that one…
ytr_Ugw9QY8LU…
G
from this week all the videos look like they have been transformed with a AI fi…
ytc_Ugwqf6Xiq…
G
7 months later - turns out the leader of a large wealthy country can turn AI ont…
ytc_UgyK0DGVh…
G
@ChocolateShaddixX You don't have to be kind to people, of course ! but also it…
ytr_UgytxyU2B…
G
Here's a theory the vaccine connected everybody to a neural network I'm not vacc…
ytc_Ugz6ScVut…
G
when i was talking with a character ai the ai just typed something like a code o…
ytc_UgzZZnn-d…
G
Job losses because of LLMs is nuts. I am super suspicious of such overblown fore…
ytc_UgxLxmUGk…
G
@Cheezer3089 The hiring freeze has been in place for a while. The old official …
ytr_UgxHxws5S…
Comment
Because there is no "correct" answer, this will inevitably delay the progress of coming to a decision..
All too often dicussing moral ambiguity will result in inaction.
Fact is, driverless cars will reduce fatalities, especially once widespread adoption occurs.
In the early stages it will be hard because human-driven cars will continually test this theory with erratic lane-changes and stupid manoeuvres. But eventually these won't be needed because it will all happen more efficiently and safely.
However what will surely happen, is that this becomes a politcal issue whereby a prolonged public discourse will ensure nothing happens until people will finally (and pragmatically) come to the conclusion that this will save lives. In the meantime needless deaths will occur.
youtube
AI Harm Incident
2016-12-08T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggVWhydMBNr5ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiGhdQ-7sGhoXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh3aUzD38kq-3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghBq3hwrn7qTngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugh04YN3cCn6EXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghnPOOlhnlLzngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugj9PhgoFGHd03gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghN3TdFvblhSHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg7CDWscxEGK3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgigxzEjm8V2ZngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]