Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
or from history ,whenever a more advanced civilisation meets a less advanced one…
ytr_Ugw4nzYNI…
G
But that’s the thing, the people who defend AI would also, without a doubt, use …
ytr_Ugw2GPreK…
G
@MushuaThePotato Okay, but that doesn't make it any less ironic that people were…
ytr_UgwlLCYPI…
G
Sophia, the AI robot featured in the video, primarily focuses on interacting thr…
ytr_Ugys6cxYc…
G
Make ai "art" and calle yourself an "artist" is the same as putting freezed food…
ytc_UgzsPEa4A…
G
Sora Ai meaning
. STEALING ARTWORKS
.DESTROYING ART COMMUNITY AND TALENTS!
this…
ytc_UgzYHw6M7…
G
Video trying to warn us about ai, but uses ai created clips to make the video. A…
ytc_UgzsMSdgr…
G
@41-Haiku 1) There's only one way how to do recursive self-improvement in LLMs …
ytr_Ugw8URcwZ…
Comment
It’s something I considered as well, the media jumps on any mishap relating to self driving cars as if that justifies them being bad or dangerous, when in fact en masse the likelihood is that roads will get safer.
In small numbers the lives saved versus lost might be unnoticeable. Scale it up and suddenly you’ve got potentially a fraction of fatalities you once had from conventional automobiles. But we have to get over that initial distrust, sadly like everything it’ll probably be a while before that net benefit is realised.
youtube
2023-07-31T11:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxukQjqRR6f6Xre5sF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxBiMBQ5lWOhK-4mQB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxgySTwFUJ43HVWlvt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxJqYuRi0G1MGQMPix4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMJz-IxeP6ueISXph4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkSf9SF8XFo-Dn6xd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_UNVwacIfOiICyyN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwutRFxldrAFzPqnBV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxbghNUGweGjTHowWd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxyGYXtuJJyi7iO8_l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]