Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you’re not cynical re AGI then you’re naive. Period. This guy pretending AGI …
ytc_UgxNCudP3…
G
This is super vague and he clearly doesnt give any info on these for a reason. A…
ytc_UgziAP2te…
G
This’s what exactly the Google AI researcher showed the danger of biases. Unfort…
ytc_UgwBoXLqI…
G
This guy is a moron, I can't fathom how he can proclaim to be expert in AI bias …
ytc_UgxSSGqxQ…
G
If the plice car wasnt black and had reflective markings like a fire truck it co…
ytc_UgxmEgXfs…
G
Besides; Robots are programmed!!! It won't last; it never has, never will!! Espe…
ytc_UgySOoA1z…
G
That's why I abandoned my previous job in teaching and started as an AI trainer …
ytc_UgyGrFAX-…
G
I think artists should be asked for consent instead of an opt-out. I think the w…
ytc_UgzWcqj7I…
Comment
Unless the AI sees Terminator the movie... and concludes that's the best way to survive.... this isn't impossible by any means, what the actual discussion is about, is plausibility... how plausible will such a option be to the AI, and that we can't answer....
youtube
2024-12-14T20:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwWhxi1glqTXjY71WN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyR4As7NrcfwBhHzEJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuEZF1cUUcFOdyotR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy02qhud0dSIz1eVS54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxDe_hvzclCFLJUKk94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQm-gW4cZ8Ua1KYzV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwaIlKAnK77mix4fNB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz6dKfgnklwgQLzvw14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwbMylsc6J-kaD54Ad4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw0Gz_jbyVCksaSNWZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]