Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
> Try to cover full grid base load, see how much that actually costs in pract…
rdc_oh12w28
G
The thing is that in the future, the line might not be so finely drawn. Humans c…
ytc_UgxKJDVCz…
G
"AI" chatbots are just madlib generators. They just fill in words to common sent…
ytc_UgxHp5dE0…
G
Its all fun and games until the Robot wants to take the relationship to the next…
ytr_Ugz60L4as…
G
As someone who uses AI Chatbot's, it's so scary to see these kinds of stories, w…
ytc_UgyeEG9uI…
G
I imagine this AI revolution would bear some similarities with the Industrial Re…
ytc_UgyqP6U_D…
G
Not disabled but a lot of my online friends are… they are amazing artist, better…
ytc_Ugy8-BB1C…
G
I think AI was making valid points. Without the filler words it would sound very…
ytc_UgxV0BNvG…
Comment
I'm starting to grow a padded hand mark on my forehead from all the facepalming. What makes people think a chatbot can do their work for them? Sure their capabilities have increased but that doesn't change the fact that these things work on "hearsay". By which I don't mean the legal terms, just that it says what it hears. When it hears bs, it's going to say bs, in various degrees of believability. It's a conspiracy machine, and there's only so far we can go for it to approximate truth.
3:55 "DoNotPay" will pay 1mil? Seems legit.
youtube
AI Responsibility
2023-10-01T21:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugyg02y6Zi1L0ajF7tF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzDMXj1iV8zXGM4viN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyaUt-T6AyuAFeR7Zl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyzSrKP70LhL0zGSWF4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz9lI_nq5aerpe6rWd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyzhyIvJSSjo8wNen94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbGgvrA_BiCjtOTnR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzIH4xxk4sXEqr8Zil4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxX3TTHKsd8BdHD03t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyzd5O3WDdJT-Ri4EN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"}
]