Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If we consider Churchill killed 1 million Indian, and 300,000 English soldiers o…
ytc_UgxvvqmVx…
G
You are the most eloquent challenger of Tesla Autopilot i heard so far. Interest…
ytc_UgxahluPj…
G
“AI” has to be the greatest misnomer of the century. These systems are not AI. T…
ytc_UgxVYm1FV…
G
i think a huge part why AI is so prevolent is the perception that you have to be…
ytc_UgwKuUCr6…
G
Wasn't sure about AI in hiring, but after using ShortlistIQ, I see how it can gi…
ytc_UgwWg2KBz…
G
Artificial intelligence is an outstanding tool, but considering how it is being …
ytc_Ugz9cbre9…
G
Can't wait to see what everything is going to cost in the coming future. We're a…
ytc_Ugwkfed38…
G
AI needs the human to lay out what needs to be done; and most often the AI is us…
ytc_UgxGoR9EQ…
Comment
Why should i forget AI apocalypse which is the actual real threat? This are rather benign threats on the short term while AI potentially annihilating us is obviously a massive threat on the long term (if that's most people still think) or pretty much still on the short term so why don't we focus on what REALLY MATTERS here instead of wasting time, energy and ressources thinking about absolutely irrelevant threats if we're all dead? And anyways AI will soon be able to also resolve environmental issues. Let's just hope they won't do it by getting rid of us as a pretty obvious starting point ;)
youtube
AI Responsibility
2024-01-02T04:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwCeNR9rJ3ysa8wlIh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwwdhPtk4nm3GDxFB14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyb82hJX2xgmQUUTn14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxSGDZHXN6NMBRg7Id4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz5zqHMmXdBQGTL5fN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgycuOf9i_XtSupUIL54AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzI2Ks7uJu6k8Vd3KN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxfD6Ksy80UhAnyqCF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy9MpduZ2ZIlgIOA2F4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwD_wQwqSY-94v5r2N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"approval"}
]