Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This reminds me of the debate that happened around photography when it was getti…
ytc_Ugy_tC9MH…
G
My ideal transportation modes are biking and trains. (Also know how to ride a h…
ytc_UgyrLo5pL…
G
imagine if photographs had been banned because it used to threaten painters and …
ytc_Ugzjg2GGP…
G
AI isn't for you or for me. The ads are just trying to market the idea to us, bu…
ytc_UgyuyUZCo…
G
They are not computable notions owing to the inherent instability and unpredicta…
ytc_Ugyge4iAr…
G
A.I. ain't gonna take over and destroy the world by itself - the human-controlle…
ytc_UgwZY5aBX…
G
This is fake.
Thats a human.
No AI has yet passed the touring test.
We are still…
ytc_UgxoGn5cQ…
G
I'm sorry... but they fail to realize that it's going to happen regardless of wh…
rdc_cthzbiy
Comment
I’m glad Bill is still working on new nuclear energy, but I don’t trust someone who stands to make a ton of money, to have a fair & balanced risk assessment on AI. AGI is more likely to _wipe out humanity_ than meaningfully help it. See Eliezer Yudkowsky’s interviews 🌍☄️
youtube
2023-05-15T16:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyhYnjmsmT5breAgQB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8sy38DZhvKKbzK6J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxWGyHvSN6XMob1UHF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxlxL6Mzrz4KOrctKl4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwAEpDQjx_YyXBlD3V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwQ7z0mqy9E7W-lqKB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyqdve7kmMbpB8QEQh4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxVzTxHeCrDHK50--J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzyUGWiFWk0DofDYKt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy9cD8WX4LGbuDj4u94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]