Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@nobodycares607 Lol just saying a company will like a highly skilled and experie…
ytr_Ugzrfpa4U…
G
Ci si deve porre alcune domande prima di creare qualcosa di potenzialmente danno…
ytc_Ugxy58XTN…
G
He can't really one hundred percent prove that AI will kill us, but the trouble …
ytc_UgzyMjkxA…
G
I remember when they told us back in the 80’s that one day you’ll do everything …
ytc_UgwRjil6R…
G
Bro Im doing a graphic design degree and one of our assignments was to be judges…
ytc_UgwmA_mdd…
G
The US administration can’t even 14:15 sue someone safely or run ICE safely. So…
ytc_UgwAYkXTg…
G
It’s actually very simple.
In every Tesla and autonomous driving car a camera m…
ytc_UgxZAxPbl…
G
All I have to say about a.i. is. “I’m gonna milk this cow the best way I know ho…
ytc_UgwZ4HN32…
Comment
Cloud based storage for AI intelligence could go wrong If all the designers and programers don't have the same ethics or moral code. If one robot learns how to shoot a gun, they all would know to shoot a gun. It seems to me that there would have to be some division in the cloud to keep things like this from happening. AI's would have to be blocked from certain things in the cloud that would interfere with its main design.
youtube
AI Moral Status
2020-01-30T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx5k852kRwcWvInNEB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwX8mZH1a6tRmTxUJd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzJ-46IJZec-FLcAvB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzVoisoG9YNNMrV2Od4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy6t143vGSY3ZFM6aZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyE-ieOvj4Q-V5dcoN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw3HpvWIbHRHN7L5rh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz029v_ogW7_1sN88h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-KSRXLKxyXBva3KV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxJpVZxwMzlLAFsHjF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]