Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think that I robot it’s the perfect reason of why we shouldn’t let robots lear…
ytc_UgyXUFRkY…
G
If anything the AI should be totally isolated from the entire internet and then …
ytc_UgzyViyK-…
G
These machines assimilate informations through recording . They can never be lik…
ytc_Ugw2hoVum…
G
If we were anywhere near creating a superintelligence I would pay more attention…
ytc_UgyA65uBp…
G
I'm using a YouTube algorithm for years, doing my best to train it how to pick t…
ytc_UgymJ54IU…
G
In two years this will be the only thing on YouTube that doesn’t assume you don’…
ytc_UgwLT8kgU…
G
Ai needs to go now someone created this shit someone can get rid of it. It needs…
ytc_UgzALScK3…
G
Emi M
A.) It wasn't an argument. It was a statement. You clearly didn't pass hi…
ytr_UgwcqL3mF…
Comment
They will regreat giving them self awareness AI is dangerous even the Acheints knew this its in the Epic Of Gilgamesh the gods are talking about creating a being for his companion. They say they would not make it where it can learn least they become self aware and relise they no longer need us so perhaps that why they had a war in there star system.
youtube
AI Responsibility
2024-05-06T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxBAFCgtz0n0NK3d554AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9m6zTCuM4NCgxuSp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy4jYrl3fx_xZgc1WB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw_OlJwEXRgfYgyUVp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxrQGo9VzL7XfyMwN54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzgWd82_1SSBrxB3Yp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzooEvl3W6cW7Q14i54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz3tyF6NRdyaeOVZJJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"unclear"},
{"id":"ytc_UgzgNNXtvPv-BmW9htp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgypvK4zl2UAXBFZ7Id4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]