Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This debate was disillusioning. These are, of course, extremely intelligent pers…
ytc_UgymYtKkv…
G
Wouldnt it be great if a.i could see the benefit of a clan based human race. Liv…
ytc_UgzyTrDFe…
G
Typical talking from people who knows nothing about the subject - this is the 1s…
ytc_UgzbI66wD…
G
I prefer Gemini in Claude. Claude is the best Gemini is what I prefer and is wha…
ytc_UgyMv9wsT…
G
For those interested in better understanding AI safety research, I would wholehe…
ytc_Ugy6QvnlN…
G
3:49 “…we don’t know are to make sure the system are aligned with our preference…
ytc_UgwjrtwIa…
G
Certainly a better approach than having students sitting on a chair for more tha…
ytc_Ugzwq-OTZ…
G
AI is slowing down. It doesn't have any new training data, and I'd say more than…
ytr_Ugx63FoC3…
Comment
The right question is: Why you don’t create a reference system to control AI not necessarily for how many R but to control that it won’t destroy humanity?????????
Or multiple systems of reference.
You think too many LLs are the problem????
Since when we design something so powerful without a reference system to pinpoint the danger???
youtube
AI Governance
2024-08-02T12:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwhhvT7tdG9FWhaVbl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyl-pMdLohuMS87dZt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyMlxH_NZWgWMWNzRF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxftOLNA8R9DbmxJXZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzC1j6aqH2bDGvOW_94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxwdBLE_R-HqL_spPp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw7xfEzvQGqiyjXJzt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwp4xwz3JWF1F40Sg94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyaFIlptQN8Npp2FOd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyjuHTmRP7kjK4O_Ph4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"}
]