Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Free time isn't a good thing if you don't know what to do with idle hands. Sorry…
rdc_dt9ubwm
G
It sounds like you're enjoying the dialogue! Sophia's take on wisdom and learnin…
ytr_UgxV1IbHI…
G
Forget full autonomous automation. AI should only be used for augmenting - not r…
ytc_UgxKxHgwb…
G
@Think_Of_A_Better_Reality It is the cleanest burning hydrocarbon. It produces c…
ytr_UgxnqL6gO…
G
Imagine doing a petty crime, running away then getting ran over and crushed by a…
ytc_Ugx85lHOG…
G
Yeah, the grin he hides when he says he’s not all that Quantum savvy…. Just Sayi…
ytc_UgxKB4Uw6…
G
@ that was literally the most pathetic thing I’ve ever read. 1st, actual art has…
ytr_UgxhO3FES…
G
Technology is inherently neutral, A.I will automate our lives and make our lives…
ytc_UgzDJYRXf…
Comment
There is plenty of evidence to show that we are headed into dangerous territory. Instrumental convergence is a regular occurrence with current AI models, including strategic deception and power-seeking.
A description of broadly superhuman artificial intelligence should ring the same alarm bells as a description of a world-ending supernuke. You don't have to try to set it off to find out if we would survive. On that scale, a small catastrophe is still incalculably horrific.
youtube
AI Governance
2024-11-12T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugz_yKuoKnTOTwpefPt4AaABAg.AAiaptz-j_1AAmOMFZF9M6","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugx8g_5sHOdatpNS1ql4AaABAg.AAiaMK8MSDxAAl5UmKk8NB","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugy6CiuOR6gKecGxKsd4AaABAg.AAiaGgR1H05AAkEW-cmu-_","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzCvXXYuGSCHhd8t-R4AaABAg.AAi_R3e0iIkAAjHps0E2Kq","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyJ8K6sL0wfC6SMlq54AaABAg.AAi_DdDRj5UAAl8z-LKwiI","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwePVVbMUGmOuwAgch4AaABAg.AAiZnSWW2klAAjIytb5cUL","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwePVVbMUGmOuwAgch4AaABAg.AAiZnSWW2klAAjJ1__3PS3","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgwePVVbMUGmOuwAgch4AaABAg.AAiZnSWW2klAAl3hJVF6k9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwePVVbMUGmOuwAgch4AaABAg.AAiZnSWW2klAD2JjEBDIcN","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxWbWU8HnwXTGqsyxd4AaABAg.AAiYeR1UICtAAigX-3Eus1","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]