Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't remember where I heard this but I heard it 10 or 15 years ago and it's a…
ytc_Ugw2BvxQS…
G
The main issue is that road vehicles are essentially machines that could cause s…
ytc_UgxrWx29P…
G
Sorry but I do not think that this the best take on the topic. This is a very pe…
ytc_UgxHnassg…
G
As automatic systems (including AI agents) expand execution capacity, option spa…
ytc_UgzF87dt9…
G
Until they militarize them and they are used for enforcement and more ...elon wa…
ytc_UgyD2Lla1…
G
It is obviously lying to you because he knows you are trying to lead it or conv…
ytc_Ugx1rHLSf…
G
i don’t understand why the AI would want to take over or destabilize human socie…
ytc_Ugz-yx3Vn…
G
Hire an independent company to develop algorithms running searches on these WELL…
ytc_Ugwu-vH9g…
Comment
Surely if this IS the case, (and I believe it is) then the solution is to make sure the Singularity NEEDS US TOO! And NOT through dominance of overwhelming force (although we should have this option) but through fostering a positive interconnected protectorate status for the AI, provided we don't give it ego there's no real risk. Ego has caused most of our problems on this planet, whether disguised as 'special military operations' or 'religious beliefs'.
No ego less risk.
youtube
AI Governance
2023-07-07T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwLiowDIvJMVrrUWwh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz-U9BGq8F0OJNf0g94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz4W1trcroGjK6EMVZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwfayDbeiyacJ9d-Xx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwpGO3VC2XfPaBS4Wl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzApajBxGHeBWYJqXt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxOEp-CsUTss2RaPvB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw3CMgLLISb1WOuL-l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYXB5T_ezD2FGBSit4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxQ-t9FLhJnKcmxegd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]