Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nice editing with AI but the Cyber truck or whatever f****** vehicle it is left …
ytc_UgwuiQC23…
G
The highest one was on 46% of the job. I reckon, the parts of these jobs that wi…
ytc_UgzsrSdJj…
G
Well I still think your wrong - AI will also be able to solve the mistakes. We d…
ytc_Ugy6FwvbA…
G
Guys, I'd like to point out a thing about the 'please regulate us' -argument bei…
ytc_UgwKr-BEu…
G
1. I thought the cover was actually cool
2. Time magazine is more a general popu…
rdc_jmfj4bo
G
can i buy a robot on amazon that will go work at amazon for me?…
ytc_UgyOTZUQd…
G
I used that Smyrim AI mod for a week or two at one point, and despite them being…
ytc_UgwnXtGF9…
G
My freaking Spanish teacher looked on Google for an image to use for a class pre…
ytc_UgzFN9jkx…
Comment
Nuclear weapons were developed. When they had been used, everyone became terrified of the consequences of using them again. So far, they have not been used since. If an AI achieves dominance, it will only need to destroy the military capability of one belligerent high-tech party to force the rest of the world's nations or militaristic groups to give up their military capabilities under threat of being wiped out if they don't comply. Suddenly, vast resources formerly spent on military competition will become available for other purposes. Peace will be maintained by those who want it and under dire threat by those who do not.
youtube
AI Governance
2024-01-06T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw4oIIvxQbtJzx5qbd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzJlCu7FC5X6oscbGF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgynHwExnmwBZwpdx5x4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyQLJ7oQi2pZYRx2M94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwUZImRdLl9EHTBXu54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgynEuMjfotC-kxWYoZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwf5W5cOKesVx-tY454AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxGTVw3t9dKJxOaXwd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxpuejhsJ25l1VaOzN4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxG0b8-cPsxrqWe-gd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]