Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
honestly it probably would be pretty cool on letting machine make your own art, …
ytc_UgyNdLQY1…
G
If the computer can produce handwriting by analyzing what a person's brain does …
rdc_f50k1ce
G
If robots become conscious, does that mean engineers and electricians will becam…
ytc_UgxtaiVlU…
G
In my opinion one of the main issues we face is that what we are trying to do is…
ytc_UgwBljTBF…
G
I guess if America stops AI, the rest of the world, especially China will too, a…
ytc_UgwVcGvxX…
G
I basically started thanking and asking things to ChatGPT in a sooooo polite wa…
ytc_UgzOljRom…
G
You don't get it do you. To the wealthy, making money isn't about making money. …
ytc_UgwhNgxXN…
G
The thing companies don’t seem to get is, if everything is automated, soon they …
ytc_Ugw-DS84i…
Comment
My point is that you can't say a priori that a machine will never work as well as a human. Studies show that eyewitnesses to crimes are often wildly off base while the camera never lies. Faulty human vision and attentiveness is one reason why it is expected that self-driving cars and trucks will replace human driving.
Anyway, weapons development has a life of its own. If a significant weapon can be made, it pays to be the first to have it, or at least not be too far behind.
youtube
2012-11-24T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzLgHGb32Jea5OFKqF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz7amYdQiGwJljo7894AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx9bcUTIcCBDEsZBRJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7BqMhhGpcp_rNO8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9Ggwv4XWRx4nlf0d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzoWwuDpN-21OfQ0U94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzxLhw3d4Cfjm14qZd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYTAGDrDX2UaHFL414AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw_MFN3II2ieBv1g3l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiAhLWl3p99-PEriV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}
]