Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All valid points. But I've been in various fields of IT including development ov…
ytc_UgyQh9htI…
G
The only reason any would vote no is if they have interests in other facial reco…
rdc_eksuarp
G
I hate the stupid idea that drawing is useless ai can do it faster because I lov…
ytc_Ugwfiw1gJ…
G
the truth AI 🤷🏻♀️ going to give a suprise problem in the future 💸 everyone even…
ytc_UgwoOxAzK…
G
I don't think AI fakes are an important issue. I don't really care about attract…
ytc_UgzNnHG3Q…
G
The ai artists shouldn’t even have ‘artist’ in their names.They really can’t cal…
ytc_Ugw9piOk-…
G
Please, let them track us all. Since almost nobody ever actually does anything a…
ytr_UgwI_BBhI…
G
would love a bibliography, I'm doing an essay on whether there is reason to fear…
ytc_UggFbFrSe…
Comment
When LISP machines became capable of running on purpose-built coprocessor cards the size of modern graphics cards back around 1985, I thought that for sure the military was going to get the bright idea to install them in missiles and give them enough autonomous reasoning capability to do what we're talking about today. Now, even if the capabilities of 80s-era LISP machines weren't up to the task, the idea of LAWS has got to be at least forty years old, and it boggles my mind that the DoD didn't put more effort into this idea back then.
youtube
2024-06-30T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzxCbxuXWUJhrsNqfh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgqLtQK0O0DmPYdx94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1aCNv_FoIevkv7dl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzAV70__PRqJhqUIM14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRq0iMi7R8RwiRkol4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBfNnJAbOo9kot-Bt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzFT5MOQ_2KuftEBLt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy9BdJCKhbu16V96v14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAg7w-Ema6FaE5OG14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwCPbK-1BoUO5KgUYV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]