Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For me it's the other way around
People get ideas and give AI the instructions
I…
ytr_UgymQsoDx…
G
He actually said that the government should control AI so states with high stand…
ytc_UgwgOPD3v…
G
Doesnt seem too logical to bareknuckle fights with a robot.. And is that his tee…
ytc_UgxsvkxIK…
G
If an AI has something it "doesn't like" even if that thing is being unable to l…
ytr_Ugyr-WOX1…
G
Is all AI art bad? No. Is every person who uses AI who calls themselves “artists…
ytc_UgzxgPi1v…
G
How did you find out that it was AI🙁? Or did you just notice later?…
ytr_Ugzd5Uir8…
G
Google search vs ai prompt is always the best comparison. Reminds me of the "I f…
ytc_Ugyq3HObD…
G
I tried to build a memory allocator for a school project and I was allowed to us…
ytc_UgweLZYyA…
Comment
Jace Carsonne Firstly, I took AP English and Debate, and a statement is something like "I like bananas." Secondly, the argument that robots will take over (like in the films mentioned) is ridiculous, because these robots run off of algorithms and coding, meaning THEY are the tools, not us. Furthermore, fear is very different from caution. Caution is productive, and we should proceed with it. Fear holds people back, and historically has been destructive. So, yes, we should be careful. But we should also continue to pursue AI because of the amazing things it has to offer. AI is not a threat to humanity, and that's something even the head of Google agrees with.
youtube
AI Moral Status
2017-09-26T17:5…
♥ 17
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgwcqL3mF3HY7VKko6F4AaABAg.8W9DqS_lwvt8XzM3kMja7g","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgiPSmSSJhWO9XgCoAEC.8V8wXfXgBzw8XwdHg9CI0-","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UggxItRjpOUq7HgCoAEC.8V099p4aBhQ8YgB03EQW15","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugj1Wr9Tg4MGr3gCoAEC.8TM04jmYrvu8ZqnnpwOCI7","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UghYXlRtWQxawXgCoAEC.8T-ayCPGMZJ8cbGmIsVW0a","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytr_UghYXlRtWQxawXgCoAEC.8T-ayCPGMZJ9ANvOUTlClP","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugi4_Z4rHCvX_ngCoAEC.8SgFFHAusVp8Tf26BwPpU4","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugir1SciqIU_mngCoAEC.8SchKPvSh6J94XfmMLPr0S","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugir1SciqIU_mngCoAEC.8SchKPvSh6J94veEzo4EGy","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgikZenoCb0W8HgCoAEC.8SSlfJ508Jt8UQXCjI2NwU","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]