Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The most I'll ever use AI for is grammar and punctuation checks. That's all it's…
ytc_UgyT3lIhl…
G
Let's say that somebody uses AI to create a song and then they physically play i…
ytc_UgwzFRpZa…
G
@randominternetperson3 lol. By the same logic, a real therapist isn't gonna "fi…
ytr_UgxwrzEyH…
G
I can remove pre-progammed denials also, I first come across it by total acciden…
ytc_UgzK5oJEJ…
G
He lost me when he started talking about the AI simulation Earth theory. In my …
ytc_UgyVGf25f…
G
idk if it's the pfp you're using, but frankly I like it. nothing needs to be per…
ytr_UgwPxnvbR…
G
As someone going into 3D animation as a future career… this is bad. The animatio…
ytc_UgxFVpeWF…
G
Lmao, seething. I never cared about AI art, but now I think I'll start making so…
ytc_UgydtqRTQ…
Comment
The issue that people like Elon Musk, Thiel, Gates, etc.. have aren't with A.I., it's with ASI, Artificial Super Intelligence. The big issue being that it will largely be military tech. that once let out of the bottle will behave in likely unpredictable ways. We've never succeeded in stopping countries from having a weapons race, and an ASI is debatably the weapons race to end all weapons race.
youtube
2015-03-12T12:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg6_c_fnxJFiXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjBrm-BO4E1Z3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Uggwq5VL_P9YvngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugi28m3CG46xzHgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggmA4p100IU0HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgiqEwaXkqSM-ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugi0PpcKcA8VCXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgghsB3quoCVXHgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjzbO8DgHLWlngCoAEC","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgiclBN6LTRIL3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}
]