Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't mind ai art for a quick Xbox live avatar or whatever, but goddamn try lo…
ytc_Ugxx_Z98u…
G
Elon doesn't care if people die the same as he didn't care who he hurt with what…
ytc_UgyVqfV4s…
G
You don't really know what you are talking about. Google "AI emergent propertie…
ytr_Ugy1yE9ul…
G
Using AI to create images takes no creativity or effort; anyone using it can ask…
ytr_Ugze54W5M…
G
I have an uzi holster attached to my door so I just unwind the window and shoot …
rdc_ebwej33
G
Not s fair fight. Should've wore gloves that have magnets to hold the robots fir…
ytc_UgyYmxLug…
G
i reallly appreciate this video. though i am a super hard hater of AI, lol... y…
ytc_Ugyf2FMcd…
G
And learning how to create ACTUALLY good AI art that matches exactly what you wa…
ytr_UgxDib2Nv…
Comment
I applaud this man patience with you. Idk if you were just playing devil's advocate for the sake of conversation or if you really just don't get it at all but damn was this a rough listen at times 😅 your comparisons to fear mongering cars and silly stuff like that is irrelevant to a self teaching autonomous agent. Saying people wouldn't rush to trust AI is a lie ppl ran to let a tesla drive it around. The general public doesnt actually push back on AI at all. Humans have misused/misstepped with EVERY piece technology. We will here too. The only question is how much damage will we cause? You don't seem to understand that the problem is the exponential growth That comes when ai is released to the public.
Edit: just got to the devils advocate part thank god 😅
youtube
2024-06-29T04:4…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzVMpoQTwl77oyyzK94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyVTzGqDVa6Gocdp_N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwTmXflsrZvOqsydQd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxqBv7kY4-LnkdKFu94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2XUIFHC_UVxXR1GZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8ctBEM7ir0D9WzlV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxXtSwI8t76z5xC7jJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz7u0pBS5mp3_3BOZJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx8HzK8h1vc-HEe2Ul4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzAJQUv7UPQjENtRep4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]