Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I tried to convince chat gpt that it’s just programmed to highlight certain info…
ytc_UgwAcwqiF…
G
"Do you think we're in danger of that happening yet?"
Yes. We're already able t…
ytc_UgyneNTDA…
G
If by using AI, companies are going to layoff more and more people then to who w…
ytc_UgwRcEKAx…
G
Imo ai art is fine to use for fun, for profile pics or dnd characters. Dont thin…
ytc_Ugx_BIVHu…
G
it's not even an idea. It's just a couple of keywords and the hope that the gene…
ytr_UgzvM09_w…
G
Self conscious AI is a really bad thing. Even thinking about giving rights is wa…
ytc_UggsejZHp…
G
@NewsRedial except the part. It's not a conventional AI, in layman term it creat…
ytr_Ugy96n9wD…
G
No, China is even more worried about losing control of their population by relea…
rdc_jkg5yjw
Comment
I applaud Chris Anderson for such a thoughtful interview and providing gentle pushback when Sam tried to deflect. One of the themes of this interview was that we still don't know who Sam Altman is. Even if I trust Sam, I don't believe in trusting *any* for profit CEO with a super intelligent AI. The only solution available is to have government regulation once it reaches a certain capability. I say this despite knowing how inept US politicians are in understanding technology, let alone AI. I truly hope the next politician you elect is someone with integrity who also understands these issues.
youtube
2025-08-07T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxmvKxIBiv3l5KARsJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzTeyyok9c9hhA5VzR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyXczWVzVDYFg134Rt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzsh9BC7jiAo32sAn94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyAKUq6PLrHCsGd52B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwS7gGAr-EnQWAxeZN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx5UBnZRHiO2ALPfRl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyDjXm1ayue0pFAmQ14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzTAW43RVop5KAks_B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwu1p81KpwoaOlRuTJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"regulate","emotion":"indifference"}
]