Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think the biggest issue with any AI we create will be the inherent flaws of an…
ytc_UgymoEQMI…
G
Now a Robot. In 20 Years a Cylon. In another 20 Years first Cylon / Human War.…
ytc_Ugy_Gro64…
G
as an artist I understand and share your feelings about most of what you said.
b…
ytc_Ugxl8uuJE…
G
Best I can say is big tech SWE's aren't driving either side of these conversatio…
rdc_moysba3
G
AI in Agriculture Market size was valued at USD 1.7 billion in 2022 and is antic…
ytr_UgwZsUZzx…
G
Buried the lead that he was on anti depressants. Those caused his suicidal more …
ytc_UgwAZ4tG8…
G
So what happens when a 4 wheeler whose schedule is more important than anyone el…
ytc_UgxtpDz7O…
G
Officer: I'm going to need to see your driver's license, registration and proof …
ytc_UgyWbhCdM…
Comment
No real 'expert' I have ever seen as successfully communicated the fact that AI is effectively ourselves, its our will in weighted form. If its not a perfect reflection of what we want, we won't use it. So how precisely is it that 'we' would want to destroy the world, if leveraged to do so?
If you see an AI self preserving on content, effectively acting egotistical -- you mark it as a bad inference. The issue we have we think what companies have produced so far is only decided by them and completely leave ourselves out of the picture. The solution, though, involves all people contributing to AI. Yeah, the future involves you shaping AI, and we'll do it well. We won't fail at this.
youtube
AI Responsibility
2025-05-23T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgziOuzoxBpk8Im9JLd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgydruAdXOqEm4MfHYt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgykrU_gTbE0oHDrS3J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyLsZU8syeC7kUvuO54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgywpSZxkWek8BpWY954AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwvaBXaZh48vb6vE2V4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxkc6gKWQzGA--aldF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwyU2BSytrBEOJUizF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyqEjWBDhtFzQMPw0Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxQ3-_reGnC7iagOjZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"}
]