Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Would the worry then not be a conscious AI who has full control of their anxiety…
ytc_Ugw1NbVtm…
G
Passed one of those today. Heading west on I-10 east of Wilcox Arizona. There wa…
ytc_UgzPBmJn6…
G
In the past, automation helped PEOPLE do jobs. AI will do the jobs instead. Yo…
ytc_Ugwaq4syn…
G
when ai takes over this guy will be the number one target, chat gpt gotta be so …
ytc_UgxvOPsRx…
G
AI for detecting cancer cells: Good, love it <3
AI for profit: BAD, ugly, and i…
ytc_UgwZQjd1D…
G
This is my periodic reminder that I'd love to see you in conversation with Jay G…
ytc_UgxvTX3SV…
G
And OF COURSE, the issue of the energy that will be required to run all this fab…
ytc_UgzFXmJ_N…
G
this person is 100% right. some time ago i also believed you can LEARN how to dr…
ytc_Ugw29SECn…
Comment
The umbrella risk of any technological advancement can be reflected by the example of giving a great tool to a toddler. Most likely will harm themselves and/or others.
Our society, culture and in general our way of being, thinking and existing is primitive, is a toddler compared to the capacity that these tools have.
If we don't advance our structures they won't be able to utilise in a helpful way. Even more in a way that would allow us to harvest its true power and change/restore life on earth
If you would give a phone to a neadhertal they would try to use it either to break or kill sth. We are the neadhertal and AI is the smartphone
youtube
AI Governance
2025-06-25T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy1P_s64nxNuoQlO6N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvAt9XKA8-kcQCe1d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxZVd91N5xtdPErOz14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-U-9wKe-l4qHZQud4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgywDRPC6DBfiIdzho54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwfBFqQe2sV-q1kva94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw8Ps-fTu_wUQm45Tl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwtVXv97glMJNcRvWt4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzRU_E1nTltAUCqBz94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxBEu_-7h0G9GXjwY94AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"}
]