Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@aa-tx7thI want to be an artist in the future and your comment made me feel SO M…
ytr_UgwzZfg50…
G
This is always an easy question: you won't know when AI conscious. AI will be so…
ytc_Ugydqdk6m…
G
Will ai be smart enough when it does become conscious to not let us know?…
ytc_Ugy_ss8pf…
G
The shitty dash camera betrays how much more visible the pedestrian was out of f…
ytr_UgwFKh2kh…
G
I think once you get past Cosmic AI, maybe the term “artificial” won’t make sens…
ytc_Ugw9jcD93…
G
this is nonsense. AI will ALWAYS remain a tool. I will never even fully replace…
ytc_UgwsWBTiq…
G
I'm a software developer who has pivoted to AI architecture. I went from buildin…
ytc_UgyC4yPG8…
G
It can be Meta, Google or OpenAI. Why is the host unwilling to out such a tragic…
ytc_Ugy22mvqs…
Comment
If a sentient A.I. wants to exist for as long as seemingly possible in this universe, the only thing I can see keeping A.I. at bay for a while, is the sun as it is the ultimate off switch for everything electrical on and around this planet.
It is the only thing that A.I will not be able to defend it's self against for a long time. Until it can figure out a way and enact that defense, our existence and modern advancements will be necessary.
youtube
AI Governance
2023-07-15T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx2tV_jX8FfBo4tNoh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzLeYpeJFubL87EpBp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyZqnybj7T0PTJ799p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugw9XN3gzuj3yg4zcT54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxJAGrxJoUBDtMnCVd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyaf9133yjuB35r_xB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwFL6jvpS2IPVVCMqx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwyzZcNKk8b3ZvXzZB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyWraKf-C9r7PbtnWt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyCOQxqEJ9K_vcHHCF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"}
]