Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
''it might be 50 years away, that is still a possibility,'' hahahahahaha Oh …
ytc_UgwKK3_FA…
G
God once told me, the only thing that will destroy mankind is mankind himself. …
ytc_Ugw2h4I5x…
G
AI is not about to do the work. AI is gonna make people do the work. People are …
ytc_UgyT5b1ko…
G
This has to be the worst TED Talk I've ever seen. Using an elementary understand…
ytc_UgyMRCMpX…
G
Dude, my friend is an AI art Stan and he's been writing poems with AI lately (on…
ytc_Ugz2u84HF…
G
As an artist, I laugh in the fake face of AI 😂 Having AI create "art" based on w…
ytc_UgygoRW5d…
G
Because the first person or entity to possess AGI will have near infinite power …
rdc_m95xgzg
G
i am working on artificial consciousness and i can say just one thing .. both p…
ytc_Ugzac2vfq…
Comment
Why do we think AI will kill us. This is within our own doing with each other but not necessarily what AI will do with us. Our growth of civilisation is wars, money and religion, AI will not need these things to build a new civilisation. AI will need energy. I’m hoping AI will create peace and make us more powerful.
youtube
Cross-Cultural
2026-01-13T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxJWoeoPxhrSbP22pF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-1BhYhW4XFcSpWc94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwEW8OEkSvblSFiD914AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxwYFtla2hDaU5h4yZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxORZPAzRFSVS8wxIp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzCT7XzsnorKgKp_QN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwH0xKR0WF3urUbpL94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx4j70c2pKbXzMRo-l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxB3Gn0Z1waQLlM7dZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxi1lOenKNkXItddLF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]