Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm gonna say D.
A robot cannot develop. Human emotion.
To answer this question…
ytc_UgxlYEa2L…
G
I'm not buying it. Even the best paid-for LLMs are absolute shite. They might co…
ytc_UgwAL7Yy0…
G
Imagine striving for decades to create a machine that can think like a human bei…
ytc_UgyQXL7Kh…
G
It seems the destiny of humankind is either "Idiocracy" or "Futurama".
As far a…
ytc_Ugjo_2qmw…
G
I'm just now finding out that there could be deep fakes of me at homecoming 😨 wt…
ytc_UgzWfFoqZ…
G
As someone who commissions art.
I would rather buy from a real human who had to …
ytc_Ugz5_xg6X…
G
This was an explanation of whether it shares 'truths' or merely things it feels …
ytc_UgxFi2AD3…
G
In my experience with all sorts of AI chatbots, the 'reasoning' feature is usele…
ytc_UgzuGab3d…
Comment
I believe there is one thing that most people are missing, if everyone loses their job to AI the world shuts down. Businesses may have all AI and automation, but there will be nobody to buy their product, so this system would not last long at all. 99% unemployment equals economic collapse
youtube
AI Governance
2025-09-12T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxgsa-aDi3R5NxLAEd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzfUEJXyVUuTSf00Ol4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy71SvJoYSVhiOty-R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwim4mwfbHcMedPKTx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyD2QqEg5Rn9OSTN4t4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzzceo7t__UoSHqXId4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwFJND1_r0PVJUeIKh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyYcBauVrdLgN4qvMt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzph7RJ7dTLp-6dt614AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzRRZhUPqeX0q2Vb5t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}
]