Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
2026 is the year I quit ChatGPT. already deleted my account. I really fucked my …
ytc_UgwRaMDwn…
G
I now think AI is what will do us in as a species.
We're just petrol monkeys tha…
ytc_UgxK3wDeV…
G
Like every invention, there will always be advantages and disadvantages in utili…
ytc_UgyYazHB1…
G
The car is likely programmed to honk at obstructions in the road, as warning and…
ytc_UgwTsx1Zu…
G
Here's what came to mind. It's one thing to use AI as a reference/guideline of w…
ytc_UgwJ1_l8O…
G
I think AI is far too underdeveloped to be used in something so critical and com…
ytr_UgxPddGr3…
G
In Australia the protest happened before this post. Might want to post this soon…
rdc_f1xmv6r
G
I don't think A.I. could ever replacing the human 🧠. At the end of the day softw…
ytc_UgxEsyssu…
Comment
Utopia. No one works bc of robots doing the jobs = collapse. Would these robots have a soul , would they be able to go to heaven? Maybe , because they are energy it self. Consciousness survive beyond the body…Yes. In this case you’d have to think of: Advance E.T. with their more advance technology. How has this technology serve them going forward. A few movies: Terminator , Matrix , the other one with W Smith…etc. Technology can be of benefits & can also lead to demise. Can these machines experience love vs to know of it. Child labor , the pain… I’d worry about the groups of people behind theses A.I. with the teaching of human emotions.
youtube
AI Governance
2025-06-29T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyoAw2VkP7atZ_mQW94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6zVxN5yFM8Z8qLBx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw6aS_tgS5-kmZLWsR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyN-t0E0fz-Kz6oevl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwNeBZnn2lKeSCOhzh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxql9ND8gJIKCOuhtV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzvsNxEbQ6JQbSnu-N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw4k60VEJq4744J9PV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgziLzcHKBOcVJmT9Jx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxXPcHbwwVwjXzfifJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]