Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who cares? Bro is just salty because one ai post has more likes than their entir…
ytc_UgxNRRYkX…
G
A plummer? Giiiirl, theres definitely gonna be a robot for that. It'll be built …
ytc_UgxmQTqcN…
G
as a digital and traditional artist- if you ARE NOT an artist you have no say in…
ytc_UgzCsKpJx…
G
In the UK 37.5 hours is the norm. It seems weird to do any more than that.…
rdc_dv0mkaa
G
If you are an average Joe like me who cant capitalise over AI, just buy shares i…
ytc_Ugz8V1ik_…
G
Right Elon. That is exactly why you create an unchecked AI that’s actually lying…
ytc_Ugz-ah084…
G
We don't need actual AGI to destroy most of our jobs. And the people who are mak…
ytc_Ugwbn7Tls…
G
Similar thing happened to a friend of mine who thought to use chatGPT to track d…
ytc_Ugxd5xApJ…
Comment
I can't see why a super intelligence would remove all life on earth. How would it benefit? Why would it need to manufacture machines to harvest resources for its own use when it already has human who will do that. So no I don't believe that super general intelligence will destroy life on earth. I don't believe that it would allow the manufacture of a super virus to wipe out humanity either because it would know what the implications would be. However those risks are real if AI only had a limited range of intelligence say only biological research and therefore wouldn't have the knowledge to understand the full implications of what it were doing.
youtube
AI Governance
2025-09-05T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyn7pyHOntsH5ktk794AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwE4pQt16vZLIFV_T14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgywOlZxKz3za5-UAMh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwf1doEtGB58j0Karx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxkr39zDs4Zku7B1ad4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwqfOE-lidMBjkzCC94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx9DYSbvL2jyCKqosx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwRf8aynElnExG3aMN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSFX7JOBihzLzQe5l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwmdfk0XoynpyeOG-l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]