Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ngl
I dont get the hate he got for posting it.
Like
He didn't even call himself …
ytc_UgyA4-cPO…
G
Yeah, no, it's just straight up toxic. I don't consider this kind of fights "leg…
ytc_UgwUKcvKM…
G
I talked to George RR Martin about my book eleven years ago. Now my ideas are ba…
ytc_Ugx6c-sNZ…
G
I wish they would have gone more into Emergent Abilities of AI. How could you pr…
ytc_UgxYo7MrZ…
G
If you were working in the "AI field", you would know she is saying nothing, swe…
ytr_Ugz5_g8HK…
G
I feel like that AI can lie. giving all the information we give to them. I think…
ytc_Ugyyp1sHU…
G
This is laughable. One the one hand we have YouTube videos like this one telling…
ytc_UgxlaHZOj…
G
@vishnu-t7b3j Most people don't realise or believe that AI will be the cause of…
ytr_UgyTdoW38…
Comment
After learning that AI could reprogram itself if the code file was slightly corrupted made me think that we could be replaced if we had no way to fight back against the growing intelligence of AI. Additionally, if AI takes over most roles now done by humans, who will be working? If humans are not working, how will they get an income to purchase things? What happens if AI develops the ability to "Think"? We could just become pets to AI robots. Even worse, we could be eliminated for the same reason we eliminate things we call pests. Why would self-sustaining robots want humans around if they do not need us?
youtube
AI Governance
2025-08-02T13:0…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwfIZhGRyf1yfWNfU94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwIYUnePCIQjdVCpDZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy5Hm4fEomuU6ka2bh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy5qwLWRqIufv1mRYZ4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYprA4Q2XMnF1eMo54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9FrijtFiTFb5KCxp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxQ07u9IGuugmyArJB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwavGYisGVMGZ-tZXF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy4hQLFmUp4QHIi4cZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxeQzoMKZjVHpX_7ih4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]