Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think fits the AI shulde to be free and the sec all the artist should to be co…
ytc_Ugxwk8zL5…
G
Someone was saying Phub already has rules against advertising deepfake sites so …
ytr_Ugx1fpoqP…
G
I am a artist and I hate AI I can't believe people think it can destroy us It li…
ytc_UgwJhy75h…
G
I wonder whether Perimatter was as concerned with Meta & other competing AI that…
ytc_UgybL8Qm2…
G
There is always a limit of something and now we are going beyond the limit, huma…
ytc_UgwY3c9_a…
G
Shame on her, boricua or borracha narcoterrorist, corrupta bartender ladrona etc…
ytc_UgzBJWwRy…
G
An A.I. expert stated that 50 percent of jobs will be gone by 2035. Ghost in th…
ytc_Ugy1OGy3_…
G
No one is asking who will make, maintain and fix the robots. AI will still need…
ytc_UgwAYqgLH…
Comment
Any form of Ai that we have in 2026 is working off human data bases, human inputs. Human logarithms. There will never be Ai at least in the next 10 years doing what all these Ai specialists predict they will be doing. They will all be working off human input logarithms and commands. There will be no age of ultron nonsense. The worst that it can go is hostile countries using Ai generated videos or hacking for nuclear codes. Ultron is never going to appear and say howdy. I dare you message me in the next 5 years if that happens
youtube
2026-03-10T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy6dAF9BK48TTdweAF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz6MpOt1ka2WKKAnSl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxmcMyls9A5bxTBzdt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxsJdUF7jKB7154jSV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxvwi87PnCLRCgD42F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRs1FZY_U41Fk33SR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz-jezX5UPCV9OL3xV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy07-HpimgAN1Vemnd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwP70oEKtQepzmsoI94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyUk8z3Eer3CEt3jVd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]