Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He still win tbf, verification + 22m view? For a cheap ai video? That a money ri…
ytc_UgwJ8X7Rm…
G
This is all nonsense. They’re trying to get you people to believe. If there’s ev…
ytc_UgxGs69Fe…
G
You are missing the point of why it is dangerous. It's not the AI being evil, bu…
ytr_Ugw6xOWQv…
G
AI won’t be delivering packages outside of big cities, even then they would most…
ytc_Ugxq_36dl…
G
well i think it not bad have ai murder mashine but not art stealer those one we …
ytc_Ugzo1uSG7…
G
This is a very revealing take. All art (from painting to music to whatever) take…
ytr_Ugy9Sf_y0…
G
this is why I will NEVER accept driverless cars. Safer than human drivers? NOT…
ytc_Ugz_DTmdq…
G
We appreciate your observation! Sophia's design aims to combine human-like featu…
ytr_Ugy1YOL7l…
Comment
I think the biggest problem is that you have read so much of the BS and optimistic crap about AI that you think its actually intelligent. It's not. Language models are just emulators. They don't actually understand anything they literally just generate statistical answers from everything they have read on the internet... and they have ALREADY scrubbed the entire internet...
If they actually understood anything there would be no hallucinations / no kids committing suicide because of AI.
In the future maybe... but what they are doing now isn't even close now to AI and they are just hyping it to the point of insanity.
At best it's a tool to sift through large amounts of data.
youtube
2026-04-26T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzVkF_NrLhkrEUVVVJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLowtyS4K1gwcVkiR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJFDUpw1XGigpJ7mN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxRXlHNABr7L1lv52p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyeLGPbuS6GBEUl8NR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzMBadBsMgyzTfMzXh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxlfB3eUxRSUIynTop4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxFI5kcBLGmjQUDv5N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxfzkj5jcopo-Urf754AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzY4WLqaKcu4xBIIfJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]