Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Approx. 9mins 55secs to 11mins 10secs: Some thoughts:
1.) Government regulation …
ytc_UgylNLwQj…
G
thing is... its not even their fault. they have to in many cases to stay competi…
ytr_UgxIJdgj-…
G
AI hallucinations are real and dangerous. Also if it consumes its own work logs,…
ytc_UgwIMGFiA…
G
The first question I have is whether or not the people making such claims are th…
ytc_Ugwv_qyYi…
G
This is what happens when you introduce LLM's to a country with the reading leve…
ytc_Ugzpl4Rsd…
G
The same logic can be put when Im taking my camera to cinema, set it on record a…
ytr_Ugwyj-cM7…
G
So AI isn't building easily maintainable code today. AI will improve. Remember t…
ytc_UgzLNGZmD…
G
We are being rendered as humankind to be inferior to a technology as we no longe…
ytc_UgwAh3RMt…
Comment
The fact that Google has a policy against creating sentient AI suggests that they believe it's not only possible but likely
youtube
AI Moral Status
2022-06-30T10:5…
♥ 14
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[
{"id":"ytc_UgzmiwdVhcYM0buJeXJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwWMaXgYWnAy4PbuZ54AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9ha9tNQLWPgU_FQ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwM7XcidVzjoAqMZvZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVQUP9gOmJqHqhx254AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]