Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
13:20 “it could be awful”
No sir, it WILL be awful. There is no safe platform to…
ytc_Ugx37uxFs…
G
- some one watched the "The Terminator"??? Like where the fuck do you think AI w…
rdc_kk3fd7f
G
@AsianDadEnergy I see you are, of course, correct. "Chain of thought" is an AI-…
ytr_UgyelLU9a…
G
The problem I see and with any emergent technology that threatens a skill, talen…
ytc_UgyOGlc9x…
G
Fine, maybe talent doesn't go there@foxesarefalling. But I still think it could…
ytr_Ugxv_z9dx…
G
@subculture_star I volunteer to improve ai art, all those marxist artists do is…
ytr_Ugzd7I2Qo…
G
But unlike Teslas in self-driving mode, they didn't crash and burst into flame, …
ytc_UgwH9nQqU…
G
The thing is that there are those in society who would gladly take us there, or …
rdc_ktu4qhi
Comment
Already see that in interactors and creators of Ameca. Being ok with being rude to a robot, on video, just because it's believed to not yet be sentient doesn't seem like a good idea to me...
youtube
AI Governance
2024-02-27T02:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugw0-Ijj-uhoI_XU2kZ4AaABAg.9xi3KIOXgpIA0Ilfjfvfbe","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugw8xpbzs34lqwyUB4N4AaABAg.9xV31fclxyyASXvnz5UoYs","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwwU1Vv8AbrPJmj44d4AaABAg.9xSuyb6sSxa9xxs-2VWMez","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwwU1Vv8AbrPJmj44d4AaABAg.9xSuyb6sSxa9yZQMKQmbZv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwwU1Vv8AbrPJmj44d4AaABAg.9xSuyb6sSxaA0IrzMEPBNz","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytr_UgxjUJdy-cwgABssFyJ4AaABAg.9xHmp52KLllA0ad530XCjW","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxlZFt48vN0azs9C8l4AaABAg.9wuBs01TZNm9wuC-PKWdm4","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugz1EaM88bYiQJACaDl4AaABAg.9wu2NUyoV8k9wwWIYwAUTT","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzoGjXyTSmb9neYwAZ4AaABAg.9wpsQX0eQNL9wrndBoxpuu","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgzoGjXyTSmb9neYwAZ4AaABAg.9wpsQX0eQNL9xhCJI3yOIt","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}
]