Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI art is why me and a lot of artists quit publishing or even drawing in the fir…
ytc_UgyUR8502…
G
So in scenarios that users write lyrics and have a.i. create a song from their l…
ytc_UgzJdeuC4…
G
“might not look as perfect” there were no flaws, i think it’s better than the ai…
ytc_UgyA62lF7…
G
I'm not for driverless trucks, but click bait, you did not show you following a …
ytc_UgzOYSuIC…
G
mark would literally copy the database as database_copy1 and delete the original…
rdc_hj3f4jy
G
I think it is a little late, now we need to plan for containment. you know that …
ytc_UgwyvZFgD…
G
people get hit and die all the time, if it wasn't self driving no one would repo…
ytc_UgwVQR4Gt…
G
This understanding of gpt-5 being way more powerful than gpt-3 and 4 is not corr…
ytr_Ugy9JnZEG…
Comment
Aaron Bastani's first point is excellent. It is important to disconnect the two problems: the dubious, and at the very least very far off risk of "super-intelligence", and the real current impacts of LLMs; on the climate, the economy and the gutting of many sectors to be replaced with models which will in the long run will be devastating to the output quality of software and service industries.
youtube
2026-02-12T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxkuX2BXpiHTKsOtJZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0xzOokFhOIbtl-OJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyxvOWwCmK3JhyzwkZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzBRkB8fH9Y6L0wZT14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxke1ex27baFrKJj7R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz7vzYxKKg85-9MdzN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwQApf9165F5XnEdPN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxgY-diDkBZ3GEtUCR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy2h2ajSjFopO3RBxx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzIrHin3cr4wqXSjqV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]