Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'd just trace tbh. BECAUSE ITS AI NO ONE WILL CARE IF YOU TRACE AI…
ytc_UgxZ5ffwk…
G
Yeah, but is that to scare u to not using it until he comes out with his own AI …
ytr_UgyJ5xE40…
G
Homework is worthless, I just did it on the bus. It's just to keep you busy.…
ytc_Ugyo7c-8i…
G
After wading around in the discourse about AI (on Reddit, mistake 1) just trying…
ytc_UgxcUElk_…
G
The truth is Perlmutter is a crook working with WB and other big corporations to…
ytc_UgzQvX1tp…
G
Anger and passion is the very thing that makes up a human soul, saying that the …
ytc_Ugxf3lCTe…
G
It's very clear that The Ai lacks something fundamentally human. The renders in …
ytc_Ugxvv7ckJ…
G
They should make driverless trucks illegal?? Why do this?
There's no reason to …
ytc_UgzjJnfh-…
Comment
Avery eye opening interesting talk. Itbegins to sound like the greatest polluter of our planet is AI. I never gave a thought to the power generated that was required to respond to my questions that realy didnt need answering. Add together the millions all over the word keying into AI and I suspect the biggest polluter of our panet is not the diesel oils or gases, but computing plus its use of AI. We need a new exhaust system added to AI . Remember you pay your electricity bills and AI cant work without it. And can you imagine. We have gormless politicians forcing us to adopt to using eectric cars which use power in their computing systems their diesel models dont have. We should control computing and conserve energy by rationing computing time and go back to using our combustion engines 100%.
youtube
AI Responsibility
2024-01-15T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz3A6ytKSz9G6DkOep4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzp5p8JxwS6W1mEiRV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwj7g0cSUP2cs9wAdF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyS27CgNA2aonADmHR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7rk7Ek_VRyrUvMd94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwhvO6biC75dSTKgbV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdTny0kxeCm2M7P4B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwODINtuoYxnSTgfqF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzzpOM-Qo0Fzi5HGbd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgydKJ2kryjY35qMFwF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]