Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
depression, suicidal thoughts and easy access to a gun is what killed this man. …
ytc_UgwOd2nID…
G
I think if it wasn't for the destruction nuclear weapons has on the land much of…
ytc_Ugyw4wFTN…
G
Well you literally said you had “coucil” so im guessing it’s because of that and…
rdc_jklwmb1
G
It's a complicated argument. I've been on both sides and I've chosen AI in the e…
ytc_UgyleBsgD…
G
>Did you regret joining google?
Nope. I think I had to learn what this kind …
rdc_mtoawyx
G
This is all very interesting. Now, please do a show on how AI is applied in hiri…
ytc_UgzB8Hfwy…
G
Crashes are mostly to do with finance. The underlying demand for goods and servi…
ytc_UgzCGEa3b…
G
❌ Argue with ChatGPT, built by transhumanists, by stealing and pirating copyrigh…
ytc_Ugy4D0CX6…
Comment
3:10 That graphic is very misleading. While it is clear, that the training of ai models can be very computationally intensive, you can’t just claim that the training for one model leads to x amount of emmisions. Models like the GPT models or Dall-E sure did lead to a lot of emmisions, but there are several applications, where the training of the ai produces orders of magnitude less CO2. The size of the model and the duration of the training matter! Also you have to consider, that after the training, so during usage, the production of emmisions are way lower. Still there can be a lot of trainings, because most of the time there are be a lot of iterations of the ai models.
youtube
Viral AI Reaction
2024-10-26T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxg4j8RInMM3RCU6Ll4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvaBdEgJD50rN0UgF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzxbirGWlDL2LG0wUp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy0BCgsc4jPpWPRC3x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyyxHAOC9zu0GXATax4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw-lh8eplHS5sVLuKd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxmemNCwHRAbCnvpVF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNKJT-PltS3_auM8B4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy5-WNSZyRvehNjOoN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxkDmABty6Y7s0AbPd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]