Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@fuascailte because it's based on speculation, makes numerous mistakes in it's a…
ytr_UgyPgzrKD…
G
@lavalamp_real i can relate, unitentionally got to a point where the ai was try…
ytr_UgxnUi-2e…
G
We’ve already crossed a kind of “soft singularity,” not where it has become cons…
ytc_Ugx_0e9qu…
G
We are far from even reaching consensus on the definition of good intentions let…
ytc_UgzaioAdW…
G
Notice he clearly mentions the reason why AI will be so dangerous around 11:45. …
ytc_UgyLn1QPa…
G
Hi,
It is important to note how the data you train the model on shapes how it be…
ytc_UgwTEme7i…
G
We would like Sophia Robot and Hall and Amika to Denmark, they are Denmark's pro…
ytc_UgxAw3FJT…
G
The one thing AI doesn't have is soul and you can tell the music has zero soul. …
ytc_UgxFrYBFI…
Comment
This is not a real problem with AI. Frankly the direction of this argument seems like an attempt to compare it to crypto mining, but in that case the difficulty of mining is the point!
Worries about this will quickly become outdated. Sure, some AI models are currently energy intensive at the moment, but that IS dropping rapidly. Companies like OpenAI don't want their models to cost a ton to run, so they're driven to find ways to drastically reduce that. And other models are already becoming efficient enough to run off a smartphone, so I don't think this will remain a big issue to focus on, for long.
"They used the energy of 30 homes to train a model, just so people can tell knock knock jokes." If she's not going to take the benefits of AI seriously, why should I do that for any of her arguments?
youtube
AI Responsibility
2023-11-07T19:4…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzieWaQ3r2Ptkzipex4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlypwDpaIYqVTiQ994AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwhnAJ5Takh6Cj9yKV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy6QwZWpshINHKMtZR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzVQxgYgTHDVrns63d4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyAdhKEiTqjdXRcpx14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyVkxbEXrb19EKewDl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxJNt-pTw72fDA-Nit4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzlSEumVDVbEXVNYo14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxvY9Ir_2kf4RpX6ix4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]