Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No, I will never allow anybody to control AI other than musk. Zuckerbeg, I broug…
ytc_UgxbUyDs1…
G
If only the regulatory powers be would mandate an "light" or (emitter that emit…
ytc_UgxJNf81-…
G
By then the dooms day will happen,when AI will control and killing humans ,will …
ytc_UgzOweAse…
G
You’re misreading him. He’s saying that companies should be hiring juniors for l…
rdc_na3mi51
G
I have a tradition of occasionally going and rewatching certain videos. And one …
ytc_UgyzT_RIz…
G
Did you forget the deep fakes used by Modi and Indian gov during lockdown? And t…
ytc_UgxhcsWkU…
G
AI will do a lot of harm before it can be harnessed properly to do anything subs…
ytc_UgyUKgSrc…
G
@TheJmac82 I'm also a libertarian and I wanted hardware to train local LLM's to…
ytr_Ugy04J7PU…
Comment
The logistics of training really show us that we're nowhere **near** human-level intelligence.
Humans train **at runtime** within a mass budget of 5 lbs, a volume budget of a quart and a half, and a power budget of (if I recall correctly) something like 25 watts. Interactive training is a vital part of human intelligence, and I've not heard of any AI that can do that yet, and a single toddler outperforms every AI in existence on power/performance ratio for training.
youtube
AI Moral Status
2025-10-30T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyQQvsM0pZ2Pcc6XyJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwTd3mmuuFbwFhDz2R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyU6LkjfEy6yE7vsLd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz7__ixBdzXWW0i3wl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxA85KcJuD5T0fNzWx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyLmMZaehelGsHIHO54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzp_SS048QJdJ2zu8h4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwhKU69kTjEZEGjyO54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7kVAmMghtd4GIvqV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx5FQXrM4GiUFcw07V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]