Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Sheree Hardin Even if they do, what point? His point is basically this: "When h…
ytr_UgwwUh3zu…
G
"Penrose sees the limits—but he doesn’t see what’s beyond them. Consciousness wo…
ytc_UgzN2hpbT…
G
@steven - that section around 1hour 30 min in, where you describe the types of r…
ytc_UgyozvqfW…
G
Big taxes on companies that use robots/AI to replace human jobs, use the tax pro…
ytc_Ugz_dUIXb…
G
Surely, a moralistic, ethical code should be installed from the very outset crea…
ytc_UgwrakQna…
G
Funnily enough everything in this video is just prediction. Right now AI is stil…
ytr_Ugznzo4b1…
G
What if you use a different AI or if you ri is basically just hiring someone els…
ytc_UgwyD1j7p…
G
Yeah … that’s what we DO … imagine you feed ai with “wrong stuff” …say OBSCURANT…
ytc_UgwZ7hxaU…
Comment
Makes one wonder how many times AI will fail in the other direction, falsely tagging an actual threat as just some regular guy or sneaking some tiny detail into a police report that an attorney can use to get serious charges thrown out.
Jury: It says here the suspect is accused of beating an infant to death with two _other_ infants, effectively dual-wielding babies with the intent to kill, at 2:00 P.M. on January 4th, but the AI police report says he was out of town on the 4th. We have security camera footage from eight different angles of the suspect holding up a sign bearing the date, time and a detailed confession before hammering an infant into a bloody pulp with two entirely separate infants, but if the AI says he wasn't there then that sounds like reasonable doubt to me. I mean, this technology _is_ pretty cool.
youtube
2026-01-08T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwwVQJGS71x9S6q4jt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwhwfeTQ0E4nov2CPh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxDO8-dg8asl1YTfu54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwBWHgRRniIcakNOZZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyRxQwT-cKv1JzipLx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzefDgQKmeq0JIGOzl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyNY6hjSqOUloY1Q9V4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwzyWKGUQerFg-GJjd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxwTdbWPvKK6b86MmR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxJDZE_nOlmgqJ7KAB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]