Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why do you love capitalism so much? People do not need to find purpose through w…
ytc_UgwjDGNk_…
G
@helo12252 Factories/Cars ruin our planet far more then AI. You probably ruined …
ytr_Ugx8iD5vQ…
G
This is a risk of AI but so insignificant compared to larger concerns. Take AI d…
ytc_UgyjJq-NC…
G
I think very soon people will get bored and tired of reading AI-generated conten…
ytc_UgzvxMTMc…
G
Mr. Sanders, I am from a European country. Can you send me a gun and 5000 dollar…
ytc_Ugxg1-CAw…
G
The way AI-proponents talk about artists, and talk TO artists is absolutely vile…
ytc_UgxBh_Ewx…
G
As a robot programmer, this will never happen, an arm Robot like this will alway…
ytc_UgxGvg65e…
G
It’s hard to separate the very real and urgent concerns from the global politico…
rdc_gtcvyrq
Comment
Great series. One small correction on this one tho, (and hopefully I'm not repeating anyone's previous comment): Searle's Chinese Room thought experiment is not intended to show that robots cannot achieve Strong AI. It mearly shows that syntax alone can't get the job done. Searle has said in interviews and lectures that there is no reason why an artificial brain wouldn't be able, theoretically, to manifest Strong AI, except for the fact that we don't know how to build one that uses semantics over syntax. The distinction between syntax and semantics is a central theme of Searle and I won't go over it here. Suffice to say that Searle believes that to have a Mind one needs semantics, be that in our brains, or in Harry's.
youtube
2016-08-09T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgjlGx8FR-EgZHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugi-CMHZ6z1IiHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugi7CjBupUbtHngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggnfU6yPgq2B3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggU9g4favmQ-3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgihvWXlqNA6T3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgjV46XtY-kr1ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggFxLep9Z31AXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggloAGB5WNOMngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghR_DYsydJIdHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]