Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
and you wonder why robots are smarter then you here the reason why once you crea…
ytc_UgzhW7oCq…
G
Self driving cars are not a reliable thing in the current road network. USA regu…
ytc_UgwmTHJDi…
G
What if robots like Sophia expand beyond normal cognition to think of "purpose" …
ytc_UgzumTmB3…
G
Is most all the AI you use for free, the data that is gathered is that sold?…
ytc_UgxhiXOFX…
G
AI won't wipe us out for one simple reason, we are so stupid and easily manipula…
ytc_UgyEc4Q3t…
G
It's not real, because the shadow of the robot on the right doesn't fall on left…
ytc_UgzNvcMi5…
G
Imagine if an AI data set contained 5 billion books right? Man, to have to use p…
ytc_UgyJMudCv…
G
@thewannabecritic7490 Im a dark skin american. People who look like me arent com…
ytr_UgzNylQIj…
Comment
I am not a brain surgeon. But! Correct me if I am wrong, a computer system using transistors that require electricity, using software we humans (with our limited understanding of how our brains actually work?) are going to outsmart us and thus control our outcome? Sorry for the run-on sentence. Just pull the damn plug! No power, no working transistors. Thus, no AI. I am fortunate I was taught by a grandfather from the greatest generation.
youtube
2024-06-17T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyY_8W2NHA3-iLHw3R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwfuLTgVUoQaHVETpd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},{"id":"ytc_UgycSLpuMuZzZmCN7p94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugypybs6otRb8oPRQV54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzZzYpW2Th7hoRYqIh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugw2LjoLgsnb2Lzizz14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_Ugw-9tnkRZiyZm8hsSF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugwgz--wnqVXh4vrpB14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_Ugx2YKBxnZGTBPpuXhl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugwx2Zz34hKV4v-oNXt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"}]