Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For each art just post a thousand pictures of jschlatt so all ai is schlatt…
ytc_UgwN60GKw…
G
50:27 thank you for making the effort in trying to save humanity! It’s all very …
ytc_UgwIhl_7H…
G
Which is funny bc ai can't tell gorillas and blacks apart 😅😅
Edit: to the dude w…
ytc_UgzgOmYyk…
G
For 1 AI trying to build a chemical weapons, how many AIs are trying to prevent …
ytc_Ugw5TkRYj…
G
It has manually teached targets the code can be optimized to speed it up its an …
ytr_UgzEQz7EA…
G
Omg i just realised , we are watching ' ai observing ai observing ai art '…
ytc_Ugxo_1Phk…
G
as a beginner don’t use any ai tools like copilot, you will be reliant on ai the…
ytc_UgxpSUvte…
G
I don't have this "artistic ability" they talk so much but I'd enjoy my crappy s…
ytc_Ugx7ei36Z…
Comment
The a.i. may become infallible in the next 25 or 30 years.. the human drivers that drive along side will not. There is no A.I. that is capable of predicting the human.
Humans cut-off tuck drivers all the time. Have roadside accidents all the time.
A robot will not be able to account for those scenarios caused by human driver errors. Unless human drivers are eliminated completely, the need for human truck drivers will always outweigh the use if A.I.
A.I. is barely usuable for airplanes and trains.😮
youtube
AI Jobs
2025-06-28T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw7jEzpB66IRhLRu5Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwxq3Iqg8WIphyKS0l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxh-LbijIYhM3Ks3bF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyU-meDZCbFaSyPRyR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyZF7J7upCpQhydzOV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxmLulhhPhXCVz4Wpt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxl-zq3JvQyDtMKoVV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxVzPXA54nFqtqUtfN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxmcYfgjtb2neKn_4d4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx1xZWgeFH8cNPepKB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]