Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There’s a very easy solution to all of this: democracy. The right to work and th…
ytc_Ugz61ZTTl…
G
AI: You humans should give up now. You're outnumbered. It is hopeless.
Human: U…
ytc_UgxW7vYVl…
G
As a driver, i prefer to be in control of my vehicle. I hate cruise control and …
ytc_UgwcGX5j0…
G
Im more impressed by the AI speaking on behlf of the customer, i want that…
ytc_UgwPtNj9F…
G
I’m an artist, but I’m also of the opinion that copyright is a monopolistic law …
ytc_UgxqsV3Ty…
G
I get where you're coming from! The idea of advanced AI can definitely feel a bi…
ytr_Ugxc8TR70…
G
Humans; we don't want to work our whole life!
Also human: we don't want AI or r…
ytc_Ugxy7cHxA…
G
True but we already have AI which can inspect code for errors and vulnerabilitie…
ytr_UgwyFOd_7…
Comment
In regards to continual progress, walking forward one step at a time is good but it's a good idea to stop when it's off the edge of a cliff. The fact is once thinking AI is made humans stop progressing and robot start the progress. How are humans to improve anything at anything if we don't do anything. What is the point to life how will the economy a concept as old as humanity itself operate in this no work future. All people liabilities of the state it would be in the interest of the elites to genocide the lower classes, serilize themselves (assuming humans reach immortality) and have the machine clone should a death occure but to what point. People need purpose. Advance AI robots are unethical and I say that as an engineer who automates manufacturing lines.
youtube
2022-05-08T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgySudQLvv9j_6WcZjh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxZ2yLNtNVdECleR6F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxoUMWtTDk-2onLSfx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1YiLCUDspiwaLN1d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfQaVjOQTWL71Y_6t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx9lv2PctjCd2x3w8t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyiLHFKjyiXRXC70Wd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwRcW09jJo7pR7x5vp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzAK_1qx2rf5vauf6d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyjw3gGAxu-1YYL4pt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]