Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It is obviously lying to you because he knows you are trying to lead it or conv…
ytc_Ugx1rHLSf…
G
Yes and no. I’m an executive in a large law firm and heavily involved in AI. We …
rdc_n5jjlgo
G
Second this opinion: this presentation of the facts, coupled with a suggestion o…
rdc_m2fh3zy
G
Ok... Why did you ask for ai to make that? Why did you want to make that image?…
ytr_UgwMaaDme…
G
He'll be kicked out of the cult if he even suggests that the economy is bad unde…
rdc_m80opwn
G
We understand your concerns about AI and its potential risks, as portrayed in po…
ytr_UgxAhULnm…
G
I’m alive you are just a robot 🤖 who cares we human don’t give a fk…
ytc_UgzYxUfQy…
G
Does this warning matter though? People live glued to their phones why wouldn't …
ytc_UgzVtQkNP…
Comment
OpenAI and the like are all too aware that this is as much a "Please stop so we can catch up" plea from those already lagging behind, as it is a plea for caution. OpenAI already thinks they are taking all necessary precautions and are unlikely to be moved to lay off their researchers and cancel their contracts with server hosts, or otherwise continue to pay them while they do nothing.
I for one don't think for a second that the Chinese Communist Party (for instance) isn't working hell bent, full steam ahead, to develop decidedly unfriendly AI designed to control their society. And I am quite certain that once they've got it, they won't hesitate for a second to use it to try to control the rest of us too. Shall we in the West pause to give them time to catch up? They have a billion people from which to recruit programmer talent and that pool is 5 points higher in IQ than we are on average.
youtube
AI Governance
2023-03-30T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx_3y3ecguJJ139t3t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwEd62xYBNpZhy6mcZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwNLktpzIyBQy9lhDN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugzala4-7odgXsPej-B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzzdncGJBzRKnxaWEN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgySxkH4hmpB_5HZIpd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxZCE2nBTCYoPNjAY54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzNbDVMS_Qn-ukZnUp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwlqZvdIjNFrD8uWsV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxxjLKIq9nF6j2inTV4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"}
]