Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Said it 7 years ago were there was only talking, there is no co existence, it's …
ytc_UgwK8f8lt…
G
I like the hypocrisy of harping on "beware of AI bias" when in reality humans li…
ytc_UgzM6n14a…
G
AI is not us. We humans are capable of gorilla warfare against these non organi…
ytc_UgyQfwE-q…
G
This comment was definitely written by chatgpt. Also the dude is a lawyer and yo…
ytr_Ugyi_g6nY…
G
This is a very fascinating conversation to watch. So glad that Geoffrey Hinton a…
ytc_UgywOGxtA…
G
I already dislike AI art as it is. But you gave me more reasons why too dislike …
ytc_Ugw7CMuWF…
G
Bro did not just try and convince living human beings to "be nice" to a soulless…
ytc_UgxHnzBL2…
G
Do Korean companies really pay actual overtime salaries when they force people t…
rdc_dv0n95w
Comment
Given that whole world works on this technology, and don't have to comply to your rules and standards, considering the power of this technology what is the use of most perfect agency, laws and regulations you find and make when pretty small company, not country could make pretty big and bad not to say evil ai ? How would you solve the problem which is not in your hands? This is much greater than invention of atomic bomb, this is like inventing atomic bomb that could be built by any company out of sea water in few hours, forget regulating that everybody would have one till tomorrow morning and by the time you make the draft for one law every country in the world would have whole bunch of them.
Yes I do exaggerate a bit, speed is not really that high but I am also not to much far away from the truth? Wait for few years and tell me am I right, I hope not.
youtube
AI Governance
2023-05-18T21:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzsvmj0bbd-HbQfzTR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzZE1uKqAzRxoaNqc94AaABAg","responsibility":"company","reasoning":"contractualist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxjB8QlWoolkVgUAUx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwM63I__v3k2GDetTx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMTMb52k1uWFhDbzJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzLI0fTGhFfOVgDvgp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzy0g5FX72XQ_Y4mft4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy9SB6z8O0DRp3d9RZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxbgaiIYd5fkpN5OCh4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxrD0GnsfHJFN8P2T14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]