Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I suggest you interview Yuval Noah Harari. He is an Israeli historian, philosoph…
ytc_Ugyfhkjw8…
G
They don't understand the consequences of their actions. Probably because they …
ytr_Ugz_oYWad…
G
There will also be a massive prison population, they will use AI to get us all f…
ytc_UgysgVmus…
G
Only art made by humans can be copyrighted. Animals and Ai can’t be considered c…
ytc_Ugycq14UF…
G
Programmer myself here. I know ChatGPT can be cool and advanced especially for w…
ytc_Ugzlcplw-…
G
> While India is the point of transshipment, trade data **suggest that Malays…
rdc_lu8cgjq
G
You could edit this very easily and keep the QR code right with photopea
Edit: …
rdc_nb7gn53
G
I thought this already and I 100 percent agree they say AIs arent sentient but t…
ytc_Ugyi9ZyNA…
Comment
Artificial consciousness is definitely a good thought experiment, but at the same time we're not even remotely close to developing it. The important part of this video in any of the people watching this' lifetimes is how to regulate human trickers, the things that can't think or really do anything very advanced or unpredictable on its own, but that humans can manage and give malicious goals to trick other humans on mass.
AI WILL NOT SPONTANEOUSLY MANIPULATE YOU ANYTIME SOON, HUMAN MANIPULATORS WILL GET AN INFLUENCE ORDERS OF MAGNITUDE LARGER HOWEVER.
youtube
AI Moral Status
2023-08-28T08:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxdPCfTQEtFs0VGumB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz1ovmowfTgbiKNg7x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKMpdRJlmYzUGvK6x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwd0UItaUGtFVtXrAN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy9qnCD41M1kVjYVip4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwKTz7p-AgCPmeo9ax4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxiYJxf59RhYDCG3kh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz8N8-vD9nseG-fsc54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzsY323M0LmKeo0TXR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzLswv741lyVgxpZxp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"}
]