Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@jasper2621 The 3 laws as a reliable robot "moral code" must be the cheesy telev…
ytr_UgwqtNc37…
G
I just don't understand why AI is going to be malevolent. Also, we currently do …
ytc_UgylRaeZc…
G
I think its fairly clear by the recent years and actions by governments of the w…
ytc_UgwhmRLE2…
G
Imagine creating sentient AI and then thinking you didn't just because you have …
ytc_UgwifpGxb…
G
I am interested to know if chatgpt was charging for this service. Because, it's …
ytc_Ugzk8mjHS…
G
@undeadpeak Welp, this might help against some stuff, but, once again, just addi…
ytr_Ugw3O8AU4…
G
Wow how easy it would be to put them in charge of a robot and then enslave human…
rdc_jp5keyh
G
Yes! It's not just about quality.
It's about the years of effort and learning so…
ytc_Ugz5fAytr…
Comment
Nope... EXECUTIVES are the jobs ultimately replaced by AI.. those are the jobs with the least human creativity.
The shareholders who realize this the fastest will be the richest. No more big salaries and bonuses when AI can do it for FREE with zero loss to creativity
youtube
AI Jobs
2023-08-03T07:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzHpCA6FswYfv0g-bR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4AU6P40VksfWj_fB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzaw4SxN9MkbF-MLLl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxcMyxaW4I8LJP--sZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxxfdAs6VDAb1vEvAl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxLci1GXkp9usF0rwF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugx8P3h5EoL7lSHjVb54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4niYGNRjtDFwdKCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyZ-aBerpomm4YdzuV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx1s3GqoFS3l4MdOC54AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]