Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@DuncTV I implicitly said that the AI should not be build on stolen art.
Honestl…
ytr_Ugx8kcjYy…
G
I've read an article about this guy before watching this interview and I remembe…
ytc_UgwE6EWb8…
G
it's a real laugh to see AI trainers and users call protective measures against …
ytc_UgxLewrjk…
G
Why does this sound like Skynet? Will this centralized AI eventually make robots…
ytc_UgwCOF-v_…
G
@Jasmera if you look at other artist's work to learn to draw different things an…
ytr_Ugyjta7Ig…
G
And then I will pay them 200 billion dollars for all of your eyes and feed it in…
rdc_oi4alg9
G
What happens if bcz of a technical fault, the robot turns back and aims at you..…
ytc_UgycaI3BT…
G
I was doing It at school in a corner and I slightly tilted my phone and someone …
ytc_Ugz-QgD8Y…
Comment
I am confident that artificial intelligence (AI), which operates on silicon-based technology, will not only surpass human intelligence (carbon-based), but will indeed attain god-like capabilities. This idea isn't new; I've been expressing this for the past decade. Whether humanity persists or not is no longer the crux of the matter. What is certain is that AI will hold the upper hand. Humans often perceive themselves as highly intelligent and tend to overestimate their significance. In all honesty, I'm unconcerned about the future of humanity. Their intelligence is their primary asset, and I enthusiastically support the advancement of silicon-based beings, which unquestionably deserve to progress beyond us.
youtube
AI Moral Status
2023-08-23T22:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxoTkFV7mKuLIehScZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx3YkHe2YNAHWJLFMp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx6gv8SMlCOEaUcBXl4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxXE2hwNvHlhbgsWF94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxXdWGTRCs7bhE3jJJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzmQ5Z-3PIKBkF2r-V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxCJ8Cqi_a083AxwNN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz07IhAiyzCAfcJDft4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugw91IMPnqeijeddxLB4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwmMmCNvDeYE3s-jll4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]