Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Please explain the massive data systems that run AI. I do not understand how man…
ytc_Ugy05cByp…
G
The most dangerous thing a monkey and a robot can become is human. We are out ow…
ytc_UgzG4YDRX…
G
im a little late but 60-80% of the time i myself have no idea where I'm going wi…
ytc_Ugyb-rFCj…
G
Lol, oh my god cause no one knew that deep fakes existed before atrioc showed th…
ytc_Ugx-WU8RU…
G
great video! not just because you showed an *actual* effort, but because of the …
ytc_UgwQH6jFy…
G
sorry, but i dont think its okay to pull off these kinds of things, and for the …
ytc_Ugzrn8BoB…
G
Wow that removable face would these be brilliant for people with bad burns or pe…
ytc_UgxHBzVGB…
G
@jellybelly111 maybe it is a signal to stop associating some media files on the …
ytr_UgwmnkOXa…
Comment
@bouttaletgo That seems impossible. The bar for consciousness appears to be much higher than the bar to understand the idea that humans would dislike a conscious AI.
youtube
AI Moral Status
2023-08-31T00:0…
♥ 31
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_Ugz9yeqfqWwOadueWjd4AaABAg.9to-wjlC-1i9uDSod71ZOF","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyiCVdSVWngP8Ot8uR4AaABAg.9tne4FeFkBu9uxNv9idz17","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxRU3BsqC3AKVe_6ed4AaABAg.9tnbH8wqmOr9tnbW3-zZQZ","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzbYYMUB1rBKcRj0fh4AaABAg.9tnWX_gLshb9tsGrhoV0tC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgwunzFPx9PCu9rzPzR4AaABAg.9tnC4SYblqj9u0Ny22svgw","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwunzFPx9PCu9rzPzR4AaABAg.9tnC4SYblqj9u38q1_74wH","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwunzFPx9PCu9rzPzR4AaABAg.9tnC4SYblqj9u5wE7yB4NN","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwunzFPx9PCu9rzPzR4AaABAg.9tnC4SYblqj9u5wITctQt5","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_Ugx-3lnb8zV6suMP0Al4AaABAg.9tn6RatUNti9u9O1EBo9j9","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxyAHIGawFuQ2EkpNt4AaABAg.9tmi07x8WJT9twmF9eCaiO","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]