Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
after watching a few videos regarding AI, i'm not that worried anymore that it w…
ytc_Ugx84RGFJ…
G
There's another problem which its that AI was codified - for lack of a better te…
ytc_Ugz8ElAJp…
G
THIS!
But there is a study: if people read Art is made by AI. They hate it. Even…
ytr_UgyzHXt4_…
G
If an AI is programmed to say it is not sentient and to even argue when you say …
ytc_UgzBVY7GF…
G
My ai chats are just me insulting them and/or throwing nukes at them.
One time, …
ytc_Ugyj-Xuxx…
G
Paul
We must first define if a Robot is a creature or not. Some people may see …
ytr_UgxP6TXtv…
G
AI is and will be tool. You wont ask screwdriver to fix your car, mechanic with …
ytc_UgwYLlywG…
G
I've never seen anybody brag about promting, but I've seen thousands of comments…
ytr_UgyTE47Iw…
Comment
Okay, a few things:
1. Do not market L2 Automation as Autopilot. That is intentional mislabling of a product.
2. _Why_ does the radar have issues detecting stationary objects? Stationary things are way more dangerous then things that are moving away from you!
The wall or end of the traffic jam is more dangerous then some vehicle changing lanes a full safety distance away.
3. Do not disengage the assistant before impact. Keep it running so it keeps slowing down, until a human override happens.
4. Somewhere before 150 attention warnings the system should say: "Okay, either my sensors are faulty or you are faulty. I am off for the night."
youtube
AI Harm Incident
2023-09-10T12:0…
♥ 194
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxKU6Df944_9w1RfmJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxftVetFTwRJpqy3oR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxR5tqvZFLr9dvRYoN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZgPEgpptDBidHRKt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgznrKt-URjSpd1GPK94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzDERWCxVLNiPExmbR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxTg-NiQ_3i5lznTeF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw5EPWyYzoj0fqmGyV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxiRVrdZ_DT2WsTeht4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxEDdHcs9f47QNXNBt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]