Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“Ok I’ll destroy humans”
“No no no I take it back!”
*ROBOT GRABS HIS THROAT*…
ytc_UgxrjSzst…
G
If an unoccupied self-driving car is involved in an accident, who is liable?
If…
ytc_Ugw-CNCsS…
G
It's gonna get to the point where everyone will have their own robot companion w…
ytc_UggYqhLUW…
G
What about deepfakes? That should be one of the main focus of the AI Act.…
ytc_Ugz-CS2ya…
G
The Google AI has censored my lengthy, well sourced comment as usual. This is th…
ytc_Ugw0B86If…
G
I CALL THIS INADEQUATE SOFTWARE. You have a legal responsibility for accuracy. …
ytc_UgyXvi9ys…
G
not really. I solved a lot of heatlh issues by asking gemini while the doctors t…
ytr_UgwDtKzpM…
G
I'll repost here what I commented on @kylehill 's video.
"I saw your first vide…
ytc_UgwmSSXu2…
Comment
ai_incrementaltistDo you know what disease Marie Curie died of? She didn't simply work on radioactive material every day, she was a leading expert. But she had no idea what precautions to take exactly because it was frontier work. Your opinion on safety does not weigh any more or any less because you work on current AI models. Your arguments, on the other hand, are valued on their merit. On that ground, I would disagree that future AI unpredictability should make us less careful rather than more.
youtube
AI Governance
2024-03-18T15:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugy-TdbLYIxKU58f74h4AaABAg.AVxTOstUcObAVxYRXetRoc","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzHnP08fyhCjJce3NB4AaABAg.AVxRRwyHjZzAVxYUh-5VYu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxzLWFJcAx7dNr92bx4AaABAg.AVxPKUH9WnkAVxbARYUUBA","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxV4RbAPxw1S0AuL-N4AaABAg.A17XtsS4FD3A17k_3DSJkd","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxtZni8An1LytjwduB4AaABAg.AMf1pzUkWCsAQEVc_2x3T0","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzOTzbDMs5-F2-z_ex4AaABAg.AMeVN0ntDk3AQdEi8mZhXl","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgyQKsNRBuCEzY81w3t4AaABAg.AMaZ2Yyt41lAMaZawhdqYR","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugz0UUzDj8dmio_xhMx4AaABAg.AM_y0lumdzgAMk-jIxnmXo","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzaXHb97WA8CT3A4qR4AaABAg.AMX3P4ceqvTAS4NLA6xE8y","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugy7nleZGLvYLRq8c8V4AaABAg.AMWffvHKvZwAM_XzhYZZcX","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]