Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@heinrikgoettsche1595 If we referr to consciousness, then yes, but AI literall…
ytr_UgzPaPBT-…
G
At some point, probably not. It will be more and more part of other systems we r…
ytr_Ugwn_hEUu…
G
What f****😐🤯 not the jobs human
This is war human vs robot 🤖 f****
Would how h…
ytc_Ugzy9lDnD…
G
I wasn’t sure if it was legit. I tried it and it seemed like for sure I spoke to…
rdc_jhcxs7m
G
Bravo for stup...ty of People STILL believing is ok to have this ROBOTS, AI, and…
ytc_UgxZVXsc9…
G
Looking at he world, i dont mind A.I destorying it. I hope just it does it quckl…
ytc_UgyVhIdzq…
G
It is over hyped. It is important to know what to ask of it and how to ask it, i…
rdc_mleasn0
G
This is so prophetic, the Bible speaks of the mark of the beast. AI is the tool …
ytc_UgzrWXV6G…
Comment
as an artist and person who’s passionate for science, it’s scary to imagine the jobs I wish to have one day being taken away by ai. It’s good to have automation but at what point do we put up the roadblock? It’s going to lead to mindless consumption if left unchecked because we’re all about letting the most efficient thing do the job. I want to be engaged with my world, discover new things and create masterpieces but I can’t do that if these machines can do it better and more efficiently than I.
youtube
2024-12-19T19:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx9Qba2jIsjS53ko5N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz_ymExDkM6ldx12IJ4AaABAg","responsibility":"government","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwPa7u0z7plkOyoO914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyi36BJsah9wPeNmIB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgypIKTBbhRW3o-PG2R4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzjWIfeRMBXLZi7KMV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-JF_zkcP7hzhUpKd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwia7_tP5RXjKX1LEF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxd7O8S-QEwqu33pDR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyxQTLOBW2dIrb_6b54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]