Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI hurt my feelings and made me cry, it said I was nothing but a dumbass talking…
ytc_UgwHSZHl0…
G
Mark my words AI is far more dangerous than what we know about. AI has replaced …
ytc_Ugys-C7pi…
G
*"There’s been a steady stream of academic papers on the topic. A 2013 study by …
rdc_kigrkk2
G
For the business processes that are so well defined and predictable, Enterprise …
ytc_UgxWEI4e4…
G
Although driving seems like a simple task, it is anything but! In fact, driving …
ytc_Ugxd6oL9c…
G
What will people do after ai takes all jobs, then will go explore the cosmos and…
ytc_Ugwai1gzj…
G
Good point, developed countries should spend resources to invent technology with…
rdc_gtcwkhr
G
AI can be incredibly helpful, but it relies entirely on the information it’s giv…
ytc_Ugx_KdB0F…
Comment
Could AI be the AntiChrist? Something that could potentially reach deity levels of intelligence and start demanding that we lowly humans worship it? It's kinda plausible when you think about it. Prophecy says that the AntiChrist will solve the worlds problems and create prosperity, causing him to be loved by the world before claiming to be God and demanding worship. Something AI could possibly do.
youtube
AI Moral Status
2025-06-01T11:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzh55EsBgQsYN7xPIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxw24pcqfthScXxRyp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxq-3Mj0p1w0a2ocnZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwUvB3U2YdMb75ihSp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8gn5Ny8T6d3sdnyh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzhPMmgdrUQTyofMHp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwn1i9Mxum4FEfdcZV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzp6trz9EL5MO5cEFN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwWGcyPg67i_iOnxLZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyuQfBi89yJI_USdzJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]