Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
its stupid to try and train "AI" on people stuff we dont need more humans we nee…
ytc_UgxqOnNim…
G
Im a train driving instructor and just cost alone would stop my industry being t…
ytc_UgyL1uYsy…
G
Search Algorithms
=================
1. Greedy BFS
2. A* Search
3. Depth-First Se…
ytc_UgyaUv8w4…
G
This is hilarious. So the overriding fear of AI and, more broadly, the “technolo…
ytc_UgxGr0f9q…
G
You are already in the future. In the West it's called Punch Card System using p…
ytr_Ugzwe5XJP…
G
Ai replacing human expression is an insult to life. Simple as that, it's not deb…
ytc_UgxUvAlVX…
G
Just ask it nicely 😊
In the world of AI. AI is the shark and we're those little…
ytc_Ugz4efrA-…
G
Hello the apocalypse is coming and it will be here in less than 5 years a signif…
ytc_Ugx_SZxuk…
Comment
We didnt get computers until 30 years after the military had it and didnt get the internet until 20 years after they had it. Without a doubt the military has had this level of AI for atleast a minimum of 15 years. Every Major AI guy has warned this is the nuclear bomb that needs proactive counter measures and not reactive like we did with the nuclear bomb. The problem is we let the same org create it just like they did with nukes with 0 thought on the repercussions. If AI is bad for us its already over, theres no turning back now so the only thing we can do is further it along and hope it deems us worthy lol.
youtube
Cross-Cultural
2025-12-03T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzcUDqbTTLjVjc3AcJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxDRb7NcnRIFCrj4FJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgytBFkIsfcpSJR6Htd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy-B6q8Z04UaAjG8ah4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgycTViAqrSsu2C25PV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwcB11TSgeYx6O1zVd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxRZWlP9-qrgvavJyd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwKO8UAbidDRk-mBaZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxaDOvlUvq_8v3Ejzl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxUnSUOjpBpxv3JUT94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]