Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I left being a technology teacher because kids only wanted to experience AI. I w…
ytc_Ugxg2KE00…
G
We will be the ones who kill ourself because we wanted AI and I have read that A…
ytc_UgyBUdiYP…
G
Something that just puts me off about the flow of conversation in this video is …
ytc_Ugy7rPhAM…
G
Well i know where my Ai investment is going towards. Since they have a backbone…
ytc_UgyyJrwrh…
G
I once trained an AI model to remove backgrounds from images. It ended up always…
ytc_UgyJxT6jm…
G
With AI looming and the current economic outlook, the managers have the upper ha…
ytc_UgwoTgclN…
G
If I was selling tickets on the doom train, the 1st thing I’d learn is how to br…
ytc_UgxiOYWkp…
G
@TheWhyFiles I was discussing with Google's bard AI. We were discussing buildin…
ytc_Ugxo2OhsC…
Comment
if you make AI it should not think for itself and not being able to take decisions on his own. any of this 2 and is game over skynet or ultron is more than sure capable of taking over with 0 problems
youtube
AI Governance
2025-07-16T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugxe3i3-I84L1v6WkTt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgyNe47VIeo0z32NiuJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugw-5BCPmTL8DWyYNYp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz8He1xAsHAo7lJgxp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgyK13VTVE6tRJwxfJ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgyE5wX3cFfw6_X6IL94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},{"id":"ytc_UgyZlF8dluKheFFNkeh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgyeJLaFMWyhNas_XaR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_Ugzz8WYjxBOPN_6YBIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxWXmn27NBJ1sAWi5p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}]