JavaScript is required to use Bungie.net

オリジナルの投稿元:Secular Sevens

このスレッドは他のスレッドから派生: オリジナルの投稿を表示

GrandmasterNinjaにより編集済み: 2/1/2013 12:46:35 PM
34

Why do people think schools should teach you everything?

I've been noticing people are wanting schools to talk about sexuality* within schools. But why the hell do you want that? Should schools be the one teaching kids to accept one another for who we are, or is it the parent's responsibility to do this? We learn about sex one way or another, be it through sex-ed in school, or discovering the internet lol. Do you think this is a necessity, a value that one really needs to learn? Why don't we teach kids how to live in the god damn wilds, make things, cook food, get fit. If a school is supposed to be a place to get set for the world, then we'd never leave it. A school is made so that a citizen of the country can be a productive one that raises the GDP of the country. *I meant sexuality as in homo, hetero, bi, lesbian, trans. There is nothing wrong with sex-ed, I just don't think schools need to tell students about the above terms, sex is sex no matter the sexual orientation.

投稿言語:

 

マナーを守りましょう。投稿する前に、Bungie の行為規範を確認してください。 キャンセル 編集 ファイアチームを作る 投稿

このコンテンツはご覧いただけません。
;
preload icon
preload icon
preload icon