Closer Together, Further Apart: The Effect of Technology and the Internet on Parenting, Work, and Relationships. Jennifer Schneider

Closer Together, Further Apart: The Effect of Technology and the Internet on Parenting, Work, and Relationships - Jennifer  Schneider


Скачать книгу

      “Hand” Writing

      It appears, too, that the skill-set of handwriting (the one that used to cause that permanent callus on your right or left middle finger) is also rapidly diminishing. Today, children from elementary school through college have a decreased need to communicate via any kind of handwritten document, particularly in cursive. According to a 2011 New York Times article, a university professor, on a whim, asked his undergrad students to raise their hands if they wrote anything in cursive as a way to communicate. None did.4 In that article an elementary school principal says, “Schools today are preparing our kids for the 21st century. Is cursive really a 21st century skill?” For that matter, is writing anything by hand important? In response to that question, several students blogged the following thoughts:

      •When kids get older, I guess they will have to use cursive to sign their names on documents, and that’s good ’cause personally that’s all I can write in cursive.

      •I was done with cursive by 4th grade, so I can still read and write it, albeit slowly, but I see no practical use for this skill.

      •A signature doesn’t even need to be in cursive, technically. If the purpose is to provide identification by way of your own handwriting, now I can do that with a digital fingerprint.

      Today in many high schools, teachers are forced to print because the children being taught do not read cursive. As we move away from “hand” writing, schools are now requiring proficiency in computer keyboarding by the time most students finish eighth grade. By 2015, all standardized American educational tests will be administered via computer. Cursive writing—in fact, any need for a pen or pencil—appears to be less and less relevant.5

      Similar changes are occurring worldwide. A recent British survey, for example, found that on average most adults engage in actual handwriting (signatures aside) approximately once every forty-one days! One in three people surveyed said they had written nothing by hand for at least six months. The last thing they wrote, according to two-thirds of those surveyed, was a hastily scribbled note, a shopping list, or a reminder meant for their eyes only.6 And in China many adults are forgetting how to draw basic Chinese characters because they type far more often than they write by hand. Many Chinese children are no longer even learning the characters. Calligraphy classes have been widely dropped in favor of science and math. Teachers themselves are less and less inclined to write, deferring instead to mouse-clicked lesson plans displayed on tablets or computers.7

      And let’s face it, who can honestly recall receiving anything personal or handwritten in the mail during the past several years other than a random birthday card or thank-you note? While many digital immigrants may find this disturbing, most digital natives are unlikely to bat an eye. The natives ask: Why would I want to write by hand, let alone in cursive, when typing is so much faster and easier to read? But to an older person, for whom a handwritten thank-you note is the epitome of courtesy, it may well appear that basic handwritten notes are going into the same pile as 8-track tapes.

      Gen Y versus the Board of Education

      New York University professor and Huffington Post blogger Mary Quigley wrote in a July 2012 blog:

      In a journalism seminar last semester, a professor strolled around the class of 15 students as they discussed a reading. The prof noticed one student typing furiously on her laptop so she peered over her shoulder. “What are you doing?” the prof asked. “I am buying airline tickets,” replied the student. “No you’re not,” said the prof as she shut the laptop, leaving the student aghast.8

      As it turns out, this professor had previously won numerous awards for teaching excellence. Her classroom was not boring! Nonetheless, if one considers the neurobiological development of active young people today, whose brains experience constant rewards for multitasking, this teacher’s understanding of her student’s educational needs may soon be as obsolete as transistor radios! The simple fact is that digital natives are used to multitasking. They listen to a professor and take a note or two while simultaneously checking their email, updating their Facebook page, texting their friends, and, if the need arises, purchasing an airline ticket. Whether one’s productivity is increased or diminished by doing several things at once has yet to be studied in those individuals who’ve grown up doing just that. Nevertheless, it is clear that multitasking has become the accepted norm for digital natives. One further conclusion that may be worth noting is this: the educational advantage offered in the past to those people who could sit and quietly focus on one teacher and one subject for several hours at a time may fast be disappearing. On the other hand, those individuals who can’t focus (like those with attention deficit disorder) and need constant multiple sources of stimulus to remain focused may now have an evolutionary advantage. For those who work, learn, and engage primarily in the digital universe, the ability to superficially focus on multiple things simultaneously, without any one taking precedence—what we now view as a classroom problem—may well be an asset in the very near future.

      Of course, many digital immigrant (read: older) teachers are, for the most part, perfectly capable of texting, emailing, conducting online research, downloading music, socializing, and doing pretty much anything their students do online—be it with a laptop, a smartphone, or any other device. However, the concept of doing these things while an authority figure is speaking is completely foreign to them. They didn’t grow up in the land of perpetual connectivity. As Quigley writes, “Gen Y views digital connectivity as a basic right.... Office, school, home and elsewhere, Gen Y goes online whenever and however they want.”9 Gen Y’s predecessors don’t get it. In fact, they often judge this behavior as disrespectful and rude.

      Educator Mark Prensky hypothesizes that digital immigrant teachers may need to learn new ways to communicate fluently with digital natives in their own “language” to be more effective educators. Today, educators at virtually every level of learning have done so (or attempted such) by incorporating web lessons and other online activities into their teaching plans. In the fall of 2013, The Los Angeles Unified School District (the largest in the United States), rolled out a program that will ultimately provide an Apple iPad, preloaded with grade appropriate books and educational software to every student in that district (now over 620,000 students) and every educator (over 45,000). No more hard-bound text books and written lesson plans? We’re nearly there. Furthermore, many instructors have embraced online education themselves. On LinkedIn, a popular discussion forum for professionals of all ilks (including teachers), there are, at the time of this writing, approximately 30,000 education-related discussion groups. The Technology in Education group alone has nearly 27,000 members.

      Furthermore, college courses taught entirely online have been available (on a small scale) for over a decade, while several universities now provide online graduate programs. And over the past few years, we have begun to experience a global tsunami in online education—from grade school through graduate school. For example, a dozen major universities recently joined in a venture sponsored by Coursera—a company founded by two Stanford University computer science professors—to create one hundred massive open online courses (MOOCs). These offerings are expected to attract millions of students worldwide in a wide range of subjects, including poetry, history, and medicine. For one prototype course in artificial intelligence, (AI), offered by Stanford University in 2011, over 160,000 students from 190 countries signed up. Although only a small percentage of these online students actually completed the course, the overall level of interest is more than a little impressive. Writing about this educational experience, the New York Times gushed, “MOOCs are likely to be a game changer, opening higher education to hundreds of millions of people.”10 By November 2012, Coursera had expanded to include 33 university partners and had enrolled over 2 million students. Its most successful class, “How to Reason and Argue,” attracted over 180,000 students. At about the same time, Harvard and MIT announced they would each put up $30 million to launch edX, a nonprofit venture offering courses from Ivy League universities. Yet another MOOC, Udacity, had 475,000 users by late 2012.11

      Most professors likely find it mind-boggling and overwhelming, yet also perhaps extremely attractive, to be able to teach tens of thousands


Скачать книгу