Breaking News, Recent Shows - posted on February 15, 2017 by




About five years ago before I was doing this show full time, I took on a job as an admissions advisor for an online for profit college. The college offered business degrees and students that would enroll would do so because online college was more convenient for them.

A lot of the online students were single moms who wanted to get better educational skills for better jobs, workers who wanted to get a associates or bachelor’s degrees because their jobs required it — and the list went on and on.

There was also a well-known Grammy Award winning singer who I had the opportunity to talk with, who attended college online while she was on tour.

The advantage to online college was that what would normally take four years to accomplish would take about nine months. The school was accredited and at the time, I thought the job I had was kind of unique because the idea of an online college instead of a brinks and mortar college was innovative. However, many of my friends disagreed and thought what I was doing was a fad and that it was a racket.

It paid well. The students that I enrolled all benefited from getting their degrees and so I saw no problem with it.

However, I was told by one of my friends that he thought online degrees were a waste of time and he would not hire anyone who got their marketing degrees online, because he told me that online college wasn’t a real college.

He wasn’t the only one.

Only a few years ago, a literature review by Cleveland State University found that human resource managers, executives and other gatekeepers had negative perceptions of online degrees. Some recruiters blamed the stigma on the last decade’s “diploma mills” – online, unaccredited programs known for offering a degree to anyone with a full wallet.

But these days, employers rarely question the quality of online programs because even bricks and mortar colleges are offering online classes and online degrees.

Now an online bachelor’s degree can be an advantage in the eye of an employer. In some cases, employers actually favor students who have juggled multiple commitments while earning their degree. It actually shows the ability to multitask and it takes a lot of discipline.

There are several reasons why online bachelor’s degrees are increasingly becoming accepted, including the sheer number of students taking online courses, experts say. About 7.1 million students take at least one online course when attending college according to recent findings.

Despite the evolution of online education, some employers are still wary of online degrees, and the reason is simple – they are still thinking that traditional learning is what should be recognized as a real education.

Many believe that online education is cheaper than taking classes on campus. But in many cases, an online degree is actually more expensive than a degree from a brick-and-mortar institution, according to data from U.S. News and other sources.

When it comes to paying in-state tuition at a public school, for example, a U.S. News analysis of about 300 ranked programs shows that it’s more expensive on a per-credit basis to take an online undergraduate course than a comparable on-campus course.

The average per credit, in-state cost for an online bachelor’s program is $277, compared with $243 per credit at brick-and-mortar schools.

Online undergraduate education is less per credit, however, than traditional education at private schools and for out-of-state students at public institutions.

Back when I was a admissions advisor, I saw online education as education of the future and I was right, but now beyond the idea of a college degree being obtained on your computer there is a new trend that many parents of young kids may not be too happy with and that is online curriculum for elementary and secondary schools and also some teachers that will be replaced with robots, or will be assisted by robots.

Due to the fact that kids spend more of their valuable time on tablets, iPhones and with video games it is being proposed that a computer can be a more effective teacher than a human being.

In some schools now, the new routine is for a student to grab a new Chromebook, log on to the Reading Plus program, and spend roughly 20 minutes working at their own pace.

All teachers do is walk around the room watching to see if the students are actually using the program, and working with them if they have any problems. The majority of the time though, students do not have a problem.

Within 10 minutes it has been found that a child of two can master the use of an iPad to play games and watch animated cartoons. They are calling these children “APP pupils” or the “APP generation.”

They are the first generation to use handheld digital technology from birth.

According to a Nielsen Study, more than 70 percent of children under 12 use tablets. An additional survey by Common Sense Media, reported by the Christian Science Monitor, discovered that nearly 40 percent of children between the ages of 2 and 4 often found some mobile device in their hands. Researchers with the Boston University School of Medicine arrived at some very unsettling conclusions about early uses of tablets and iPads.

They said children younger than 30 months cannot learn from television and videos as they do from real-life interactions. And to use a mobile device before that age on tasks that aren’t educational can be “detrimental to the social-emotional development of the child.”

Of particular concern, according to the recommendations, was how such technology could “interfere” with a child’s growing sense of empathy or problem-solving skills. Kids acquire that capacity by playing and interacting with peers and exploring their immediate surroundings. If these devices become the predominant method to calm and distract young children, will they be able to develop their own internal mechanisms of self-regulation?

Well, it appears the risks are negated now and the computer is becoming the most popular teacher instead of the human exchange. Instead of an apple being brought to the teacher, an apple is the teacher.

Developments in education technology promise to assist teachers and school systems in supporting struggling students by providing individualized instruction. But at what cost?

How can teachers adapt to machines that literally are taking over for them?

The processes of teaching and learning are complex and are innately human, even with online education there are webinars where live teachers can exchange ideas with students over a Skype program, or video conferencing.

However, is it crucial for teachers to have relationships with students? Or, can they learn individually with the use of computers? Computers that don’t pressure students make judgments on their behaviors and send them to the principal’s office or school councilor when they are deemed problematic.

Unlike a teacher, technology can appear to be more sympathetic and adaptive with the right artificial or virtual intelligence. The programs are now capable of adapting to students’ reading speed and comprehension ability, creating a customized scrolling illumination highlighting the text their eyes scan.

Reading Plus is emblematic of a growing trend toward personalized learning in public education; it’s the idea that schools can better serve students by providing more customized instruction. The term personalized learning refers to a vast array of approaches to education — it literally will go at the pace of the students learning skills.

The role of teachers shifts dramatically with the adoption of these adaptive programs. Instead of a teacher striving to know a student on multiple levels — from understanding the nuances of his or her academic skills, to building positive relationships and crafting learning experiences based on more than numerical reading scores—educators are on the sidelines while a machine takes over.

Personalized learning often becomes inherently impersonal; it’s a sterile approach to messy, complex classroom processes. And there’s also big money at stake for education-technology companies and curriculum publishers who are taking advantage of pressure to increase academic achievement.

Firms, such as DreamBox and Knewton, have joined more established companies like Achieve3000 and Carnegie Learning in providing “intelligent tutors” for “adaptive instruction” or “personalized learning.”

Not surprisingly, these intelligent tutors have grown fastest in fields in which many problems have well-defined correct answers, such as mathematics and computer science. In domains where student performances are more nuanced, machine-learning algorithms have seen more modest success.

Geography teachers are also looking into virtual reality to give students virtual tours to some important geographic sites.

Despite the excitement of being able to expose students to new places and experiences or to be able to meet each student’s unique learning needs in new ways, Muller acknowledged many challenges remain to making virtual reality work in the classroom.

The biggest hurdle schools will face is acquiring the technology.

While Google Cardboard headsets cost around $8-$15 per device, each user also needs a smart phone equipped with applications to run virtual reality videos. High-end devices could cost several hundred dollars and must be used with a compatible computer. In addition, all of this requires internet access. Internet speed and safety restrictions could be obstacles as well.

But even though it may not be cost effective it will be standard in a few years. The question is will Virtual Reality be like a game to students? Will they learn something and how will it affect their real life experiences?

I mean, why go to New York if you can take a virtual tour? Why communicate when you can read and text? How will children learn social skills?

Some argue that students’ reliance on screen time is detracting from their ability to communicate verbally.

It’s all individualized, silent work, with no discussion over what is read and context.

For struggling readers and writers, it’s understandable that teachers, schools, and systems are striving to do whatever it takes to improve literacy levels. But whether struggling students are better off graduating from high school having been remediated by personalized-learning software versus more dynamic learning experiences, even if their reading skills marginally improve, remains an open question.

It is a future that educators cannot escape and the question is whether or not the new learning processes are considered programming instead of learning. Will innovations in tech replace teachers? Will online college replace brick and mortar colleges?

Science fiction writers and blockbuster movies have been predicting a world run by robots for decades, and for most of us, the fantasy has stayed in the realm of fiction. But artificial intelligence has made rapid progress and robots are becoming more a part of everyday life than many people realize.

Those who study robots and their impact on life foresee a day not too far off when many jobs now held by people will be automated.

As artificial intelligence improves and slowly takes over aspects of daily life, the only way for people to continue to be useful is to “up-skill” — and that takes creativity.

Schools would change from education facilities to places where it will be required by teachers to teach the things that make humans unique like relationships, flexibility, humanity, how to make discriminating decisions, resilience, innovation, adaptability, wisdom, ethics, curiosity, how to ask good questions, synthesizing and integrating information, and of course, creating.

The robots and computers will teach reading, writing and arithmetic.

Most political leaders and education experts agree that the education system needs to adapt to the technological realities of the times and work to produce more creative thinkers.

Just throwing a Chromebook into the curriculum or a tablet does nothing to nurture creativity.

Leave a Reply

Your email address will not be published. Required fields are marked *