College Alternatives, Part 2: Moving Out of the Educational Middle Ages

Column by Steve Seachman.

Exclusive to STR

Everyone knows about the skyrocketing cost of college. We should also look at the antiquated and ineffective teaching methods they rely on—and develop a better alternative.

According to most historians, the European Middle Ages were a period beginning in the 5th Century AD with the Fall of Rome and extending for the next eight or nine hundred years. For the first few “dark” centuries of this period, education and art collapsed, skilled trades largely vanished, science was non-existent, and the condition of life was both squalid and short.

What little remained in the field of education was provided via small, somewhat independent monasteries. Then in the 11th and 12th Centuries, authorities in Italy and England would be first to expand and alter the educational system by wedding it to state privilege and attaching top-down controls most prominently featured in the “degree” granting process. This new creation, known as the university system, is remarkably similar to what Western culture has re-adopted during the last century. 

The invention of the printing press (c. 1450) challenged the influence of the initial university system, as books became cheaper and more readily available over the next few centuries. So the need for a university “scholar” delivering lectures to a classroom tapered off considerably.

But the ruling authorities in the 19th and 20th Centuries were not finished with their tinkering in higher education. The politicians and their academic boosters could not resist the folly of European Statism and the promise of social engineering enveloped in the subsidized re-launch of the university model in America.

The inherent shortcomings of that business model will never be fixed from within because the system is unworkable to begin with.

History has shown us a better way. Technology offers a cheaper alternative. The subsidies are inherently corrupting to academic independence and will eventually run out. And the crushing debt of the university system is too ruinous.

Overdue for a Second Opinion

In case you think that students might learn something useful while percolating in a passive classroom set within a giant complex of aging buildings . . . perhaps you should get a second opinion on that manufactured assumption.

Last week, I explained how the well-crafted sales pitch of “You Need a Degree” profits a lot of debt-dealers, scholarship peddlers and college administrators that are eager to entice kids into making long-term financial commitments. Today, we’ll take a closer look on the quality side along with options to the “classroom only” approach.

The point that some college professors passionately believe in their highly controlled system means little, except that many teachers are too deeply invested in their rank and privileges to objectively consider alternatives. In the end, anyone’s alleged motives don’t really matter much. Results matter more. And positive results tend to indicate good motives.

Whether you pursue liberal arts, business or the “hard” sciences (or all three), there is remarkably little that the non-challenging, hyper-legalistic and increasingly intolerant setting of modern college can do to broaden the mind and enrich the soul. I would argue that the opposite effect is more likely, based on our passive-explosive culture of college indoctrination and mass-conformity.

The Problem with Conformity

The initial problem with conformity—which has been nurtured by college-educated leaders and opinion makers in all walks of life—is that it creates a false sense of security, as in “everyone’s doing it, so it must be right. ”Actually, the opposite is usually true: If everyone’s doing it, nobody is really thinking about it and it’s probably wrong. Excessive conformity also leads to intolerance and hostility (we’re way past that already) along with cynically rejecting any new ideas regardless of how bad the current system is falling apart.

If you’re not convinced that paralyzing conformity/intolerance/hostility/cynicism are a big deal in the U.S., consider Exhibit A: everything about college; Exhibit B: skyrocketing healthcare costs, with its safety blanket of insurance coercion; Exhibit C: national debt; Exhibit D: obesity crisis; Exhibit E: giving free junk food to millions of overweight people despite items B, C and D). This may be going out on a limb (by business community standards), but some folks would impute a fair share of “conformity” to explain the ongoing support for the social insecurity Ponzi, the failed war on certain drugs along with the failed wars of global conquest.

The crazy thing is that some conformity-prone adults will read the above paragraph and delude themselves into believing that any slight deviation from protocol is “way too radical for me!” then go send their kids off to the anti-social training camps of passive vegetation, speech codes, thought-crime enforcement and political fanatics who openly call for the forced overhaul of modern civilization. An even more crazy phenomenon is that college “radicals” and their cheering squad in the major media typically support or remain silent on 100% of the above destructive core-conformity while attempting to antagonize people with “identity politics” and other superficial distractions. The insanity has gotten that bad.

In short, conventional thinking on college (and lots more) doesn’t involve much thinking. It involves a deep fear of what might happen if we stop following the crowd. The submissive setting of college lectures (in-person or online) merely feeds into that conformist mindset of narrowly scripted subjects and opinions that always trends towards the lowest common denominators. Regardless of our flaws in the past, conformity was not nearly the problem a century ago as it is today, after the rise of widespread college education.

Moving Education Out of the Classroom

More specific to college teaching methods, the exhaustive offering of textbook formulas and boring lectures probably doesn’t enhance anyone’s “education” at all. And this conclusion is anything but “new.” It’s almost like the last three millennia of recorded history have been erased and overdubbed with Teletubbies and Wiggles reruns. Great stuff for toddlers learning primary colors and phonics. But a little rudimentary for young adults.

Around 3,000 years ago, the wise King Solomon gave us the saying “iron sharpens iron,” a phrase so valid and concise that it’s still used today. Less well known is the full sentence from which that Proverb is extracted: “As iron sharpens iron, so one man sharpens another.”

The focus here is on the dynamic interaction involved. Or “sparks” in the literal sense of two swords striking each other in a battle of ideas. Socrates, the Greek stonemason and soldier-turned-philosopher (c. 470-399 BC) latched onto a similar concept, using penetrating questions and sound reasoning via dialogue (not contrived and insincere “debate” spectacles) in the pursuit of virtue and wisdom. He never wrote a book, yet managed to influence civilization for centuries after his death.

The dull format of passive books and lectures—the methods of choice for pandering college professors as well as many subsidized leaders of institutionalized religion—merely “puffs up” the brain with empty knowledge. I mention the latter group because state-subsidized religion (often confused with independent churches or temples) was complicit in the original universities in the 11th and the 12th Centuries, all nine of the colonial-era colleges (Harvard, Yale, Princeton, etc.) and hundreds of scholastic institutions to this day. 

The university system has always involved a high level of political privilege, corporate indoctrination and top-down control over the entire process.

This is not meant to imply that every aspect of the university model is false and detrimental (the attention to broader arts and sciences can be beneficial, for instance). It’s just that privilege, indoctrination and anti-market controls have always been key ingredients to that system—now more than ever. Yet both insiders and critics consistently fail to acknowledge that.

Based on today’s culture of permanent outrage and knee-jerk protests from one side—almost always self-serving and self-destructive—met with reflexive mockery and appeasement from the other, it’s safe to say that few people are being “sharpened” in college classrooms or church pews, or from watching/listening to TV news or AM talk-radio. Personal growth from those impersonal methods is simply contrary to human nature.

The Limitations of Books and Lectures (and their online equivalents)

Books and lectures are fine for introductory purposes, technical references or for entertainment, particularly when compared to mind-numbing television. But they fall short when it comes to personally challenging anyone in any moral, ethical, spiritual or even “practical” sense.

Those weaknesses should be obvious in the case of textbooks—the bedrock of the university model. The moment any written document (in the absence of an engaging teacher) tarnishes someone’s cherished idols—such as their favorite politician, cultural custom or sacred object—or pushes them away from a harmful addiction . . . the reader puts it down, convinced the writer “doesn’t know what he’s talking about” or “is a self-righteous jerk who doesn’t understand me” or some other weak excuse, because the reader (or student) is in total controlThat aspect alone is deadly to any effective teaching method.

By design, books are an entirely controlled setting for both the writer and the reader vs. any more dynamic and effective teaching style, which involves a more balanced setting of give and take.Books are more of a dump and run approach; that’s why they generally suck for learning purposes.

When it comes to lectures, for the last three or four generations we’ve replaced independent scholars (now making a resurgence on the Internet and a few face-to-face settings) with leashed experts who are usually on the payroll of some corporation, institution or government agency.

With books and lectures, the celebrity “experts” who dominate those formats have too much to lose to risk challenging any of the more dangerous modern orthodoxies that are tearing the nation apart, particularly in the world of education. In that restrictive setting, conformity and control masked with superficial glitz will always trump stimulating discussion or (perish the thought) creative problem solving.

If you want to learn via reading, articles on the Internet and a few independent magazines are usually much better at conveying useful information and analysis. But they have limitations too, since you can’t ask spontaneous questions that would benefit both the reader and the writer; there is rarely an attempt at “leading by example”; any crazy idea (when unchallenged) can sound good on paper; and most websites and magazines are ideologically factionalized and prone to pandering to the established preferences of the audience.

(It shouldn’t be surprising that decent analysis and “real news” can more often be found on the free-access, market-driven Internet, far outside the Good Ole Boys club of ultra-exclusive FCC licensing, urban newspaper monopolies and their collection of other subsidized platforms. Their constant bias towards central planning and social engineering is important to remember, since you will never hear any of this from the gossip/slander/advertising industry that specializes in state-sponsored stupidity.)

Just a century ago, before we were lured away from the Apprentice Model, the general concepts above would be considered Teaching 101. Today, it probably sounds like some radical new idea. (If Millennials want to think that dynamic interaction is a NEWS FLASH, TRENDING NOW!!!… that’s fine with me.)

But one-way books and lectures and their online equivalents—the most persuasive methods among a passive culture of induced conformity, and the methods most easily tampered with—have always been the tools of choice among creeping authoritarians and their paid support staff.

Now that more effective teaching methods have been pushed aside for a few generations, their easily mass-produced (yet somehow more expensive) alternatives have moved in to satisfy the “hunger for knowledge” that remains. In an attempt to feed that desire, Americans purchased over 680 million printed books in 2017 (actually more, since that figure is only based on the 80 to 85% of reported sales). Total annual revenue for the book-and-lecture workshops of American colleges was $564 billion (Fall 2015 — Spring 2016 school year). Total government subsidies from combined federal, state and local levels on all classroom-based education (K through college) for FY2018 will be nearly $1.1 trillion.

Oh, yes. We love our books and lectures. And most detractors are too busy complaining (via books and lectures) to offer a better alternative.

Rounding to the nearest billion, I’m guessing that federal, state and local government spending on any type of dynamic two-way teaching or “mentoring” is approximately zero. Not that I’m pushing for outside interference; just gauging our priorities.

Open Classroom of Civilization: Dynamic Two-way Teaching

Since the days of Socrates and Jesus, not to equate those two excellent teachers, we’ve known that two-way dialog and direct application (i.e., mentors and apprenticeship) are better ways of learning. The fact that both of those challenging, never pandering, teachers left a legacy that is remembered about 2,400 years and 2,000 years later, respectively, is a testament to the quality and durability of their teaching. Not just the words they spoke, but their methods as well. (In comparison, most of the drivel pumped out of the modern college system is forgotten within hours of cramming for the final exam.)

Most forms of public education in Europe (beyond isolated monasteries) took an extended absence during the Dark Ages, from roughly the 6th to the 10th Century AD. During that period, extreme poverty and political oppression ruled the Western world.

Gradually climbing out of that pit of misery, dynamic hands-on teaching was central to the re-birth of civilization from the late Middle Ages (starting around the 11th Century) and enduring until the beginning of the 20th Century. From an educational standpoint, master craftsmen in guilds or businesses would train young apprentices, with the junior staff gaining both social and technical skills along with being provided room and board. The owners would benefit from inexpensive labor that was not “free” or “easy to exploit,” since a competitive marketplace and parental involvement work to minimize the latter.

Anyone interested in beneficial “liberal arts” could consult elders in their local community or in some cases independent scholars (something available now more than ever). Even the folks at Wikipedia inadvertently stumbled across this former opportunity, highlighting a 12th Century university charter that “guaranteed [sic] the right of a traveling scholar to unhindered passage in the interests of education” as nothing less than the foundation of “academic freedom.” (Then they stumble a bit. Wikipedia’s unequivocal praise of bureaucratic tenure “protections” that have led to rigid orthodoxies and insulation from consumer feedback is standard academic tripe. That self-congratulating view contradicts much available evidence and is probably based on a failure to recognize the anti-market nature of academia itself. This pro-tenure, anti-market preference is common throughout the university system, yet without comparison in the rest of society. Much of this disparity can be attributed to the echo chamber and collectivist mindset of the college bubble. When Statists see difficulty, their solution is usually more top-down conformity and control, with heavy doses of legalistic hypocrisy thrown in to keep any subversives in line. Market corrections and consumer choice are window dressing at best. This bias has led us to the nullification of any meaningful sense of academic freedom.)

In contrast to the subsidized and divisive university model—which reaps over $500 billion annually in handouts, grants and loans as detailed in Part 3 next week—the educational system of the Apprentice Model relied on mutual cooperation between students, teachers and consumers . . . as quaint as that may sound. Without subsidies, they had no other choice.

Dynamic, experienced and rational teachers never sought those privileges or accepted the debilitating restrictions attached. The Apprentice Model worked well for numerous professions (including engineers, doctors and lawyers) for centuries without subsidies, as noted in American Apprenticeship and Industrial Education page 17 and other modern references cited here. The Apprentice Model got crowded out by an induced surge of the university model in the 20th Century, precisely because of political subsidies and artificial pressures, not because of any inherent shortcomings of professional mentoring.

The pre-printing press business model of the university system—which made some sense when it was devised in the 11th and 12th Centuries, when hand-written manuscripts were expensive and rare—was largely irrelevant soon after America became liberated from British rule and its stifling culture of hereditary privilege. Although schools are loath to admit it, literacy rates in America were over 90% for whites (both men and women) by 1810. That was a time when the U.S. population was rapidly growing, the economy was advancing and education was almost entirely a private enterprise.

For a more recent academic approach, the National Training Laboratories of Bethel, Maine confirmed a similar benefit for interactive teaching methods via research in the early 1960s. As visualized in their famous Learning Pyramid, lectures and reading provide the least retention; group discussion, practice by doing, and teaching others (or immediate use of leaning) provide the best retention.

Yet the worst methods of learning get all the attention from the distorted educational marketplace we’ve created.

Why Such an Imbalance?

Most professional jobs now require a minimum of 17 years (K through college) of classroom books and lectures. If you’ve “only” got 16.5 years of that stuff, you’ll be treated as a leper and your resume will be thrown in the trash. (Some innovative companies are changing that pattern, but they are currently an enlightened minority.)

From the businesses I’ve observed as a consultant or worked for as an employee, I’ve never seen or heard of ONE that provides even 0.1 years of anything that could vaguely pass for “mentoring.” Senior staff don’t have the time and aren’t given the positive incentives to encourage that. I base this on 25 years of working in the consulting business and visiting dozens of clients, with the last 10 years focusing closely on educational alternatives and positive incentives for mentoring (initially from a personal interest, but full-time for the last six months).

Putting any historical evidence aside and looking at it another way, no one in their right mind would go to a doctor whose resume said: I’ve read lots of books and sat through exciting lectures in school . . . but I’ve never actually practiced any of this stuff. Nevertheless, employers hire college grads with similar all-theory/no-skills credentials.

Educational reformer and New York City Teacher of the Year for 1989, 1990 and 1991, John Taylor Gatto, has done pioneering research (summarized in his book and videos) on how education has evolved in America. One of his many apt conclusions is:

“What’s gotten in the way of education in the United States is a theory of social engineering that says there is ONE RIGHT WAY to proceed with growing up.”

My question to employers and their Human Resources departments is simple: Who decided on this need for 17 years of one ineffective learning style vs. less than 0.1 years of other styles like mentoring?

My point is NOT that classroom books and lectures are totally worthless. I’m just saying that history and common sense suggest that adding in some direct mentoring can make the overall learning experience into a much better educational package. In other words, instead of a lecture vs. mentor imbalance of 17 to 0.1, why not maybe 15 to 2, or something similar? And it doesn’t necessarily have to add up to 17 years.

Building a Better Alternative

The outdated university model of books and lectures and their online equivalents are fine for introductory purposes and as reference manuals. Beyond that, direct application under the guidance of an experienced professional can have a far greater impact. Doctors figured that out long ago. They call it residency.

I’m calling it The Mentor Model. It’s basically an update of the Apprentice Model for professionals, with some added safeguards to protect and balance the interests of students, mentors and companies. One of the more important parts is to attach positive incentives for the professional mentors, instead of the failed mix of fear/guilt/charity we resort to now, as in “just do it, because it’s part of your job.” From my experience, that will never work. The basis here is to recognize that in the modern economy, potential mentors are usually non-owners. So “ownership-like” incentives are needed to keep things running smoothly.

The plan is for students, after one or two years of college, to go right into professional work. Since community colleges are very affordable and still do some teaching—skipping out on the “publish or perish” routine and other distractions—that’s a good way to get some core classes finished and establish a bit of independence after high school. It also helps with screening serious candidates.

The idea with The Mentor Model is to replace the last two years of college with a one or two year blend (depending on career choice) of work and study that is more productive than doing a bunch of classroom theoretical drills.

This also addresses the key question: Who pays for it? Answer: You do.

During that one or two year Transition Period, students (junior staff) get a lower pay rate than a full-time salary as a trade-off for receiving valuable mentoring and work skills from an experienced professional. During the Transition Period, the target is for a reasonable split of your time on academics vs. work. For an example 50/50 split, that would translate to 1,000 hours of online self-study and occasional field trips if appropriate (unpaid academics) and 1,000 hours of paid work per year.

In the case of consulting engineering (my background) or any job like IT or accounting that bills hourly fees, instead of charging out junior staff at $100 per hour as is typical for entry level work, the company would charge $60 per hour for this example. Right off the bat, the client is saving 40% on junior labor and the firm has an advantage for bidding new jobs. Working 1,000 hours per year at $60/hr yields $60K to split four ways between the company, the junior staff, the mentor and yours truly (who gets the smallest slice). For employees that don’t bill hourly to clients, the concept is similar with minor adjustments. Economic details beyond that are left for discussions with interested companies.

After one to two years in the Transition Period (as agreed in advance) junior staff would graduate to a full-time salary, having learned all the textbook/online theory necessary, gaining diverse and mentored work experience, while earning $10-15/hr pay and accruing no debt.

If a student wants to go back to school after some mentoring, that’s fine. Due to the financial practices of the modern education system, you’ll need to be aware of student loan and scholarship availability and debt repayment schedules if you intend to leave or re-enter the university model.

I should add: This is not an informal “internship” where students are often left doing a small list of mundane tasks with little or no supervision. If you’re thinking of participating in a summer internship, you may want to ask your prospective employer: Who will be my boss and how many hours per week of mentoring will I receive? If you get a blank stare or weak excuses, that should tell you something. The internships I’ve seen are better than nothing . . . but just barely.

In summary, college is over-priced and over-rated. Americans figured that out a long time ago but seem to have forgotten it lately. There are much better and more cost-effective ways learn. It comes down to educating yourself on the options and deciding which method is the best fit for your career aspirations.

Your rating: None Average: 10 (1 vote)
S.E. Seachman's picture
Columns on STR: 7

Steve Seachman is the owner of The Mentor Model, a new business designed to provide an alternative to the 4-year college degree for students seeking professional jobs.
His website is:
He can be reached at:


Lawrence M. Ludlow's picture

Just a minor note on comparing today's universities with those in the Middle Ages, which is when they first appeared. On one hand, there was control by the church hierarchy over many university functions, but in one way, medieval universities were much more responsive to markets than today's state-funded horrors. If you read about debaters such as Peter Abelard and other 12th-century doctors at these schools, students would de-enroll en masse if a doctor was outflanked by a student or a question or a debating point. So there was some immediate market feedback and an immediate drop in "customers" if a professor was unable to hold his own.