Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

October 17, 2017

Thinking Critically About “Fake News, Facts, and Alternative Facts” (Coursera)

Filed under: Critical Reading,Journalism,News — Patrick Durusau @ 3:17 pm

Fake News, Facts, and Alternative Facts by Will Potter, Josh Pasek, and Brian Weeks.

From “About this course:”

How can you distinguish credible information from “fake news”? Reliable information is at the heart of what makes an effective democracy, yet many people find it harder to differentiate good journalism from propaganda. Increasingly, inaccurate information is shared on Facebook and echoed by a growing number of explicitly partisan news outlets. This becomes more problematic because people have a tendency to accept agreeable messages over challenging claims, even if the former are less objectively credible. In this teach-out, we examine the processes that generate both accurate and inaccurate news stories, and that lead people to believe those stories. We then provide a series of tools that ordinary citizens can use to tell fact from fiction.

To honor the exhortations “use critical thinking,” here are some critical thoughts on course description for “Fake News, Facts, and Alternative Facts.”

How can you distinguish credible information from “fake news”?

The description starts with black and white, normative classifications, one good, “credible information,” and one bad,“fake news.” Information other than being alive or dead is rarely that clear cut. As Tom Petty recently proved, even being dead can be questionable.

You are being emotionally primed to choose “credible information,” as opposed to evaluating information to determine the degree, if any, it should be trusted or used.

Reliable information is at the heart of what makes an effective democracy,

A remarkable claim, often repeated but I have never seen any empirical evidence for that proposition. In critical thinking terms, you would first have to define “reliable information” and “effective democracy.” Then using those definitions, provide empirical evidence to prove that in the absence of “reliable information” democracy is ineffective and with “reliable information” democracy is effective.

It’s an easy claim to make, but in the context of a critical thinking course, isn’t more required than repeating popular cant?

I’ll grant many theories of democracy are predicated upon “reliable information but then those theories also posit equal treatment of all citizens, another popular fiction.

yet many people find it harder to differentiate good journalism from propaganda.

As opposed to when? What is the baseline for when people could more easily “…differentiate good journalism from propaganda…?” Whenever you hear this claim made, press for the study with evidence to prove this point.

You do realize any claiming such a position considers themselves capable of making those distinctions and you are very likely in the class of people who cannot. In traditional terminology, that’s called having a bias. In favor of their judgment as opposed to yours.

Increasingly, inaccurate information is shared on Facebook and echoed by a growing number of explicitly partisan news outlets.

You know the factual objections by this point, what documentation is there for an increase in “inaccurate information” (is that the same as false information?) over when? When was there less inaccurate information. Moreover, when were there fewer “explicitly partisan news outlets?”

By way of example, consider these statements about Jefferson during the presidential election in 1800:


In the election of 1800, ministers spread rumors that Jefferson held worship services at Monticello where he prayed to the “Goddess of Reason” and sacrificed dogs on an altar. Yale University president Timothy Dwight warned that if he became president, “we may see the Bible cast into a bonfire.” Alexander Hamilton asked the governor of New York to take a “legal and constitutional step” to stop the supposed atheist vice president from becoming head of state. Federalists who opposed him called him a “howling atheist,” a “manifest enemy to the religion of Christ,” a “hardened infidel,” and, for good measure, a “French infidel.” As Smith describes it, insults like these were issued forth from hundreds of pulpits in New England and the mid-Atlantic. When Jefferson won the election, many New England Federalists buried their Bibles in their gardens so the new administration would not confiscate and burn them.

It may just be me but it sounds like there was “inaccurate information” and “explicitly partisan news outlets” available during the presidential election of 1800.

When anyone claims there is more “inaccurate information” or “explicitly partisan news outlets,” ask for percentage evidence against some base period.

Surely if they are devoted to “credible/reliable information,” they would not make such statements in the absence of facts to back them up. Yes?

This becomes more problematic because people have a tendency to accept agreeable messages over challenging claims, even if the former are less objectively credible.

People accepting messages they find agreeable is a statement of how people process information. Thinking Fast, Thinking Slow, Kahneman.

The claim goes off the rails with “…even if the former are less objectively credible.”

Where does “…less objectively credible.” come from? It’s a nice sleight of hand but never fall for anyone claiming an “objective” context. It doesn’t, hasn’t and won’t ever exist.

You can make claims from the context of a community of people, scholars, experts, etc., that is every claim originates in shared values and worldview. (See Stanley Fish if you are interested in the objectivity issue.

As with all such claims, the authors have a criteria for “objectively credible” they want you to use in preference to other criteria, suggested by others.

There’s nothing wrong with advocating a particular criteria for judging information, we can all do no more or less. What I object to is cloaking it in the fiction of being beyond a context, to be “objective.” Let us all put forth our criteria and contend for which one should be preferred on an equal footing.

In this teach-out, we examine the processes that generate both accurate and inaccurate news stories, and that lead people to believe those stories. We then provide a series of tools that ordinary citizens can use to tell fact from fiction.

I can almost buy into “accurate” versus “inaccurate” news stories but then I’m promised “tools” to enable me to “…tell fact from fiction.”

Hmmm, but “Who is this class for:” promises:

This course is aimed at anyone who wants to distinguish credible news from “Fake News” by learning to identify biases and become a critical information consumer.

I don’t read “…learning to identify biases…” as being the same thing as “…tools…to tell fact for fiction.”

The latter sounds more like someone is telling me which is fact and fiction? Not the same as being on my own.

I’m enrolling in the course now and will have more comments along the way.

The crucial point here is that “critical thinking” should be universally applied, especially so to discussions of critical thinking.

January 27, 2017

The Critical Thinking Skills Cheatsheet [Infographic and Wookbook]

Filed under: Critical Reading,Journalism,News,Reporting — Patrick Durusau @ 2:45 pm

The Critical Thinking Skills Cheatsheet [Infographic] by Lee Watanabe-Crockett.

From the post:

Critical thinking skills truly matter in learning. Why? Because they are life skills we use every day of our lives. Everything from our work to our recreational pursuits, and all that’s in between, employs these unique and valuable abilities. Consciously developing them takes thought-provoking discussion and equally thought-provoking questions to get it going. Begin right here with the Critical Thinking Skills Cheatsheet.

It’s a simple infographic offering questions that work to develop critical thinking on any given topic. Whenever your students discover or talk about new information, encourage them to use these questions for sparking debate and the sharing of opinions and insights among each other. Together they can work at building critical thinking skills in a collaborative and supportive atmosphere.
… (emphasis in original)

The infographic, also available as a color 11 x 17 pdf file, is too large to display here but I can give you the flavor of it:

Who

… benefits from this?
… is this harmful to?
… makes decisions about this?
… is most directly affected?
… have you also heard discuss this?
… would be the best person to consult?
… will be the key people in this?
… deserves recognition for this?

What, Where, When, Why and How have similar expansions.

See also The Critical Thinking Workbook from Global Digital Citizen.

Specific domains may benefit from altered or additional prompts but this a great starting place!

You’re the fact-checker now [Wineberg/McGrew Trafficking In Myths]

Filed under: Bias,Critical Reading,Journalism,News — Patrick Durusau @ 11:08 am

You’re the fact-checker now

From the post:

No matter what media stream you depend on for news, you know that news has changed in the past few years. There’s a lot more of it, and it’s getting harder to tell what’s true, what’s biased, and what may be outright deceptive. While the bastions of journalism still employ editors and fact-checkers to screen information for you, if you’re getting your news and assessing information from less venerable sources, it’s up to you to determine what’s credible.

“We are talking about the basic duties of informed citizenship,” says Sam Wineburg, Margaret Jacks Professor of Education.

Wineburg and Sarah McGrew, a doctoral candidate in education, tested the ability of thousands of students ranging from middle school to college to evaluate the reliability of online news. What they found was discouraging: even social media-savvy students at elite universities were woefully unskilled at determining whether or not information came from reliable, unbiased sources.

Winburg and McGrew arrived at the crisis of “biased” news decades, if not centuries too late.

Manufacturing Consent: The Political Economy of the Mass Media by Edward S. Herman and Noam Chomsky, published in 2002, traces the willing complicity of the press in any number of fictions that served the interests of the government and others.

There is a documentary by Mark Achbar and Peter Wintonick about Noam Chomsky and Manufacturing Consent. Total run time is: 2 hours, 40 minutes and 24 seconds. I read the book, did not watch the video. But if you prefer video:

https://www.youtube.com/watch?v=YHa6NflkW3Y

Herman and Chomsky don’t report some of the earlier examples of biased news.

Egyptian accounts of the Battle of Kadesh claim a decisive victory in 1274 or 1273 BCE over the Hittites, accounts long accepted as the literal truth. More recent research treats the Egyptian claims as akin to US claims to winning the war on terrorism.

Winning wars makes good press but no intelligent person takes such claims uncritically.

For the exact details, consider:

The Road to Kadesh: A Historical Interpretation of the Battle Reliefs of King Sety I at Karnak

and, “The Battle of Kadesh: A Debate between the Egyptian and Hittite Perspectives:”

Or as another example of biased reporting, consider the text of You’re the fact-checker now.

From the post:

“Accurate information is an absolutely essential ingredient to civic health,” says Wineburg.

Ok, so what do you make of the lack of evidence for:

…it’s getting harder to tell what’s true, what’s biased, and what may be outright deceptive[?]

I grant there’s a common myth of a time when it was easier to tell “what’s true, what’s biased and what may be outright deceptive.” But the existence of a common myth doesn’t equate to factual truth.

An article exhorting readers to become fact-checkers that is premised on a myth, in Wineburg’s own words, has a “shaky foundation.”

Sources have always been biased and some calculated to deceive, from those that reported total Egyptian victory at Kadesh to more recent examples by Herman and Chomsky.

Careful readers treat all sources as suspect, especially those not considered suspect by others.


Semi-careful readers may object that I have cited no evidence for:

…it’s getting harder to tell what’s true, what’s biased, and what may be outright deceptive.

being a myth.

“Myth” in this context is a rhetorical flourish to describe the lack of evidence presented by Winburg and McGrew for that proposition.

To establish such a claim, the alleged current inability of students to discern between trustworthy and untrustworthy sources requires:

  1. A baseline of what is true, biased, deceptive for time period X.
  2. Test of students (or others) for discernment of truth/bias/deception in reports during period X.
  3. A baseline of what is true, biased, deceptive for time period Y.
  4. Proof the baselines for periods X and Y are in fact comparable.
  5. Proof the tests and their results are comparable for periods X and Y.
  6. Test of students (or others) for discernment of truth/bias/deception in reports during period Y.
  7. Evaluation of the difference (if any) between the results of tests for periods X and Y.

at a minimum. I have only captured the major steps that come to mind. No doubt readers can supply others that I have overlooked.

Absent such research, analysis and proofs, that can be replicated by others, Wineberg and McGrew are trafficking in common prejudice and nothing more.

Such trafficking is useful for funding purposes but it doesn’t advance the discussion of training readers in critical evaluation of sources.

January 13, 2017

Calling Bullshit in the Age of Big Data (Syllabus)

Filed under: Critical Reading,Journalism,News,Reporting,Research Methods — Patrick Durusau @ 7:33 pm

Calling Bullshit in the Age of Big Data by Carl T. Bergstrom and Jevin West.

From the about page:

The world is awash in bullshit. Politicians are unconstrained by facts. Science is conducted by press release. So-called higher education often rewards bullshit over analytic thought. Startup culture has elevated bullshit to high art. Advertisers wink conspiratorially and invite us to join them in seeing through all the bullshit, then take advantage of our lowered guard to bombard us with second-order bullshit. The majority of administrative activity, whether in private business or the public sphere, often seems to be little more than a sophisticated exercise in the combinatorial reassembly of bullshit.

We’re sick of it. It’s time to do something, and as educators, one constructive thing we know how to do is to teach people. So, the aim of this course is to help students navigate the bullshit-rich modern environment by identifying bullshit, seeing through it, and combatting it with effective analysis and argument.

What do we mean, exactly, by the term bullshit? As a first approximation, bullshit is language intended to persuade by impressing and overwhelming a reader or listener, with a blatant disregard for truth and logical coherence.

While bullshit may reach its apogee in the political sphere, this isn’t a course on political bullshit. Instead, we will focus on bullshit that comes clad in the trappings of scholarly discourse. Traditionally, such highbrow nonsense has come couched in big words and fancy rhetoric, but more and more we see it presented instead in the guise of big data and fancy algorithms — and these quantitative, statistical, and computational forms of bullshit are those that we will be addressing in the present course.

Of course an advertisement is trying to sell you something, but do you know whether the TED talk you watched last night is also bullshit — and if so, can you explain why? Can you see the problem with the latest New York Times or Washington Post article fawning over some startup’s big data analytics? Can you tell when a clinical trial reported in the New England Journal or JAMA is trustworthy, and when it is just a veiled press release for some big pharma company?

Our aim in this course is to teach you how to think critically about the data and models that constitute evidence in the social and natural sciences.

Learning Objectives

Our learning objectives are straightforward. After taking the course, you should be able to:

  • Remain vigilant for bullshit contaminating your information diet.
  • Recognize said bullshit whenever and wherever you encounter it.
  • Figure out for yourself precisely why a particular bit of bullshit is bullshit.
  • Provide a statistician or fellow scientist with a technical explanation of why a claim is bullshit.
  • Provide your crystals-and-homeopathy aunt or casually racist uncle with an accessible and persuasive explanation of why a claim is bullshit.

We will be astonished if these skills do not turn out to be among the most useful and most broadly applicable of those that you acquire during the course of your college education.

A great syllabus and impressive set of readings, although I must confess my disappointment that Is There a Text in This Class? The Authority of Interpretive Communities and Doing What Comes Naturally: Change, Rhetoric, and the Practice of Theory in Literary and Legal Studies, both by Stanley Fish, weren’t on the list.

Bergstrom and West are right about the usefulness of this “class” but I would use Fish and other literary critics to push your sensitivity to “bullshit” a little further than the readings indicate.

All communication is an attempt to persuade within a social context. If you share a context with a speaker, you are far more likely to recognize and approve of their use of “evidence” to make their case. If you don’t share such a context, say a person claiming a particular interpretation of the Bible due to divine revelation, their case doesn’t sound like it has any evidence at all.

It’s a subtle point but one known in the legal, literary and philosophical communities for a long time. That it’s new to scientists and/or data scientists speaks volumes about the lack of humanities education in science majors.

July 22, 2014

Commonplace Books at Harvard

Filed under: Books,Critical Reading,Knowledge Networks — Patrick Durusau @ 8:03 pm

Commonplace Books

From the webpage:

In the most general sense, a commonplace book contains a collection of significant or well-known passages that have been copied and organized in some way, often under topical or thematic headings, in order to serve as a memory aid or reference for the compiler. Commonplace books serve as a means of storing information, so that it may be retrieved and used by the compiler, often in his or her own work.

The commonplace book has its origins in antiquity in the idea of loci communes, or “common places,” under which ideas or arguments could be located in order to be used in different situations. The florilegium, or “gathering of flowers,” of the Middle Ages and early modern era, collected excerpts primarily on religious and theological themes. Commonplace books flourished during the Renaissance and early modern period: students and scholars were encouraged to keep commonplace books for study, and printed commonplace books offered models for organizing and arranging excerpts. In the 17th, 18th, and 19th centuries printed commonplace books, such as John Locke’s A New Method of Making Common-Place-Books (1706), continued to offer new models of arrangement. The practice of commonplacing continued to thrive in the modern era, as writers appropriated the form for compiling passages on various topics, including the law, science, alchemy, ballads, and theology. The manuscript commonplace books in this collection demonstrate varying degrees and diverse methods of organization, reflecting the idiosyncratic interests and practices of individual readers.

A great collection of selections from commonplace books!

I am rather “lite” on posts for the day because I tried to chase down John Locke’s publication of A New Method of Making Common-Place-Books in French, circa 1686/87.

Unfortunately, the scanned version of Bibliotheque Universelle et Historique I was using, listed “volumes” when they were actually four (4) issues per year and the issue containing Locke’s earlier publication is missing. A translation that appears in John Locke, The Works of John Locke in Nine Volumes, (London: Rivington, 1824 12th ed.). Vol. 2 gives this reference:

Translated out of the French from the second volume of Bibliotheque Universelle.

You can view an image of the work at: http://lf-oll.s3.amazonaws.com/titles/762/0128-02df_Bk.pdf on page 441.

Someone who could not read Roman numerals gave varying dates for the “volumes” of Bibliotheque Universelle et Historique which didn’t improve my humor. I will try to find a complete scanned set tomorrow and try to chase down the earlier version of A New Method of Making Common-Place-Books. My concern is the graphic that appears in the translation and what appears to be examples at the end. I wanted to confirm that both appear in the original French version.

Enjoy!

PS: I know, this isn’t as “practical” as functional programming, writing Pig or Cuda code, but on the other hand, understanding where you are going is at least as important as getting there quickly. Yes?

July 21, 2014

Commonplace Books

Filed under: Books,Critical Reading,Knowledge Networks — Patrick Durusau @ 6:52 pm

Commonplace Books as a Source for Networked Knowledge and Combinatorial Creativity by Shane Parrish.

From the post:

“You know that I voluntarily communicated this method to you, as I have done to many others, to whom I believed it would not be unacceptable.”

There is an old saying that the truest form of poverty is “when if you have occasion for any thing, you can’t use it, because you know not where it is laid.”

The flood of information is nothing new.

“In fact,” the Harvard historian Ann Blair writes in her book Too Much to Know: Managing Scholarly Information Before the Modern Age, “many of our current ways of thinking about and handling information descend from patterns of thought and practices that extent back for centuries.” Her book explores “the history of one of the longest-running traditions of information management— the collection and arrangement of textual excerpts designed for consultation.” She calls them reference books.

Large collections of textual material, consisting typically of quotations, examples, or bibliographical references, were used in many times and places as a way of facilitating access to a mass of texts considered authoritative. Reference books have sometimes been mined for evidence about commonly held views on specific topics or the meanings of words, and some (encyclopedias especially) have been studied for the genre they formed.

[…]

No doubt we have access to and must cope with a much greater quantity of information than earlier generations on almost every issue, and we use technologies that are subject to frequent change and hence often new. Nonetheless, the basic methods we deploy are largely similar to those devised centuries ago in early reference books. Early compilations involved various combinations of four crucial operations: storing, sorting, selecting, and summarizing, which I think of as the four S’s of text management. We too store, sort, select, and summarize information, but now we rely not only on human memory, manuscript, and print, as in earlier centuries, but also on computer chips, search functions, data mining, and Wikipedia, along with other electronic techniques.

Knowing some of the background on the commonplace book will be helpful:

Commonplace books (or commonplaces) were a way to compile knowledge, usually by writing information into books. Such books were essentially scrapbooks filled with items of every kind: medical recipes, quotes, letters, poems, tables of weights and measures, proverbs, prayers, legal formulas. Commonplaces were used by readers, writers, students, and scholars as an aid for remembering useful concepts or facts they had learned. Each commonplace book was unique to its creator’s particular interests. They became significant in Early Modern Europe.

“Commonplace” is a translation of the Latin term locus communis (from Greek tópos koinós, see literary topos) which means “a theme or argument of general application”, such as a statement of proverbial wisdom. In this original sense, commonplace books were collections of such sayings, such as John Milton‘s commonplace book. Scholars have expanded this usage to include any manuscript that collects material along a common theme by an individual.

Commonplace books are not diaries nor travelogues, with which they can be contrasted: English Enlightenment philosopher John Locke wrote the 1706 book A New Method of Making a Common Place Book, “in which techniques for entering proverbs, quotations, ideas, speeches were formulated. Locke gave specific advice on how to arrange material by subject and category, using such key topics as love, politics, or religion. Commonplace books, it must be stressed, are not journals, which are chronological and introspective.” By the early eighteenth century they had become an information management device in which a note-taker stored quotations, observations and definitions. They were even used by influential scientists. Carl Linnaeus, for instance, used commonplacing techniques to invent and arrange the nomenclature of his Systema Naturae (which is still used by scientists today).

[footnote links omitted]

Have you ever had a commonplace book?

Impressed enough by Shane’s post to think about keeping one. In hard copy.

Curious how you would replicate a commonplace book in software?

Or perhaps better, what aspects of a commonplace book can you capture in software and what aspects can’t be captured.

I first saw this in a tweet by Aaron Kirschenfeld.

May 26, 2013

The Sokal Hoax: At Whom Are We Laughing?

Filed under: Critical Reading,Literature,Science — Patrick Durusau @ 12:44 pm

The Sokal Hoax: At Whom Are We Laughing? by by Mara Beller.

The philosophical pronouncements of Bohr, Born, Heisenberg and Pauli deserve some of the blame for the excesses of the postmodernist critique of science.

The hoax perpetrated by New York University theoretical physicist Alan Sokal in 1996 on the editors of the journal Social Text quickly became widely known and hotly debated. (See Physics Today January 1997, page 61, and March 1997, page 73.) “Transgressing the Boundaries – Toward a Transformative Hermeneutics of Quantum Gravity,” was the title of the parody he slipped past the unsuspecting editors. [1]

Many readers of Sokal’s article characterized it as an ingenious exposure of the decline of the intellectual standards in contemporary academia, and as a brilliant parody of the postmodern nonsense rampant among the cultural studies of science. Sokal’s paper is variously, so we read, “a hilarious compilation of pomo gibberish”, “an imitation of academic babble”, and even “a transformative hermeneutics of total bullshit”. [2] Many scientists reported having “great fun” and “a great laugh” reading Sokal’s article. Yet whom, exactly, are we laughing at?

As telling examples of the views Sokal satirized, one might quote some other statements. Consider the following extrapolation of Heisenberg’s uncertainty and Bohr’s complementarity into the political realm:

“The thesis ‘light consists of particles’ and the antithesis ‘light consists of waves’ fought with one another until they were united in the synthesis of quantum mechanics. …Only why not apply it to the thesis Liberalism (or Capitalism), the antithesis Communism, and expect a synthesis, instead of a complete and permanent victory for the antithesis? There seems to be some inconsistency. But the idea of complementarity goes deeper. In fact, this thesis and antithesis represent two psychological motives and economic forces, both justified in themselves, but, in their extremes, mutually exclusive. …there must exist a relation between the latitudes of freedom df and of regulation dr, of the type df dr=p. …But what is the ‘political constant’ p? I must leave this to a future quantum theory of human affairs.”

Before you burst out laughing at such “absurdities,” let me disclose the author: Max Born, one of the venerated founding fathers of quantum theory [3]. Born’s words were not written tongue in cheek; he soberly declared that “epistemological lessons [from physics] may help towards a deeper understanding of social and political relations”. Such was Born’s enthusiasm to infer from the scientific to the political realm, that he devoted a whole book to the subject, unequivocally titled Physics and Politics [3].
(…)

A helpful illustration that poor or confused writing, accepted on the basis of “authority,” is not limited to the humanities.

The weakness of postmodernism does not lie exclusively in:

While publicly abstaining from criticizing Bohr, many of his contemporaries did not share his peculiar insistence on the impossibility of devising new nonclassical concepts – an insistence that put rigid strictures on the freedom to theorize. It is on this issue that the silence of other physicists had the most far-reaching consequences. This silence created and sustained the illusion that one needed no technical knowledge of quantum mechanics to fully comprehend its revolutionary epistemological lessons. Many postmodernist critics of science have fallen prey to this strategy of argumentation and freely proclaimed that physics itself irrevoably banished the notion of objective reality.

The question of “objective reality” can be answered only within some universe of discourse, such as quantum mechanics for example.

There are no reports of “objective reality” or “subjective reality” that do not originate from some human speaker situated in a cultural, social, espistemological, etc., context.

Postmodernists, Stanley Fish comes to mind, should have made strong epistemological move to say that all reports, of whatever nature, from literature to quantum mechanics, are reports situated in human context.

The rules for acceptable argument vary from one domain to another.

But there is no “out there” where anyone stands to judge between domains.

Should anyone lay claim to an “out there,” you should feel free to ask how they escaped the human condition of context?

And for what purpose do they claim an “out there?”

I suspect you will find they are trying to privilege some form of argumentation or to exclude other forms of argument.

That is a question of motive and not of some “out there.”

I first saw this at Pete Warden’s Five short links.

January 11, 2012

Algorithms exercise: Find mistakes in Wikipedia articles

Filed under: Algorithms,Critical Reading — Patrick Durusau @ 8:06 pm

Algorithms exercise: Find mistakes in Wikipedia articles by René Pickhardt.

From the post:

Today I started an experiment I created an excercise for coursework in algorithms and data structures that is very unusuale and many people have been criticle if this was a good idea. The idea behind the exercise is that studens should read wikipedia articles to topics related to lectures and find mistakes or suggest things that could be improoved. Thereby I hope that people will do something that many people in science don’t do often enough: Read something critically and carefully and question the things that you have learnt. (more discussions after the exercise)

This is great! No only can students practice thinking critically but there is a forum to test their answers: other users of Wikipedia.

Read the article, do the exercises and see how your critical reading skills fare.

Powered by WordPress