Archive for the ‘Wikipedia’ Category

Walter Scott, the 88-year-old Guardian subeditor who was a walking Wikipedia – The Guardian

My great-uncle Walter Scott worked for the Guardian as a foreign affairs subeditor from 1916 to his retirement in 1963 aged 88 (The changing art of the subeditor: You had to read the type upside down, 2 August). He was due to retire in 1939, just as war broke out, and younger men in the office enlisted.

A small team under the leadership of EA Montague vowed to continue printing and distributing the paper throughout the war, even if the country came under enemy occupation. On 9 July 1940, Walter received a memo saying that in the event of an invasion of Britain, he would be notified by a secret code. The message would read: Private and confidential, not for publication sortie 03.30 hours. If he received this, Walter was instructed to immediately notify all war correspondents around the world, as their lives would be endangered.

A particular friend of Walters was James Bone (London editor of the Manchester Guardian, 1912-45). He frequently invited Walter for weekends at his country cottage, Abbots Holt, in Tilford, Surrey. At his retirement dinner, Bone made a speech paying tribute to the people he worked with and said and the great Walter, who has carried so much of the London end on his shoulders. He has borne with me so patiently for so long and I fear I have not benefited as I should from his kindly admonitions. The last note from James to Walter, dated 6 September 1959, asks: [Do you] like the Guardians new name? Its the penalty that Manchester had to pay the world!

Walters services were needed as he was the go-to person for knowledge of foreign affairs. The management were concerned for his wellbeing and sent the following note: KAS [the office manager] says that he is quite content so long as we tell Walter Scott in writing that he has complete freedom to stay away in bad weather or whenever he feels slightly below par.

Walters command and memory of foreign affairs meant he was consulted long into his retirement; he was their Wikipedia.Anita ScottFarnham, Surrey

Have an opinion on anything youve read in the Guardian today? Please email us your letter and it will be considered for publication.

Original post:
Walter Scott, the 88-year-old Guardian subeditor who was a walking Wikipedia - The Guardian

Quiznos Fast-Food Chain Partners With Bakkt To Accept Bitcoin For Sandwiches – Benzinga – Benzinga

Fast-food chain Quiznos partnered with Intercontinental Exchange Inc (NYSE:ICE)-owned cryptocurrency financial services platform Bakkt to start acceptingBitcoin(CRYPTO: BTC) for sandwiches at select Denver locations.

What Happened:According to a Wednesday Coindeskreport, Quiznos locations including the one located at the high-traffic Denver airport will start accepting Bitcoin in mid-August.

Customers who download the Bakkt mobile application and pay with the coin,will receive a $15 reward in Bitcoin.

Bakkt Chief Revenue Officer Sheela Zemlin said that the company wouldpay close attention to how this pilot program performs and considers expanding it "to additional Quiznos locations across the country."

The firm has been actively recruiting retail chains for some timeand recently alsoannouncedthat its users wouldbe able to pay at more than a million Cantaloupe retail devices with Bitcoin as well.

President of Quiznos parent company REGO Restaurant Group Mark Lohmann said that he appreciates that the partnership allows his firm to accept bitcoin directly at the point of sale." He believes that this is also an answer to "mobile and millennial consumer demand for alternative and cryptocurrency payment options."

Price Action:According to CoinMarketCapdata, Bitcoin is now trading at $39,398after gaining 3.09% of value over the last 24 hours. The current price is 40% down from midApril's high of over $64,000, but also 66% higher than late-July's low of under $23,150.

Photo:Salmonpepperricevia Wikipedia

2021 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.

See original here:
Quiznos Fast-Food Chain Partners With Bakkt To Accept Bitcoin For Sandwiches - Benzinga - Benzinga

The CDC Should Be More Like Wikipedia – The Atlantic

Much as his predecessors warned Americans against tobacco and opioid abuse, Surgeon General Vivek Murthy issued a formal advisory last Thursday that misinformationsuch as the widespread propaganda now sowing doubts about coronavirus vaccines on social mediais an urgent threat to public health. It is, but the discussion soured quickly. After President Joe Biden said social-media platforms that turn users against vaccines are killing people, an anonymous Facebook official told CNN that the White House is looking for scapegoats for missing their vaccine goals. When Press Secretary Jen Psaki said the White House is flagging problematic posts for Facebook, conservatives and Twitter contrarians inferred that the government was telling the company to censor people. The journalist Glenn Greenwald described the effort as fascism.

Greenwald and Facebook are minimizing a genuine problem: An infodemic involving the viral spread of misinformation, as well as the mingling of facts with half-truths and falsehoods in a fractured media environmenthas compounded the COVID-19 pandemic. But critics of Murthys initiative and Bidens comments are right about one thing: The official health establishment has made the infodemic worse through its own inability to cope with conflicting scientific views. In the early days of the pandemic, experts at the World Health Organization, CDC Director Robert Redfield, National Institute of Allergy and Infectious Diseases Director Anthony Fauci, and thenSurgeon General Jerome Adams discouraged mask wearing and only belatedly reversed course; some of the same voices later pooh-poohed the notion that the coronavirus first began spreading after escaping from a Chinese research laba possibility now being taken far more seriously.

Daniel Engber: Dont fall for these lab-leak traps

Anti-vaccination propagandists and social-media provocateurs alike have exploited these missteps to great effect; even those inclined to trust the government have lost some confidence in official pronouncements. If the Biden administration hopes to reverse that, it should ask itself: What could the CDC do differently if the lab-leak hypothesis first surfaced today?

What the United States needs if it hopes to combat misinformation is a better system for communicating with the publica system that keeps up with continuous changes in scientific knowledge; that incorporates expertise from people in a variety of fields, not just those anointed with official titles at well-known institutions; and that weaves dissenting perspectives into a larger narrative without overemphasizing them.

Fortunately, the internet has produced a model for this approach: Wikipedia. The crowdsourced reference site is the simplest, most succinct summary of the current state of knowledge on almost any subject you can imagine. If an agency such as the CDC launched a health-information site, and gave a community of hundreds or thousands of knowledgeable people the ability to edit it, the outcome would be far more complete and up-to-date than individual press releases. The same modeltapping distributed expertise rather than relying on institutional authoritycould be useful for other government agencies that find themselves confronting rumors.

Rene DiResta: Virus experts arent getting the message out

The idea of making government websites more like Wikipedia may sound far-fetched, even comical. People of a certain agepeople such as meremember our teachers telling us, Wikipedia is not a source! And yet, over two decades, Wikipedia has flourished. Though perhaps still not citable for academic work, the site provides reliable, up-to-date information about millions of topics, backed by robust sourcing. And it meets the needs of the moment: the incorporation of a wide swath of voices; transparency about who is saying what; and a clear accounting, via the Talk page accompanying each entry, of every change to the consensus narrative.

An officially sanctioned but broadly sourced version of Wikipedia for health matters could also serve as a robust resource for Facebook, Twitter, YouTube, and other social-media companies to point their users to. Tech platforms are currently expected to counter misinformation by amplifying authoritative sources, but they are also aware that simply linking to the CDCs and WHOs official sites is not resonating with many audiences. When internet users show that they trust crowdsourced information more than any one agencys pronouncements, figuring out how to generate the best crowdsourced information possible is a matter of urgency.

As a researcher, I study misinformation, but Im also concerned about threats to freedom of expression. Although health misinformation can cause significant harm to communities, heavy-handed content moderationeven when intended to limit that harmexacerbates deep distrust and fears of censorship. Knowledge evolves. New facts should change peoples minds. Sometimesas with masksthe loudest calls to reconsider the prevailing consensus come from those outside of government.

Murthys advisory recognizes this: It is important to be careful and avoid conflating controversial or unorthodox claims with misinformation, he writes. Transparency, humility, and a commitment to open scientific inquiry are critical. Forthrightly acknowledging that consensus does change and that, at key moments, the government does not yet know all the facts might help rebuild the publics trust; at a minimum, it might minimize the impact of the tedious Gotcha! tweets that present two seemingly conflicting headlines as evidence of wholesale expert, media, and government incompetence.

Rene DiResta: The anti-vaccine influencers who are merely asking questions

Wikipedia, with its army of 97,000 volunteers contributing to COVID-related pages, has already been forced to confront the challenges of the lab-leak hypothesisan emblematic example of the challenge of trying to fact-check online information when scientific consensus is in flux or has not yet formed. The Talk page linked to the Wikipedia entry on the origin of the coronavirus provides visibility into the roiling editing wars. Sock-puppet accounts descended, trying to nudge the coverage of the topic to reflect particular points of view. A separate page was created, dedicated specifically to the COVID-19 lab leak hypothesis, but site administrators later deleted ita decision that remains in dispute within the Wikipedia community. The Talk pages for some pandemic-related entries have been labeled with one of the sites standard warnings: There have been attempts to recruit editors of specific viewpoints to this article. If youve come here in response to such recruitment, please review the relevant Wikipedia policy on recruitment of editors, as well as the neutral point of view policy. Disputes on Wikipedia are resolved by consensus, not by majority vote.

On June 17, the sites supreme court, the Arbitration Committee, made the decision to place COVID-19 pages under discretionary sanctions, a rubric that involves a higher standard of administrator oversight and greater friction in the editing process, and is in place for other topics such as abortion, the Arab-Israeli conflict, and Falun Gong. But the point is that Wikipedia has developed a consistent framework to handle these turbulent topics. The site has clearly articulated guidelines to foster the incorporation of the most accurate information and provide visibility into exactly how the current version of any entry came about. These are significant achievements.

Maintaining and expanding the site requires countless hours of volunteer labor. Because laypeople may not be able evaluate the significance of highly technical scientific findings, a Wikipedia-style communications model for government would require tapping a variety of reputable contributors, including people outside of government, as initial authors or editors, who would then invite others to join the effort, perhaps for a set term. The editorial conversationsthe process of mediating consensuswould be viewable by everyone, so allegations of backroom dealing would not be credible.

Ultimately, Wikipedia remains a platform on which consensus develops in full public view. In fact, some other platformsincluding YouTubechose to point to Wikipedia quite prominently beginning in 2018, in efforts to direct people toward reliable information as they watched videos discussing various conspiracy theories, such as one about the 1969 moon landing. Wikipedia is regularly the top link in search results, suggesting that internet users rely on it even though they understand the limitations of a source writtenand constantly rewrittenby pseudonymous volunteer authors. During the pandemic, platforms have struggled to decide which posts are misinformation and how to direct users to authoritative sources. A Wikipedia-style deployment of distributed expertise and transparent history is promising regardless of whether were talking about how a novel coronavirus spreads or what happened to some ballots in a dumpster or what really transpired in the latest viral protest video.

Although Biden blamed Facebook and other social-media platforms for the spread of misinformation, Murthys advisory offers useful advice to everyone in the media ecosystem. Limiting the spread, he declares, is a moral and civic imperative that will require a whole-of-society effort. Physicians can use social media themselves, to counter bad information with good. Journalists can avoid publishing clickbait headlines and more carefully evaluate studies that have yet to be peer-reviewed. Tech platforms can redesign algorithms and product features to surface reliable information about health. And individual social-media users can think before they share things online.

The surgeon generals exhorting ordinary Americans to do their part in stopping viral misinformation is a remarkable acknowledgement that, in the modern information environment, the distribution of power has shifted. The unfortunate irony is that a surgeon generals advisory may not break through the noiseor may immediately become fodder in a roiling, unending online battle.

Public officials who hope to solve problems in this environment need to be willing to try new tacticsand not just on matters of health. Any message that agencies put before citizens will be richer if shaped by processes that account for the changed relationship between fact and opinion, between expertise and influence, and between the public and its leaders.

See the rest here:
The CDC Should Be More Like Wikipedia - The Atlantic

The science – Capillary leak syndrome – Johnson and Johnson vaccination warning – Wikipedia – – The Weston Forum

Langen (dpa) People who have had the extremely rare capillary leak syndrome in the past should not receive the coronavirus vaccine from Johnson & Johnson.

This stems from the manufacturers so-called Rote-Hand-Brief Brief, distributed by the Paul Ehrlich Institute (PEI). Accordingly, in the first few days after the administration of this vaccine, very rare cases of capillary leak syndrome, in some cases fatal, have been reported.

Thus, the vaccine is contraindicated in people who have had capillary leak syndrome in the past. Capillary leak syndrome is an extremely rare but potentially life-threatening disease. According to the information, it is characterized, among other things, by severe attacks of edema (water retention) especially in the extremities and hypotension. Immediate treatment is necessary if symptoms develop after a Johnson & Johnson vaccination.

According to the vaccination information sheet, people who have experienced capillary leak syndrome should also not be vaccinated with AstraZeneca (Vaxzevria). There he says of the very rare cases that have occurred: The capillary leak syndrome appears in the first few days after vaccination and is characterized by rapid progression in swelling of the arms and legs, sudden weight gain and a feeling of weakness. Immediate medical treatment is required.

dpa-infocom, dpa: 210719-99-436866 / 4

Alcohol buff. Troublemaker. Introvert. Student. Social media lover. Web ninja. Bacon fan. Reader.

Follow this link:
The science - Capillary leak syndrome - Johnson and Johnson vaccination warning - Wikipedia - - The Weston Forum

Wikipedia: The Most Reliable Source on the Internet? – PCMag

(Photo by Ali Balikci/Anadolu Agency via Getty Images)

Wikipedia is a fascinating corner of the weba font of knowledge that leads to expected places. But as any teacher or professor will tell you, it's not a primary source. Use it as a jumping-off point, but scroll to the bottom and seek out original sources for the "truth."

Is that fair? Is Wikipedia indeed a repository for half-truths? It's a topic that Professor Amy Bruckman from the Georgia Institute of Technology's School of Interactive Computing has researched extensively and examines in her book Should You Believe Wikipedia?, out in 2022 from Cambridge University Press.

Her conclusions may surprise you. Ahead of a September keynote at IntelliSys 2021, we spoke to Professor Bruckman, a Harvard grad who holds a PhD from theMIT Media Lab, about how to test assumptionsand the definition of truth and existencein an era of misinformation.

Before we get to Wikipedia, your wider research focuses on the field of "social computing," which includes ethics, research, content creation and moderation, plus social movements. When did you first encounter web-based communities?[AB] Around 1990, I was a grad student at the MIT Media Lab and my friend Mike Travers showed me a model of MIT in a multi-user, text-based virtual world.He had programmed a bot of his advisor, Marvin Minsky. Virtual Marvin would automatically start off in his office in the Media Lab, walk across campus to a classroom, and deliver a lecture at the correct time Tuesdays and Thursdays, reading a chapter of his book, Society of Mind. It was magic. I was hooked.

And was that when you built your first multiplayer real-time world? [AB] Yes, that was when I built MediaMOO, a multi-user text-based world designed to be a professional community for media researchers. Then my dissertation project was a virtual world for kids called MOOSE Crossing, where kids built the world together and learned object-oriented programming and practiced their creative writing.

Many people have fond memories of using MOOSE Crossing as kids. In fact, there was something on NPR about it last year. But these were early days in collaborative computing. What were you running MOOSE Crossing and MediaMOO on? [AB] Well, this was before the invention of the web, and we were using computers running the UNIX operating system.The internet wasnt yet a mass medium, but we could see that it would be, and the potential was exciting.

Which brings us to Wikipedia. Many of us consult it, slightly wary of its bias, depth, and accuracy. But, as you'll be sharing in your speech at Intellisys, the content actually ends up being surprisingly reliable. How does that happen?[AB] The answer to "should you believe Wikipedia?" isn't simple.In my book I argue that the content of a popular Wikipedia page is actually the most reliable form of information ever created.Think about ita peer-reviewed journal article is reviewed by three experts (who may or may not actually check every detail), and then is set in stone. The contents of a popular Wikipedia page might be reviewed by thousands of people. If something changes, it is updated. Those people have varying levels of expertise, but if they support their work with reliable citations, the results are solid. On the other hand, a less popular Wikipedia page might not be reliable at all.

Because few people access that page, or know/care enough about the subject to correct/challenge them? Which brings us to the big ideas behind what is truth, and how we reach it. [AB] In my book and my talk at Intellisys, I try to teach everyone a bit of basic epistemology, and show how that helps us better understand the internet.I believe ideas like virtue epistemology can help us to improve the quality of the internet going forwards.

Okay, virtue epistemology is definitely a big idea. Give us a working definition, and how it applies to Wikipedia. [AB] Virtue epistemology suggests that knowledge is a collaborative achievement, and we all can work to achieve knowledge (justified, true belief) by aspiring to epistemic virtues: "curiosity, intellectual autonomy, intellectual humility, attentiveness, intellectual carefulness, intellectual thoroughness, open-mindedness, intellectual courage and intellectual tenacity." Being someone who is careful with knowledge is a lifelong quest, and trying to embody those virtues helps.

So if someone embodies those virtues, we expect them to be in pursuit of noble truth. But how do we know what is true? [AB] The real world exists, but is only knowable through our fallible senses.But that doesnt mean that reality is subjective.Am I sitting on a chair?You see it with your senses and I with mine, but we agree that there is something called a "chair," and I am sitting on one.The high degree of correlation between my subjective perceptions and your subjective perceptions is caused by the fact that the world existsthere's really a chair. The more people agree on something, the more we can be sure of it.And the more those people possess what we would call "reliable cognitive processes," the more we can be sure of it.So let's pick a harder example than my chair: Is human activity changing the climate?We know the answer is yes because a large number of people with reliable cognitive processes agree. Truth exists independent of the knower, but social consensus is our best way of figuring out what that truth is.

Most of us exist inside a bubble of similarly minded folks, which shores up our confirmation bias. Can you explain that concept with regard to Wikipedia too?[AB] Im not actually a climate scientist. I know that human activity is changing the climate because I have chosen sources I trust.And I interact with a community of people (in person and online) who share my views.When everyone around me believes that human activity is changing the climate, its easier for me to decide that its worth extra money to buy a car with a hybrid engine. I live in a bubble of like-minded folks.Thats good most of the time. I dont have to go get a degree in climatology before I go car shopping.But there are growing numbers of bubbles of people who share false beliefs, and reinforce those beliefs in one another.Thats a problem for the internet in general.

What happens with false beliefs on Wikipedia? [AB] Maybe the biggest surprise of the internet to me is that false bubbles generally are corrected on Wikipedia.Even if you pick a controversial topic like climate change or vaccination, the Wikipedia page typically reflects mainstream scientific consensus.Something about Wikipedia is working better than the rest of the internet, and I think we can learn from it as a positive model.

Can metadata help? [AB] Metadata is critical to the future of the internet. We all need help deciding what to believe. It would help a lot if information came with a reliability rating.But there's nothing easy about creating those ratings.We need both a method of judging what is reliable, and a financial model to pay for the process of creating those labels.

If Wikipedia is a good example of mass peer review, then it can also incorporate testimony to establish a baseline of truth, right? For example, I was invited by the USC Shoah Foundation, which was founded by Steven Spielberg, to see their recording of Holocaust testimonies for future generations. But eyewitness accounts often don't pass into 'truth' or are considered too subjective on Wikipedia. Is that when we have to urge people to look to wider sources? [AB] You need an intermediate layerinterpretation of primary sources by a Holocaust scholar. Thats the difference between a work of history and an encyclopedia.The job of a historian is to synthesize primary sources and form an interpretation.The job of an encyclopedia is to summarize work by historians and give you a list of links to go read if you want to learn more.

Wikipedia also asks us to educate ourselves, and then share that knowledge, as subject matter experts. Can you talk about your personal non-academic experience here?[AB] Editing Wikipedia can be a lot of fun.I used to help with the page on trash cans. The group of people working on the page had a long conversation about the words "bin" versus "can" and how the name for a waste receptacle varies around the world.Theres a mini-golf course near my parents' house, and the trash can is shaped like a dolphin.I added a picture of it to the article, and the next time I visited I showed the owner that his trash can was famous.The photo was there for a decade or so.Someone has since taken it down, sadly.But I enjoy contributing to something meaningful.Sometimes even things less silly than trash cans.

Finally, and this is going pretty deep, is any of this true? That's where, as you've pointed out, metaphysics comes in, and where we get to look up at the sky and wonder 'am I dreaming this life?' Discuss.[AB] One thing you learn quickly when you hang out with epistemologists is that truth exists. How we agree what that truth is can be tricky. But the affordances of internet technology are a surprisingly good fit for how knowledge is constructed.I think we can learn a lot from success stories like Wikipedia.And maybe apply those lessons to the design of the rest of the internet.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.

Go here to read the rest:
Wikipedia: The Most Reliable Source on the Internet? - PCMag