Because automatic editing tools require special permission to use, Graham has to work with the Wikipedia communities that manage versions of the encyclopedia in different languages. Graham and company created the InternetArchiveBot, a tool that scans Wikipedia for broken links and automatically adds links to versions archived in the Wayback Machine. It's also been digitizing physical books and other analog media, and has now scanned 3.8 million books. The organization's Wayback Machine service has archived 387 billion webpages since 2001. The Internet Archive was in a unique position to help solve this problem. And even freely available internet content often disappears. Books and academic journals supply some of the best, most reliable information for Wikipedia editors, but those sources frequently are either unavailable online or are behind paywalls. One issue that came up was the fragility of Wikipedia citations. So Kahle convened a group of people to discuss how to improve the information ecosystem. From fake news and inauthentic social media campaigns waged by foreign nations to concerns about voting systems themselves being rigged, there were plenty of ways that technology and information systems failed the public. "No matter who you wanted to be president, I would say almost everyone would agree the whole process was a train wreck," Internet Archive founder Brewster Kahle said in a speech in San Francisco last week. The Internet Archive embarked on its effort to weave digital books into Wikipedia after the 2016 election. But thanks to the Internet Archive's Digital Library of Japanese-American Incarceration, created with the Seattle-based organization Densho, many of those rare books are now available online. The Wikipedia entry on the internment of Japanese-Americans during World War II, for example, cites hard-to-find titles, says Internet Archive director of partnerships Wendy Hanamura. In other cases, books might be hard to come by. But students working late into the night on term papers, or reporters on tight deadlines, might not have time to order a book on Amazon or wait for a library book to become available. You could, of course, verify the information the traditional way by tracking down a physical copy of a book. Sign up for the Daily newsletter and never miss the best of WIRED. Until recently, if you wanted to verify that those books say what the article says they say, or if you just wanted to read the cited material, you'd need to track down a copy of the book. The entry on Martin Luther King Jr., for example, cites 66 different books. But many Wikipedia articles rely on good old-fashioned books. It's easy enough when the sources are online. Anyone can check out those citations to learn more about a subject, or verify that those sources actually say what a particular Wikipedia entry claims they do-that is, if you can find those sources. Any sentence that isn't backed up with a credible source risks being slapped with the dreaded "citation needed" label. The reason people rely on Wikipedia, despite its imperfections, is that every claim is supposed to have citations. It's the first stop for nearly everyone doing online research. It supplies answers for the information snippets you see on your Google or Bing search results. Wikipedia is the arbiter of truth on the internet.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |