A 4,300-year-old city, which has a massive step pyramid that is at least 230 feet (70 meters) high and spans 59 acres (24 hectares) at its base, has been excavated in China, archaeologists reported in the August issue of the journal Antiquity.
The pyramid was decorated with eye symbols and “anthropomorphic,” or part-human, part-animal faces. Those figures “may have endowed the stepped pyramid with special religious power and further strengthened the general visual impression on its large audience,” the archaeologists wrote in the article. [The 25 Most Mysterious Archaeological Finds on Earth]
For five centuries, a city flourished around the pyramid. At one time, the city encompassed an area of 988 acres (400 hectares), making it one of the largest in the world, the archaeologists wrote. Today, the ruins of the city are called “Shimao,” but its name in ancient times is unknown.
The pyramid contains 11 steps, each of which was lined with stone. On the topmost step, there “were extensive palaces built of rammed earth, with wooden pillars and roofing tiles, a gigantic water reservoir, and domestic remains related to daily life,” the researchers wrote.
Yeah, geek stuff, but I love it. I used to be on the board of my local library, and we had treasures, too (our original collection was purchased for the library by Ben Franklin). Via Atlas Obscura:
Librarians take these kinds of questions very seriously, so when Atlas Obscura contacted some of our favorite libraries to ask about the oldest books in their collections, we were treated to a wealth of information about the treasures they hold.
The New York Public Library, for instance, has not only cuneiform tablets and ninth-century gospels, but also a Gutenberg Bible and a copy of The Bay Psalm Book, one of the oldest books printed in America. In addition to its own cuneiform tablets and Gutenberg Bible, the Library of Congress holds one of the oldest examples of printing in the world, passages from a Buddhist sutra, printed in A.D. 770, as well as a medieval manuscript from 1150, delightfully titled Exposicio Mistica Super Exod.
In the history of writing, bound books as we know them today arrive fairly late, so there are no actual “books” on this list. Instead, this is a wondrous collection of illuminated manuscripts, papyrus scrolls, and clay tablets. Some of these items you can even see in person, if you pay a visit.
I go out of my way to find harmless diversions during the age of Cheeto, and one of my new favorites is a series called Strange Evidence, from the Science Channel.
Basically, they document strange phenomena, and then get scientists to try to figure them out and recreate them. It does have that annoying fast-cut editing that you see on ghost shows, and that sinister music, but it’s really just plain old science.
My favorite episode was the one where an Egyptian tomb statue was caught on the Manchester Museum’s security camera turning around. (For what it’s worth, I figured it out. I’ve figured most of them out, but it’s still fun.)
The Russian attacks on the 2016 U.S. presidential election and the country’s continuing election-related hacking have happened across all three dimensions of cyberspace – physical, informational and cognitive. The first two are well-known: For years, hackers have exploited hardware and software flaws to gain unauthorized access to computers and networks – and stolen information they’ve found. The third dimension, however, is a newer target – and a more concerning one.
This three-dimensional view of cyberspace comes from my late mentor, Professor Dan Kuehl of the National Defense University, who expressed concern about traditional hacking activities and what they meant for national security. But he also foresaw the potential – now clear to the public at large – that those tools could be used to target people’s perceptions and thought processes, too. That’s what the Russians allegedly did, according to federal indictments issued in February and July, laying out evidence that Russian civilians and military personnel used online tools to influence Americans’ political views – and, potentially, their votes. They may be setting up to do it again for the 2018 midterm elections.
Some observers suggest that using internet tools for espionage and as fuel for disinformation campaigns is a new form of “hybrid warfare.” Their idea is that the lines are blurring between the traditional kinetic warfare of bombs, missiles and guns, and the unconventional, stealthy warfare long practiced against foreigners’ “hearts and minds” by intelligence and special forces capabilities.
However, I believe this isn’t a new form of war at all: Rather, it is the same old strategies taking advantage of the latest available technologies. Just as online marketing companies use sponsored content and search engine manipulation to distribute biased information to the public, governments are using internet-based tools to pursue their agendas. In other words, they’re hacking a different kind of system through social engineering on a grand scale.
Old goals, new techniques
More than 2,400 years ago, the Chinese military strategist and philosopher Sun Tzu made it an axiom of war that it’s best to “subdue the enemy without fighting.” Using information – or disinformation, or propaganda – as a weapon can be one way to destabilize a population and disable the target country. In 1984 a former KGB agent who defected to the West discussed this as a long-term process and more or less predicted what’s happening in the U.S. now.
The Russians created false social media accounts to simulate political activists – such as @TEN_GOP, which purported to be associated with the Tennessee Republican Party. Just that one account attracted more than 100,000 followers. The goal was to distribute propaganda, such as captioned photos, posters or short animated graphics, purposely designed to enrage and engage these accounts’ followers. Those people would then pass the information along through their own personal social networks.
Starting from seeds planted by Russian fakers, including some who claimed to be U.S. citizens, those ideas grew and flourished through amplification by real people. Unfortunately, whether originating from Russia or elsewhere, fake information and conspiracy theories can form the basis for discussion at major partisan media outlets.
As ideas with niche online beginnings moved into the traditional mass media landscape, they serve to keep controversies alive by sustaining divisive arguments on both sides. For instance, one Russian troll factory had its online personas host rallies both for and against each of the major candidates in the 2016 presidential election. Though the rallies never took place, the online buzz about them helped inflame divisions in society.
The trolls also set up Twitter accounts purportedly representing local news organizations – including defunct ones – to take advantage of Americans’ greater trust of local news sources than national ones. These accounts operated for several years – one for the Chicago Daily News, closed since 1978, was created in May 2014 and collected 20,000 followers – passing along legitimate local news stories, likely seeking to win followers’ trust ahead of future disinformation campaigns. Shut down before they could fulfill that end, these accounts cleverly aimed to exploit the fact that many Americans’ political views cloud their ability to separate fact from opinion in the news.
These sorts of activities are functions of traditional espionage: Foment discord and then sit back while the target population becomes distracted arguing among themselves.
Fighting digital disinformation is hard
Analyzing, let alone countering, this type of provocative behavior can be difficult. Russia isn’t alone, either: The U.S. tries to influence foreign audiences and global opinions, including through Voice of America online and radio services and intelligence services’ activities. And it’s not just governments that get involved. Companies, advocacy groups and others also can conduct disinformation campaigns.
Unfortunately, laws and regulations are ineffective remedies. Further, social media companies have been fairly slow to respond to this phenomenon. Twitter reportedly suspended more than 70 million fake accounts earlier this summer. That included nearly 50 social media accounts like the fake Chicago Daily News one.
Facebook, too, says it is working to reduce the spread of “fake news” on its platform. Yet both companies make their money from users’ activity on their sites – so they are conflicted, trying to stifle misleading content while also boosting users’ involvement.
Real defense happens in the brain
The best protection against threats to the cognitive dimension of cyberspace depends on users’ own actions and knowledge. Objectively educated, rational citizens should serve as the foundation of a strong democratic society. But that defense fails if people don’t have the skills – or worse, don’t use them – to think critically about what they’re seeing and examine claims of fact before accepting them as true.
American voters expect ongoing Russian interference in U.S. elections. In fact, it appears to have already begun. To help combat that influence, the U.S. Justice Department plans to alert the public when its investigations discover foreign espionage, hacking and disinformation relating to the upcoming 2018 midterm elections. And the National Security Agency has created a task force to counter Russian hacking of election systems and major political parties’ computer networks.
These efforts are a good start, but the real solution will begin when people start realizing they’re being subjected to this sort of cognitive attack and that it’s not all just a hoax.
Grappling with the question, “Are we alone?”
It’s all very interesting, but since quantum theory has expanded our thinking to the potential of so many alternate realities, why do physicists assume that the laws of physics truly are universal?
What if “reality” really is only what we imagine it to be?