As it becomes more socially acceptable to halt a conversation to look something up on a smartphone, I wonder what is happening to our memories. Memorization of everyday facts doesn't seem to get a lot of respect these days, compared to a person with an iPhone.
Outsourcing our memories to machines is unlikely to stop anytime soon, but it will be interesting to see how other professions besides librarianship change in the coming years. Would you trust a physician who relies on an electronic device to remember standard diagnoses and dosages, for example? What about a researcher who can't spell? Reference librarians hardly do any work that involves looking up routine facts; increasingly we have shifted to assist more with informational processes related to comprehension, analysis, and integration.
Even if computers can beat us in the memory arena, the skill is still a measure of human intelligence. Unfortunately this is only obvious when when we are offline and disconnected, which is a decreasing amount of the time. But what makes us smarter than the machines we have created? We tend to change what we mean by intelligence in order to feel smarter than the machines, but there is less and less that machines don't 'know.' Here I'm echoing thoughts from a recent New Yorker article by Adam Gopnik:
"We have been outsourcing our intelligence, and our humanity, to machines for centuries. They have long been faster, bigger, tougher, more deadly. Now they are much quicker at calculation and infinitely more adept at memory than we have ever been. And so now we decide that memory and calculation are not really part of mind. ... We place the communicative element of language above the propositional and argumentative element, not because it matters more but because it's all that's left to us." [my bold]
Or maybe this is all wrong, and it's most accurate to say that our memories have been technologically enhanced in order to compensate for the increased quantity and availability of digital information. Machines may assist us, but we will continue to rely on our analog brains for the type of information -- even dry, fact-based information -- that we use every day at work and home. This is still quicker, at least until we embed microchips in our heads. But memorizing the type of information we don't access regularly, just for the sake of it, is less and less necessary. Fair enough?