What’s the big deal with Information Literacy?

We all love the convenience of looking for answers immediately and having “google” in our pockets. My dad, in his eighties, said, “People don’t really need to know anything anymore because they can just google it.” What a wise statement from a pre-Internet veteran. We have digital assistants on our devices to help us remember important things and manage our time. We are free of the burdon of having to remember small things and instead access and share information nonstop on a global scale.

A generation of citizens is emerging who have never known life without a networked mobile device with instant access to information. With that convenience comes the sacrifice of time to reflect and the guarantee of accuracy and quality of the hits we receive.

AbeLINCOLNinfolit

An information literacy colleague, Esther Grassian, advocates the need for Information Literacy and explains why it is a big deal. Information literacy is not simply an academic term– it is an understanding of our current culture in networked society.

A friend posted an insightful quote on Facebook which I noticed had been liked and shared by many but without any attribution to a creator. I asked if she knew the source and she replied, “People share these things all the time now and nobody ever really knows where they came from.” I find it perplexing that this smart and tech-savvy young woman would simply shrug off intellectual property with a “Who knows? Who cares?” attitude.

In BEYOND THE BLOGOSPHERE by Aaron Barlow, we are given the image of the Internet as a “book of sand” in which nobody knows the origin of ideas. They are washed out to sea and scattered along the beach.

If we really don’t care about information literacy in the future, there will be a high price to pay. Idiocracy might be a concept too difficult for the citizens of the future to comprehend. The fear of artificial intelligence evolving into consciousness pales in comparison to the glimpse of human beings shrugging off any desire to acquire knowledge simply because easy access and quick apps have made it irrelevant.

Wait!! What? Everyone is “Elsewhere” Conversation in the Digital Age (Part Two)

My review of Sherry Turkle’s Reclaiming Conversation continues with the concept of avoiding boredom or anxiety in our lives by “going elsewhere” on our phones. Those “boring bits of life” and worries that may come into our minds can be escaped by scrolling our news feeds and connecting with our online networks. Turkle suggests we consider the value of contemplation during brief moments of boredom and anxiety because as humans this “thinking through” leads to problem-solving and creativity. A lull in a conversation gives us time to reflect on the people around us. But today– our mobile devices tug at us to go “elsewhere”.

And when we get to this other place on our devices, the activity is nonstop. It’s become acceptable to back channel through conferences, business meetings, events and even mealtime. Interruption is now considered simply another mode of connectivity. Our brains love the stimulation of endless diversion but we never feel we can really keep up. Turkle says, “Only half joking, people in their teens and twenties tell me that the most commonly heard phrase at dinner with their friends is “Wait, what?” Everyone is always missing a beat, the time it takes to find an image or send a text (pg 37)”.

Certainly, the constant companionship we carry in our pockets can be used for good and I remain hopeful that the future can bring positive uses for technology in our daily lives. Stay tuned for more of the warnings we may need to heed as presented in this book and the possible solutions we are urged to embrace before it is too late!

Part of digital citizenship (and information literacy) is giving ourselves a healthy information diet. Just like our bodies will have consequences if we eat only tasty junk food or sweets, so too our minds are at risk in an age of constant digital intake and interruption.
meme-zombies-cell-phones_large

photo from http://weknowmemes.com/2012/07/whats-the-point-of-being-afraid-of-the-zombie-apocalypse/

Search is Changing

How we search for information is becoming more and more personalized. The personal dashboards we create on our devices make access to information easy and convenient, but how many of us realize the personal responsibility we now have to choose wisely each day, each moment?

Yes, information literacy has become a critical personal skill. “Like us on Facebook. Follow us on Twitter.” We hear it all around us. The likes and follows grow like giant sea monsters in the ocean of information chaos. Sure, we all love Google- but do we understand the responsibility is on each of us to evaluate what has been specifically given to us through algorithms of data? Do we really care?

As learners and citizens, we are all in this together. Come on, Google, give me the best answers for everyone. Not the best answers based on who you think I am. Of course, there are other search engines, but we all use google as a verb.

For those concerned about the future of information literacy, consider including information professionals in your personal learning network. You will need them. Thanks to Sheila Webber for sharing this slideshow by Phil Bradley.

War and Peace: Reflection on Literature in the Digital Age -Part 2

The reward of reading War and Peace was much more than being able say I got through it.  The reward of reading is always the experience of deep thought.  Having viewed both a movie version and a made-for-TV version, I had an idea of the overall story, but a reader not only has the creative privilege of designing the sets, the costumes, and actors, the reader gets to contemplate the philosophical concepts embedded throughout. And War and Peace certainly is chock full of philosophical meanderings.  What does it mean to exist as a human?  What is power and what is free will?  What are truth, beauty or love?  Did God design it all? This book is so much more than the story.

The story of War and Peace is basically a love story, which surprised me! Although there are vivid battle scenes of the War of 1812, descriptions of society a hundred years before Downton Abbey (and the production values I created in my mind were ever so lovely), and financial ruin during times of duress, the overarching theme is love.  The characters desperately seek the meaning of life through encounters with power, money problems, death, and the contemplation of free will. Spoiler ALERT! Love, it turns out, is the ultimate meaning. The Beatles were right! Love is all you need.

loveisall

But it really isn’t that simple.  Stating that love is the answer is like showing a picture of water and talking about how water is necessary for life.  You have to drink it to experience it.  You have to go through the journey for yourself. The experience of reading War and Peace is a far cry from the type of reading we get on the web as we scroll frantically through tidbits of thought. The experience has become a part of me, unlike the countless photos, memes, and comments delivered throughout every day through “disposable” media.

A book is a container of thought. Mental nourishment is as important as physical nourishment.  Your grandmother knew that you are what you eat. Now I am sounding like a librarian!  Everyone should read.

Well, that too isn’t really that simple.  I agree with Nicholas Carr’s suggestion (the reason I read War and Peace as stated in Part 1) that our brains are changing due to our constant connection to the Internet.  I recognize it in my own life and the lives of those around me- always leaving the present moment to check in online with our devices.  Maybe I should summarize by saying, all you need is love, a good book, and an effort to unplug occasionally. War and Peace exemplified all three and, in Part 3, I just may share an example or two.

 

Information Infatuation: Big Data is Big Daddy

The concept of algorithms providing us all with instant information access is fairly common since we all rely daily on Google. For the past few years, I have come across predictions about Big Data changing our lives from experts in the field of information technology, such as The Horizon Report.

I just finished reading Big Data by Viktor Mayer-Schonberger and Kenneth Cukier which provided an overview of the pros and cons of our “continuing infatuation with data” (Wall Street Journal). My biggest take-away is a phrase which was repeated throughout the book– we are no longer seeking causation as much as correlation. In other words, it doesn’t really matter why things are a certain way — just what they are! As an information professional I am struggling with that statement and several other ideas in Big Data.

bigdata

Benefits
Anyone who has ordered books through Amazon realizes that “Amazon knows what I like!” Recommendations for titles through Amazon can be amazing and enlightening. So too, other sales companies can access our preferences and bring just what we want to us –instead of making us search for things. On a snow day (teachers love those!), I spent some time searching Zappos.com for a pair boots. Shortly after that, I noticed boots kept coming up on my web searches on many different computers. Google, Amazon, Zappos, and all the web companies are already utilized big data and collecting our personal preferences. This can be a time-saving convenience.

Consequences & Disadvantages

The authors of Big Data are concerned that algorithms can predict behaviors. An example is the father who was shocked that a company was sending his teenage daughter coupons for diapers and baby products only to find out that she was indeed expecting a baby! The company had access to big data before the girl’s father was told. These predictions might also be shared with law enforcement agencies to give them a “heads up” about potential criminals. This raises questions about privacy because a person cannot be arrested for a crime that has not yet been committed.

Big data is changing mathematical statistics and may bring about the “demise of the expert” (page 139). One human being’s intuition and a lifetime of wisdom cannot compete with millions of algorithms. In fact, the idea that correlation is more important than causation could bring (in the words of the authors) the end of theory (page 70). We live in an age that values convenience more than quality and “settling for good enough” trumps our “obsessing over accuracy” (page 191). The authors believe that “big data is transforming many aspects of our lives and ways of thinking” (page 192).

As a librarian, I value personal privacy and respect for intellectual freedom. When I read about eBooks capturing massive amounts of data on readers’ preferences- how long they spend on a page and how they highlight or take notes in a book (page 114), I was appalled at how readily we give up our privacy as readers. I don’t think the average person realizes that big data has already taken much of our privacy away.

Mayer-Schonberger & Cukier state, “If big data teaches us anything, it is that just acting better, making improvements–without deeper understanding–is often good enough”. This clashes with my educator’s philosophy and pedalogical paradigm of critical thinking and information literacy. “Good enough” is not the goal in education. The goal is excellence. Perhaps this book is slanted toward commercial business rather than education, but the changes big data will bring are on a global scale and impact all fields.

Because my research has been in virtual worlds and because education is rapidly integrating technology at all age levels, I am concerned about big data’s role in participatory global digital culture (where my students now live). The authors of Big Data believe “What we are able to collect and process will always be just a tiny fraction of the information that exists in the world. It can only be a simulacrum of reality, like the shadows on the wall of Plato’s cave”. Virtual worlds can and must provide high quality “simulacrum” and settling for less is not something I am willing to hold as acceptable. Weeding out the millions of data hits that are “not so desirable” is the challenge of information literacy each of us now faces.

Mayer-Schonberger, V. and Kenneth Cukier(2013). Big Data. New York: W.W. Mariner.

Nudging the 5 year old and the 65 year old

Today I helped a kindergartener and a 65 year old do the same thing: find what they wanted to do on their iPad.

450px-Ipad_mini

The five year old kindergartener wanted “Big Cat’s” microphone to record his voice (on a child’s app) and it wasn’t working.
The 65 year old wanted to share a Facebook post to specific friends on his iPad FB account.

Both the 5 year old and the 65 year old sat beside me (one during school hours and one after school) with their ipads on a day which was overwhelmingly spent solving iPad issues. Some of the issues I encountered were update needs, restriction issues, apps not working, wifi settings wrong, locked-out passcodes, and numerous emails about iPads. Tech issues with iPads has overtaken my job as a school librarian. But that is not the point here.

I sat beside two individuals today who are 60 years apart in age. I had the same feeling, as I sat with both of them. I felt a sense of “please help” and I felt a sense of “this is so important to me”.

Who am I to judge their information literacy needs?

Well, actually, I am an information literacy specialist. So, I suppose this is the topic I could write on for hours; however, information literacy is rapidly changing as we move into digital culture- whether at age 5 or 65.

Shall we play a game? Share a youtube video? Shall we critically evaluate our information intake with meaningful purpose using best research practices?

Learning….the quest for life.

Caring for the individual….where they are in life.

I shall remember doing what my colleague, Kristin Fontichiaro, would call “nudging toward inquiry.”

Post-Modern Me

I keep running into the term post-modernism and I think it applies to us all.  Life as a 21st century educator (building a PLN, participating in Web 2.0, and constantly striving toward best practices of learning in global participatory digital culture) is a fascinating, yet paradoxical adventure. We now live in an era of metaliteracy, metadata, and perhaps “metalife”.   We no longer plant, harvest, and cook our food, like The Little Red Hen, because we enjoy our modern conveniences.  Yet, we are busier than ever “growing” our networks and “creating/curating” our content.

 

What powerful tools we have to connect on a global scale!  I have colleagues in Greece, Australia, Great Britain and all over the globe.  Some have actual met me physically and some have not.  Does it matter?  In a long ago era (think prior to the Internet), it mattered.  To meet someone meant to look into their eyes, to see the lines of age and experience or the wide-eyed innocence of youth.  That meeting was the opportunity to get a sense of one’s physical presence.  But today, perhaps the digital presence supercedes the physical one.  As digital devices have become top priority for communication, our metaselves have become “us”.  I don’t mean to sound like a dark futurist or a stuffy academic philosopher.  Maybe I am just a rambling librarian who wants to hang onto something physical like a book (and you can read tons of articles about why you would want to!  Ebooks are never really owned- only licensed temporarily).

 

What is interesting is how we pick and choose our personal/professional learning networks (or our online communities for those outside of education) as though we are critically evaluating people as data.  A century ago, the number of people we encountered, whether brilliant, annoying, or comical, was limited.  Today, we have a flood of information and a flood participants in our incoming stream of networked applications. Today, we can not only curate and critically evaluate information topics, we can curate the people who share the content.

 

I do think we need to remember one thing.  People are more important than data.  Behind these words, your words, your online curation, your tweets, and behind every keyboard- is a person.  A living person is more than algorithms of interests, more than big data.  Maybe, if I take a break and breathe deeply, I will allow serendipity to occur and life to be simply lived.  To be alive is miraculous and the funny thing is… just when I think I am grasping the concept of post-modernism- I learn that post-modernism is over.  We are entering post– post-modernism.  That doesn’t scare me.  I am getting used to not understanding life.
scoopit
 

 

 

The Once and Future Quest for Learning

The model for knowledge in the past was acquisition, mastery, and then mentoring or teaching others.  Today, I admit I am acquiring and sharing faster than I can master or even contemplate the material I encounter on Twitter, Scoop-it, and other online curation spaces. As I find resources on topics of personal interest (and I am focusing on educational content mostly in relation to information literacy), I gather and share but never have enough time to fully reflect on my learning.

In other words, I am teaching faster than I am learning.

I am turning into Merlin. (He lived his life backwards.)

Merlin_(illustration_from_middle_ages)

 

 

 

 

 

 

 

 

 

 

 

 

My concept of linear time has changed in relation to my learning.  Was it an allusion that once upon a time I could fully grasp a concept?

I hope we can all catch up to what we once knew someday.

 

 

Photo from wikimedia

Prep for a Twitter Chat on Digital Literacy

When asked to host a live twitter chat for the Texas Educators Chat @txeduchat on Dec. 1, 2013 from 8-9pm Central Time, I immediately thought of “Information Literacy in Participatory Digital Culture with a Focus on Youth and 21st Century Learning”. Of course, that topic was way too long for a tweet, so it was shortened to digital literacy (hashtag #digilit). Condensing our thoughts and words may appear easy (140 characters can be read fast) but it actually can be challenging. One of the most powerful online tools I have found for developing a professional learning network is twitter- once you get the hang of it.

The word most difficult to cut from my initial long topic phrase was “participatory”. Social media, live online interaction, user-generated content, and content curation tools have revolutionized our information intake. Students are now expected to be both consumers and producers of information (prosumers- a term coined by Alvin Toffler). In order to participate actively in the construction of learning in digital culture, students are required to develop digital literacy skills which are strongly focused on technology tools.

Condensing terms to hold the most meaning in the smallest space (think poetry) is not the only challenge of tweeting. We also have to consider nomenclature. Academics are sometimes criticized for using jargon that is difficult for people to understand. Natural language, tagging, and folksonomies have risen in popularity over formal classification subject headings in digital culture. Understandably, it is important to agree on correct terms that best describe broad categories and specific things.

Information Literacy
In the days before the close of the Gutenberg Parenthesis (when the book was king of the information hierarchy), literacy meant reading and writing. Information Literacy, a term coined by Paul Zurkowski, is recognized by numerous experts in the fields of information science and education, such as Mike Eisenberg, as the umbrella under which other literacies are categorized. As much of life is spent online, digital literacy has risen to the top of the list of multiple literacies which have numerous related terms.

Useful Resources for Digital Literacy

Critical Evaluation of Websites (Kathy Schrock)
AASL Standards for the 21st Century Learner
Common Sense Media on Digital Literacy

Think Before You Speak or get #digitalvertigo

My mom used to remind all of her kids (and grandkids) that “everything you think does not need to come out of your mouth”.  You can keep some ideas and words to yourself.  If you don’t think first, you may regret it later.

A new way to phrase this idea might be, “Think before you post”.  As social media sites urge everyone to share life in digital formats, many rush to the opportunity.  The idea, like most ideas, is not new.  Napoleon Hill was one of the first to say “Think twice before you speak” and also one of the first “motivational self-help” proponents of the modern personal success genre.

Here’s another book on the topic that cautions us (think Sherry Turkle and Nicholas Carr) about the pleasures of sharing our lives online.
Title:  #digitalvertigo: How Today’s Online Social Revolution is Dividing, Dimishing, and Disorienting Us 
Author:  Andrew Keen

digitalvertigosmall

 

 

 

I like the clever use of a hashtag in the title.

 

 

 

 

 

 

Notice all the post-it notes sticking out the side of my copy?  On a side note, I really like physical books because I can put those tangible notes inside!  Sure, I can highlight on an ebook, but going back to find my notes is not as “obvious” to me and I end up forgetting I even have an electronic copy.  (More on that another day.)

Many ideas for blog posts can be seen in the numerous post-its.  But I will only share one because I have learned that the chance of anyone reading a long blog post is nil.

Participation in social media has changed the way we live, think,  and interact.  Jonas Lehrer states, “While the Web has enabled new forms of collective action, it has also enabled new kinds of collective stupidity”.  Lehrer is a contributor to Wired magazine.  He cautions that we are moving from “the smart group” to “the dumb herd” and reminds us that real insight means “thinking for oneself” (Keen, page 51).

Following the crowd has always been dangerous but #digitalvertigo gives some real world examples about how the phrase “think before you speak” is taking on new meaning in digital culture.  We all have digital voices now.  We all can speak and can be heard.  The keyword that we mustn’t forget is…..THINK.