Presentations

Using video in online instruction (video)

This video offers an overview of the case for using video in online instruction. It is related to the post entitled “Using an external video resource in your course”.


 

Why use video in online instruction?

The following video provides an overview of the pedagogical case for using video in online instruction.


Need a transcript? Click here

It’s never been more easy to record and publish videos. But with this convenience comes a source of new challenges when we consider using video in online instruction.

For example, even though face-to-face instructors can hold their class’ attention for an hour or more, the same lecture presented as a video recording will likely hold viewers’ attention for about 6 minutes.

And even though video provides a strong sensory experience for viewers, videos alone do not cause learning. So what are the best ways to use video?

This guide will introduce you to some of the basics of using video in online instruction, including a section on how it helps to promote your presence in the course.

Along the way, this guide will show you a model for your own course development by mixing a variety of media: video, text, and images – all working together where their strengths are used to their fullest.

Introduction to Using Rich Media in Online Instruction

This narrated slideshow is intended for an audience of experienced higher education online instructors who are at the stage in their professional development where she or he can consider conceiving instructional strategies beyond the basics.

The presentation makes the case for using rich media as an integral part of designing and developing online course. But not because rich media is “cool”, but because using rich media offers advantages for students in several ways: helping to understand things better because the mode of communication is better able to convey certain information, introducing professional communities of practice, and building experience with critical digital tools.


 
introduction to rich meduia
 

Creative Commons License
Introduction to Using Rich Media in Online Instruction by Steve Covello is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.
Based on a work at http://idmodule.com/?p=1192.

What is Possible in Teaching and Learning Online? Achieving Equivalence!

This narrated slideshow is intended for an audience of experienced F2F higher education instructors who are making the transition to teaching online or in a hybrid/blended format.

It is useful as prelude to a broader program of professional development – especially in making the case that online learning viable and rigorous, when developed and facilitated properly. Feel free to share or embed this in your own presentations, with attribution.


Creative Commons License
What is Possible in Teaching and Learning Online? Achieving Equivalence! by Steve Covello is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.
Based on a work at http://idmodule.com/?p=1183.

A Proposal for a User-based Learning Analytics Data Collection System

The following is an outline text to accompany a presentation delivered at the Emerging Learning Design conference, June 1, 2012, at Montclair State University. It elaborates more upon certain areas of the presentation that were constrained or omitted due to time. Here is the PPT file:

Download “Emerging Learning Design: A User-based Analytics Proposal” ELD_04.pptx – Downloaded 786 times – 2 MB

Abstract:

The current use of analytics leaves unanswered questions about how online learner behavior is attributed to outcomes because data collection is systems-based – polling the LMS, not the user. For example, “clickometrics” cannot tell us how students cognitively engage with instructional activities to make sense of subject matter – only that there is a correlation between duration/frequency of engagement and grades. It cannot tell us whether student failure is related to navigational disorientation or whether his or her conceptualization of subject matter is misaligned with instructional design or content. This presentation calls for a user-based system of learner analytics data collection. Dervin’s Sense-making Methodology proposes that user needs and information use can be reliably predicted using a problem-centered (or, objective-centered) approach. To benefit from this approach, learners must elicit their experiences with how instructional content and activities empowered them to make sense of learning objectives and how to achieve them – a focus on “how,” rather than “what” students used to learn. This data may help course evaluation efforts in making direct (rather than inferential) conclusions about the use of information and instruction in correlation to achievement of learning outcomes. We propose a user-based model for data collection within the LMS learning environment based on Dervin’s and others’ research in the area of User-based Design.

 

Presenter’s note: I have chosen to re-align the proposed system more towards “student success” rather than purely within instructional design. While student success may be inclusive of academic achievement, using a sense-making/sense-giving strategy as a catalyst for subject related learning may produce problems in assessment. 

——

Before we begin, I would like to call your attention to the Connectivism blog by George Siemens, who has written about the application of sensemaking in educational research.

Siemens also just published the following blog post on April 24th, 2012 entitled “Change MOOC: Sensemaking and Analytics” which he created as his participation in the Change MOOC.

George Siemens
http://www.connectivism.ca
http://change.mooc.ca/index.html
“Sensemaking and Analytics”

Change MOOC is an ongoing open access course where guest researchers and practitioners contribute to the study and use of instructional technology. I highly recommend it.

While you may find some similarities between this presentation and what Siemens has published, I hope to advance the conversation further here with some additional perspectives.

I would like each of you to think to yourselves the answers to the following questions: How did you manage to get to this place, given the insanely complicated highway system here in New Jersey? What were the gaps in your knowledge and experience that muddled up your way? How did you bridge those gaps? If you meet someone next week who says they’re going to drive here, what would you say to them? Keep those thoughts in the back of your mind.

—————–

What is this presentation about?

This presentation is about how collecting a different kind of data can be more informative to support student success.

Student success can be interpreted in a many ways, such as course completion, retention, development of values, achieving academic goals, and so on. In any case, the field of learning analytics seeks causal factors that predict how learners end up as successful, at-risk for failure, or somewhere in between.

How are these highly complex, abstract, and sometimes chaotic “spaces” conceptualized by learners, what are their gaps in comprehending it, and how do they bridge those gaps – or fail to bridge them?

One of my interests as a practitioner is in accounting for the learner’s reality of the online learning space: the ambiguities in a mediated communication environment, the meta-cognitive skills needed to succeed in a student-centered learning environment, how learners acculturate to a Social Constructivist approach to learning, and the logical assumptions inherent in the course design.

I propose this system as a method to know more about the learner’s perspective of online learning through eliciting a narrative of their sense-making efforts to comprehend it as a totality.

You are going to hear a few terms used here today: “Sense-Making”, “User-based Design”, and “Learning Analytics”. Some of these you may be familiar with, some perhaps less so.

Sense-Making, as a body of research, is defined variously, as you can see. In a nutshell, it is the constant cognitive process of human beings to construct realities, seek patterns, communicate, organize, comprehend, and negotiate order and chaos to solve specific problems.

“Sensemaking is finding a representation that organizes information to reduce the cost of an operation in an information task” (Russell et al. 1993: 272).
“[S]ensemaking is a motivated, continuous effort to understand connections . . . in order to anticipate their trajectories and act effectively” (Klein et al. 2006: 71).
“Sensemaking is about labeling and categorizing to stabilize the streaming of experience” (Weick et al. 2005: 411) and differs from decision making in its focus on “contextual rationality” (Weick 1993: 636).
“Sensemaking involves individuals attempting to “negotiate strangeness” (Weick 1993: 645). Failures in these settings occurs when “[f]rameworks and meanings [destroy] rather than [construct] one another” (Weick 1993: 645).

The above research retrieved from George Siemen’s “Sensemaking and Analytics” post, April 24, 2012.

Dr. Brenda Dervin has spent decades of research into developing the Sense-Making Methodology. In it she describes the human experience as comprised of cognitive movement in time and space towards making sense of our environment, creating realities, actively seeking to achieve goals, being observant and communicative, and most of all being evaluative.

The value of information in this process is situation-specific. “Facts” are only as useful as they apply to certain conditions, being unmade by other conditions, stopping points and “gaps”.

And life is full of “gaps” (Carter, 1980). We reach stopping points because of discontinuities in resources, time, space, people, society, or simply because our sense of a situation “ran out” – a cognitive gap.

Humans seek to bridge these gaps by co-orienting with the sense made by others in order to understand what insights it may provide. They do this through communication.

When people elicit their experiences, it takes the form of a narrative. If you had to explain how you got here, for example, you could describe what issues arose at certain turning points, what your concerns or questions were at each turn, and then how your prior experience or someone’s assistance helped you to understand your situation better, or moved you forward to the next point. You could describe how each decision produced an outcome.

This narrative is useful because it can be used as a form of sense-giving to others in similar situations.

This research has influenced my instructional design approach for online learning by thinking beyond “what” gaps learners bridged to be successful, to “how” learners used what was available to bridge gaps. We look for “… [the] communicatings that make, reinforce, challenge, resist, alter, and reinvent human worlds (Dervin, 2003, p.141),” in the face of the barriers, constraints, struggles, power, and authorities that stood in their way.

Additionally, this method values both successful and unsuccessful outcomes.

If we could capture more of the verbings of online learners, we could gain greater insight into the cognitive connections learners make to succeed within an online instructional system.

And this is the driving principle behind a user-based analytics concept.

—————————

This leads us to describing User-based Design. It pertains to the design of information systems, which we will define as a prescribed area of interest for the purpose of solving a problem. Or more simply put, a “problem space”.

An information system can be designed from a variety of epistemic positions. What do we mean by an “epistemic position”? It’s like a filmmaker making a movie from a certain character’s perspective, like seeing a situation from a certain person’s reality. Changing the perspective to another character produces an entirely different interpretation of context and meaning. An information system designer can do the same thing.

The traditional method is what’s called systems-based development. This means that the design, organization, and information in the system is a reflection of a reality as perceived by system designers and developers. We see this all the time in the design of applications, information kiosks, e-commerce, and so on – like every time Microsoft updates MS Word and you can’t find the paragraph formatting feature anymore.

The implication of this method is that the developers know what is needed by users under the conditions they believe users will encounter, and requires the user to understand the environment according to the developers’ sense of reality  (M. S. Nilan and M. A. D’Eredita, 2008).

Note: WIRED Magazine recently published an article about the rationale for A/B Testing, which undermines the systems-based tradition.

An alternative systems design method is from a user-based perspective. An information system built from a user-based perspective focuses on the situations users encounter and how they construct their reality of it. Its design and organization is based on the users’ criteria for what information is useful, when it is useful, and how that information helps them in advancing their cognitive movement.

“User-based” refers to an epistemic position that validates the reality of human beings as they perceive it, as a foundation for the design of systems, e.g., a series of steps intended to solve a human problem/situation/context …”

It mandates that we, as systems developers and instructors, understand how others see the world, and create designs that facilitate the narrative of the users’ realities within the problem space.

A distinguishing difference in the user-based approach is a focus on the steps taken to solve the problem as a way to predict users needs – not on what we know about users individually. Again, think about your journey here. You are all unique in your personal characteristics and experiences, but you all sought to solve the same problem. Do you think each of you could contribute to improving how a GPS system could work for others? I bet you could!

What is the key rationale for a user-based system?

In any information system, there is an abundance of useless information, or “noise”. A user-based system seeks to reduce the signal-to-noise ratio for the user by organizing information in ways that reflect users’ situation-based realities. Where do we get these patterns from — this narrative?

Dervin’s Sense-Making Methodology provides us with the roadmap. We can elicit this kind of information from users in what’s called the micro-moment timeline interview. In short, the interview process seeks to elicit from an interviewee a recounting of the steps taken in a given situation that produced a certain outcome, whether a goal was achieved or not.

These steps are outlined and then examined in closer detail at each step to determine what factors affected the situation:

  • What were the factors that influenced taking that step?
  • What were the questions or concerns at that step?
  • What were your ideas or understandings?
  • What were your reactions?
  • How did you get an answer? How hard was it to get that answer?
  • Did the answer help or hinder you? How?
  • If you could have waved a magic wand, how would you have been helped best at this step?

Given the myriad of ways online learners can be confused, distracted, or hindered in some way, having the ability to collect this kind of data would be useful. Which leads is to examining learning analytics.

——————-

Integrating Sense-making and User-based Design into a Learning Analytics approach:

The Society for Learning Analytics defines learning analytics as follows:

“Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” – The Society for Learning Analytics (SoLAR,  http://www.solaresearch.org/mission/about/)

Learning analytics can show us patterns in learner behavior as they work within the LMS. The way we currently draw data is mostly through “clickometrics”, or measurements of frequency, duration, and patterns of interactive connections. An at-risk student’s behavior patterns, shown by the data, imply disengagement or non-participation, which is correlated to high probability of dropout or failure.

An analyst will recognize this pattern and may suggest an intervention. The ensuing intervention will likely be a conversation between the student and the instructor/adviser along the lines of “What is your situation? What are your problems? How can we help you?”. The result of this conversation may include remediation: student mentoring, formative improvement of the course, review of LMS best practice, or faculty PD. I think this is a good idea. We ought to know more about learners’ narratives.

A functional equivalent of a sense-making interview should be employed within the LMS to build a database of sense-making behavior in bridging “gaps”, which analysts can use to trace the process of engagement or disengagement.

We should be discovering how learners make sense of the learning environment, social learning constructs, and communication systems in order to succeed – and even when they don’t.

If we embrace these key principles – of sense-making and user-based systems design – then we should rethink our learning analytics strategy.

So what would a user-based learning analytics system look like?

Let’s imagine some possibilities.

The first possibility is that we think about the LMS as more than a repository or a just focal point for communication. As the primary point of intersection with the learner, it is the playfield where sense-making activity occurs – “where do I start, where do I go, what do I do, who do I interact with, how do I respond?” If we could capture every “gap” moment in the learner’s cognitive movement as they comprehend the learning environment, we would have rich data that expresses learners’ needs and uses within precise situations and conditions.

Why not slap a big “Gap” button in the LMS that students can click on so they can tell us what they need to bridge their various gaps. Sounds easy! Well, we have a bit of a problem with that. Nicholas J. Belkin, a systems researcher, tells us,

“Individuals cannot easily express their needs to information systems what they do not know.” (Belkin, 1980)

How can you ask someone what they need if they likely don’t know themselves? So now what?

Here’s another possibility. Again, Dervin’s theory states,

Humans seek to bridge gaps by co-orienting with the sense made by others in order to understand what insights it may provide. They do this through communication.

Let’s say that we were able to collect a narrative of significant experiences from online learners in such a way that the next group of learners for that course would benefit from the sense made by those who had taken it before? What would you get?

You would begin to see patterns of step-taking at a variety of points in the course/unit/module where sense-making gaps occurred. You would see a collection of questions and concerns learners had at each step, and how the answers helped them to understand their situation better or move forward.

Some of you may be thinking right now, “This is unfair. Why should we give new students the benefit of the wisdom gained by previous students? The first students did all the work.”

A valid question if what we were focusing on were strictly learning. However, we are interested in the factors that contribute to “student success”, or, the meta-cognitive skills that regulate and optimize learning.

What do we ask learners, and how do we ask it?

Let’s start with this statement, based on a foundation interview question in Dervin’s SMM:

“Think back to the most significant incident – good or bad – in your learning experience in this [course, unit, module]. Describe to us the steps that led to that situation as if you were describing it as a movie. What happened first, second, third, etc.”

We will ask learners to think back to the most significant incidents in the course (or whatever unit of analysis) and recall each step that led to the significance of that situation.

  • What were the questions or concerns at that step?
  • What were your ideas or understandings?
  • How did you get an answer? If yes, how hard was it to get that answer?
  • How did that answer help you?
  • If you could have waved a magic wand, how would you have been helped best at this step?


Now let’s throw in a twist. Instead of engaging the students with these questions as an effort to help us – the systems developers (and who cares about us?) – let’s engage the students as an effort to help each other. Let’s rephrase our questions again:

“Think back to the most significant incident – good or bad – in your learning experience in this [course, unit, module]. Describe the steps that led to that situation so that the next students in this course will know what happened. What happened first, second, third, etc., as if you were describing it as a movie.”

  • What were the questions or concerns you had at that step that others should know about?
  • What were your ideas or understandings?
  • How did you get an answer? If yes, how hard was it to get that answer?
  • How did that answer help you?
  • If you could have waved a magic wand, how would you have been helped best at this step?

The key challenge in this line of questioning is to focus on the meta-cognitive skills, or the “success” skills, not on the kinds of questions that elicit answers like “I just got the answer from Wikipedia”. We don’t need to know the answers to “what” or “where” questions – we want to know the “how” actions that were taken (or not taken) that produced a “(un)bridged gap” outcome.

Let’s look at some of the outcomes we are referring to. These are the kinds of qualitative outcomes that reflect a helpful sense-giving encounter. These are the ingredients of success.

Dr Brenda Dervin's portrayal of the kinds of help

How do we collect this information?

We should consider the type of learners in this strategy. For underclass students, perhaps it would be at specific intervals in every course, collected by an adviser, instructor, TA. For upperclass or adult students, perhaps an automated system. We could also incorporate incentives to respond to prompts within the LMS. This is a familiar element in social media.

The most challenging part of this question, however, is cultural. Is this form of para-instructional activity perceived by learners as positive or invasive? Is it too time consuming to participate against the perceived benefits for it to be worth the effort? If we model this kind of learning culture in other ways, we may find better cooperation. We see this kind of language in university mission statements all the time, so I believe we are philosophically consistent with it.

When do we collect this information from learners?

Our reflex here is to do this as part of course evaluation at the end of the term, which is a valid possibility. Another possibility is that we try to close the gap between the time when learners form these understandings and the moment when we ask them about it. This suggests that we could query students a couple times a semester, or even perhaps after every module.

Is this an undue burden on students? It’s hard to compare what is appropriate in a learning environment against what is appropriate in in-person or online social media behavior. We all know someone who calls us on the phone for help a few times too many for comfort.

But then we are the engineers of the online learning experience. We can establish whatever conventions are effective to achieving our institutions’ goals and mission, especially since our traditions in online learning are still formative.

How is the information we collect used for sense-giving?

If, as I had suggested earlier, we view a course as a repeating cycle whose participants pay forward their sense-making, an analyst could review the learners’ narratives and determine which steps learners had taken in certain situations. The key information can be expressed wherever it would be most useful, according to the learner’s narrative of experience.

How this actually appears in the LMS is open to imagination. Perhaps something to the effect of:

  • 55 students had the following questions and concerns at this point
  • 15 students found X strategy helped them to find better resources
  • 14 students found X strategy helped connect socially with other students
  • 12 students found X resource for improving their computer’s performance
  • 11 students found X connection improved their confidence
  • 55 students had the following questions at this point

Again, we aren’t giving away the learning part of instruction. We’re providing a platform for meta-cognitive sense-giving from the perspective of other learners. And what’s even more valuable is that even narratives of unsuccessful efforts are useful to others.

There is precedence, too, in sense-giving behavior in popular virtual environments. We see this in user-based ratings systems for advice, search result relevance based on interest, and online communities that self-regulate the terms of communication between members (and those that don’t). It could be that “Generation Share” may be more open to responding to this kind of feedback than previous generations.

And because it is user-based, the more participants, the higher the resolution, and the lower the signal to noise within the overall system. Learners will see themselves in the information: “Yes – that’s my stopping point too; that is my situation; that is my question; this is the kind of help I need. How were these impasses resolved?”

Summary:

Student Success, as a concept, is fraught with McLuhanesque fuzziness.

Online learners need to comprehend several complex information, communication, and social systems, some of which can be alien, ill-formed, ambiguous, chaotic, or lacking in the human punctuation that fosters sense-making.

We are proposing a combination of Sense-making Methodology, User-based Design, and Learning Analytics to produce a system that offers a functional equivalent of the way we experience sense-making and problem solving in everyday communication.

To achieve this, I challenge us to change the epistemic position of online learning from systems-based to user-based, and then to adapt our learning analytics philosophy and analysis methods to reflect this position. For example:

A Systems-based Design:

  • Information describes a developer’s ordered reality
  • Focuses on systems engagement
  • Transcends learner’s situations
  • Seeks patterns of engagement
  • Isolates learner’s experiences

A User-based Design:

  • Information is designed by humans according to their realities
  • Focuses on human co-orientation, sensemaking
  • Deeply rooted in learner’s situations
  • Seeks engagements that form meaning
  • Multiplies all experiences

My goal in this proposal is to advance the conversation about learning analytics so that it better resembles the core of our educational values. The locus of meaning is centered around people – not systems.

Many aspects of this presentation were perhaps oversimplified, but I hope I’ve inspired you to think about what we do in a new way. I look forward to your questions and comments. Thank you.

 

References and Resources:

Belkin, N. J. (1980). Anomalous states of knowledge as a basis for information retrieval. The Canadian Journal of Information Science, 5, 133-143.

Carter, R. F. (1980). Discontinuity and communication. Paper presented at the East-West Institute on communication theory from Eastern and Western perspectives, Honolulu, HI.

Dervin, B. (2003). Sense-Making’s journey from metatheory to methodology to method: An example using information seeking and use as research focus. In B. Dervin & L. Foreman-Wernet (with E. Lauterbach) (Eds.). (2003). Sense-Making Methodology reader: Selected writings of Brenda Dervin (pp. 133-164). Cresskill, NJ: Hampton Press. © Hampton Press and Brenda Dervin (2003) Dervin, B., & Nilan, M. (1986). Information needs and uses. Annual review of information science and technology (ARIST) Vol. 21, pp 3-33. White Plains, NY: Knowledge Industry Publications

Dervin, B. (2003). Chaos, order, and Sense-Making: A proposed theory for information design. In B. Dervin & L. Foreman-Wernet (with E. Lauterbach) (Eds.). Sense-Making Methodology reader: Selected writings of Brenda Dervin (pp. 325- 340). Cresskill, NJ: Hampton Press.

Dervin, B. (1992). From the mind’s eye of the user: The Sense-Making qualitative- quantitative methodology. In J. D. Glazier & R. R. Powell (Eds.), Qualitative research in information management (pp. 61-84). Englewood, CO: Libraries Unlimited. Reprinted in: B. Dervin & L. Foreman-Wernet (with E. Lauterbach) (Eds.). (2003). Sense-Making Methodology reader: Selected writings of Brenda Dervin (pp. 269-292). Cresskill, NJ: Hampton Press. © Hampton Press and Brenda Dervin (2003), reprinted by permission of Jack D. Glazier (1992)

Dervin, B. (1991). Comparative theory reconceptualized: From entities and states to processes and dynamics. Communication Theory, 1(1), 59-69. Reprinted in: B. Dervin & L. Foreman-Wernet (with E. Lauterbach) (Eds.). (2003). Sense-Making Methodology reader: Selected writings of Brenda Dervin (pp. 61-72). Cresskill, NJ: Hampton Press. © Hampton Press and Brenda Dervin (2003), reprinted by permission of Oxford University Press (1991)

Dervin, B. (1989). Audience as listener and learner, teacher and confidante: The Sense- Making approach. In R. E. Rice & C. K. Atkin (Eds.), Public communication campaigns (2nd ed., pp. 67-86). Newbury Park, CA: Sage. Reprinted in: B. Dervin & L. Foreman-Wernet (with E. Lauterbach) (Eds.). (2003). Sense-Making Methodology reader: Selected writings of Brenda Dervin (pp. 215-232). Cresskill, NJ: Hampton Press. © Hampton Press and Brenda Dervin (2003), reprinted by permission of Sage Publications (1989).

Dervin, B. (1983, May). An overview of Sense-Making research: Concepts, methods, and results to date. Paper presented at the annual meeting of the International Communication Association, Dallas, TX. © Brenda Dervin (1983 & 2002).

Dervin, B., & Frenette, M. (2001). Sense-Making Methodology: Communicating communicatively with campaign audiences. In R. E. Rice & C. K. Atkin (Eds.), Public communication campaigns (3rd ed., pp. 69-87). Thousand Oaks, CA: Sage. Reprinted in: B. Dervin & L. Foreman-Wernet (with E. Lauterbach) (Eds.). (2003). Sense-Making Methodology reader: Selected writings of Brenda Dervin (pp. 233-251). Cresskill, NJ: Hampton Press © Hampton Press and Brenda Dervin (2003), reprinted by permission of Sage Publications (2001).

Klein, G., Moon, B., and Hoffman, R. R. (2006) Making sense of sensemaking 1: Alternative perspectives. IEEE Intelligent Systems 21, (4) 70–73. doi:10.1109/MIS.2006.75

Nilan, M. (1992). Cognitive Space: Using Virtual Reality for Large Information Resource Management Problems, Journal of Communication, Volume 42, Issue 4, pages 115–135, December 1992

Nilan, M. & D’Eredita, Michael, (2008). In the Spirit of Collaborating Conference Paper/Presentation. Illinois Digital Environment for Access to Learning and Scholarship. Retrieved online December 5, 2010 from: http://hdl.handle.net/2142/15091

Nilan, M. & Fletcher, P. T. (1987). Information Behaviors in the Preparation of Research Proposals – a User Study. Proceedings of the ASIS Annual Meeting, held 1987, 186-192

Russell, D. M., Stefik, M. J., Pirolli, P., and Card, S. K. (1993) The cost structure of sensemaking. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York: Association for Computer Machinery: 269−276. doi:10.1145/169059.169209

Schwandt, D. R. (2005) ‘When managers become philosophers: Integrating learning with sensemaking.’ Academy of Management Learning & Education [online] 4, (2) 176–192.

The Society for Learning Analytics (SoLAR), (2012): Retrieved May 25, 2012 from: http://www.solaresearch.org/mission/about

Weick, K. E. (1988) ‘Enacted sensemaking in crisis situations.’ Journal of Management Studies [online] 25, (4) 305-317. Available from

Weick, K. E. (1993) ‘The collapse of sensemaking in organizations: The Mann Gulch disaster.’ Administrative Science Quarterly 38, (4) 628-652

Weick, K. E., Sutcliffe, K. M., and Obstfeld, D. (2005) ‘Organizing and the process of sensemaking.’ Organization Science 16, (4) 409-421

Main Menu