AcAdv, AdvTech, Higher Education

Today’s #AcAdv Chat Topic: Data Analytics in Academic Advising #highered

A couple of week’s ago, I was fortunate to join the Open SUNY COTE Summit 2017. I will be sure to share more about the #COTEsummit learning in the coming weeks; however, the last session helped me think about framing TODAY’s (3/21) #AcAdv Chat I’ll be moderating from 12-1 pm CT: Data Analytics in #AcAdv 

During the #COTEsummit Learning Analytics panel hosted by OLC, we dug into what information we know and how we use it to understand more about our learners.  Many academic advising units/divisions, often jump to the platform or process for how we analyze students to predict learner behavior:

//platform.twitter.com/widgets.js

But before advising leaders in higher ed jump on the big data bandwagon or decide to implement technology platform to collect data, I think our support units need to identify what information and data we need to know to effectively support our learners. Let’s make decisions on the data that is most helpful, instead of letting predictive analytics make decisions for us at our institutions. What often gets lost in this conversation and planning is this: learning or learning analytics.

Learning analytics are about learning (Gašević, Dawson, & Siemens, 2015). Sometimes we forget this about learning analytics when the phrase data is tossed out at the  “strategic-planning-task-force-retention-student-success-operation” meeting occurs at our universities and colleges. Sure, learning analytics might be most relevant for instructors and faculty; however, learning data is also critical for those who support the instructional design, scaffold student success, and provide academic advising/support in higher education.

Image c/o Giulia Forsythe

In thinking about academic advising and learner support, I have SO many questions about data and data analytics for this #AcAdv Chat topic… here are just a few:

  • How does your institution collect, store, and share data campus-wide?
  • What do you do as a staff or faculty member to interpret the data?
  • Are you able to interpret, read, and translate the information provided about your learners?
  • Are there real-time notifications where students, staff, and faculty can interpret academic progress? What does this look like at your campus?
  • Do your data sets on campus talk to one another? Is there much interaction between your student information system, learning management system, institutional portal, or institutional research data? Why or why not?
  • What challenges and/or issues have you thought about for how data is collected and/or reviewed for learner support?
  • Who or what office can you reach out to on campus for “data analysis” or digging into your learner data to interpret further to support the work you do?

What thoughts or questions do you have about this issue, higher ed? Won’t you join us for today’s #acadv chat conversation? Here’s how:

TWEETS from the #AcAdv Chat conversation on 03.21.17

Reference:

Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64-71.

Book Review, edusocmedia, Learning and Performance, Professional Development, Training & Development

#BookReview: The New Social Learning, 2nd Edition #NewSocialLearning

The first edition of this book, The New Social Learning, was published 5 years ago. I read and have a copy of it on my bookshelf; however, we know that emerging and connected technologies have continued to flourish and influence our organizations. The social technology landscape has changed since 2010. There are a number of new platforms, additional functionalities and communication channels, an increase of utilization and adoption by our organizations, and a much greater acceptance of social media being applied for learning and development. Marcia Conner and Tony Bingham have recently published an updated version of this book with The New Social Learning: Connect. Collaborate. Work, 2nd Ed.* The latest edition provides a number of excellent case studies for how social media is being implemented in workplace learning, development, and performance.

SocialLearning

Bingham and Conner (2015, p. 8) define social learning as the “joining with others to make sense of and create new ideas…[it] is augmented with social media tools that bridge distance and time, enabling people to easily interact across workplace, passion, curiosity, skill or need. It benefits from a diversity in types of intelligence and in the experiences of those learning.” What is really “new” about this type of social learning with emerging technologies is the impact these platforms and tools have to the experience. “Social tools leave a digital audit trail, documenting our journey – often an unfolding story – and provide a path for others to learn from” (Bingham & Conner, 2015, p. 9). Social media facilitates the empowerment of learning among your networked peers beyond the limitations of geography or time.  I appreciate how the authors identify what is NOT the new social learning (e.g. informal, e-learning, MOOCs, just for knowledge workers, in contrast to formal learning/education), and how this type of learning is meant to augment, not replace, training, knowledge management, and communication practices in our organizations. As technology has accelerated change in the workplace, Bingham and Conner (2015, p. 18-19) see the opportunity to implement a new social learning strategy based on these changes in work:

  • The accelerated pace of change requires agility. Consider agile values for the workplace.
  • Our technologies go where we go without any boundaries. Not all can be controlled, contained, or developed from within an organization.
  • Our shifting workplace demographics change expectations, with regards to generations, gender, culture.
  • People desire personal connection to communicate, collaborate, and share.

Although the authors share a number of success stories about individuals and organizations who are engaged in social media to enhance learning, they do offer potential critiques and considerations for governance of social tools. By including applied examples and practice to social learning theory, this book identifies suggested approaches and considerations for implementation of a new social learning program as outlined by its table of contents (TOC):

  1. Reach Out and Connect – Introduction to the book topic and focus (download the TOC and part of Chapter 1 here: http://www.thenewsociallearning.com/)
  2. Embark on the Journey – Setting goals and planning for the “new social learning”
  3. Transition and Engage – Strategic steps for implementation of social media for learning
  4. Never Give Up – Reminders, challenges, suggestions, and issues to consider
  5. Analyze Insights and Returns – Suggested methods and areas to evaluate and measure
  6. In-Person Learning Reimagined – Opportunity to engage in F2F social learning from the springboard of social tools
  7. Appendix: Social Media Governance – Examples of a few corporate policies and guidelines to consider for your organization

Chapter 5 provided excellent considerations on how to analyze and understand stakeholders when considering a social (media) learning approach. This section outlines this lightweight analysis to help quantify social and digital tool adoption. As I tend to work with non-profits, K-12, higher education, and professional/trade associations, I modified the descriptions and questions from this section of Bingham and Conner’s (2015, pp. 206-252) book to focus the analysis for learning and development organizations:

  • Analysis 1 – Perspective: Do you have a sense of how people in your organization feel about the company/institution, each other, their clients, etc.? What if you could better map the perspective of your stakeholders? What is your priority with a new social learning approach? It will be critical to analyze patterns of attitudes, feelings, conversation tone, and individual voices in your organization by reviewing the unstructured data created by social and digital platforms.
  • Analysis 2 – Engagement: How important is it to have a large majority of your organization fully engaged in their work and/or learning? Are your stakeholders aware of the organization’s vision, mission, and purpose? What does it mean to have engaged educators and/or learners in your organization, with regards to online participation, generative production, and choices for collaboration?
  • Analysis 3 – Connectedness:  How do you want individuals in your organization to know each other or, at least, have a method by which they can get to know what skills and knowledge everyone brings to the table? Have you conducted an organizational network analysis yet? Do you have a method for sharing information, managing knowledge, and directing your organizational stakeholders to resources and/or other people?
  • Analysis 4 – Fiscal Fitness: Are you concerned that social (media) will be of little value to your organization? Are you afraid there is no way to measure the value many assure you is there with social media for learning? What is the ROI for social learning? Sometimes there might not be direct counts; however benchmarking our own performance indicators will help with identifying new opportunities to balance the reward-risk ratio. Outcomes of social learning might be noticed in the side effects, i.e., increased employee morale, a decline in sick days, or a growth in collaborative team projects.
  • Analysis 5 – Impact: How do you know what you are doing is actually making an impact to your organization? How have social (media) tools improved or supported your own learning and development? Is there a change in behavior, opinions, attitudes, and experiences of your stakeholders? Do you notice an increase in productivity or improved learning outcomes?
  • Analysis 6 – Influence: Do you know how collaboration and communication change measures of authority and the effect it has on who is “seen” to provide real value? Influence can come from a position of authority; however, it might also is socially and informally created with our digital, network tools. Involving all stakeholders to participate and identifying impactful messaging from leadership will be critical for open communication. You might not realize how pluralistic ignorance can impede social change in your organization.
  • Analysis 7 – Attention: Do you know how your own stakeholders can dramatically multiply the value of their own and their colleagues’ knowledge? Are your stakeholders paying attention to key messages and less attention to distracting noise? What are the key trends and movements in your organization on these social channels? Do you have a pulse of the conversation and needs on these platforms? Believe it or not, there is life without email.
  • Analysis 8 – Capacity: How do you want to expand the social learning methods and platforms you use to understand and maintain the critical skills needed for your organization? How can you analyze and foster leadership, interests, knowledge, content, or geographic distribution, for your social learning approach?
  • Analysis 9 – Change: How can you best understand your organization’s culture and the impact social approaches will have on transforming learning and development? How will you conduct a learning culture audit that includes the assessment of social media platforms for learning? How will you communication the transformation of your learning approach to the organization?
  • Analysis 10 – Fill the Holes:  How can you help others in your organization imagine a future and stimulate exploration of topics and ideas that might not fit into an existing structure? Can you conduct a personal network assessment to identify who in your organization might help to “fill in the missing holes” for your social learning approach? How might you analyze and review the real-time experience on your social media platforms?

Reference:

Bingham, T., & Conner, M. (2015). The new social learning: Connect. Collaborate. Work., 2nd Edition. Alexandria, VA: ATD Press.

*Full disclosure: The @NewSocialLearn book was sent to me by @ATD Press to read and post a review on my blog. Thank you for the read – it was enjoyed. 

dalmooc

Do You Want to Learn About Learning Analytics? #dalmooc

Last week, I attended the UTA LINK Lab talk presented by Dragan Gasevic (@dgasevic) on learning analytics and research. This discussion shared all the digital traces and learning that can be collected and measured in our various learning environments, and questions how we are best doing some of these analytics within our institutions. Although we have a number of statistics, data, and information on our learners – how can we offer actionable insight, summative feedback, and information about learner progress. Our post-secondary institutions seem to want to only deal with the “R” word = Retention. Often institutions are looking to identify students at risk, provide information about learning success, and understand how to enhance learning – but how can we effectively use data when often times our metrics only focus on single outcomes?

data-analytics-608x211

Photo c/o the #dalmooc edX Course Site

Instead, it is the process and context that our education institutions need to identify when looking at learning analytics, that is, the need to understand and optimize learning (Butler & Winne, 1995). Whether we apply the community of inquiry framework,  cognitive presence, which includes triggering events, exploration, integration and resolution (Garrison, Anderson & Archer, 2001), or the COPES (Conditions, Operations, Products, Evaluation, & Standards) model (Winnie, 1997) –  it is the meaningful data points for learning analytics that really need to be identified within our educational institutions.  As @dgasevic said, “Learning analytics is about LEARNING!” Often we assume the data collected from our courses and our systems will provide us with the answers; however if not identified in a purposeful way – why bother? What we really need to consider is, what does it mean to study and support the learning experience and not just the end results?

Here are a few areas of learning analytics and data evaluation need to be considered (just to name a few):

  • learner agency and self-regulation
  • interaction effect – external and internal conditions
  • formal and informal learning communities
  • instructional intervention methods
  • multimodal learning
  • emerging technology impact, i.e. mobile, wearable tech, etc.

Here are  questions our institutions need to consider when they want examine learning analytics:

  • What data we are collecting? And why?
  • How does the learner information we know contribute to the PROCESS of learning?
  • Who should be part of this learning analytic research for learning?
  • How can we best present and interact with the data? Can this be more immediate?
  • How can we encourage and support multidisciplinary teams to study learning analytics at our institutions?
  • Are we being being driven by questions of need, access, and availability for the learning data collection?
  • What ethical and privacy considerations should be considered when collecting data around learning?

Interested in learning more about learning analytics and data in education? Check out the paper in press by Gasevic, Dawson, and Siemens http://bit.ly/techtrends15  or better yet – join the 9-week Data Analytics & Learning MOOC that UTA & edX is hosting on this very topic starting Monday, October 20th: http://linkresearchlab.org/dalmooc/ or follow along with the conversation on Twitter #dalmooc.

References

Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: A theoretical synthesis. Review of educational research, 65(3), 245-281.

Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15(1), 7-23.

Gasevic, Dawson, Siemens (inpress). Let’s not forget: Learning analytics are about learning. TechTrends. http://bit.ly/techtrends15

Winne, P. H. (1997). Experimenting to bootstrap self-regulated learning. Journal of educational Psychology, 89(3), 397.