Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added docs/_static/in_context_course.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/_static/in_context_graded_subsection.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/_static/in_context_problem.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/_static/in_context_sidebar.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/_static/in_context_video.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
64 changes: 64 additions & 0 deletions docs/reference/in_context_dashboards.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
In-Context Dashboards
#####################

In-Context Dashboards are available starting in the Teak Open edX release. They are shown in the Analytics sidebar in Studio and present metrics relevant to the content being viewed. The following dashboards are included with Aspects.

.. image:: /_static/in_context_sidebar.png

Course Dashboard
****************

This board contains 3 charts presenting course-level metrics, and is displayed when viewing in-context analytics for the course outline.

.. image:: /_static/in_context_course.png

Subsection Engagement:
======================
The first chart shows the number of learners who attempted at least one problem in each graded subsection throughout the course and the average subsection score for those learners.

Problem Engagement:
===================
The second chart shows the number of learners who attempted each problem in the course and the percentage of learners who got each problem correct on their first attempt at the problem. A high percentage correct on the first attempt indicates that learners are finding the problem manageable (even easy if the percentage is very high), while a low percentage correct on the first attempt indicates that the problem may be very difficult or potentially unclear to learners.

Video Engagement:
=================
The third chart shows the number of unique and repeat views for each video in the course.

Graded Subsection Performance
*****************************

This board contains 2 charts presenting subsection-level metrics, and is displayed when viewing in-context analytics for a graded subsection from the course outline.

.. image:: /_static/in_context_graded_subsection.png

Graded Subsection Performance:
==============================
The first chart shows the number of respondents in each score range for the graded subsection as well as the average subsection score for all learners who attempted at least one problem in the subsection.

Final Response Results:
=======================
The second chart shows the total number of correct and incorrect responses for each problem in the subsection.

Problem
*******

This board contains 2 tables presenting problem-level metrics, and is displayed when viewing in-context analytics for a problem block from the course outline or from a unit page.

.. image:: /_static/in_context_problem.png

Problem Results Table:
======================

The first table shows the percentage of correct attempts and of correct first attempts.

Initial Responses:
==================

The second table shows a breakdown of how learners responded to the problem on their first attempt. The idea is to give course delivery teams a peak into learners’ thought process when they approach the problem for the first time.

Video
*****

The Unique vs. Repeat Views chart shows the number of unique and repeat views for a single video in the course across the duration of the video. Timestamp ranges with a large number of repeat views should be reviewed as this might be an indicator that this particular section of video is unclear to learners.

.. image:: /_static/in_context_video.png
1 change: 1 addition & 0 deletions docs/reference/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,4 @@ Reference Material
learner_groups_dashboard
operator_reports
course_comparison_dashboard
in_context_dashboards
Original file line number Diff line number Diff line change
Expand Up @@ -199,6 +199,14 @@ By default, Aspects enables plugin functionality in the LMS that embeds a define
- ``ASPECTS_LEARNER_GROUPS_HELP_MARKDOWN`` controls the content of the "Help" tab in the At-Risk Learners dashboard
- ``ASPECTS_OPERATOR_HELP_MARKDOWN`` controls the content of the "Help" tab in the Operator dashboard

In-context Metrics
------------------

Starting in the Teak Open edX release, Aspects provides in-context metrics in Studio. The following settings controls this functionality.

- ``ASPECTS_ENABLE_STUDIO_IN_CONTEXT_METRICS`` - Enables or disables in-context metrics.
- ``ASPECTS_IN_CONTEXT_DASHBOARDS`` - A dictionary mapping block types to in-context dashboards. You can use this option to remove or replace the in-context dashboard for a block type. The key `course` defines the in-context dashboard for course overview.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ArturGaspar did we miss adding ASPECTS_ENABLE_STUDIO_IN_CONTEXT_METRICS?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

on side note have a look at this thread, we should also update Installation instruction adding which images to rebuild and same should be mentioned in How to upgrade section. cc: @bmtcril



Ralph Accessibility
-------------------
Expand Down
1 change: 1 addition & 0 deletions docs/technical_documentation/how-tos/upgrade.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ As for any upgrade you should take a backup snapshot of your environment before
- To prevent orphan Superset assets from being left behind, you should remove the existing Superset assets from your Tutor environment before saving the configuration: ``rm -rf env/plugins/aspects/build/aspects-superset/openedx-assets/assets``
- Save your tutor configuration: ``tutor config save``
- Build your Docker images: ``tutor images build openedx aspects aspects-superset --no-cache``
- If using in-context metrics (only available starting in the Teak Open edX release), build also the `mfe` image: ``tutor images build mfe --no-cache``
- Initialize Aspects to get the latest schema and reports, for a tutor local install: ``tutor local do init -l aspects``

In a case where the release has special instructions, such as when new xAPI transforms have been added and you may need to replay tracking logs, they will be included in the release announcement.
Expand Down