diff --git a/docs/_static/in_context_course.png b/docs/_static/in_context_course.png new file mode 100644 index 0000000..d06e826 Binary files /dev/null and b/docs/_static/in_context_course.png differ diff --git a/docs/_static/in_context_graded_subsection.png b/docs/_static/in_context_graded_subsection.png new file mode 100644 index 0000000..a501727 Binary files /dev/null and b/docs/_static/in_context_graded_subsection.png differ diff --git a/docs/_static/in_context_problem.png b/docs/_static/in_context_problem.png new file mode 100644 index 0000000..9a5e000 Binary files /dev/null and b/docs/_static/in_context_problem.png differ diff --git a/docs/_static/in_context_sidebar.png b/docs/_static/in_context_sidebar.png new file mode 100644 index 0000000..a14882c Binary files /dev/null and b/docs/_static/in_context_sidebar.png differ diff --git a/docs/_static/in_context_video.png b/docs/_static/in_context_video.png new file mode 100644 index 0000000..01e26ec Binary files /dev/null and b/docs/_static/in_context_video.png differ diff --git a/docs/reference/in_context_dashboards.rst b/docs/reference/in_context_dashboards.rst new file mode 100644 index 0000000..a1fcec0 --- /dev/null +++ b/docs/reference/in_context_dashboards.rst @@ -0,0 +1,64 @@ +In-Context Dashboards +##################### + +In-Context Dashboards are available starting in the Teak Open edX release. They are shown in the Analytics sidebar in Studio and present metrics relevant to the content being viewed. The following dashboards are included with Aspects. + +.. image:: /_static/in_context_sidebar.png + +Course Dashboard +**************** + +This board contains 3 charts presenting course-level metrics, and is displayed when viewing in-context analytics for the course outline. + +.. image:: /_static/in_context_course.png + +Subsection Engagement: +====================== +The first chart shows the number of learners who attempted at least one problem in each graded subsection throughout the course and the average subsection score for those learners. + +Problem Engagement: +=================== +The second chart shows the number of learners who attempted each problem in the course and the percentage of learners who got each problem correct on their first attempt at the problem. A high percentage correct on the first attempt indicates that learners are finding the problem manageable (even easy if the percentage is very high), while a low percentage correct on the first attempt indicates that the problem may be very difficult or potentially unclear to learners. + +Video Engagement: +================= +The third chart shows the number of unique and repeat views for each video in the course. + +Graded Subsection Performance +***************************** + +This board contains 2 charts presenting subsection-level metrics, and is displayed when viewing in-context analytics for a graded subsection from the course outline. + +.. image:: /_static/in_context_graded_subsection.png + +Graded Subsection Performance: +============================== +The first chart shows the number of respondents in each score range for the graded subsection as well as the average subsection score for all learners who attempted at least one problem in the subsection. + +Final Response Results: +======================= +The second chart shows the total number of correct and incorrect responses for each problem in the subsection. + +Problem +******* + +This board contains 2 tables presenting problem-level metrics, and is displayed when viewing in-context analytics for a problem block from the course outline or from a unit page. + +.. image:: /_static/in_context_problem.png + +Problem Results Table: +====================== + +The first table shows the percentage of correct attempts and of correct first attempts. + +Initial Responses: +================== + +The second table shows a breakdown of how learners responded to the problem on their first attempt. The idea is to give course delivery teams a peak into learners’ thought process when they approach the problem for the first time. + +Video +***** + +The Unique vs. Repeat Views chart shows the number of unique and repeat views for a single video in the course across the duration of the video. Timestamp ranges with a large number of repeat views should be reviewed as this might be an indicator that this particular section of video is unclear to learners. + +.. image:: /_static/in_context_video.png diff --git a/docs/reference/index.rst b/docs/reference/index.rst index 9cefe9c..440ef52 100644 --- a/docs/reference/index.rst +++ b/docs/reference/index.rst @@ -9,3 +9,4 @@ Reference Material learner_groups_dashboard operator_reports course_comparison_dashboard + in_context_dashboards diff --git a/docs/technical_documentation/how-tos/production_configuration.rst b/docs/technical_documentation/how-tos/production_configuration.rst index 780b1f4..7b40b72 100644 --- a/docs/technical_documentation/how-tos/production_configuration.rst +++ b/docs/technical_documentation/how-tos/production_configuration.rst @@ -199,6 +199,14 @@ By default, Aspects enables plugin functionality in the LMS that embeds a define - ``ASPECTS_LEARNER_GROUPS_HELP_MARKDOWN`` controls the content of the "Help" tab in the At-Risk Learners dashboard - ``ASPECTS_OPERATOR_HELP_MARKDOWN`` controls the content of the "Help" tab in the Operator dashboard +In-context Metrics +------------------ + +Starting in the Teak Open edX release, Aspects provides in-context metrics in Studio. The following settings controls this functionality. + +- ``ASPECTS_ENABLE_STUDIO_IN_CONTEXT_METRICS`` - Enables or disables in-context metrics. +- ``ASPECTS_IN_CONTEXT_DASHBOARDS`` - A dictionary mapping block types to in-context dashboards. You can use this option to remove or replace the in-context dashboard for a block type. The key `course` defines the in-context dashboard for course overview. + Ralph Accessibility ------------------- diff --git a/docs/technical_documentation/how-tos/upgrade.rst b/docs/technical_documentation/how-tos/upgrade.rst index 524e682..0b8f37a 100644 --- a/docs/technical_documentation/how-tos/upgrade.rst +++ b/docs/technical_documentation/how-tos/upgrade.rst @@ -11,6 +11,7 @@ As for any upgrade you should take a backup snapshot of your environment before - To prevent orphan Superset assets from being left behind, you should remove the existing Superset assets from your Tutor environment before saving the configuration: ``rm -rf env/plugins/aspects/build/aspects-superset/openedx-assets/assets`` - Save your tutor configuration: ``tutor config save`` - Build your Docker images: ``tutor images build openedx aspects aspects-superset --no-cache`` +- If using in-context metrics (only available starting in the Teak Open edX release), build also the `mfe` image: ``tutor images build mfe --no-cache`` - Initialize Aspects to get the latest schema and reports, for a tutor local install: ``tutor local do init -l aspects`` In a case where the release has special instructions, such as when new xAPI transforms have been added and you may need to replay tracking logs, they will be included in the release announcement.