Sharebar?

Learning Impact Blog

Monica Watts, IMS Director of K-12 Engagement

Monica Watts, Director of K-12 Engagement

 

Let’s Talk About Student Data Privacy

 

There is no debate that the global pandemic accelerated the adoption of digital resources in K-12. With an increase in digital resources and access, the need for protecting student data privacy and security is even more urgent.

In March of 2021, IMS launched the TrustEd Apps Dashboard to guide your teachers and staff towards secure data use and privacy. Over the last three years, our community has played an integral part in designing a rigorous rubric for vetting an application. Already, IMS has vetted over 5,000+ apps using this IMS community-developed rubric, and of this writing, over 130 products from 65+ different suppliers have earned the TrustEd Apps Seal for Data Privacy Certification. All these applications are easily found in the IMS Product Directory.

 

IMS TrustEd Apps SealTrustEd Apps by the Numbers

→ 5,797 vetted apps in the IMS Product Directory
→ 68 suppliers have achieved the TrustEd Apps Seal
→ Over 35 in the pipeline to earn the Seal
→ 419 non-member/non-IMS certified

 

The TrustEd Apps Dashboard equips your teachers and staff with the knowledge of preferred, approved, and denied applications by the district. Additionally, it provides detailed information on how the application meets the expectations of the rubric. The new TrustEd Apps Dashboard addresses the challenge of vetting applications for data privacy and security. The TrustEd Apps Dashboard integration will launch through an LTI 1.3 certified supplier.

 

"We need to bring on vendors very quickly and make sure they work within our current ecosystem of technology solutions. IMS TrustEd Apps will help us cut through the backlog of having to vet and approve all of the apps coming in. Now, our academics department and our technology department have a partner to support and streamline our processes so we can get tools into the hands of our students and teachers quicker, while also ensuring their safety."
—Jeff McCoy, Associate Superintendent for Academics, Greenville County Schools

As we prepare to close the school year, now is the time to request access to the TrustEd Apps Dashboard for your teachers available through your membership in IMS Global. We invite you to contact us about gaining access to this valuable tool. Plus, we recently announced a new preferred partnership with CatchOn to bring TrustEd Apps privacy vetting data directly to teachers and administrators.

For more information on the TrustEd Apps process, please visit trustedapps.org today.

 

Tags:

IMS Global CEO Rob AbelRob Abel, Ed.D. | June 2021

 

"Just what you want to be, you will be in the end" —The Moody Blues

 

Holding Ourselves to a Higher Standard of Learning Impact

 

The 2021 winners of the Learning Impact Awards were recently announced. In this post, I'll give you some history on the awards and point you to a few of my favorites from among this year’s medal winners.

Learning Impact Awards 2021 winners page

From early 2006, the term “Learning Impact” has been a shorthand at IMS for improving access, affordability, and quality of education. In 2007, IMS held the inaugural Learning Impact conference and Learning Impact Awards (LIAs) competition in Vancouver. The theme, Learning Impact, was a direct result of the 2006 decision by the IMS team and Board of Directors to embrace Learning Impact as the primary measure of success of the organization.

In 2010, the IMS Board crafted the mission statement that captures the full scope of IMS activities and reiterates how impact, adoption, and standards work together to grow the edtech innovation ecosystem. We began using the image that goes with the mission statement early in 2006.

Building the EdTech Innovation Ecosystem image

The LIAs are uniquely IMS—nothing about the program was copied from any other source. IMS established a detailed rubric with eight categories of impact (including access, affordability, and quality) that are used by a neutral expert judging panel to select the winners. In the last few years, we have added the ability for the public also to vote. This public vote is equal in weight to only one judge—so there is no way to “stuff the ballot box.”

It is not easy to win a medal in the LIA competition. The typical “product pitch” does not even come close to what is required. Evidence of impact is collected and considered by the judges in the context of actual institutional use. Note that interoperability is just one of the eight criteria.

Generally, interoperability and the use of standards, in particular, relate to scalability. However, innovation does not require interoperability. The goal within the context of our full set of activities is that the LIA awards help us see the innovations with impact and then create the standards to help innovation be adopted across the ecosystem. This was especially true in the early years of the program, as it was rare to find an entry that fully endorsed standards. However, today, most of the entries are leveraging IMS standards, many at a massive scale. Thus, through the evolution of the LIA winners, we have seen the growing impact of standards over the last 15 years. This is exactly the virtuous cycle of innovation, standards, and large-scale adoption that we had hoped to establish.

IMS also analyzes the finalists, and in most years, publishes a Learning Impact Report. The purpose of the report is to take stock of where things stand with respect to the innovation trend categories that have come to the fore through the awards process. The LIAs look for evidence that an innovation is “crossing the chasm” into mainstream market adoption.

As I've discussed, I think a potential more specific set of goals beyond access, affordability, and quality, such as equity, agency, and mastery, will help the education sector focus on the key challenges that go more directly to the heart of the matter than the much-heralded mantra of “student success.” Therefore, while all the medal winners in 2021 are great, no one should be surprised that my personal favorites featured the equity, agency, and mastery themes:

Chicago Public Schools' Curriculum Equity Initiative
This project is a breakthrough in providing a scalable, culturally responsive, digital curriculum that can provide the foundation for customization, and thus equity, for a wide range of needs.

ECoach at the University of Michigan
It seems like every higher ed course should come with an ecoach which helps motivate agency with digital support that meets the student where they are.

Scaling an Equitable Access Program: VitalSource and University of California, Davis
One important aspect of equity in higher education is the trend toward ensuring that all students have access to all required resources in a digital format, a lesson from the pandemic that needs to carry over.

Class: Redefining the Virtual Classroom
While making the virtual learning experience substantially better may seem less important now that face-to-face is resuming, this winner defines an approach (potentially a new product category) that can bring the power of digital to the classroom (or hybrid or virtual) in ways that could improve the teacher’s ability to help all students.

Digital Graduation Predictor and Virtual Counselor
This project is a great example of how modern data architectures can be leveraged to get a better understanding of the progress of each student to help all succeed.

IMS Annual Report 2020 cover page

Learning Impact has been this “North Star” that has led to the 15+ years of growth of IMS discussed on our recently released annual report.  It is the collaboration of the IMS member organizations making it all work!

 

Tags:

 

IMS Chief Architect Dr. Colin SmytheIMS TECH TALK

Contributed by Dr. Colin Smythe, IMS Chief Architect

 

The Student Learning Data Model: A New Way of Working with the IMS Specifications

Having published edtech interoperability specifications for over 20 years, we’ve learned a few things. As is the standard (pun intended) by all specification development organizations, these specs are published as sets of HTML documents. In most cases, these documents are long, and it is difficult for a reader to find the information of specific interest. New tools are needed to simplify reading specification documentation, and so IMS has been developing such tools over the past 12 months. In November 2020, we announced the availability of the Student Learning Data Model or SLDM for short. The SLDM provides a new way to access, explore and visualize all of the information within the set of IMS specifications relevant to a student’s engagement and progress.

In many cases, an edtech system will make use of more than one IMS specification. An essential part of our development process is the integration between IMS spec, for example, LTI® links being a resource type in Common Cartridge®. The reference now has to be made to several specs. A reader needs to easily access, explore, and visualize information across several sets of documentation. As the number of published specifications increases (over 100 in our history), as the complexity of each specification increases (an unavoidable reality as a specification evolves), and as the integrations become more comprehensive, the availability and use of tools such as the SLDM is essential.

The SLDM provides three tiers of information. The first two tiers are available to any registered user of the IMS website. Access to the third tier is only available to users from IMS member organizations.

The first tier collects the information into an eight-cell honeycomb of:

  • User and Organization
  • Enrollment & Attendance
  • Pathways to Competency
  • Instructional Resources
  • Assignment and Assessment
  • Learning Activities

Student Learning Data Model image

The second tier—accessed through the honeycomb—provides the list and details of the relevant data models from the IMS specifications. A user of the SLDM can now get all of the information on the data models in just two clicks, avoiding having to dive into the thousands of pages of specification documentation. The third tier—accessed either through the second-tier links or directly—gives even richer details for the data models and related information.

The SLDM is a curation of the relevant specification information. It is a focus or “lens” on the IMS specifications.

Applying the lens to the complete set of IMS specifications provides a way to bring the relevant information into one focus from the mass of spec material. And we plan to make other lenses. As part of realizing the SLDM, the IMS technical team created a new data dictionary or Common Data Model (CDM). We generated the CDM from the source models made in the IMS model-driven specification development process (I will reveal more about this process in later blogs), which guarantees consistency between the published IMS spec documentation and the Common Data Model. The information presented through the SLDM is drawn directly from the CDM.

The first release of the SLDM and CDM draws on data model definitions only. Many IMS specs like OneRoster®, REST API, and others include a service definition. A service definition describes how the data must be exchanged; the data model describes the syntax, semantics, and format of the data to be exchanged but not how the data is exchanged. Future releases of the CDM and SLDM will include the service definitions and the binding technology artifacts, e.g., OpenAPI files, etc.  The long-term aim is to make tools such as the SLDM and CDM the primary way of working with IMS specifications.

Access to the SLDM and CDM is through the IMS website. We used GraphQL as the delivery technology when creating the CDM. A GraphQL server contains all of the Common Data Model and responds to queries from the IMS website. GraphQL provides a powerful combination for defining a flexible API with common semantics for the data being exchanged and stored. This is a great learning opportunity for us, so if in the future, IMS members request GraphQL based binding of an IMS specification, we can produce them from a strong understanding of its strengths and weaknesses.

So, what now?

The first step is to get IMS members and non-members using the Student Learning Data Model and IMS members using the Common Data Model.

Next, we need your feedback on what to improve and what new features we need to add. The early response has been encouraging, but we want more from you. Some of the specific feedback we need includes:

  • How can the user experience for the SLDM and Common Data Model be improved?

  • What type of synthesized information across the specifications is useful?

  • What type of information and visualization would be useful for our service-based specifications?

  • The SLDM is the first lens. What are other lenses of interest?

We look forward to hearing your feedback! Please email us at datamodel@imsglobal.org.

 

Tags: