VYPR
Low severityNVD Advisory· Published Apr 27, 2023· Updated Jan 30, 2025

Wasmtime has Undefined Behavior in Rust runtime functions

CVE-2023-30624

Description

Wasmtime is a standalone runtime for WebAssembly. Prior to versions 6.0.2, 7.0.1, and 8.0.1, Wasmtime's implementation of managing per-instance state, such as tables and memories, contains LLVM-level undefined behavior. This undefined behavior was found to cause runtime-level issues when compiled with LLVM 16 which causes some writes, which are critical for correctness, to be optimized away. Vulnerable versions of Wasmtime compiled with Rust 1.70, which is currently in beta, or later are known to have incorrectly compiled functions. Versions of Wasmtime compiled with the current Rust stable release, 1.69, and prior are not known at this time to have any issues, but can theoretically exhibit potential issues.

The underlying problem is that Wasmtime's runtime state for an instance involves a Rust-defined structure called Instance which has a trailing VMContext structure after it. This VMContext structure has a runtime-defined layout that is unique per-module. This representation cannot be expressed with safe code in Rust so unsafe code is required to maintain this state. The code doing this, however, has methods which take &self as an argument but modify data in the VMContext part of the allocation. This means that pointers derived from &self are mutated. This is typically not allowed, except in the presence of UnsafeCell, in Rust. When compiled to LLVM these functions have noalias readonly parameters which means it's UB to write through the pointers.

Wasmtime's internal representation and management of VMContext has been updated to use &mut self methods where appropriate. Additionally verification tools for unsafe code in Rust, such as cargo miri, are planned to be executed on the main branch soon to fix any Rust-level issues that may be exploited in future compiler versions.

Precomplied binaries available for Wasmtime from GitHub releases have been compiled with at most LLVM 15 so are not known to be vulnerable. As mentioned above, however, it's still recommended to update.

Wasmtime version 6.0.2, 7.0.1, and 8.0.1 have been issued which contain the patch necessary to work correctly on LLVM 16 and have no known UB on LLVM 15 and earlier. If Wasmtime is compiled with Rust 1.69 and prior, which use LLVM 15, then there are no known issues. There is a theoretical possibility for undefined behavior to exploited, however, so it's recommended that users upgrade to a patched version of Wasmtime. Users using beta Rust (1.70 at this time) or nightly Rust (1.71 at this time) must update to a patched version to work correctly.

Affected packages

Versions sourced from the GitHub Security Advisory.

PackageAffected versionsPatched versions
wasmtimecrates.io
< 6.0.26.0.2
wasmtimecrates.io
>= 7.0.0, < 7.0.17.0.1
wasmtimecrates.io
>= 8.0.0, < 8.0.18.0.1

Affected products

1

Patches

1
0977952dcd9d

Merge pull request from GHSA-ch89-5g45-qwc7

https://github.com/bytecodealliance/wasmtimeAlex CrichtonApr 27, 2023via ghsa
6 files changed · +127 82
  • crates/environ/src/module.rs+15 13 modified
    @@ -241,8 +241,9 @@ impl ModuleTranslation<'_> {
             }
             let mut idx = 0;
             let ok = self.module.memory_initialization.init_memory(
    +            &mut (),
                 InitMemory::CompileTime(&self.module),
    -            &mut |memory, init| {
    +            |(), memory, init| {
                     // Currently `Static` only applies to locally-defined memories,
                     // so if a data segment references an imported memory then
                     // transitioning to a `Static` memory initializer is not
    @@ -525,10 +526,11 @@ impl MemoryInitialization {
         /// question needs to be deferred to runtime, and at runtime this means
         /// that an invalid initializer has been found and a trap should be
         /// generated.
    -    pub fn init_memory(
    +    pub fn init_memory<T>(
             &self,
    -        state: InitMemory<'_>,
    -        write: &mut dyn FnMut(MemoryIndex, &StaticMemoryInitializer) -> bool,
    +        state: &mut T,
    +        init: InitMemory<'_, T>,
    +        mut write: impl FnMut(&mut T, MemoryIndex, &StaticMemoryInitializer) -> bool,
         ) -> bool {
             let initializers = match self {
                 // Fall through below to the segmented memory one-by-one
    @@ -543,7 +545,7 @@ impl MemoryInitialization {
                 MemoryInitialization::Static { map } => {
                     for (index, init) in map {
                         if let Some(init) = init {
    -                        let result = write(index, init);
    +                        let result = write(state, index, init);
                             if !result {
                                 return result;
                             }
    @@ -567,10 +569,10 @@ impl MemoryInitialization {
                 // (e.g. this is a task happening before instantiation at
                 // compile-time).
                 let base = match base {
    -                Some(index) => match &state {
    +                Some(index) => match &init {
                         InitMemory::Runtime {
                             get_global_as_u64, ..
    -                    } => get_global_as_u64(index),
    +                    } => get_global_as_u64(state, index),
                         InitMemory::CompileTime(_) => return false,
                     },
                     None => 0,
    @@ -585,12 +587,12 @@ impl MemoryInitialization {
                     None => return false,
                 };
     
    -            let cur_size_in_pages = match &state {
    +            let cur_size_in_pages = match &init {
                     InitMemory::CompileTime(module) => module.memory_plans[memory_index].memory.minimum,
                     InitMemory::Runtime {
                         memory_size_in_pages,
                         ..
    -                } => memory_size_in_pages(memory_index),
    +                } => memory_size_in_pages(state, memory_index),
                 };
     
                 // Note that this `minimum` can overflow if `minimum` is
    @@ -616,7 +618,7 @@ impl MemoryInitialization {
                     offset: start,
                     data: data.clone(),
                 };
    -            let result = write(memory_index, &init);
    +            let result = write(state, memory_index, &init);
                 if !result {
                     return result;
                 }
    @@ -628,7 +630,7 @@ impl MemoryInitialization {
     
     /// Argument to [`MemoryInitialization::init_memory`] indicating the current
     /// status of the instance.
    -pub enum InitMemory<'a> {
    +pub enum InitMemory<'a, T> {
         /// This evaluation of memory initializers is happening at compile time.
         /// This means that the current state of memories is whatever their initial
         /// state is, and additionally globals are not available if data segments
    @@ -640,10 +642,10 @@ pub enum InitMemory<'a> {
         /// instance's state.
         Runtime {
             /// Returns the size, in wasm pages, of the the memory specified.
    -        memory_size_in_pages: &'a dyn Fn(MemoryIndex) -> u64,
    +        memory_size_in_pages: &'a dyn Fn(&mut T, MemoryIndex) -> u64,
             /// Returns the value of the global, as a `u64`. Note that this may
             /// involve zero-extending a 32-bit global to a 64-bit number.
    -        get_global_as_u64: &'a dyn Fn(GlobalIndex) -> u64,
    +        get_global_as_u64: &'a dyn Fn(&mut T, GlobalIndex) -> u64,
         },
     }
     
    
  • crates/runtime/src/instance/allocator.rs+22 30 modified
    @@ -200,16 +200,10 @@ pub unsafe trait InstanceAllocator {
         fn purge_module(&self, module: CompiledModuleId);
     }
     
    -fn get_table_init_start(init: &TableInitializer, instance: &Instance) -> Result<u32> {
    +fn get_table_init_start(init: &TableInitializer, instance: &mut Instance) -> Result<u32> {
         match init.base {
             Some(base) => {
    -            let val = unsafe {
    -                if let Some(def_index) = instance.module().defined_global_index(base) {
    -                    *instance.global(def_index).as_u32()
    -                } else {
    -                    *(*instance.imported_global(base).from).as_u32()
    -                }
    -            };
    +            let val = unsafe { *(*instance.defined_or_imported_global_ptr(base)).as_u32() };
     
                 init.offset
                     .checked_add(val)
    @@ -256,10 +250,11 @@ fn initialize_tables(instance: &mut Instance, module: &Module) -> Result<()> {
             TableInitialization::FuncTable { segments, .. }
             | TableInitialization::Segments { segments } => {
                 for segment in segments {
    +                let start = get_table_init_start(segment, instance)?;
                     instance.table_init_segment(
                         segment.table_index,
                         &segment.elements,
    -                    get_table_init_start(segment, instance)?,
    +                    start,
                         0,
                         segment.elements.len() as u32,
                     )?;
    @@ -270,22 +265,18 @@ fn initialize_tables(instance: &mut Instance, module: &Module) -> Result<()> {
         Ok(())
     }
     
    -fn get_memory_init_start(init: &MemoryInitializer, instance: &Instance) -> Result<u64> {
    +fn get_memory_init_start(init: &MemoryInitializer, instance: &mut Instance) -> Result<u64> {
         match init.base {
             Some(base) => {
                 let mem64 = instance.module().memory_plans[init.memory_index]
                     .memory
                     .memory64;
                 let val = unsafe {
    -                let global = if let Some(def_index) = instance.module().defined_global_index(base) {
    -                    instance.global(def_index)
    -                } else {
    -                    &*instance.imported_global(base).from
    -                };
    +                let global = instance.defined_or_imported_global_ptr(base);
                     if mem64 {
    -                    *global.as_u64()
    +                    *(*global).as_u64()
                     } else {
    -                    u64::from(*global.as_u32())
    +                    u64::from(*(*global).as_u32())
                     }
                 };
     
    @@ -297,7 +288,10 @@ fn get_memory_init_start(init: &MemoryInitializer, instance: &Instance) -> Resul
         }
     }
     
    -fn check_memory_init_bounds(instance: &Instance, initializers: &[MemoryInitializer]) -> Result<()> {
    +fn check_memory_init_bounds(
    +    instance: &mut Instance,
    +    initializers: &[MemoryInitializer],
    +) -> Result<()> {
         for init in initializers {
             let memory = instance.get_memory(init.memory_index);
             let start = get_memory_init_start(init, instance)?;
    @@ -319,21 +313,18 @@ fn check_memory_init_bounds(instance: &Instance, initializers: &[MemoryInitializ
     }
     
     fn initialize_memories(instance: &mut Instance, module: &Module) -> Result<()> {
    -    let memory_size_in_pages =
    -        &|memory| (instance.get_memory(memory).current_length() as u64) / u64::from(WASM_PAGE_SIZE);
    +    let memory_size_in_pages = &|instance: &mut Instance, memory| {
    +        (instance.get_memory(memory).current_length() as u64) / u64::from(WASM_PAGE_SIZE)
    +    };
     
         // Loads the `global` value and returns it as a `u64`, but sign-extends
         // 32-bit globals which can be used as the base for 32-bit memories.
    -    let get_global_as_u64 = &|global| unsafe {
    -        let def = if let Some(def_index) = instance.module().defined_global_index(global) {
    -            instance.global(def_index)
    -        } else {
    -            &*instance.imported_global(global).from
    -        };
    +    let get_global_as_u64 = &mut |instance: &mut Instance, global| unsafe {
    +        let def = instance.defined_or_imported_global_ptr(global);
             if module.globals[global].wasm_ty == WasmType::I64 {
    -            *def.as_u64()
    +            *(*def).as_u64()
             } else {
    -            u64::from(*def.as_u32())
    +            u64::from(*(*def).as_u32())
             }
         };
     
    @@ -346,11 +337,12 @@ fn initialize_memories(instance: &mut Instance, module: &Module) -> Result<()> {
         // so errors only happen if an out-of-bounds segment is found, in which case
         // a trap is returned.
         let ok = module.memory_initialization.init_memory(
    +        instance,
             InitMemory::Runtime {
                 memory_size_in_pages,
                 get_global_as_u64,
             },
    -        &mut |memory_index, init| {
    +        |instance, memory_index, init| {
                 // If this initializer applies to a defined memory but that memory
                 // doesn't need initialization, due to something like copy-on-write
                 // pre-initializing it via mmap magic, then this initializer can be
    @@ -383,7 +375,7 @@ fn initialize_memories(instance: &mut Instance, module: &Module) -> Result<()> {
     fn check_init_bounds(instance: &mut Instance, module: &Module) -> Result<()> {
         check_table_init_bounds(instance, module)?;
     
    -    match &instance.module().memory_initialization {
    +    match &module.memory_initialization {
             MemoryInitialization::Segmented(initializers) => {
                 check_memory_init_bounds(instance, initializers)?;
             }
    
  • crates/runtime/src/instance.rs+44 36 modified
    @@ -147,8 +147,14 @@ impl Instance {
     
         /// Helper function to access various locations offset from our `*mut
         /// VMContext` object.
    -    unsafe fn vmctx_plus_offset<T>(&self, offset: u32) -> *mut T {
    -        (self.vmctx_ptr().cast::<u8>())
    +    unsafe fn vmctx_plus_offset<T>(&self, offset: u32) -> *const T {
    +        (std::ptr::addr_of!(self.vmctx).cast::<u8>())
    +            .add(usize::try_from(offset).unwrap())
    +            .cast()
    +    }
    +
    +    unsafe fn vmctx_plus_offset_mut<T>(&mut self, offset: u32) -> *mut T {
    +        (std::ptr::addr_of_mut!(self.vmctx).cast::<u8>())
                 .add(usize::try_from(offset).unwrap())
                 .cast()
         }
    @@ -183,20 +189,20 @@ impl Instance {
     
         /// Return the indexed `VMTableDefinition`.
         #[allow(dead_code)]
    -    fn table(&self, index: DefinedTableIndex) -> VMTableDefinition {
    +    fn table(&mut self, index: DefinedTableIndex) -> VMTableDefinition {
             unsafe { *self.table_ptr(index) }
         }
     
         /// Updates the value for a defined table to `VMTableDefinition`.
    -    fn set_table(&self, index: DefinedTableIndex, table: VMTableDefinition) {
    +    fn set_table(&mut self, index: DefinedTableIndex, table: VMTableDefinition) {
             unsafe {
                 *self.table_ptr(index) = table;
             }
         }
     
         /// Return the indexed `VMTableDefinition`.
    -    fn table_ptr(&self, index: DefinedTableIndex) -> *mut VMTableDefinition {
    -        unsafe { self.vmctx_plus_offset(self.offsets().vmctx_vmtable_definition(index)) }
    +    fn table_ptr(&mut self, index: DefinedTableIndex) -> *mut VMTableDefinition {
    +        unsafe { self.vmctx_plus_offset_mut(self.offsets().vmctx_vmtable_definition(index)) }
         }
     
         /// Get a locally defined or imported memory.
    @@ -238,21 +244,21 @@ impl Instance {
         }
     
         /// Return the indexed `VMGlobalDefinition`.
    -    fn global(&self, index: DefinedGlobalIndex) -> &VMGlobalDefinition {
    +    fn global(&mut self, index: DefinedGlobalIndex) -> &VMGlobalDefinition {
             unsafe { &*self.global_ptr(index) }
         }
     
         /// Return the indexed `VMGlobalDefinition`.
    -    fn global_ptr(&self, index: DefinedGlobalIndex) -> *mut VMGlobalDefinition {
    -        unsafe { self.vmctx_plus_offset(self.offsets().vmctx_vmglobal_definition(index)) }
    +    fn global_ptr(&mut self, index: DefinedGlobalIndex) -> *mut VMGlobalDefinition {
    +        unsafe { self.vmctx_plus_offset_mut(self.offsets().vmctx_vmglobal_definition(index)) }
         }
     
         /// Get a raw pointer to the global at the given index regardless whether it
         /// is defined locally or imported from another module.
         ///
         /// Panics if the index is out of bound or is the reserved value.
         pub(crate) fn defined_or_imported_global_ptr(
    -        &self,
    +        &mut self,
             index: GlobalIndex,
         ) -> *mut VMGlobalDefinition {
             if let Some(index) = self.module().defined_global_index(index) {
    @@ -263,18 +269,18 @@ impl Instance {
         }
     
         /// Return a pointer to the interrupts structure
    -    pub fn runtime_limits(&self) -> *mut *const VMRuntimeLimits {
    -        unsafe { self.vmctx_plus_offset(self.offsets().vmctx_runtime_limits()) }
    +    pub fn runtime_limits(&mut self) -> *mut *const VMRuntimeLimits {
    +        unsafe { self.vmctx_plus_offset_mut(self.offsets().vmctx_runtime_limits()) }
         }
     
         /// Return a pointer to the global epoch counter used by this instance.
    -    pub fn epoch_ptr(&self) -> *mut *const AtomicU64 {
    -        unsafe { self.vmctx_plus_offset(self.offsets().vmctx_epoch_ptr()) }
    +    pub fn epoch_ptr(&mut self) -> *mut *const AtomicU64 {
    +        unsafe { self.vmctx_plus_offset_mut(self.offsets().vmctx_epoch_ptr()) }
         }
     
         /// Return a pointer to the `VMExternRefActivationsTable`.
    -    pub fn externref_activations_table(&self) -> *mut *mut VMExternRefActivationsTable {
    -        unsafe { self.vmctx_plus_offset(self.offsets().vmctx_externref_activations_table()) }
    +    pub fn externref_activations_table(&mut self) -> *mut *mut VMExternRefActivationsTable {
    +        unsafe { self.vmctx_plus_offset_mut(self.offsets().vmctx_externref_activations_table()) }
         }
     
         /// Gets a pointer to this instance's `Store` which was originally
    @@ -297,7 +303,7 @@ impl Instance {
     
         pub unsafe fn set_store(&mut self, store: Option<*mut dyn Store>) {
             if let Some(store) = store {
    -            *self.vmctx_plus_offset(self.offsets().vmctx_store()) = store;
    +            *self.vmctx_plus_offset_mut(self.offsets().vmctx_store()) = store;
                 *self.runtime_limits() = (*store).vmruntime_limits();
                 *self.epoch_ptr() = (*store).epoch_ptr();
                 *self.externref_activations_table() = (*store).externref_activations_table().0;
    @@ -306,7 +312,7 @@ impl Instance {
                     mem::size_of::<*mut dyn Store>(),
                     mem::size_of::<[*mut (); 2]>()
                 );
    -            *self.vmctx_plus_offset::<[*mut (); 2]>(self.offsets().vmctx_store()) =
    +            *self.vmctx_plus_offset_mut::<[*mut (); 2]>(self.offsets().vmctx_store()) =
                     [ptr::null_mut(), ptr::null_mut()];
     
                 *self.runtime_limits() = ptr::null_mut();
    @@ -316,7 +322,7 @@ impl Instance {
         }
     
         pub(crate) unsafe fn set_callee(&mut self, callee: Option<NonNull<VMFunctionBody>>) {
    -        *self.vmctx_plus_offset(self.offsets().vmctx_callee()) =
    +        *self.vmctx_plus_offset_mut(self.offsets().vmctx_callee()) =
                 callee.map_or(ptr::null_mut(), |c| c.as_ptr());
         }
     
    @@ -402,7 +408,7 @@ impl Instance {
         }
     
         /// Return the table index for the given `VMTableDefinition`.
    -    unsafe fn table_index(&self, table: &VMTableDefinition) -> DefinedTableIndex {
    +    unsafe fn table_index(&mut self, table: &VMTableDefinition) -> DefinedTableIndex {
             let index = DefinedTableIndex::new(
                 usize::try_from(
                     (table as *const VMTableDefinition)
    @@ -515,7 +521,7 @@ impl Instance {
         ) {
             let type_index = unsafe {
                 let base: *const VMSharedSignatureIndex =
    -                *self.vmctx_plus_offset(self.offsets().vmctx_signature_ids_array());
    +                *self.vmctx_plus_offset_mut(self.offsets().vmctx_signature_ids_array());
                 *base.add(sig.index())
             };
     
    @@ -584,7 +590,7 @@ impl Instance {
                 let func = &self.module().functions[index];
                 let sig = func.signature;
                 let anyfunc: *mut VMCallerCheckedFuncRef = self
    -                .vmctx_plus_offset::<VMCallerCheckedFuncRef>(
    +                .vmctx_plus_offset_mut::<VMCallerCheckedFuncRef>(
                         self.offsets().vmctx_anyfunc(func.anyfunc),
                     );
                 self.construct_anyfunc(index, sig, anyfunc);
    @@ -923,40 +929,41 @@ impl Instance {
         ) {
             assert!(std::ptr::eq(module, self.module().as_ref()));
     
    -        *self.vmctx_plus_offset(offsets.vmctx_magic()) = VMCONTEXT_MAGIC;
    +        *self.vmctx_plus_offset_mut(offsets.vmctx_magic()) = VMCONTEXT_MAGIC;
             self.set_callee(None);
             self.set_store(store.as_raw());
     
             // Initialize shared signatures
             let signatures = self.runtime_info.signature_ids();
    -        *self.vmctx_plus_offset(offsets.vmctx_signature_ids_array()) = signatures.as_ptr();
    +        *self.vmctx_plus_offset_mut(offsets.vmctx_signature_ids_array()) = signatures.as_ptr();
     
             // Initialize the built-in functions
    -        *self.vmctx_plus_offset(offsets.vmctx_builtin_functions()) = &VMBuiltinFunctionsArray::INIT;
    +        *self.vmctx_plus_offset_mut(offsets.vmctx_builtin_functions()) =
    +            &VMBuiltinFunctionsArray::INIT;
     
             // Initialize the imports
             debug_assert_eq!(imports.functions.len(), module.num_imported_funcs);
             ptr::copy_nonoverlapping(
                 imports.functions.as_ptr(),
    -            self.vmctx_plus_offset(offsets.vmctx_imported_functions_begin()),
    +            self.vmctx_plus_offset_mut(offsets.vmctx_imported_functions_begin()),
                 imports.functions.len(),
             );
             debug_assert_eq!(imports.tables.len(), module.num_imported_tables);
             ptr::copy_nonoverlapping(
                 imports.tables.as_ptr(),
    -            self.vmctx_plus_offset(offsets.vmctx_imported_tables_begin()),
    +            self.vmctx_plus_offset_mut(offsets.vmctx_imported_tables_begin()),
                 imports.tables.len(),
             );
             debug_assert_eq!(imports.memories.len(), module.num_imported_memories);
             ptr::copy_nonoverlapping(
                 imports.memories.as_ptr(),
    -            self.vmctx_plus_offset(offsets.vmctx_imported_memories_begin()),
    +            self.vmctx_plus_offset_mut(offsets.vmctx_imported_memories_begin()),
                 imports.memories.len(),
             );
             debug_assert_eq!(imports.globals.len(), module.num_imported_globals);
             ptr::copy_nonoverlapping(
                 imports.globals.as_ptr(),
    -            self.vmctx_plus_offset(offsets.vmctx_imported_globals_begin()),
    +            self.vmctx_plus_offset_mut(offsets.vmctx_imported_globals_begin()),
                 imports.globals.len(),
             );
     
    @@ -967,7 +974,7 @@ impl Instance {
             // any state now.
     
             // Initialize the defined tables
    -        let mut ptr = self.vmctx_plus_offset(offsets.vmctx_tables_begin());
    +        let mut ptr = self.vmctx_plus_offset_mut(offsets.vmctx_tables_begin());
             for i in 0..module.table_plans.len() - module.num_imported_tables {
                 ptr::write(ptr, self.tables[DefinedTableIndex::new(i)].vmtable());
                 ptr = ptr.add(1);
    @@ -978,8 +985,8 @@ impl Instance {
             // time. Entries in `defined_memories` hold a pointer to a definition
             // (all memories) whereas the `owned_memories` hold the actual
             // definitions of memories owned (not shared) in the module.
    -        let mut ptr = self.vmctx_plus_offset(offsets.vmctx_memories_begin());
    -        let mut owned_ptr = self.vmctx_plus_offset(offsets.vmctx_owned_memories_begin());
    +        let mut ptr = self.vmctx_plus_offset_mut(offsets.vmctx_memories_begin());
    +        let mut owned_ptr = self.vmctx_plus_offset_mut(offsets.vmctx_owned_memories_begin());
             for i in 0..module.memory_plans.len() - module.num_imported_memories {
                 let defined_memory_index = DefinedMemoryIndex::new(i);
                 let memory_index = module.memory_index(defined_memory_index);
    @@ -1068,8 +1075,9 @@ impl Instance {
     impl Drop for Instance {
         fn drop(&mut self) {
             // Drop any defined globals
    -        for (idx, global) in self.module().globals.iter() {
    -            let idx = match self.module().defined_global_index(idx) {
    +        let module = self.module().clone();
    +        for (idx, global) in module.globals.iter() {
    +            let idx = match module.defined_global_index(idx) {
                     Some(idx) => idx,
                     None => continue,
                 };
    @@ -1182,8 +1190,8 @@ impl InstanceHandle {
         }
     
         /// Return the table index for the given `VMTableDefinition` in this instance.
    -    pub unsafe fn table_index(&self, table: &VMTableDefinition) -> DefinedTableIndex {
    -        self.instance().table_index(table)
    +    pub unsafe fn table_index(&mut self, table: &VMTableDefinition) -> DefinedTableIndex {
    +        self.instance_mut().table_index(table)
         }
     
         /// Get a table defined locally within this module.
    
  • crates/runtime/src/libcalls.rs+2 2 modified
    @@ -412,7 +412,7 @@ unsafe fn activations_table_insert_with_gc(vmctx: *mut VMContext, externref: *mu
     // Perform a Wasm `global.get` for `externref` globals.
     unsafe fn externref_global_get(vmctx: *mut VMContext, index: u32) -> *mut u8 {
         let index = GlobalIndex::from_u32(index);
    -    let instance = (*vmctx).instance();
    +    let instance = (*vmctx).instance_mut();
         let global = instance.defined_or_imported_global_ptr(index);
         match (*global).as_externref().clone() {
             None => ptr::null_mut(),
    @@ -435,7 +435,7 @@ unsafe fn externref_global_set(vmctx: *mut VMContext, index: u32, externref: *mu
         };
     
         let index = GlobalIndex::from_u32(index);
    -    let instance = (*vmctx).instance();
    +    let instance = (*vmctx).instance_mut();
         let global = instance.defined_or_imported_global_ptr(index);
     
         // Swap the new `externref` value into the global before we drop the old
    
  • crates/runtime/src/traphandlers.rs+1 1 modified
    @@ -219,7 +219,7 @@ pub unsafe fn catch_traps<'a, F>(
     where
         F: FnMut(*mut VMContext),
     {
    -    let limits = (*caller).instance().runtime_limits();
    +    let limits = (*caller).instance_mut().runtime_limits();
     
         let result = CallThreadState::new(signal_handler, capture_backtrace, *limits).with(|cx| {
             wasmtime_setjmp(
    
  • RELEASES.md+43 0 modified
    @@ -10,6 +10,27 @@ Unreleased.
     
     --------------------------------------------------------------------------------
     
    +## 8.0.1
    +
    +Released 2023-04-27.
    +
    +### Changed
    +
    +* Breaking: Files opened using Wasmtime's implementation of WASI on Windows now
    +  cannot be deleted until the file handle is closed. This was already true for
    +  open directories. The change was necessary for the bug fix in
    +  [#6163](https://github.com/bytecodealliance/wasmtime/pull/6163).
    +
    +### Fixed
    +
    +* Fixed wasi-common's implementation of the `O_DIRECTORY` flag to match POSIX.
    +  [#6163](https://github.com/bytecodealliance/wasmtime/pull/6163)
    +
    +* Undefined Behavior in Rust runtime functions
    +  [GHSA-ch89-5g45-qwc7](https://github.com/bytecodealliance/wasmtime/security/advisories/GHSA-ch89-5g45-qwc7)
    +
    +--------------------------------------------------------------------------------
    +
     ## 8.0.0
     
     Released 2023-04-20
    @@ -110,6 +131,17 @@ Released 2023-04-20
     
     --------------------------------------------------------------------------------
     
    +## 7.0.1
    +
    +Released 2023-04-27.
    +
    +### Fixed
    +
    +* Undefined Behavior in Rust runtime functions
    +  [GHSA-ch89-5g45-qwc7](https://github.com/bytecodealliance/wasmtime/security/advisories/GHSA-ch89-5g45-qwc7)
    +
    +--------------------------------------------------------------------------------
    +
     ## 7.0.0
     
     Released 2023-03-20
    @@ -174,6 +206,17 @@ Released 2023-03-20
     
     --------------------------------------------------------------------------------
     
    +## 6.0.2
    +
    +Released 2023-04-27.
    +
    +### Fixed
    +
    +* Undefined Behavior in Rust runtime functions
    +  [GHSA-ch89-5g45-qwc7](https://github.com/bytecodealliance/wasmtime/security/advisories/GHSA-ch89-5g45-qwc7)
    +
    +--------------------------------------------------------------------------------
    +
     ## 6.0.1
     
     Released 2023-03-08.
    

Vulnerability mechanics

Generated by null/stub on May 9, 2026. Inputs: CWE entries + fix-commit diffs from this CVE's patches. Citations validated against bundle.

References

6

News mentions

0

No linked articles in our index yet.