alloc/boxed.rs
1//! The `Box<T>` type for heap allocation.
2//!
3//! [`Box<T>`], casually referred to as a 'box', provides the simplest form of
4//! heap allocation in Rust. Boxes provide ownership for this allocation, and
5//! drop their contents when they go out of scope. Boxes also ensure that they
6//! never allocate more than `isize::MAX` bytes.
7//!
8//! # Examples
9//!
10//! Move a value from the stack to the heap by creating a [`Box`]:
11//!
12//! ```
13//! let val: u8 = 5;
14//! let boxed: Box<u8> = Box::new(val);
15//! ```
16//!
17//! Move a value from a [`Box`] back to the stack by [dereferencing]:
18//!
19//! ```
20//! let boxed: Box<u8> = Box::new(5);
21//! let val: u8 = *boxed;
22//! ```
23//!
24//! Creating a recursive data structure:
25//!
26//! ```
27//! # #[allow(dead_code)]
28//! #[derive(Debug)]
29//! enum List<T> {
30//! Cons(T, Box<List<T>>),
31//! Nil,
32//! }
33//!
34//! let list: List<i32> = List::Cons(1, Box::new(List::Cons(2, Box::new(List::Nil))));
35//! println!("{list:?}");
36//! ```
37//!
38//! This will print `Cons(1, Cons(2, Nil))`.
39//!
40//! Recursive structures must be boxed, because if the definition of `Cons`
41//! looked like this:
42//!
43//! ```compile_fail,E0072
44//! # enum List<T> {
45//! Cons(T, List<T>),
46//! # }
47//! ```
48//!
49//! It wouldn't work. This is because the size of a `List` depends on how many
50//! elements are in the list, and so we don't know how much memory to allocate
51//! for a `Cons`. By introducing a [`Box<T>`], which has a defined size, we know how
52//! big `Cons` needs to be.
53//!
54//! # Memory layout
55//!
56//! For non-zero-sized values, a [`Box`] will use the [`Global`] allocator for its allocation. It is
57//! valid to convert both ways between a [`Box`] and a raw pointer allocated with the [`Global`]
58//! allocator, given that the [`Layout`] used with the allocator is correct for the type and the raw
59//! pointer points to a valid value of the right type. More precisely, a `value: *mut T` that has
60//! been allocated with the [`Global`] allocator with `Layout::for_value(&*value)` may be converted
61//! into a box using [`Box::<T>::from_raw(value)`]. Conversely, the memory backing a `value: *mut T`
62//! obtained from [`Box::<T>::into_raw`] may be deallocated using the [`Global`] allocator with
63//! [`Layout::for_value(&*value)`].
64//!
65//! For zero-sized values, the `Box` pointer has to be non-null and sufficiently aligned. The
66//! recommended way to build a Box to a ZST if `Box::new` cannot be used is to use
67//! [`ptr::NonNull::dangling`].
68//!
69//! On top of these basic layout requirements, a `Box<T>` must point to a valid value of `T`.
70//!
71//! So long as `T: Sized`, a `Box<T>` is guaranteed to be represented
72//! as a single pointer and is also ABI-compatible with C pointers
73//! (i.e. the C type `T*`). This means that if you have extern "C"
74//! Rust functions that will be called from C, you can define those
75//! Rust functions using `Box<T>` types, and use `T*` as corresponding
76//! type on the C side. As an example, consider this C header which
77//! declares functions that create and destroy some kind of `Foo`
78//! value:
79//!
80//! ```c
81//! /* C header */
82//!
83//! /* Returns ownership to the caller */
84//! struct Foo* foo_new(void);
85//!
86//! /* Takes ownership from the caller; no-op when invoked with null */
87//! void foo_delete(struct Foo*);
88//! ```
89//!
90//! These two functions might be implemented in Rust as follows. Here, the
91//! `struct Foo*` type from C is translated to `Box<Foo>`, which captures
92//! the ownership constraints. Note also that the nullable argument to
93//! `foo_delete` is represented in Rust as `Option<Box<Foo>>`, since `Box<Foo>`
94//! cannot be null.
95//!
96//! ```
97//! #[repr(C)]
98//! pub struct Foo;
99//!
100//! #[unsafe(no_mangle)]
101//! pub extern "C" fn foo_new() -> Box<Foo> {
102//! Box::new(Foo)
103//! }
104//!
105//! #[unsafe(no_mangle)]
106//! pub extern "C" fn foo_delete(_: Option<Box<Foo>>) {}
107//! ```
108//!
109//! Even though `Box<T>` has the same representation and C ABI as a C pointer,
110//! this does not mean that you can convert an arbitrary `T*` into a `Box<T>`
111//! and expect things to work. `Box<T>` values will always be fully aligned,
112//! non-null pointers. Moreover, the destructor for `Box<T>` will attempt to
113//! free the value with the global allocator. In general, the best practice
114//! is to only use `Box<T>` for pointers that originated from the global
115//! allocator.
116//!
117//! **Important.** At least at present, you should avoid using
118//! `Box<T>` types for functions that are defined in C but invoked
119//! from Rust. In those cases, you should directly mirror the C types
120//! as closely as possible. Using types like `Box<T>` where the C
121//! definition is just using `T*` can lead to undefined behavior, as
122//! described in [rust-lang/unsafe-code-guidelines#198][ucg#198].
123//!
124//! # Considerations for unsafe code
125//!
126//! **Warning: This section is not normative and is subject to change, possibly
127//! being relaxed in the future! It is a simplified summary of the rules
128//! currently implemented in the compiler.**
129//!
130//! The aliasing rules for `Box<T>` are the same as for `&mut T`. `Box<T>`
131//! asserts uniqueness over its content. Using raw pointers derived from a box
132//! after that box has been mutated through, moved or borrowed as `&mut T`
133//! is not allowed. For more guidance on working with box from unsafe code, see
134//! [rust-lang/unsafe-code-guidelines#326][ucg#326].
135//!
136//! # Editions
137//!
138//! A special case exists for the implementation of `IntoIterator` for arrays on the Rust 2021
139//! edition, as documented [here][array]. Unfortunately, it was later found that a similar
140//! workaround should be added for boxed slices, and this was applied in the 2024 edition.
141//!
142//! Specifically, `IntoIterator` is implemented for `Box<[T]>` on all editions, but specific calls
143//! to `into_iter()` for boxed slices will defer to the slice implementation on editions before
144//! 2024:
145//!
146//! ```rust,edition2021
147//! // Rust 2015, 2018, and 2021:
148//!
149//! # #![allow(boxed_slice_into_iter)] // override our `deny(warnings)`
150//! let boxed_slice: Box<[i32]> = vec![0; 3].into_boxed_slice();
151//!
152//! // This creates a slice iterator, producing references to each value.
153//! for item in boxed_slice.into_iter().enumerate() {
154//! let (i, x): (usize, &i32) = item;
155//! println!("boxed_slice[{i}] = {x}");
156//! }
157//!
158//! // The `boxed_slice_into_iter` lint suggests this change for future compatibility:
159//! for item in boxed_slice.iter().enumerate() {
160//! let (i, x): (usize, &i32) = item;
161//! println!("boxed_slice[{i}] = {x}");
162//! }
163//!
164//! // You can explicitly iterate a boxed slice by value using `IntoIterator::into_iter`
165//! for item in IntoIterator::into_iter(boxed_slice).enumerate() {
166//! let (i, x): (usize, i32) = item;
167//! println!("boxed_slice[{i}] = {x}");
168//! }
169//! ```
170//!
171//! Similar to the array implementation, this may be modified in the future to remove this override,
172//! and it's best to avoid relying on this edition-dependent behavior if you wish to preserve
173//! compatibility with future versions of the compiler.
174//!
175//! [ucg#198]: https://github.com/rust-lang/unsafe-code-guidelines/issues/198
176//! [ucg#326]: https://github.com/rust-lang/unsafe-code-guidelines/issues/326
177//! [dereferencing]: core::ops::Deref
178//! [`Box::<T>::from_raw(value)`]: Box::from_raw
179//! [`Global`]: crate::alloc::Global
180//! [`Layout`]: crate::alloc::Layout
181//! [`Layout::for_value(&*value)`]: crate::alloc::Layout::for_value
182//! [valid]: ptr#safety
183
184#![stable(feature = "rust1", since = "1.0.0")]
185
186use core::borrow::{Borrow, BorrowMut};
187#[cfg(not(no_global_oom_handling))]
188use core::clone::CloneToUninit;
189use core::cmp::Ordering;
190use core::error::{self, Error};
191use core::fmt;
192use core::future::Future;
193use core::hash::{Hash, Hasher};
194use core::marker::{Tuple, Unsize};
195use core::mem::{self, SizedTypeProperties};
196use core::ops::{
197 AsyncFn, AsyncFnMut, AsyncFnOnce, CoerceUnsized, Coroutine, CoroutineState, Deref, DerefMut,
198 DerefPure, DispatchFromDyn, LegacyReceiver,
199};
200use core::pin::{Pin, PinCoerceUnsized};
201use core::ptr::{self, NonNull, Unique};
202use core::task::{Context, Poll};
203
204#[cfg(not(no_global_oom_handling))]
205use crate::alloc::handle_alloc_error;
206use crate::alloc::{AllocError, Allocator, Global, Layout};
207use crate::raw_vec::RawVec;
208#[cfg(not(no_global_oom_handling))]
209use crate::str::from_boxed_utf8_unchecked;
210
211/// Conversion related impls for `Box<_>` (`From`, `downcast`, etc)
212mod convert;
213/// Iterator related impls for `Box<_>`.
214mod iter;
215/// [`ThinBox`] implementation.
216mod thin;
217
218#[unstable(feature = "thin_box", issue = "92791")]
219pub use thin::ThinBox;
220
221/// A pointer type that uniquely owns a heap allocation of type `T`.
222///
223/// See the [module-level documentation](../../std/boxed/index.html) for more.
224#[lang = "owned_box"]
225#[fundamental]
226#[stable(feature = "rust1", since = "1.0.0")]
227#[rustc_insignificant_dtor]
228#[doc(search_unbox)]
229// The declaration of the `Box` struct must be kept in sync with the
230// compiler or ICEs will happen.
231pub struct Box<
232 T: ?Sized,
233 #[unstable(feature = "allocator_api", issue = "32838")] A: Allocator = Global,
234>(Unique<T>, A);
235
236/// Constructs a `Box<T>` by calling the `exchange_malloc` lang item and moving the argument into
237/// the newly allocated memory. This is an intrinsic to avoid unnecessary copies.
238///
239/// This is the surface syntax for `box <expr>` expressions.
240#[rustc_intrinsic]
241#[unstable(feature = "liballoc_internals", issue = "none")]
242pub fn box_new<T>(x: T) -> Box<T>;
243
244impl<T> Box<T> {
245 /// Allocates memory on the heap and then places `x` into it.
246 ///
247 /// This doesn't actually allocate if `T` is zero-sized.
248 ///
249 /// # Examples
250 ///
251 /// ```
252 /// let five = Box::new(5);
253 /// ```
254 #[cfg(not(no_global_oom_handling))]
255 #[inline(always)]
256 #[stable(feature = "rust1", since = "1.0.0")]
257 #[must_use]
258 #[rustc_diagnostic_item = "box_new"]
259 #[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
260 pub fn new(x: T) -> Self {
261 return box_new(x);
262 }
263
264 /// Constructs a new box with uninitialized contents.
265 ///
266 /// # Examples
267 ///
268 /// ```
269 /// let mut five = Box::<u32>::new_uninit();
270 /// // Deferred initialization:
271 /// five.write(5);
272 /// let five = unsafe { five.assume_init() };
273 ///
274 /// assert_eq!(*five, 5)
275 /// ```
276 #[cfg(not(no_global_oom_handling))]
277 #[stable(feature = "new_uninit", since = "1.82.0")]
278 #[must_use]
279 #[inline]
280 pub fn new_uninit() -> Box<mem::MaybeUninit<T>> {
281 Self::new_uninit_in(Global)
282 }
283
284 /// Constructs a new `Box` with uninitialized contents, with the memory
285 /// being filled with `0` bytes.
286 ///
287 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
288 /// of this method.
289 ///
290 /// # Examples
291 ///
292 /// ```
293 /// let zero = Box::<u32>::new_zeroed();
294 /// let zero = unsafe { zero.assume_init() };
295 ///
296 /// assert_eq!(*zero, 0)
297 /// ```
298 ///
299 /// [zeroed]: mem::MaybeUninit::zeroed
300 #[cfg(not(no_global_oom_handling))]
301 #[inline]
302 #[stable(feature = "new_zeroed_alloc", since = "CURRENT_RUSTC_VERSION")]
303 #[must_use]
304 pub fn new_zeroed() -> Box<mem::MaybeUninit<T>> {
305 Self::new_zeroed_in(Global)
306 }
307
308 /// Constructs a new `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
309 /// `x` will be pinned in memory and unable to be moved.
310 ///
311 /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin(x)`
312 /// does the same as <code>[Box::into_pin]\([Box::new]\(x))</code>. Consider using
313 /// [`into_pin`](Box::into_pin) if you already have a `Box<T>`, or if you want to
314 /// construct a (pinned) `Box` in a different way than with [`Box::new`].
315 #[cfg(not(no_global_oom_handling))]
316 #[stable(feature = "pin", since = "1.33.0")]
317 #[must_use]
318 #[inline(always)]
319 pub fn pin(x: T) -> Pin<Box<T>> {
320 Box::new(x).into()
321 }
322
323 /// Allocates memory on the heap then places `x` into it,
324 /// returning an error if the allocation fails
325 ///
326 /// This doesn't actually allocate if `T` is zero-sized.
327 ///
328 /// # Examples
329 ///
330 /// ```
331 /// #![feature(allocator_api)]
332 ///
333 /// let five = Box::try_new(5)?;
334 /// # Ok::<(), std::alloc::AllocError>(())
335 /// ```
336 #[unstable(feature = "allocator_api", issue = "32838")]
337 #[inline]
338 pub fn try_new(x: T) -> Result<Self, AllocError> {
339 Self::try_new_in(x, Global)
340 }
341
342 /// Constructs a new box with uninitialized contents on the heap,
343 /// returning an error if the allocation fails
344 ///
345 /// # Examples
346 ///
347 /// ```
348 /// #![feature(allocator_api)]
349 ///
350 /// let mut five = Box::<u32>::try_new_uninit()?;
351 /// // Deferred initialization:
352 /// five.write(5);
353 /// let five = unsafe { five.assume_init() };
354 ///
355 /// assert_eq!(*five, 5);
356 /// # Ok::<(), std::alloc::AllocError>(())
357 /// ```
358 #[unstable(feature = "allocator_api", issue = "32838")]
359 #[inline]
360 pub fn try_new_uninit() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
361 Box::try_new_uninit_in(Global)
362 }
363
364 /// Constructs a new `Box` with uninitialized contents, with the memory
365 /// being filled with `0` bytes on the heap
366 ///
367 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
368 /// of this method.
369 ///
370 /// # Examples
371 ///
372 /// ```
373 /// #![feature(allocator_api)]
374 ///
375 /// let zero = Box::<u32>::try_new_zeroed()?;
376 /// let zero = unsafe { zero.assume_init() };
377 ///
378 /// assert_eq!(*zero, 0);
379 /// # Ok::<(), std::alloc::AllocError>(())
380 /// ```
381 ///
382 /// [zeroed]: mem::MaybeUninit::zeroed
383 #[unstable(feature = "allocator_api", issue = "32838")]
384 #[inline]
385 pub fn try_new_zeroed() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
386 Box::try_new_zeroed_in(Global)
387 }
388}
389
390impl<T, A: Allocator> Box<T, A> {
391 /// Allocates memory in the given allocator then places `x` into it.
392 ///
393 /// This doesn't actually allocate if `T` is zero-sized.
394 ///
395 /// # Examples
396 ///
397 /// ```
398 /// #![feature(allocator_api)]
399 ///
400 /// use std::alloc::System;
401 ///
402 /// let five = Box::new_in(5, System);
403 /// ```
404 #[cfg(not(no_global_oom_handling))]
405 #[unstable(feature = "allocator_api", issue = "32838")]
406 #[must_use]
407 #[inline]
408 pub fn new_in(x: T, alloc: A) -> Self
409 where
410 A: Allocator,
411 {
412 let mut boxed = Self::new_uninit_in(alloc);
413 boxed.write(x);
414 unsafe { boxed.assume_init() }
415 }
416
417 /// Allocates memory in the given allocator then places `x` into it,
418 /// returning an error if the allocation fails
419 ///
420 /// This doesn't actually allocate if `T` is zero-sized.
421 ///
422 /// # Examples
423 ///
424 /// ```
425 /// #![feature(allocator_api)]
426 ///
427 /// use std::alloc::System;
428 ///
429 /// let five = Box::try_new_in(5, System)?;
430 /// # Ok::<(), std::alloc::AllocError>(())
431 /// ```
432 #[unstable(feature = "allocator_api", issue = "32838")]
433 #[inline]
434 pub fn try_new_in(x: T, alloc: A) -> Result<Self, AllocError>
435 where
436 A: Allocator,
437 {
438 let mut boxed = Self::try_new_uninit_in(alloc)?;
439 boxed.write(x);
440 unsafe { Ok(boxed.assume_init()) }
441 }
442
443 /// Constructs a new box with uninitialized contents in the provided allocator.
444 ///
445 /// # Examples
446 ///
447 /// ```
448 /// #![feature(allocator_api)]
449 ///
450 /// use std::alloc::System;
451 ///
452 /// let mut five = Box::<u32, _>::new_uninit_in(System);
453 /// // Deferred initialization:
454 /// five.write(5);
455 /// let five = unsafe { five.assume_init() };
456 ///
457 /// assert_eq!(*five, 5)
458 /// ```
459 #[unstable(feature = "allocator_api", issue = "32838")]
460 #[cfg(not(no_global_oom_handling))]
461 #[must_use]
462 pub fn new_uninit_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
463 where
464 A: Allocator,
465 {
466 let layout = Layout::new::<mem::MaybeUninit<T>>();
467 // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
468 // That would make code size bigger.
469 match Box::try_new_uninit_in(alloc) {
470 Ok(m) => m,
471 Err(_) => handle_alloc_error(layout),
472 }
473 }
474
475 /// Constructs a new box with uninitialized contents in the provided allocator,
476 /// returning an error if the allocation fails
477 ///
478 /// # Examples
479 ///
480 /// ```
481 /// #![feature(allocator_api)]
482 ///
483 /// use std::alloc::System;
484 ///
485 /// let mut five = Box::<u32, _>::try_new_uninit_in(System)?;
486 /// // Deferred initialization:
487 /// five.write(5);
488 /// let five = unsafe { five.assume_init() };
489 ///
490 /// assert_eq!(*five, 5);
491 /// # Ok::<(), std::alloc::AllocError>(())
492 /// ```
493 #[unstable(feature = "allocator_api", issue = "32838")]
494 pub fn try_new_uninit_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
495 where
496 A: Allocator,
497 {
498 let ptr = if T::IS_ZST {
499 NonNull::dangling()
500 } else {
501 let layout = Layout::new::<mem::MaybeUninit<T>>();
502 alloc.allocate(layout)?.cast()
503 };
504 unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
505 }
506
507 /// Constructs a new `Box` with uninitialized contents, with the memory
508 /// being filled with `0` bytes in the provided allocator.
509 ///
510 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
511 /// of this method.
512 ///
513 /// # Examples
514 ///
515 /// ```
516 /// #![feature(allocator_api)]
517 ///
518 /// use std::alloc::System;
519 ///
520 /// let zero = Box::<u32, _>::new_zeroed_in(System);
521 /// let zero = unsafe { zero.assume_init() };
522 ///
523 /// assert_eq!(*zero, 0)
524 /// ```
525 ///
526 /// [zeroed]: mem::MaybeUninit::zeroed
527 #[unstable(feature = "allocator_api", issue = "32838")]
528 #[cfg(not(no_global_oom_handling))]
529 #[must_use]
530 pub fn new_zeroed_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
531 where
532 A: Allocator,
533 {
534 let layout = Layout::new::<mem::MaybeUninit<T>>();
535 // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
536 // That would make code size bigger.
537 match Box::try_new_zeroed_in(alloc) {
538 Ok(m) => m,
539 Err(_) => handle_alloc_error(layout),
540 }
541 }
542
543 /// Constructs a new `Box` with uninitialized contents, with the memory
544 /// being filled with `0` bytes in the provided allocator,
545 /// returning an error if the allocation fails,
546 ///
547 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
548 /// of this method.
549 ///
550 /// # Examples
551 ///
552 /// ```
553 /// #![feature(allocator_api)]
554 ///
555 /// use std::alloc::System;
556 ///
557 /// let zero = Box::<u32, _>::try_new_zeroed_in(System)?;
558 /// let zero = unsafe { zero.assume_init() };
559 ///
560 /// assert_eq!(*zero, 0);
561 /// # Ok::<(), std::alloc::AllocError>(())
562 /// ```
563 ///
564 /// [zeroed]: mem::MaybeUninit::zeroed
565 #[unstable(feature = "allocator_api", issue = "32838")]
566 pub fn try_new_zeroed_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
567 where
568 A: Allocator,
569 {
570 let ptr = if T::IS_ZST {
571 NonNull::dangling()
572 } else {
573 let layout = Layout::new::<mem::MaybeUninit<T>>();
574 alloc.allocate_zeroed(layout)?.cast()
575 };
576 unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
577 }
578
579 /// Constructs a new `Pin<Box<T, A>>`. If `T` does not implement [`Unpin`], then
580 /// `x` will be pinned in memory and unable to be moved.
581 ///
582 /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin_in(x, alloc)`
583 /// does the same as <code>[Box::into_pin]\([Box::new_in]\(x, alloc))</code>. Consider using
584 /// [`into_pin`](Box::into_pin) if you already have a `Box<T, A>`, or if you want to
585 /// construct a (pinned) `Box` in a different way than with [`Box::new_in`].
586 #[cfg(not(no_global_oom_handling))]
587 #[unstable(feature = "allocator_api", issue = "32838")]
588 #[must_use]
589 #[inline(always)]
590 pub fn pin_in(x: T, alloc: A) -> Pin<Self>
591 where
592 A: 'static + Allocator,
593 {
594 Self::into_pin(Self::new_in(x, alloc))
595 }
596
597 /// Converts a `Box<T>` into a `Box<[T]>`
598 ///
599 /// This conversion does not allocate on the heap and happens in place.
600 #[unstable(feature = "box_into_boxed_slice", issue = "71582")]
601 pub fn into_boxed_slice(boxed: Self) -> Box<[T], A> {
602 let (raw, alloc) = Box::into_raw_with_allocator(boxed);
603 unsafe { Box::from_raw_in(raw as *mut [T; 1], alloc) }
604 }
605
606 /// Consumes the `Box`, returning the wrapped value.
607 ///
608 /// # Examples
609 ///
610 /// ```
611 /// #![feature(box_into_inner)]
612 ///
613 /// let c = Box::new(5);
614 ///
615 /// assert_eq!(Box::into_inner(c), 5);
616 /// ```
617 #[unstable(feature = "box_into_inner", issue = "80437")]
618 #[inline]
619 pub fn into_inner(boxed: Self) -> T {
620 *boxed
621 }
622}
623
624impl<T> Box<[T]> {
625 /// Constructs a new boxed slice with uninitialized contents.
626 ///
627 /// # Examples
628 ///
629 /// ```
630 /// let mut values = Box::<[u32]>::new_uninit_slice(3);
631 /// // Deferred initialization:
632 /// values[0].write(1);
633 /// values[1].write(2);
634 /// values[2].write(3);
635 /// let values = unsafe { values.assume_init() };
636 ///
637 /// assert_eq!(*values, [1, 2, 3])
638 /// ```
639 #[cfg(not(no_global_oom_handling))]
640 #[stable(feature = "new_uninit", since = "1.82.0")]
641 #[must_use]
642 pub fn new_uninit_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
643 unsafe { RawVec::with_capacity(len).into_box(len) }
644 }
645
646 /// Constructs a new boxed slice with uninitialized contents, with the memory
647 /// being filled with `0` bytes.
648 ///
649 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
650 /// of this method.
651 ///
652 /// # Examples
653 ///
654 /// ```
655 /// let values = Box::<[u32]>::new_zeroed_slice(3);
656 /// let values = unsafe { values.assume_init() };
657 ///
658 /// assert_eq!(*values, [0, 0, 0])
659 /// ```
660 ///
661 /// [zeroed]: mem::MaybeUninit::zeroed
662 #[cfg(not(no_global_oom_handling))]
663 #[stable(feature = "new_zeroed_alloc", since = "CURRENT_RUSTC_VERSION")]
664 #[must_use]
665 pub fn new_zeroed_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
666 unsafe { RawVec::with_capacity_zeroed(len).into_box(len) }
667 }
668
669 /// Constructs a new boxed slice with uninitialized contents. Returns an error if
670 /// the allocation fails.
671 ///
672 /// # Examples
673 ///
674 /// ```
675 /// #![feature(allocator_api)]
676 ///
677 /// let mut values = Box::<[u32]>::try_new_uninit_slice(3)?;
678 /// // Deferred initialization:
679 /// values[0].write(1);
680 /// values[1].write(2);
681 /// values[2].write(3);
682 /// let values = unsafe { values.assume_init() };
683 ///
684 /// assert_eq!(*values, [1, 2, 3]);
685 /// # Ok::<(), std::alloc::AllocError>(())
686 /// ```
687 #[unstable(feature = "allocator_api", issue = "32838")]
688 #[inline]
689 pub fn try_new_uninit_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
690 let ptr = if T::IS_ZST || len == 0 {
691 NonNull::dangling()
692 } else {
693 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
694 Ok(l) => l,
695 Err(_) => return Err(AllocError),
696 };
697 Global.allocate(layout)?.cast()
698 };
699 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
700 }
701
702 /// Constructs a new boxed slice with uninitialized contents, with the memory
703 /// being filled with `0` bytes. Returns an error if the allocation fails.
704 ///
705 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
706 /// of this method.
707 ///
708 /// # Examples
709 ///
710 /// ```
711 /// #![feature(allocator_api)]
712 ///
713 /// let values = Box::<[u32]>::try_new_zeroed_slice(3)?;
714 /// let values = unsafe { values.assume_init() };
715 ///
716 /// assert_eq!(*values, [0, 0, 0]);
717 /// # Ok::<(), std::alloc::AllocError>(())
718 /// ```
719 ///
720 /// [zeroed]: mem::MaybeUninit::zeroed
721 #[unstable(feature = "allocator_api", issue = "32838")]
722 #[inline]
723 pub fn try_new_zeroed_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
724 let ptr = if T::IS_ZST || len == 0 {
725 NonNull::dangling()
726 } else {
727 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
728 Ok(l) => l,
729 Err(_) => return Err(AllocError),
730 };
731 Global.allocate_zeroed(layout)?.cast()
732 };
733 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
734 }
735
736 /// Converts the boxed slice into a boxed array.
737 ///
738 /// This operation does not reallocate; the underlying array of the slice is simply reinterpreted as an array type.
739 ///
740 /// If `N` is not exactly equal to the length of `self`, then this method returns `None`.
741 #[unstable(feature = "slice_as_array", issue = "133508")]
742 #[inline]
743 #[must_use]
744 pub fn into_array<const N: usize>(self) -> Option<Box<[T; N]>> {
745 if self.len() == N {
746 let ptr = Self::into_raw(self) as *mut [T; N];
747
748 // SAFETY: The underlying array of a slice has the exact same layout as an actual array `[T; N]` if `N` is equal to the slice's length.
749 let me = unsafe { Box::from_raw(ptr) };
750 Some(me)
751 } else {
752 None
753 }
754 }
755}
756
757impl<T, A: Allocator> Box<[T], A> {
758 /// Constructs a new boxed slice with uninitialized contents in the provided allocator.
759 ///
760 /// # Examples
761 ///
762 /// ```
763 /// #![feature(allocator_api)]
764 ///
765 /// use std::alloc::System;
766 ///
767 /// let mut values = Box::<[u32], _>::new_uninit_slice_in(3, System);
768 /// // Deferred initialization:
769 /// values[0].write(1);
770 /// values[1].write(2);
771 /// values[2].write(3);
772 /// let values = unsafe { values.assume_init() };
773 ///
774 /// assert_eq!(*values, [1, 2, 3])
775 /// ```
776 #[cfg(not(no_global_oom_handling))]
777 #[unstable(feature = "allocator_api", issue = "32838")]
778 #[must_use]
779 pub fn new_uninit_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
780 unsafe { RawVec::with_capacity_in(len, alloc).into_box(len) }
781 }
782
783 /// Constructs a new boxed slice with uninitialized contents in the provided allocator,
784 /// with the memory being filled with `0` bytes.
785 ///
786 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
787 /// of this method.
788 ///
789 /// # Examples
790 ///
791 /// ```
792 /// #![feature(allocator_api)]
793 ///
794 /// use std::alloc::System;
795 ///
796 /// let values = Box::<[u32], _>::new_zeroed_slice_in(3, System);
797 /// let values = unsafe { values.assume_init() };
798 ///
799 /// assert_eq!(*values, [0, 0, 0])
800 /// ```
801 ///
802 /// [zeroed]: mem::MaybeUninit::zeroed
803 #[cfg(not(no_global_oom_handling))]
804 #[unstable(feature = "allocator_api", issue = "32838")]
805 #[must_use]
806 pub fn new_zeroed_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
807 unsafe { RawVec::with_capacity_zeroed_in(len, alloc).into_box(len) }
808 }
809
810 /// Constructs a new boxed slice with uninitialized contents in the provided allocator. Returns an error if
811 /// the allocation fails.
812 ///
813 /// # Examples
814 ///
815 /// ```
816 /// #![feature(allocator_api)]
817 ///
818 /// use std::alloc::System;
819 ///
820 /// let mut values = Box::<[u32], _>::try_new_uninit_slice_in(3, System)?;
821 /// // Deferred initialization:
822 /// values[0].write(1);
823 /// values[1].write(2);
824 /// values[2].write(3);
825 /// let values = unsafe { values.assume_init() };
826 ///
827 /// assert_eq!(*values, [1, 2, 3]);
828 /// # Ok::<(), std::alloc::AllocError>(())
829 /// ```
830 #[unstable(feature = "allocator_api", issue = "32838")]
831 #[inline]
832 pub fn try_new_uninit_slice_in(
833 len: usize,
834 alloc: A,
835 ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
836 let ptr = if T::IS_ZST || len == 0 {
837 NonNull::dangling()
838 } else {
839 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
840 Ok(l) => l,
841 Err(_) => return Err(AllocError),
842 };
843 alloc.allocate(layout)?.cast()
844 };
845 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
846 }
847
848 /// Constructs a new boxed slice with uninitialized contents in the provided allocator, with the memory
849 /// being filled with `0` bytes. Returns an error if the allocation fails.
850 ///
851 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
852 /// of this method.
853 ///
854 /// # Examples
855 ///
856 /// ```
857 /// #![feature(allocator_api)]
858 ///
859 /// use std::alloc::System;
860 ///
861 /// let values = Box::<[u32], _>::try_new_zeroed_slice_in(3, System)?;
862 /// let values = unsafe { values.assume_init() };
863 ///
864 /// assert_eq!(*values, [0, 0, 0]);
865 /// # Ok::<(), std::alloc::AllocError>(())
866 /// ```
867 ///
868 /// [zeroed]: mem::MaybeUninit::zeroed
869 #[unstable(feature = "allocator_api", issue = "32838")]
870 #[inline]
871 pub fn try_new_zeroed_slice_in(
872 len: usize,
873 alloc: A,
874 ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
875 let ptr = if T::IS_ZST || len == 0 {
876 NonNull::dangling()
877 } else {
878 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
879 Ok(l) => l,
880 Err(_) => return Err(AllocError),
881 };
882 alloc.allocate_zeroed(layout)?.cast()
883 };
884 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
885 }
886}
887
888impl<T, A: Allocator> Box<mem::MaybeUninit<T>, A> {
889 /// Converts to `Box<T, A>`.
890 ///
891 /// # Safety
892 ///
893 /// As with [`MaybeUninit::assume_init`],
894 /// it is up to the caller to guarantee that the value
895 /// really is in an initialized state.
896 /// Calling this when the content is not yet fully initialized
897 /// causes immediate undefined behavior.
898 ///
899 /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
900 ///
901 /// # Examples
902 ///
903 /// ```
904 /// let mut five = Box::<u32>::new_uninit();
905 /// // Deferred initialization:
906 /// five.write(5);
907 /// let five: Box<u32> = unsafe { five.assume_init() };
908 ///
909 /// assert_eq!(*five, 5)
910 /// ```
911 #[stable(feature = "new_uninit", since = "1.82.0")]
912 #[inline]
913 pub unsafe fn assume_init(self) -> Box<T, A> {
914 let (raw, alloc) = Box::into_raw_with_allocator(self);
915 unsafe { Box::from_raw_in(raw as *mut T, alloc) }
916 }
917
918 /// Writes the value and converts to `Box<T, A>`.
919 ///
920 /// This method converts the box similarly to [`Box::assume_init`] but
921 /// writes `value` into it before conversion thus guaranteeing safety.
922 /// In some scenarios use of this method may improve performance because
923 /// the compiler may be able to optimize copying from stack.
924 ///
925 /// # Examples
926 ///
927 /// ```
928 /// let big_box = Box::<[usize; 1024]>::new_uninit();
929 ///
930 /// let mut array = [0; 1024];
931 /// for (i, place) in array.iter_mut().enumerate() {
932 /// *place = i;
933 /// }
934 ///
935 /// // The optimizer may be able to elide this copy, so previous code writes
936 /// // to heap directly.
937 /// let big_box = Box::write(big_box, array);
938 ///
939 /// for (i, x) in big_box.iter().enumerate() {
940 /// assert_eq!(*x, i);
941 /// }
942 /// ```
943 #[stable(feature = "box_uninit_write", since = "1.87.0")]
944 #[inline]
945 pub fn write(mut boxed: Self, value: T) -> Box<T, A> {
946 unsafe {
947 (*boxed).write(value);
948 boxed.assume_init()
949 }
950 }
951}
952
953impl<T, A: Allocator> Box<[mem::MaybeUninit<T>], A> {
954 /// Converts to `Box<[T], A>`.
955 ///
956 /// # Safety
957 ///
958 /// As with [`MaybeUninit::assume_init`],
959 /// it is up to the caller to guarantee that the values
960 /// really are in an initialized state.
961 /// Calling this when the content is not yet fully initialized
962 /// causes immediate undefined behavior.
963 ///
964 /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
965 ///
966 /// # Examples
967 ///
968 /// ```
969 /// let mut values = Box::<[u32]>::new_uninit_slice(3);
970 /// // Deferred initialization:
971 /// values[0].write(1);
972 /// values[1].write(2);
973 /// values[2].write(3);
974 /// let values = unsafe { values.assume_init() };
975 ///
976 /// assert_eq!(*values, [1, 2, 3])
977 /// ```
978 #[stable(feature = "new_uninit", since = "1.82.0")]
979 #[inline]
980 pub unsafe fn assume_init(self) -> Box<[T], A> {
981 let (raw, alloc) = Box::into_raw_with_allocator(self);
982 unsafe { Box::from_raw_in(raw as *mut [T], alloc) }
983 }
984}
985
986impl<T: ?Sized> Box<T> {
987 /// Constructs a box from a raw pointer.
988 ///
989 /// After calling this function, the raw pointer is owned by the
990 /// resulting `Box`. Specifically, the `Box` destructor will call
991 /// the destructor of `T` and free the allocated memory. For this
992 /// to be safe, the memory must have been allocated in accordance
993 /// with the [memory layout] used by `Box` .
994 ///
995 /// # Safety
996 ///
997 /// This function is unsafe because improper use may lead to
998 /// memory problems. For example, a double-free may occur if the
999 /// function is called twice on the same raw pointer.
1000 ///
1001 /// The raw pointer must point to a block of memory allocated by the global allocator.
1002 ///
1003 /// The safety conditions are described in the [memory layout] section.
1004 ///
1005 /// # Examples
1006 ///
1007 /// Recreate a `Box` which was previously converted to a raw pointer
1008 /// using [`Box::into_raw`]:
1009 /// ```
1010 /// let x = Box::new(5);
1011 /// let ptr = Box::into_raw(x);
1012 /// let x = unsafe { Box::from_raw(ptr) };
1013 /// ```
1014 /// Manually create a `Box` from scratch by using the global allocator:
1015 /// ```
1016 /// use std::alloc::{alloc, Layout};
1017 ///
1018 /// unsafe {
1019 /// let ptr = alloc(Layout::new::<i32>()) as *mut i32;
1020 /// // In general .write is required to avoid attempting to destruct
1021 /// // the (uninitialized) previous contents of `ptr`, though for this
1022 /// // simple example `*ptr = 5` would have worked as well.
1023 /// ptr.write(5);
1024 /// let x = Box::from_raw(ptr);
1025 /// }
1026 /// ```
1027 ///
1028 /// [memory layout]: self#memory-layout
1029 #[stable(feature = "box_raw", since = "1.4.0")]
1030 #[inline]
1031 #[must_use = "call `drop(Box::from_raw(ptr))` if you intend to drop the `Box`"]
1032 pub unsafe fn from_raw(raw: *mut T) -> Self {
1033 unsafe { Self::from_raw_in(raw, Global) }
1034 }
1035
1036 /// Constructs a box from a `NonNull` pointer.
1037 ///
1038 /// After calling this function, the `NonNull` pointer is owned by
1039 /// the resulting `Box`. Specifically, the `Box` destructor will call
1040 /// the destructor of `T` and free the allocated memory. For this
1041 /// to be safe, the memory must have been allocated in accordance
1042 /// with the [memory layout] used by `Box` .
1043 ///
1044 /// # Safety
1045 ///
1046 /// This function is unsafe because improper use may lead to
1047 /// memory problems. For example, a double-free may occur if the
1048 /// function is called twice on the same `NonNull` pointer.
1049 ///
1050 /// The non-null pointer must point to a block of memory allocated by the global allocator.
1051 ///
1052 /// The safety conditions are described in the [memory layout] section.
1053 ///
1054 /// # Examples
1055 ///
1056 /// Recreate a `Box` which was previously converted to a `NonNull`
1057 /// pointer using [`Box::into_non_null`]:
1058 /// ```
1059 /// #![feature(box_vec_non_null)]
1060 ///
1061 /// let x = Box::new(5);
1062 /// let non_null = Box::into_non_null(x);
1063 /// let x = unsafe { Box::from_non_null(non_null) };
1064 /// ```
1065 /// Manually create a `Box` from scratch by using the global allocator:
1066 /// ```
1067 /// #![feature(box_vec_non_null)]
1068 ///
1069 /// use std::alloc::{alloc, Layout};
1070 /// use std::ptr::NonNull;
1071 ///
1072 /// unsafe {
1073 /// let non_null = NonNull::new(alloc(Layout::new::<i32>()).cast::<i32>())
1074 /// .expect("allocation failed");
1075 /// // In general .write is required to avoid attempting to destruct
1076 /// // the (uninitialized) previous contents of `non_null`.
1077 /// non_null.write(5);
1078 /// let x = Box::from_non_null(non_null);
1079 /// }
1080 /// ```
1081 ///
1082 /// [memory layout]: self#memory-layout
1083 #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1084 #[inline]
1085 #[must_use = "call `drop(Box::from_non_null(ptr))` if you intend to drop the `Box`"]
1086 pub unsafe fn from_non_null(ptr: NonNull<T>) -> Self {
1087 unsafe { Self::from_raw(ptr.as_ptr()) }
1088 }
1089
1090 /// Consumes the `Box`, returning a wrapped raw pointer.
1091 ///
1092 /// The pointer will be properly aligned and non-null.
1093 ///
1094 /// After calling this function, the caller is responsible for the
1095 /// memory previously managed by the `Box`. In particular, the
1096 /// caller should properly destroy `T` and release the memory, taking
1097 /// into account the [memory layout] used by `Box`. The easiest way to
1098 /// do this is to convert the raw pointer back into a `Box` with the
1099 /// [`Box::from_raw`] function, allowing the `Box` destructor to perform
1100 /// the cleanup.
1101 ///
1102 /// Note: this is an associated function, which means that you have
1103 /// to call it as `Box::into_raw(b)` instead of `b.into_raw()`. This
1104 /// is so that there is no conflict with a method on the inner type.
1105 ///
1106 /// # Examples
1107 /// Converting the raw pointer back into a `Box` with [`Box::from_raw`]
1108 /// for automatic cleanup:
1109 /// ```
1110 /// let x = Box::new(String::from("Hello"));
1111 /// let ptr = Box::into_raw(x);
1112 /// let x = unsafe { Box::from_raw(ptr) };
1113 /// ```
1114 /// Manual cleanup by explicitly running the destructor and deallocating
1115 /// the memory:
1116 /// ```
1117 /// use std::alloc::{dealloc, Layout};
1118 /// use std::ptr;
1119 ///
1120 /// let x = Box::new(String::from("Hello"));
1121 /// let ptr = Box::into_raw(x);
1122 /// unsafe {
1123 /// ptr::drop_in_place(ptr);
1124 /// dealloc(ptr as *mut u8, Layout::new::<String>());
1125 /// }
1126 /// ```
1127 /// Note: This is equivalent to the following:
1128 /// ```
1129 /// let x = Box::new(String::from("Hello"));
1130 /// let ptr = Box::into_raw(x);
1131 /// unsafe {
1132 /// drop(Box::from_raw(ptr));
1133 /// }
1134 /// ```
1135 ///
1136 /// [memory layout]: self#memory-layout
1137 #[must_use = "losing the pointer will leak memory"]
1138 #[stable(feature = "box_raw", since = "1.4.0")]
1139 #[inline]
1140 pub fn into_raw(b: Self) -> *mut T {
1141 // Avoid `into_raw_with_allocator` as that interacts poorly with Miri's Stacked Borrows.
1142 let mut b = mem::ManuallyDrop::new(b);
1143 // We go through the built-in deref for `Box`, which is crucial for Miri to recognize this
1144 // operation for it's alias tracking.
1145 &raw mut **b
1146 }
1147
1148 /// Consumes the `Box`, returning a wrapped `NonNull` pointer.
1149 ///
1150 /// The pointer will be properly aligned.
1151 ///
1152 /// After calling this function, the caller is responsible for the
1153 /// memory previously managed by the `Box`. In particular, the
1154 /// caller should properly destroy `T` and release the memory, taking
1155 /// into account the [memory layout] used by `Box`. The easiest way to
1156 /// do this is to convert the `NonNull` pointer back into a `Box` with the
1157 /// [`Box::from_non_null`] function, allowing the `Box` destructor to
1158 /// perform the cleanup.
1159 ///
1160 /// Note: this is an associated function, which means that you have
1161 /// to call it as `Box::into_non_null(b)` instead of `b.into_non_null()`.
1162 /// This is so that there is no conflict with a method on the inner type.
1163 ///
1164 /// # Examples
1165 /// Converting the `NonNull` pointer back into a `Box` with [`Box::from_non_null`]
1166 /// for automatic cleanup:
1167 /// ```
1168 /// #![feature(box_vec_non_null)]
1169 ///
1170 /// let x = Box::new(String::from("Hello"));
1171 /// let non_null = Box::into_non_null(x);
1172 /// let x = unsafe { Box::from_non_null(non_null) };
1173 /// ```
1174 /// Manual cleanup by explicitly running the destructor and deallocating
1175 /// the memory:
1176 /// ```
1177 /// #![feature(box_vec_non_null)]
1178 ///
1179 /// use std::alloc::{dealloc, Layout};
1180 ///
1181 /// let x = Box::new(String::from("Hello"));
1182 /// let non_null = Box::into_non_null(x);
1183 /// unsafe {
1184 /// non_null.drop_in_place();
1185 /// dealloc(non_null.as_ptr().cast::<u8>(), Layout::new::<String>());
1186 /// }
1187 /// ```
1188 /// Note: This is equivalent to the following:
1189 /// ```
1190 /// #![feature(box_vec_non_null)]
1191 ///
1192 /// let x = Box::new(String::from("Hello"));
1193 /// let non_null = Box::into_non_null(x);
1194 /// unsafe {
1195 /// drop(Box::from_non_null(non_null));
1196 /// }
1197 /// ```
1198 ///
1199 /// [memory layout]: self#memory-layout
1200 #[must_use = "losing the pointer will leak memory"]
1201 #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1202 #[inline]
1203 pub fn into_non_null(b: Self) -> NonNull<T> {
1204 // SAFETY: `Box` is guaranteed to be non-null.
1205 unsafe { NonNull::new_unchecked(Self::into_raw(b)) }
1206 }
1207}
1208
1209impl<T: ?Sized, A: Allocator> Box<T, A> {
1210 /// Constructs a box from a raw pointer in the given allocator.
1211 ///
1212 /// After calling this function, the raw pointer is owned by the
1213 /// resulting `Box`. Specifically, the `Box` destructor will call
1214 /// the destructor of `T` and free the allocated memory. For this
1215 /// to be safe, the memory must have been allocated in accordance
1216 /// with the [memory layout] used by `Box` .
1217 ///
1218 /// # Safety
1219 ///
1220 /// This function is unsafe because improper use may lead to
1221 /// memory problems. For example, a double-free may occur if the
1222 /// function is called twice on the same raw pointer.
1223 ///
1224 /// The raw pointer must point to a block of memory allocated by `alloc`.
1225 ///
1226 /// # Examples
1227 ///
1228 /// Recreate a `Box` which was previously converted to a raw pointer
1229 /// using [`Box::into_raw_with_allocator`]:
1230 /// ```
1231 /// #![feature(allocator_api)]
1232 ///
1233 /// use std::alloc::System;
1234 ///
1235 /// let x = Box::new_in(5, System);
1236 /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1237 /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1238 /// ```
1239 /// Manually create a `Box` from scratch by using the system allocator:
1240 /// ```
1241 /// #![feature(allocator_api, slice_ptr_get)]
1242 ///
1243 /// use std::alloc::{Allocator, Layout, System};
1244 ///
1245 /// unsafe {
1246 /// let ptr = System.allocate(Layout::new::<i32>())?.as_mut_ptr() as *mut i32;
1247 /// // In general .write is required to avoid attempting to destruct
1248 /// // the (uninitialized) previous contents of `ptr`, though for this
1249 /// // simple example `*ptr = 5` would have worked as well.
1250 /// ptr.write(5);
1251 /// let x = Box::from_raw_in(ptr, System);
1252 /// }
1253 /// # Ok::<(), std::alloc::AllocError>(())
1254 /// ```
1255 ///
1256 /// [memory layout]: self#memory-layout
1257 #[unstable(feature = "allocator_api", issue = "32838")]
1258 #[inline]
1259 pub unsafe fn from_raw_in(raw: *mut T, alloc: A) -> Self {
1260 Box(unsafe { Unique::new_unchecked(raw) }, alloc)
1261 }
1262
1263 /// Constructs a box from a `NonNull` pointer in the given allocator.
1264 ///
1265 /// After calling this function, the `NonNull` pointer is owned by
1266 /// the resulting `Box`. Specifically, the `Box` destructor will call
1267 /// the destructor of `T` and free the allocated memory. For this
1268 /// to be safe, the memory must have been allocated in accordance
1269 /// with the [memory layout] used by `Box` .
1270 ///
1271 /// # Safety
1272 ///
1273 /// This function is unsafe because improper use may lead to
1274 /// memory problems. For example, a double-free may occur if the
1275 /// function is called twice on the same raw pointer.
1276 ///
1277 /// The non-null pointer must point to a block of memory allocated by `alloc`.
1278 ///
1279 /// # Examples
1280 ///
1281 /// Recreate a `Box` which was previously converted to a `NonNull` pointer
1282 /// using [`Box::into_non_null_with_allocator`]:
1283 /// ```
1284 /// #![feature(allocator_api, box_vec_non_null)]
1285 ///
1286 /// use std::alloc::System;
1287 ///
1288 /// let x = Box::new_in(5, System);
1289 /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1290 /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1291 /// ```
1292 /// Manually create a `Box` from scratch by using the system allocator:
1293 /// ```
1294 /// #![feature(allocator_api, box_vec_non_null, slice_ptr_get)]
1295 ///
1296 /// use std::alloc::{Allocator, Layout, System};
1297 ///
1298 /// unsafe {
1299 /// let non_null = System.allocate(Layout::new::<i32>())?.cast::<i32>();
1300 /// // In general .write is required to avoid attempting to destruct
1301 /// // the (uninitialized) previous contents of `non_null`.
1302 /// non_null.write(5);
1303 /// let x = Box::from_non_null_in(non_null, System);
1304 /// }
1305 /// # Ok::<(), std::alloc::AllocError>(())
1306 /// ```
1307 ///
1308 /// [memory layout]: self#memory-layout
1309 #[unstable(feature = "allocator_api", issue = "32838")]
1310 // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1311 #[inline]
1312 pub unsafe fn from_non_null_in(raw: NonNull<T>, alloc: A) -> Self {
1313 // SAFETY: guaranteed by the caller.
1314 unsafe { Box::from_raw_in(raw.as_ptr(), alloc) }
1315 }
1316
1317 /// Consumes the `Box`, returning a wrapped raw pointer and the allocator.
1318 ///
1319 /// The pointer will be properly aligned and non-null.
1320 ///
1321 /// After calling this function, the caller is responsible for the
1322 /// memory previously managed by the `Box`. In particular, the
1323 /// caller should properly destroy `T` and release the memory, taking
1324 /// into account the [memory layout] used by `Box`. The easiest way to
1325 /// do this is to convert the raw pointer back into a `Box` with the
1326 /// [`Box::from_raw_in`] function, allowing the `Box` destructor to perform
1327 /// the cleanup.
1328 ///
1329 /// Note: this is an associated function, which means that you have
1330 /// to call it as `Box::into_raw_with_allocator(b)` instead of `b.into_raw_with_allocator()`. This
1331 /// is so that there is no conflict with a method on the inner type.
1332 ///
1333 /// # Examples
1334 /// Converting the raw pointer back into a `Box` with [`Box::from_raw_in`]
1335 /// for automatic cleanup:
1336 /// ```
1337 /// #![feature(allocator_api)]
1338 ///
1339 /// use std::alloc::System;
1340 ///
1341 /// let x = Box::new_in(String::from("Hello"), System);
1342 /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1343 /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1344 /// ```
1345 /// Manual cleanup by explicitly running the destructor and deallocating
1346 /// the memory:
1347 /// ```
1348 /// #![feature(allocator_api)]
1349 ///
1350 /// use std::alloc::{Allocator, Layout, System};
1351 /// use std::ptr::{self, NonNull};
1352 ///
1353 /// let x = Box::new_in(String::from("Hello"), System);
1354 /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1355 /// unsafe {
1356 /// ptr::drop_in_place(ptr);
1357 /// let non_null = NonNull::new_unchecked(ptr);
1358 /// alloc.deallocate(non_null.cast(), Layout::new::<String>());
1359 /// }
1360 /// ```
1361 ///
1362 /// [memory layout]: self#memory-layout
1363 #[must_use = "losing the pointer will leak memory"]
1364 #[unstable(feature = "allocator_api", issue = "32838")]
1365 #[inline]
1366 pub fn into_raw_with_allocator(b: Self) -> (*mut T, A) {
1367 let mut b = mem::ManuallyDrop::new(b);
1368 // We carefully get the raw pointer out in a way that Miri's aliasing model understands what
1369 // is happening: using the primitive "deref" of `Box`. In case `A` is *not* `Global`, we
1370 // want *no* aliasing requirements here!
1371 // In case `A` *is* `Global`, this does not quite have the right behavior; `into_raw`
1372 // works around that.
1373 let ptr = &raw mut **b;
1374 let alloc = unsafe { ptr::read(&b.1) };
1375 (ptr, alloc)
1376 }
1377
1378 /// Consumes the `Box`, returning a wrapped `NonNull` pointer and the allocator.
1379 ///
1380 /// The pointer will be properly aligned.
1381 ///
1382 /// After calling this function, the caller is responsible for the
1383 /// memory previously managed by the `Box`. In particular, the
1384 /// caller should properly destroy `T` and release the memory, taking
1385 /// into account the [memory layout] used by `Box`. The easiest way to
1386 /// do this is to convert the `NonNull` pointer back into a `Box` with the
1387 /// [`Box::from_non_null_in`] function, allowing the `Box` destructor to
1388 /// perform the cleanup.
1389 ///
1390 /// Note: this is an associated function, which means that you have
1391 /// to call it as `Box::into_non_null_with_allocator(b)` instead of
1392 /// `b.into_non_null_with_allocator()`. This is so that there is no
1393 /// conflict with a method on the inner type.
1394 ///
1395 /// # Examples
1396 /// Converting the `NonNull` pointer back into a `Box` with
1397 /// [`Box::from_non_null_in`] for automatic cleanup:
1398 /// ```
1399 /// #![feature(allocator_api, box_vec_non_null)]
1400 ///
1401 /// use std::alloc::System;
1402 ///
1403 /// let x = Box::new_in(String::from("Hello"), System);
1404 /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1405 /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1406 /// ```
1407 /// Manual cleanup by explicitly running the destructor and deallocating
1408 /// the memory:
1409 /// ```
1410 /// #![feature(allocator_api, box_vec_non_null)]
1411 ///
1412 /// use std::alloc::{Allocator, Layout, System};
1413 ///
1414 /// let x = Box::new_in(String::from("Hello"), System);
1415 /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1416 /// unsafe {
1417 /// non_null.drop_in_place();
1418 /// alloc.deallocate(non_null.cast::<u8>(), Layout::new::<String>());
1419 /// }
1420 /// ```
1421 ///
1422 /// [memory layout]: self#memory-layout
1423 #[must_use = "losing the pointer will leak memory"]
1424 #[unstable(feature = "allocator_api", issue = "32838")]
1425 // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1426 #[inline]
1427 pub fn into_non_null_with_allocator(b: Self) -> (NonNull<T>, A) {
1428 let (ptr, alloc) = Box::into_raw_with_allocator(b);
1429 // SAFETY: `Box` is guaranteed to be non-null.
1430 unsafe { (NonNull::new_unchecked(ptr), alloc) }
1431 }
1432
1433 #[unstable(
1434 feature = "ptr_internals",
1435 issue = "none",
1436 reason = "use `Box::leak(b).into()` or `Unique::from(Box::leak(b))` instead"
1437 )]
1438 #[inline]
1439 #[doc(hidden)]
1440 pub fn into_unique(b: Self) -> (Unique<T>, A) {
1441 let (ptr, alloc) = Box::into_raw_with_allocator(b);
1442 unsafe { (Unique::from(&mut *ptr), alloc) }
1443 }
1444
1445 /// Returns a raw mutable pointer to the `Box`'s contents.
1446 ///
1447 /// The caller must ensure that the `Box` outlives the pointer this
1448 /// function returns, or else it will end up dangling.
1449 ///
1450 /// This method guarantees that for the purpose of the aliasing model, this method
1451 /// does not materialize a reference to the underlying memory, and thus the returned pointer
1452 /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1453 /// Note that calling other methods that materialize references to the memory
1454 /// may still invalidate this pointer.
1455 /// See the example below for how this guarantee can be used.
1456 ///
1457 /// # Examples
1458 ///
1459 /// Due to the aliasing guarantee, the following code is legal:
1460 ///
1461 /// ```rust
1462 /// #![feature(box_as_ptr)]
1463 ///
1464 /// unsafe {
1465 /// let mut b = Box::new(0);
1466 /// let ptr1 = Box::as_mut_ptr(&mut b);
1467 /// ptr1.write(1);
1468 /// let ptr2 = Box::as_mut_ptr(&mut b);
1469 /// ptr2.write(2);
1470 /// // Notably, the write to `ptr2` did *not* invalidate `ptr1`:
1471 /// ptr1.write(3);
1472 /// }
1473 /// ```
1474 ///
1475 /// [`as_mut_ptr`]: Self::as_mut_ptr
1476 /// [`as_ptr`]: Self::as_ptr
1477 #[unstable(feature = "box_as_ptr", issue = "129090")]
1478 #[rustc_never_returns_null_ptr]
1479 #[rustc_as_ptr]
1480 #[inline]
1481 pub fn as_mut_ptr(b: &mut Self) -> *mut T {
1482 // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1483 // any references.
1484 &raw mut **b
1485 }
1486
1487 /// Returns a raw pointer to the `Box`'s contents.
1488 ///
1489 /// The caller must ensure that the `Box` outlives the pointer this
1490 /// function returns, or else it will end up dangling.
1491 ///
1492 /// The caller must also ensure that the memory the pointer (non-transitively) points to
1493 /// is never written to (except inside an `UnsafeCell`) using this pointer or any pointer
1494 /// derived from it. If you need to mutate the contents of the `Box`, use [`as_mut_ptr`].
1495 ///
1496 /// This method guarantees that for the purpose of the aliasing model, this method
1497 /// does not materialize a reference to the underlying memory, and thus the returned pointer
1498 /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1499 /// Note that calling other methods that materialize mutable references to the memory,
1500 /// as well as writing to this memory, may still invalidate this pointer.
1501 /// See the example below for how this guarantee can be used.
1502 ///
1503 /// # Examples
1504 ///
1505 /// Due to the aliasing guarantee, the following code is legal:
1506 ///
1507 /// ```rust
1508 /// #![feature(box_as_ptr)]
1509 ///
1510 /// unsafe {
1511 /// let mut v = Box::new(0);
1512 /// let ptr1 = Box::as_ptr(&v);
1513 /// let ptr2 = Box::as_mut_ptr(&mut v);
1514 /// let _val = ptr2.read();
1515 /// // No write to this memory has happened yet, so `ptr1` is still valid.
1516 /// let _val = ptr1.read();
1517 /// // However, once we do a write...
1518 /// ptr2.write(1);
1519 /// // ... `ptr1` is no longer valid.
1520 /// // This would be UB: let _val = ptr1.read();
1521 /// }
1522 /// ```
1523 ///
1524 /// [`as_mut_ptr`]: Self::as_mut_ptr
1525 /// [`as_ptr`]: Self::as_ptr
1526 #[unstable(feature = "box_as_ptr", issue = "129090")]
1527 #[rustc_never_returns_null_ptr]
1528 #[rustc_as_ptr]
1529 #[inline]
1530 pub fn as_ptr(b: &Self) -> *const T {
1531 // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1532 // any references.
1533 &raw const **b
1534 }
1535
1536 /// Returns a reference to the underlying allocator.
1537 ///
1538 /// Note: this is an associated function, which means that you have
1539 /// to call it as `Box::allocator(&b)` instead of `b.allocator()`. This
1540 /// is so that there is no conflict with a method on the inner type.
1541 #[unstable(feature = "allocator_api", issue = "32838")]
1542 #[inline]
1543 pub fn allocator(b: &Self) -> &A {
1544 &b.1
1545 }
1546
1547 /// Consumes and leaks the `Box`, returning a mutable reference,
1548 /// `&'a mut T`.
1549 ///
1550 /// Note that the type `T` must outlive the chosen lifetime `'a`. If the type
1551 /// has only static references, or none at all, then this may be chosen to be
1552 /// `'static`.
1553 ///
1554 /// This function is mainly useful for data that lives for the remainder of
1555 /// the program's life. Dropping the returned reference will cause a memory
1556 /// leak. If this is not acceptable, the reference should first be wrapped
1557 /// with the [`Box::from_raw`] function producing a `Box`. This `Box` can
1558 /// then be dropped which will properly destroy `T` and release the
1559 /// allocated memory.
1560 ///
1561 /// Note: this is an associated function, which means that you have
1562 /// to call it as `Box::leak(b)` instead of `b.leak()`. This
1563 /// is so that there is no conflict with a method on the inner type.
1564 ///
1565 /// # Examples
1566 ///
1567 /// Simple usage:
1568 ///
1569 /// ```
1570 /// let x = Box::new(41);
1571 /// let static_ref: &'static mut usize = Box::leak(x);
1572 /// *static_ref += 1;
1573 /// assert_eq!(*static_ref, 42);
1574 /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1575 /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1576 /// # drop(unsafe { Box::from_raw(static_ref) });
1577 /// ```
1578 ///
1579 /// Unsized data:
1580 ///
1581 /// ```
1582 /// let x = vec![1, 2, 3].into_boxed_slice();
1583 /// let static_ref = Box::leak(x);
1584 /// static_ref[0] = 4;
1585 /// assert_eq!(*static_ref, [4, 2, 3]);
1586 /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1587 /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1588 /// # drop(unsafe { Box::from_raw(static_ref) });
1589 /// ```
1590 #[stable(feature = "box_leak", since = "1.26.0")]
1591 #[inline]
1592 pub fn leak<'a>(b: Self) -> &'a mut T
1593 where
1594 A: 'a,
1595 {
1596 let (ptr, alloc) = Box::into_raw_with_allocator(b);
1597 mem::forget(alloc);
1598 unsafe { &mut *ptr }
1599 }
1600
1601 /// Converts a `Box<T>` into a `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
1602 /// `*boxed` will be pinned in memory and unable to be moved.
1603 ///
1604 /// This conversion does not allocate on the heap and happens in place.
1605 ///
1606 /// This is also available via [`From`].
1607 ///
1608 /// Constructing and pinning a `Box` with <code>Box::into_pin([Box::new]\(x))</code>
1609 /// can also be written more concisely using <code>[Box::pin]\(x)</code>.
1610 /// This `into_pin` method is useful if you already have a `Box<T>`, or you are
1611 /// constructing a (pinned) `Box` in a different way than with [`Box::new`].
1612 ///
1613 /// # Notes
1614 ///
1615 /// It's not recommended that crates add an impl like `From<Box<T>> for Pin<T>`,
1616 /// as it'll introduce an ambiguity when calling `Pin::from`.
1617 /// A demonstration of such a poor impl is shown below.
1618 ///
1619 /// ```compile_fail
1620 /// # use std::pin::Pin;
1621 /// struct Foo; // A type defined in this crate.
1622 /// impl From<Box<()>> for Pin<Foo> {
1623 /// fn from(_: Box<()>) -> Pin<Foo> {
1624 /// Pin::new(Foo)
1625 /// }
1626 /// }
1627 ///
1628 /// let foo = Box::new(());
1629 /// let bar = Pin::from(foo);
1630 /// ```
1631 #[stable(feature = "box_into_pin", since = "1.63.0")]
1632 pub fn into_pin(boxed: Self) -> Pin<Self>
1633 where
1634 A: 'static,
1635 {
1636 // It's not possible to move or replace the insides of a `Pin<Box<T>>`
1637 // when `T: !Unpin`, so it's safe to pin it directly without any
1638 // additional requirements.
1639 unsafe { Pin::new_unchecked(boxed) }
1640 }
1641}
1642
1643#[stable(feature = "rust1", since = "1.0.0")]
1644unsafe impl<#[may_dangle] T: ?Sized, A: Allocator> Drop for Box<T, A> {
1645 #[inline]
1646 fn drop(&mut self) {
1647 // the T in the Box is dropped by the compiler before the destructor is run
1648
1649 let ptr = self.0;
1650
1651 unsafe {
1652 let layout = Layout::for_value_raw(ptr.as_ptr());
1653 if layout.size() != 0 {
1654 self.1.deallocate(From::from(ptr.cast()), layout);
1655 }
1656 }
1657 }
1658}
1659
1660#[cfg(not(no_global_oom_handling))]
1661#[stable(feature = "rust1", since = "1.0.0")]
1662impl<T: Default> Default for Box<T> {
1663 /// Creates a `Box<T>`, with the `Default` value for `T`.
1664 #[inline]
1665 fn default() -> Self {
1666 let mut x: Box<mem::MaybeUninit<T>> = Box::new_uninit();
1667 unsafe {
1668 // SAFETY: `x` is valid for writing and has the same layout as `T`.
1669 // If `T::default()` panics, dropping `x` will just deallocate the Box as `MaybeUninit<T>`
1670 // does not have a destructor.
1671 //
1672 // We use `ptr::write` as `MaybeUninit::write` creates
1673 // extra stack copies of `T` in debug mode.
1674 //
1675 // See https://github.com/rust-lang/rust/issues/136043 for more context.
1676 ptr::write(&raw mut *x as *mut T, T::default());
1677 // SAFETY: `x` was just initialized above.
1678 x.assume_init()
1679 }
1680 }
1681}
1682
1683#[cfg(not(no_global_oom_handling))]
1684#[stable(feature = "rust1", since = "1.0.0")]
1685impl<T> Default for Box<[T]> {
1686 /// Creates an empty `[T]` inside a `Box`.
1687 #[inline]
1688 fn default() -> Self {
1689 let ptr: Unique<[T]> = Unique::<[T; 0]>::dangling();
1690 Box(ptr, Global)
1691 }
1692}
1693
1694#[cfg(not(no_global_oom_handling))]
1695#[stable(feature = "default_box_extra", since = "1.17.0")]
1696impl Default for Box<str> {
1697 #[inline]
1698 fn default() -> Self {
1699 // SAFETY: This is the same as `Unique::cast<U>` but with an unsized `U = str`.
1700 let ptr: Unique<str> = unsafe {
1701 let bytes: Unique<[u8]> = Unique::<[u8; 0]>::dangling();
1702 Unique::new_unchecked(bytes.as_ptr() as *mut str)
1703 };
1704 Box(ptr, Global)
1705 }
1706}
1707
1708#[cfg(not(no_global_oom_handling))]
1709#[stable(feature = "pin_default_impls", since = "CURRENT_RUSTC_VERSION")]
1710impl<T> Default for Pin<Box<T>>
1711where
1712 T: ?Sized,
1713 Box<T>: Default,
1714{
1715 #[inline]
1716 fn default() -> Self {
1717 Box::into_pin(Box::<T>::default())
1718 }
1719}
1720
1721#[cfg(not(no_global_oom_handling))]
1722#[stable(feature = "rust1", since = "1.0.0")]
1723impl<T: Clone, A: Allocator + Clone> Clone for Box<T, A> {
1724 /// Returns a new box with a `clone()` of this box's contents.
1725 ///
1726 /// # Examples
1727 ///
1728 /// ```
1729 /// let x = Box::new(5);
1730 /// let y = x.clone();
1731 ///
1732 /// // The value is the same
1733 /// assert_eq!(x, y);
1734 ///
1735 /// // But they are unique objects
1736 /// assert_ne!(&*x as *const i32, &*y as *const i32);
1737 /// ```
1738 #[inline]
1739 fn clone(&self) -> Self {
1740 // Pre-allocate memory to allow writing the cloned value directly.
1741 let mut boxed = Self::new_uninit_in(self.1.clone());
1742 unsafe {
1743 (**self).clone_to_uninit(boxed.as_mut_ptr().cast());
1744 boxed.assume_init()
1745 }
1746 }
1747
1748 /// Copies `source`'s contents into `self` without creating a new allocation.
1749 ///
1750 /// # Examples
1751 ///
1752 /// ```
1753 /// let x = Box::new(5);
1754 /// let mut y = Box::new(10);
1755 /// let yp: *const i32 = &*y;
1756 ///
1757 /// y.clone_from(&x);
1758 ///
1759 /// // The value is the same
1760 /// assert_eq!(x, y);
1761 ///
1762 /// // And no allocation occurred
1763 /// assert_eq!(yp, &*y);
1764 /// ```
1765 #[inline]
1766 fn clone_from(&mut self, source: &Self) {
1767 (**self).clone_from(&(**source));
1768 }
1769}
1770
1771#[cfg(not(no_global_oom_handling))]
1772#[stable(feature = "box_slice_clone", since = "1.3.0")]
1773impl<T: Clone, A: Allocator + Clone> Clone for Box<[T], A> {
1774 fn clone(&self) -> Self {
1775 let alloc = Box::allocator(self).clone();
1776 self.to_vec_in(alloc).into_boxed_slice()
1777 }
1778
1779 /// Copies `source`'s contents into `self` without creating a new allocation,
1780 /// so long as the two are of the same length.
1781 ///
1782 /// # Examples
1783 ///
1784 /// ```
1785 /// let x = Box::new([5, 6, 7]);
1786 /// let mut y = Box::new([8, 9, 10]);
1787 /// let yp: *const [i32] = &*y;
1788 ///
1789 /// y.clone_from(&x);
1790 ///
1791 /// // The value is the same
1792 /// assert_eq!(x, y);
1793 ///
1794 /// // And no allocation occurred
1795 /// assert_eq!(yp, &*y);
1796 /// ```
1797 fn clone_from(&mut self, source: &Self) {
1798 if self.len() == source.len() {
1799 self.clone_from_slice(&source);
1800 } else {
1801 *self = source.clone();
1802 }
1803 }
1804}
1805
1806#[cfg(not(no_global_oom_handling))]
1807#[stable(feature = "box_slice_clone", since = "1.3.0")]
1808impl Clone for Box<str> {
1809 fn clone(&self) -> Self {
1810 // this makes a copy of the data
1811 let buf: Box<[u8]> = self.as_bytes().into();
1812 unsafe { from_boxed_utf8_unchecked(buf) }
1813 }
1814}
1815
1816#[stable(feature = "rust1", since = "1.0.0")]
1817impl<T: ?Sized + PartialEq, A: Allocator> PartialEq for Box<T, A> {
1818 #[inline]
1819 fn eq(&self, other: &Self) -> bool {
1820 PartialEq::eq(&**self, &**other)
1821 }
1822 #[inline]
1823 fn ne(&self, other: &Self) -> bool {
1824 PartialEq::ne(&**self, &**other)
1825 }
1826}
1827
1828#[stable(feature = "rust1", since = "1.0.0")]
1829impl<T: ?Sized + PartialOrd, A: Allocator> PartialOrd for Box<T, A> {
1830 #[inline]
1831 fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
1832 PartialOrd::partial_cmp(&**self, &**other)
1833 }
1834 #[inline]
1835 fn lt(&self, other: &Self) -> bool {
1836 PartialOrd::lt(&**self, &**other)
1837 }
1838 #[inline]
1839 fn le(&self, other: &Self) -> bool {
1840 PartialOrd::le(&**self, &**other)
1841 }
1842 #[inline]
1843 fn ge(&self, other: &Self) -> bool {
1844 PartialOrd::ge(&**self, &**other)
1845 }
1846 #[inline]
1847 fn gt(&self, other: &Self) -> bool {
1848 PartialOrd::gt(&**self, &**other)
1849 }
1850}
1851
1852#[stable(feature = "rust1", since = "1.0.0")]
1853impl<T: ?Sized + Ord, A: Allocator> Ord for Box<T, A> {
1854 #[inline]
1855 fn cmp(&self, other: &Self) -> Ordering {
1856 Ord::cmp(&**self, &**other)
1857 }
1858}
1859
1860#[stable(feature = "rust1", since = "1.0.0")]
1861impl<T: ?Sized + Eq, A: Allocator> Eq for Box<T, A> {}
1862
1863#[stable(feature = "rust1", since = "1.0.0")]
1864impl<T: ?Sized + Hash, A: Allocator> Hash for Box<T, A> {
1865 fn hash<H: Hasher>(&self, state: &mut H) {
1866 (**self).hash(state);
1867 }
1868}
1869
1870#[stable(feature = "indirect_hasher_impl", since = "1.22.0")]
1871impl<T: ?Sized + Hasher, A: Allocator> Hasher for Box<T, A> {
1872 fn finish(&self) -> u64 {
1873 (**self).finish()
1874 }
1875 fn write(&mut self, bytes: &[u8]) {
1876 (**self).write(bytes)
1877 }
1878 fn write_u8(&mut self, i: u8) {
1879 (**self).write_u8(i)
1880 }
1881 fn write_u16(&mut self, i: u16) {
1882 (**self).write_u16(i)
1883 }
1884 fn write_u32(&mut self, i: u32) {
1885 (**self).write_u32(i)
1886 }
1887 fn write_u64(&mut self, i: u64) {
1888 (**self).write_u64(i)
1889 }
1890 fn write_u128(&mut self, i: u128) {
1891 (**self).write_u128(i)
1892 }
1893 fn write_usize(&mut self, i: usize) {
1894 (**self).write_usize(i)
1895 }
1896 fn write_i8(&mut self, i: i8) {
1897 (**self).write_i8(i)
1898 }
1899 fn write_i16(&mut self, i: i16) {
1900 (**self).write_i16(i)
1901 }
1902 fn write_i32(&mut self, i: i32) {
1903 (**self).write_i32(i)
1904 }
1905 fn write_i64(&mut self, i: i64) {
1906 (**self).write_i64(i)
1907 }
1908 fn write_i128(&mut self, i: i128) {
1909 (**self).write_i128(i)
1910 }
1911 fn write_isize(&mut self, i: isize) {
1912 (**self).write_isize(i)
1913 }
1914 fn write_length_prefix(&mut self, len: usize) {
1915 (**self).write_length_prefix(len)
1916 }
1917 fn write_str(&mut self, s: &str) {
1918 (**self).write_str(s)
1919 }
1920}
1921
1922#[stable(feature = "rust1", since = "1.0.0")]
1923impl<T: fmt::Display + ?Sized, A: Allocator> fmt::Display for Box<T, A> {
1924 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1925 fmt::Display::fmt(&**self, f)
1926 }
1927}
1928
1929#[stable(feature = "rust1", since = "1.0.0")]
1930impl<T: fmt::Debug + ?Sized, A: Allocator> fmt::Debug for Box<T, A> {
1931 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1932 fmt::Debug::fmt(&**self, f)
1933 }
1934}
1935
1936#[stable(feature = "rust1", since = "1.0.0")]
1937impl<T: ?Sized, A: Allocator> fmt::Pointer for Box<T, A> {
1938 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1939 // It's not possible to extract the inner Uniq directly from the Box,
1940 // instead we cast it to a *const which aliases the Unique
1941 let ptr: *const T = &**self;
1942 fmt::Pointer::fmt(&ptr, f)
1943 }
1944}
1945
1946#[stable(feature = "rust1", since = "1.0.0")]
1947impl<T: ?Sized, A: Allocator> Deref for Box<T, A> {
1948 type Target = T;
1949
1950 fn deref(&self) -> &T {
1951 &**self
1952 }
1953}
1954
1955#[stable(feature = "rust1", since = "1.0.0")]
1956impl<T: ?Sized, A: Allocator> DerefMut for Box<T, A> {
1957 fn deref_mut(&mut self) -> &mut T {
1958 &mut **self
1959 }
1960}
1961
1962#[unstable(feature = "deref_pure_trait", issue = "87121")]
1963unsafe impl<T: ?Sized, A: Allocator> DerefPure for Box<T, A> {}
1964
1965#[unstable(feature = "legacy_receiver_trait", issue = "none")]
1966impl<T: ?Sized, A: Allocator> LegacyReceiver for Box<T, A> {}
1967
1968#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
1969impl<Args: Tuple, F: FnOnce<Args> + ?Sized, A: Allocator> FnOnce<Args> for Box<F, A> {
1970 type Output = <F as FnOnce<Args>>::Output;
1971
1972 extern "rust-call" fn call_once(self, args: Args) -> Self::Output {
1973 <F as FnOnce<Args>>::call_once(*self, args)
1974 }
1975}
1976
1977#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
1978impl<Args: Tuple, F: FnMut<Args> + ?Sized, A: Allocator> FnMut<Args> for Box<F, A> {
1979 extern "rust-call" fn call_mut(&mut self, args: Args) -> Self::Output {
1980 <F as FnMut<Args>>::call_mut(self, args)
1981 }
1982}
1983
1984#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
1985impl<Args: Tuple, F: Fn<Args> + ?Sized, A: Allocator> Fn<Args> for Box<F, A> {
1986 extern "rust-call" fn call(&self, args: Args) -> Self::Output {
1987 <F as Fn<Args>>::call(self, args)
1988 }
1989}
1990
1991#[stable(feature = "async_closure", since = "1.85.0")]
1992impl<Args: Tuple, F: AsyncFnOnce<Args> + ?Sized, A: Allocator> AsyncFnOnce<Args> for Box<F, A> {
1993 type Output = F::Output;
1994 type CallOnceFuture = F::CallOnceFuture;
1995
1996 extern "rust-call" fn async_call_once(self, args: Args) -> Self::CallOnceFuture {
1997 F::async_call_once(*self, args)
1998 }
1999}
2000
2001#[stable(feature = "async_closure", since = "1.85.0")]
2002impl<Args: Tuple, F: AsyncFnMut<Args> + ?Sized, A: Allocator> AsyncFnMut<Args> for Box<F, A> {
2003 type CallRefFuture<'a>
2004 = F::CallRefFuture<'a>
2005 where
2006 Self: 'a;
2007
2008 extern "rust-call" fn async_call_mut(&mut self, args: Args) -> Self::CallRefFuture<'_> {
2009 F::async_call_mut(self, args)
2010 }
2011}
2012
2013#[stable(feature = "async_closure", since = "1.85.0")]
2014impl<Args: Tuple, F: AsyncFn<Args> + ?Sized, A: Allocator> AsyncFn<Args> for Box<F, A> {
2015 extern "rust-call" fn async_call(&self, args: Args) -> Self::CallRefFuture<'_> {
2016 F::async_call(self, args)
2017 }
2018}
2019
2020#[unstable(feature = "coerce_unsized", issue = "18598")]
2021impl<T: ?Sized + Unsize<U>, U: ?Sized, A: Allocator> CoerceUnsized<Box<U, A>> for Box<T, A> {}
2022
2023#[unstable(feature = "pin_coerce_unsized_trait", issue = "123430")]
2024unsafe impl<T: ?Sized, A: Allocator> PinCoerceUnsized for Box<T, A> {}
2025
2026// It is quite crucial that we only allow the `Global` allocator here.
2027// Handling arbitrary custom allocators (which can affect the `Box` layout heavily!)
2028// would need a lot of codegen and interpreter adjustments.
2029#[unstable(feature = "dispatch_from_dyn", issue = "none")]
2030impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Box<U>> for Box<T, Global> {}
2031
2032#[stable(feature = "box_borrow", since = "1.1.0")]
2033impl<T: ?Sized, A: Allocator> Borrow<T> for Box<T, A> {
2034 fn borrow(&self) -> &T {
2035 &**self
2036 }
2037}
2038
2039#[stable(feature = "box_borrow", since = "1.1.0")]
2040impl<T: ?Sized, A: Allocator> BorrowMut<T> for Box<T, A> {
2041 fn borrow_mut(&mut self) -> &mut T {
2042 &mut **self
2043 }
2044}
2045
2046#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2047impl<T: ?Sized, A: Allocator> AsRef<T> for Box<T, A> {
2048 fn as_ref(&self) -> &T {
2049 &**self
2050 }
2051}
2052
2053#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2054impl<T: ?Sized, A: Allocator> AsMut<T> for Box<T, A> {
2055 fn as_mut(&mut self) -> &mut T {
2056 &mut **self
2057 }
2058}
2059
2060/* Nota bene
2061 *
2062 * We could have chosen not to add this impl, and instead have written a
2063 * function of Pin<Box<T>> to Pin<T>. Such a function would not be sound,
2064 * because Box<T> implements Unpin even when T does not, as a result of
2065 * this impl.
2066 *
2067 * We chose this API instead of the alternative for a few reasons:
2068 * - Logically, it is helpful to understand pinning in regard to the
2069 * memory region being pointed to. For this reason none of the
2070 * standard library pointer types support projecting through a pin
2071 * (Box<T> is the only pointer type in std for which this would be
2072 * safe.)
2073 * - It is in practice very useful to have Box<T> be unconditionally
2074 * Unpin because of trait objects, for which the structural auto
2075 * trait functionality does not apply (e.g., Box<dyn Foo> would
2076 * otherwise not be Unpin).
2077 *
2078 * Another type with the same semantics as Box but only a conditional
2079 * implementation of `Unpin` (where `T: Unpin`) would be valid/safe, and
2080 * could have a method to project a Pin<T> from it.
2081 */
2082#[stable(feature = "pin", since = "1.33.0")]
2083impl<T: ?Sized, A: Allocator> Unpin for Box<T, A> {}
2084
2085#[unstable(feature = "coroutine_trait", issue = "43122")]
2086impl<G: ?Sized + Coroutine<R> + Unpin, R, A: Allocator> Coroutine<R> for Box<G, A> {
2087 type Yield = G::Yield;
2088 type Return = G::Return;
2089
2090 fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2091 G::resume(Pin::new(&mut *self), arg)
2092 }
2093}
2094
2095#[unstable(feature = "coroutine_trait", issue = "43122")]
2096impl<G: ?Sized + Coroutine<R>, R, A: Allocator> Coroutine<R> for Pin<Box<G, A>>
2097where
2098 A: 'static,
2099{
2100 type Yield = G::Yield;
2101 type Return = G::Return;
2102
2103 fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2104 G::resume((*self).as_mut(), arg)
2105 }
2106}
2107
2108#[stable(feature = "futures_api", since = "1.36.0")]
2109impl<F: ?Sized + Future + Unpin, A: Allocator> Future for Box<F, A> {
2110 type Output = F::Output;
2111
2112 fn poll(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Self::Output> {
2113 F::poll(Pin::new(&mut *self), cx)
2114 }
2115}
2116
2117#[stable(feature = "box_error", since = "1.8.0")]
2118impl<E: Error> Error for Box<E> {
2119 #[allow(deprecated)]
2120 fn cause(&self) -> Option<&dyn Error> {
2121 Error::cause(&**self)
2122 }
2123
2124 fn source(&self) -> Option<&(dyn Error + 'static)> {
2125 Error::source(&**self)
2126 }
2127
2128 fn provide<'b>(&'b self, request: &mut error::Request<'b>) {
2129 Error::provide(&**self, request);
2130 }
2131}