alloc/boxed.rs
1//! The `Box<T>` type for heap allocation.
2//!
3//! [`Box<T>`], casually referred to as a 'box', provides the simplest form of
4//! heap allocation in Rust. Boxes provide ownership for this allocation, and
5//! drop their contents when they go out of scope. Boxes also ensure that they
6//! never allocate more than `isize::MAX` bytes.
7//!
8//! # Examples
9//!
10//! Move a value from the stack to the heap by creating a [`Box`]:
11//!
12//! ```
13//! let val: u8 = 5;
14//! let boxed: Box<u8> = Box::new(val);
15//! ```
16//!
17//! Move a value from a [`Box`] back to the stack by [dereferencing]:
18//!
19//! ```
20//! let boxed: Box<u8> = Box::new(5);
21//! let val: u8 = *boxed;
22//! ```
23//!
24//! Creating a recursive data structure:
25//!
26//! ```
27//! # #[allow(dead_code)]
28//! #[derive(Debug)]
29//! enum List<T> {
30//! Cons(T, Box<List<T>>),
31//! Nil,
32//! }
33//!
34//! let list: List<i32> = List::Cons(1, Box::new(List::Cons(2, Box::new(List::Nil))));
35//! println!("{list:?}");
36//! ```
37//!
38//! This will print `Cons(1, Cons(2, Nil))`.
39//!
40//! Recursive structures must be boxed, because if the definition of `Cons`
41//! looked like this:
42//!
43//! ```compile_fail,E0072
44//! # enum List<T> {
45//! Cons(T, List<T>),
46//! # }
47//! ```
48//!
49//! It wouldn't work. This is because the size of a `List` depends on how many
50//! elements are in the list, and so we don't know how much memory to allocate
51//! for a `Cons`. By introducing a [`Box<T>`], which has a defined size, we know how
52//! big `Cons` needs to be.
53//!
54//! # Memory layout
55//!
56//! For non-zero-sized values, a [`Box`] will use the [`Global`] allocator for its allocation. It is
57//! valid to convert both ways between a [`Box`] and a raw pointer allocated with the [`Global`]
58//! allocator, given that the [`Layout`] used with the allocator is correct for the type and the raw
59//! pointer points to a valid value of the right type. More precisely, a `value: *mut T` that has
60//! been allocated with the [`Global`] allocator with `Layout::for_value(&*value)` may be converted
61//! into a box using [`Box::<T>::from_raw(value)`]. Conversely, the memory backing a `value: *mut T`
62//! obtained from [`Box::<T>::into_raw`] may be deallocated using the [`Global`] allocator with
63//! [`Layout::for_value(&*value)`].
64//!
65//! For zero-sized values, the `Box` pointer has to be non-null and sufficiently aligned. The
66//! recommended way to build a Box to a ZST if `Box::new` cannot be used is to use
67//! [`ptr::NonNull::dangling`].
68//!
69//! On top of these basic layout requirements, a `Box<T>` must point to a valid value of `T`.
70//!
71//! So long as `T: Sized`, a `Box<T>` is guaranteed to be represented
72//! as a single pointer and is also ABI-compatible with C pointers
73//! (i.e. the C type `T*`). This means that if you have extern "C"
74//! Rust functions that will be called from C, you can define those
75//! Rust functions using `Box<T>` types, and use `T*` as corresponding
76//! type on the C side. As an example, consider this C header which
77//! declares functions that create and destroy some kind of `Foo`
78//! value:
79//!
80//! ```c
81//! /* C header */
82//!
83//! /* Returns ownership to the caller */
84//! struct Foo* foo_new(void);
85//!
86//! /* Takes ownership from the caller; no-op when invoked with null */
87//! void foo_delete(struct Foo*);
88//! ```
89//!
90//! These two functions might be implemented in Rust as follows. Here, the
91//! `struct Foo*` type from C is translated to `Box<Foo>`, which captures
92//! the ownership constraints. Note also that the nullable argument to
93//! `foo_delete` is represented in Rust as `Option<Box<Foo>>`, since `Box<Foo>`
94//! cannot be null.
95//!
96//! ```
97//! #[repr(C)]
98//! pub struct Foo;
99//!
100//! #[unsafe(no_mangle)]
101//! pub extern "C" fn foo_new() -> Box<Foo> {
102//! Box::new(Foo)
103//! }
104//!
105//! #[unsafe(no_mangle)]
106//! pub extern "C" fn foo_delete(_: Option<Box<Foo>>) {}
107//! ```
108//!
109//! Even though `Box<T>` has the same representation and C ABI as a C pointer,
110//! this does not mean that you can convert an arbitrary `T*` into a `Box<T>`
111//! and expect things to work. `Box<T>` values will always be fully aligned,
112//! non-null pointers. Moreover, the destructor for `Box<T>` will attempt to
113//! free the value with the global allocator. In general, the best practice
114//! is to only use `Box<T>` for pointers that originated from the global
115//! allocator.
116//!
117//! **Important.** At least at present, you should avoid using
118//! `Box<T>` types for functions that are defined in C but invoked
119//! from Rust. In those cases, you should directly mirror the C types
120//! as closely as possible. Using types like `Box<T>` where the C
121//! definition is just using `T*` can lead to undefined behavior, as
122//! described in [rust-lang/unsafe-code-guidelines#198][ucg#198].
123//!
124//! # Considerations for unsafe code
125//!
126//! **Warning: This section is not normative and is subject to change, possibly
127//! being relaxed in the future! It is a simplified summary of the rules
128//! currently implemented in the compiler.**
129//!
130//! The aliasing rules for `Box<T>` are the same as for `&mut T`. `Box<T>`
131//! asserts uniqueness over its content. Using raw pointers derived from a box
132//! after that box has been mutated through, moved or borrowed as `&mut T`
133//! is not allowed. For more guidance on working with box from unsafe code, see
134//! [rust-lang/unsafe-code-guidelines#326][ucg#326].
135//!
136//! # Editions
137//!
138//! A special case exists for the implementation of `IntoIterator` for arrays on the Rust 2021
139//! edition, as documented [here][array]. Unfortunately, it was later found that a similar
140//! workaround should be added for boxed slices, and this was applied in the 2024 edition.
141//!
142//! Specifically, `IntoIterator` is implemented for `Box<[T]>` on all editions, but specific calls
143//! to `into_iter()` for boxed slices will defer to the slice implementation on editions before
144//! 2024:
145//!
146//! ```rust,edition2021
147//! // Rust 2015, 2018, and 2021:
148//!
149//! # #![allow(boxed_slice_into_iter)] // override our `deny(warnings)`
150//! let boxed_slice: Box<[i32]> = vec![0; 3].into_boxed_slice();
151//!
152//! // This creates a slice iterator, producing references to each value.
153//! for item in boxed_slice.into_iter().enumerate() {
154//! let (i, x): (usize, &i32) = item;
155//! println!("boxed_slice[{i}] = {x}");
156//! }
157//!
158//! // The `boxed_slice_into_iter` lint suggests this change for future compatibility:
159//! for item in boxed_slice.iter().enumerate() {
160//! let (i, x): (usize, &i32) = item;
161//! println!("boxed_slice[{i}] = {x}");
162//! }
163//!
164//! // You can explicitly iterate a boxed slice by value using `IntoIterator::into_iter`
165//! for item in IntoIterator::into_iter(boxed_slice).enumerate() {
166//! let (i, x): (usize, i32) = item;
167//! println!("boxed_slice[{i}] = {x}");
168//! }
169//! ```
170//!
171//! Similar to the array implementation, this may be modified in the future to remove this override,
172//! and it's best to avoid relying on this edition-dependent behavior if you wish to preserve
173//! compatibility with future versions of the compiler.
174//!
175//! [ucg#198]: https://github.com/rust-lang/unsafe-code-guidelines/issues/198
176//! [ucg#326]: https://github.com/rust-lang/unsafe-code-guidelines/issues/326
177//! [dereferencing]: core::ops::Deref
178//! [`Box::<T>::from_raw(value)`]: Box::from_raw
179//! [`Global`]: crate::alloc::Global
180//! [`Layout`]: crate::alloc::Layout
181//! [`Layout::for_value(&*value)`]: crate::alloc::Layout::for_value
182//! [valid]: ptr#safety
183
184#![stable(feature = "rust1", since = "1.0.0")]
185
186use core::borrow::{Borrow, BorrowMut};
187#[cfg(not(no_global_oom_handling))]
188use core::clone::CloneToUninit;
189use core::cmp::Ordering;
190use core::error::{self, Error};
191use core::fmt;
192use core::future::Future;
193use core::hash::{Hash, Hasher};
194use core::marker::{Tuple, Unsize};
195#[cfg(not(no_global_oom_handling))]
196use core::mem::MaybeUninit;
197use core::mem::{self, SizedTypeProperties};
198use core::ops::{
199 AsyncFn, AsyncFnMut, AsyncFnOnce, CoerceUnsized, Coroutine, CoroutineState, Deref, DerefMut,
200 DerefPure, DispatchFromDyn, LegacyReceiver,
201};
202#[cfg(not(no_global_oom_handling))]
203use core::ops::{Residual, Try};
204use core::pin::{Pin, PinCoerceUnsized};
205use core::ptr::{self, NonNull, Unique};
206use core::task::{Context, Poll};
207
208#[cfg(not(no_global_oom_handling))]
209use crate::alloc::handle_alloc_error;
210use crate::alloc::{AllocError, Allocator, Global, Layout};
211use crate::raw_vec::RawVec;
212#[cfg(not(no_global_oom_handling))]
213use crate::str::from_boxed_utf8_unchecked;
214
215/// Conversion related impls for `Box<_>` (`From`, `downcast`, etc)
216mod convert;
217/// Iterator related impls for `Box<_>`.
218mod iter;
219/// [`ThinBox`] implementation.
220mod thin;
221
222#[unstable(feature = "thin_box", issue = "92791")]
223pub use thin::ThinBox;
224
225/// A pointer type that uniquely owns a heap allocation of type `T`.
226///
227/// See the [module-level documentation](../../std/boxed/index.html) for more.
228#[lang = "owned_box"]
229#[fundamental]
230#[stable(feature = "rust1", since = "1.0.0")]
231#[rustc_insignificant_dtor]
232#[doc(search_unbox)]
233// The declaration of the `Box` struct must be kept in sync with the
234// compiler or ICEs will happen.
235pub struct Box<
236 T: ?Sized,
237 #[unstable(feature = "allocator_api", issue = "32838")] A: Allocator = Global,
238>(Unique<T>, A);
239
240/// Constructs a `Box<T>` by calling the `exchange_malloc` lang item and moving the argument into
241/// the newly allocated memory. This is an intrinsic to avoid unnecessary copies.
242///
243/// This is the surface syntax for `box <expr>` expressions.
244#[doc(hidden)]
245#[rustc_intrinsic]
246#[unstable(feature = "liballoc_internals", issue = "none")]
247pub fn box_new<T>(x: T) -> Box<T>;
248
249impl<T> Box<T> {
250 /// Allocates memory on the heap and then places `x` into it.
251 ///
252 /// This doesn't actually allocate if `T` is zero-sized.
253 ///
254 /// # Examples
255 ///
256 /// ```
257 /// let five = Box::new(5);
258 /// ```
259 #[cfg(not(no_global_oom_handling))]
260 #[inline(always)]
261 #[stable(feature = "rust1", since = "1.0.0")]
262 #[must_use]
263 #[rustc_diagnostic_item = "box_new"]
264 #[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
265 pub fn new(x: T) -> Self {
266 return box_new(x);
267 }
268
269 /// Constructs a new box with uninitialized contents.
270 ///
271 /// # Examples
272 ///
273 /// ```
274 /// let mut five = Box::<u32>::new_uninit();
275 /// // Deferred initialization:
276 /// five.write(5);
277 /// let five = unsafe { five.assume_init() };
278 ///
279 /// assert_eq!(*five, 5)
280 /// ```
281 #[cfg(not(no_global_oom_handling))]
282 #[stable(feature = "new_uninit", since = "1.82.0")]
283 #[must_use]
284 #[inline]
285 pub fn new_uninit() -> Box<mem::MaybeUninit<T>> {
286 Self::new_uninit_in(Global)
287 }
288
289 /// Constructs a new `Box` with uninitialized contents, with the memory
290 /// being filled with `0` bytes.
291 ///
292 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
293 /// of this method.
294 ///
295 /// # Examples
296 ///
297 /// ```
298 /// let zero = Box::<u32>::new_zeroed();
299 /// let zero = unsafe { zero.assume_init() };
300 ///
301 /// assert_eq!(*zero, 0)
302 /// ```
303 ///
304 /// [zeroed]: mem::MaybeUninit::zeroed
305 #[cfg(not(no_global_oom_handling))]
306 #[inline]
307 #[stable(feature = "new_zeroed_alloc", since = "1.92.0")]
308 #[must_use]
309 pub fn new_zeroed() -> Box<mem::MaybeUninit<T>> {
310 Self::new_zeroed_in(Global)
311 }
312
313 /// Constructs a new `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
314 /// `x` will be pinned in memory and unable to be moved.
315 ///
316 /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin(x)`
317 /// does the same as <code>[Box::into_pin]\([Box::new]\(x))</code>. Consider using
318 /// [`into_pin`](Box::into_pin) if you already have a `Box<T>`, or if you want to
319 /// construct a (pinned) `Box` in a different way than with [`Box::new`].
320 #[cfg(not(no_global_oom_handling))]
321 #[stable(feature = "pin", since = "1.33.0")]
322 #[must_use]
323 #[inline(always)]
324 pub fn pin(x: T) -> Pin<Box<T>> {
325 Box::new(x).into()
326 }
327
328 /// Allocates memory on the heap then places `x` into it,
329 /// returning an error if the allocation fails
330 ///
331 /// This doesn't actually allocate if `T` is zero-sized.
332 ///
333 /// # Examples
334 ///
335 /// ```
336 /// #![feature(allocator_api)]
337 ///
338 /// let five = Box::try_new(5)?;
339 /// # Ok::<(), std::alloc::AllocError>(())
340 /// ```
341 #[unstable(feature = "allocator_api", issue = "32838")]
342 #[inline]
343 pub fn try_new(x: T) -> Result<Self, AllocError> {
344 Self::try_new_in(x, Global)
345 }
346
347 /// Constructs a new box with uninitialized contents on the heap,
348 /// returning an error if the allocation fails
349 ///
350 /// # Examples
351 ///
352 /// ```
353 /// #![feature(allocator_api)]
354 ///
355 /// let mut five = Box::<u32>::try_new_uninit()?;
356 /// // Deferred initialization:
357 /// five.write(5);
358 /// let five = unsafe { five.assume_init() };
359 ///
360 /// assert_eq!(*five, 5);
361 /// # Ok::<(), std::alloc::AllocError>(())
362 /// ```
363 #[unstable(feature = "allocator_api", issue = "32838")]
364 #[inline]
365 pub fn try_new_uninit() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
366 Box::try_new_uninit_in(Global)
367 }
368
369 /// Constructs a new `Box` with uninitialized contents, with the memory
370 /// being filled with `0` bytes on the heap
371 ///
372 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
373 /// of this method.
374 ///
375 /// # Examples
376 ///
377 /// ```
378 /// #![feature(allocator_api)]
379 ///
380 /// let zero = Box::<u32>::try_new_zeroed()?;
381 /// let zero = unsafe { zero.assume_init() };
382 ///
383 /// assert_eq!(*zero, 0);
384 /// # Ok::<(), std::alloc::AllocError>(())
385 /// ```
386 ///
387 /// [zeroed]: mem::MaybeUninit::zeroed
388 #[unstable(feature = "allocator_api", issue = "32838")]
389 #[inline]
390 pub fn try_new_zeroed() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
391 Box::try_new_zeroed_in(Global)
392 }
393
394 /// Maps the value in a box, reusing the allocation if possible.
395 ///
396 /// `f` is called on the value in the box, and the result is returned, also boxed.
397 ///
398 /// Note: this is an associated function, which means that you have
399 /// to call it as `Box::map(b, f)` instead of `b.map(f)`. This
400 /// is so that there is no conflict with a method on the inner type.
401 ///
402 /// # Examples
403 ///
404 /// ```
405 /// #![feature(smart_pointer_try_map)]
406 ///
407 /// let b = Box::new(7);
408 /// let new = Box::map(b, |i| i + 7);
409 /// assert_eq!(*new, 14);
410 /// ```
411 #[cfg(not(no_global_oom_handling))]
412 #[unstable(feature = "smart_pointer_try_map", issue = "144419")]
413 pub fn map<U>(this: Self, f: impl FnOnce(T) -> U) -> Box<U> {
414 if size_of::<T>() == size_of::<U>() && align_of::<T>() == align_of::<U>() {
415 let (value, allocation) = Box::take(this);
416 Box::write(
417 unsafe { mem::transmute::<Box<MaybeUninit<T>>, Box<MaybeUninit<U>>>(allocation) },
418 f(value),
419 )
420 } else {
421 Box::new(f(*this))
422 }
423 }
424
425 /// Attempts to map the value in a box, reusing the allocation if possible.
426 ///
427 /// `f` is called on the value in the box, and if the operation succeeds, the result is
428 /// returned, also boxed.
429 ///
430 /// Note: this is an associated function, which means that you have
431 /// to call it as `Box::try_map(b, f)` instead of `b.try_map(f)`. This
432 /// is so that there is no conflict with a method on the inner type.
433 ///
434 /// # Examples
435 ///
436 /// ```
437 /// #![feature(smart_pointer_try_map)]
438 ///
439 /// let b = Box::new(7);
440 /// let new = Box::try_map(b, u32::try_from).unwrap();
441 /// assert_eq!(*new, 7);
442 /// ```
443 #[cfg(not(no_global_oom_handling))]
444 #[unstable(feature = "smart_pointer_try_map", issue = "144419")]
445 pub fn try_map<R>(
446 this: Self,
447 f: impl FnOnce(T) -> R,
448 ) -> <R::Residual as Residual<Box<R::Output>>>::TryType
449 where
450 R: Try,
451 R::Residual: Residual<Box<R::Output>>,
452 {
453 if size_of::<T>() == size_of::<R::Output>() && align_of::<T>() == align_of::<R::Output>() {
454 let (value, allocation) = Box::take(this);
455 try {
456 Box::write(
457 unsafe {
458 mem::transmute::<Box<MaybeUninit<T>>, Box<MaybeUninit<R::Output>>>(
459 allocation,
460 )
461 },
462 f(value)?,
463 )
464 }
465 } else {
466 try { Box::new(f(*this)?) }
467 }
468 }
469}
470
471impl<T, A: Allocator> Box<T, A> {
472 /// Allocates memory in the given allocator then places `x` into it.
473 ///
474 /// This doesn't actually allocate if `T` is zero-sized.
475 ///
476 /// # Examples
477 ///
478 /// ```
479 /// #![feature(allocator_api)]
480 ///
481 /// use std::alloc::System;
482 ///
483 /// let five = Box::new_in(5, System);
484 /// ```
485 #[cfg(not(no_global_oom_handling))]
486 #[unstable(feature = "allocator_api", issue = "32838")]
487 #[must_use]
488 #[inline]
489 pub fn new_in(x: T, alloc: A) -> Self
490 where
491 A: Allocator,
492 {
493 let mut boxed = Self::new_uninit_in(alloc);
494 boxed.write(x);
495 unsafe { boxed.assume_init() }
496 }
497
498 /// Allocates memory in the given allocator then places `x` into it,
499 /// returning an error if the allocation fails
500 ///
501 /// This doesn't actually allocate if `T` is zero-sized.
502 ///
503 /// # Examples
504 ///
505 /// ```
506 /// #![feature(allocator_api)]
507 ///
508 /// use std::alloc::System;
509 ///
510 /// let five = Box::try_new_in(5, System)?;
511 /// # Ok::<(), std::alloc::AllocError>(())
512 /// ```
513 #[unstable(feature = "allocator_api", issue = "32838")]
514 #[inline]
515 pub fn try_new_in(x: T, alloc: A) -> Result<Self, AllocError>
516 where
517 A: Allocator,
518 {
519 let mut boxed = Self::try_new_uninit_in(alloc)?;
520 boxed.write(x);
521 unsafe { Ok(boxed.assume_init()) }
522 }
523
524 /// Constructs a new box with uninitialized contents in the provided allocator.
525 ///
526 /// # Examples
527 ///
528 /// ```
529 /// #![feature(allocator_api)]
530 ///
531 /// use std::alloc::System;
532 ///
533 /// let mut five = Box::<u32, _>::new_uninit_in(System);
534 /// // Deferred initialization:
535 /// five.write(5);
536 /// let five = unsafe { five.assume_init() };
537 ///
538 /// assert_eq!(*five, 5)
539 /// ```
540 #[unstable(feature = "allocator_api", issue = "32838")]
541 #[cfg(not(no_global_oom_handling))]
542 #[must_use]
543 pub fn new_uninit_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
544 where
545 A: Allocator,
546 {
547 let layout = Layout::new::<mem::MaybeUninit<T>>();
548 // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
549 // That would make code size bigger.
550 match Box::try_new_uninit_in(alloc) {
551 Ok(m) => m,
552 Err(_) => handle_alloc_error(layout),
553 }
554 }
555
556 /// Constructs a new box with uninitialized contents in the provided allocator,
557 /// returning an error if the allocation fails
558 ///
559 /// # Examples
560 ///
561 /// ```
562 /// #![feature(allocator_api)]
563 ///
564 /// use std::alloc::System;
565 ///
566 /// let mut five = Box::<u32, _>::try_new_uninit_in(System)?;
567 /// // Deferred initialization:
568 /// five.write(5);
569 /// let five = unsafe { five.assume_init() };
570 ///
571 /// assert_eq!(*five, 5);
572 /// # Ok::<(), std::alloc::AllocError>(())
573 /// ```
574 #[unstable(feature = "allocator_api", issue = "32838")]
575 pub fn try_new_uninit_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
576 where
577 A: Allocator,
578 {
579 let ptr = if T::IS_ZST {
580 NonNull::dangling()
581 } else {
582 let layout = Layout::new::<mem::MaybeUninit<T>>();
583 alloc.allocate(layout)?.cast()
584 };
585 unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
586 }
587
588 /// Constructs a new `Box` with uninitialized contents, with the memory
589 /// being filled with `0` bytes in the provided allocator.
590 ///
591 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
592 /// of this method.
593 ///
594 /// # Examples
595 ///
596 /// ```
597 /// #![feature(allocator_api)]
598 ///
599 /// use std::alloc::System;
600 ///
601 /// let zero = Box::<u32, _>::new_zeroed_in(System);
602 /// let zero = unsafe { zero.assume_init() };
603 ///
604 /// assert_eq!(*zero, 0)
605 /// ```
606 ///
607 /// [zeroed]: mem::MaybeUninit::zeroed
608 #[unstable(feature = "allocator_api", issue = "32838")]
609 #[cfg(not(no_global_oom_handling))]
610 #[must_use]
611 pub fn new_zeroed_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
612 where
613 A: Allocator,
614 {
615 let layout = Layout::new::<mem::MaybeUninit<T>>();
616 // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
617 // That would make code size bigger.
618 match Box::try_new_zeroed_in(alloc) {
619 Ok(m) => m,
620 Err(_) => handle_alloc_error(layout),
621 }
622 }
623
624 /// Constructs a new `Box` with uninitialized contents, with the memory
625 /// being filled with `0` bytes in the provided allocator,
626 /// returning an error if the allocation fails,
627 ///
628 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
629 /// of this method.
630 ///
631 /// # Examples
632 ///
633 /// ```
634 /// #![feature(allocator_api)]
635 ///
636 /// use std::alloc::System;
637 ///
638 /// let zero = Box::<u32, _>::try_new_zeroed_in(System)?;
639 /// let zero = unsafe { zero.assume_init() };
640 ///
641 /// assert_eq!(*zero, 0);
642 /// # Ok::<(), std::alloc::AllocError>(())
643 /// ```
644 ///
645 /// [zeroed]: mem::MaybeUninit::zeroed
646 #[unstable(feature = "allocator_api", issue = "32838")]
647 pub fn try_new_zeroed_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
648 where
649 A: Allocator,
650 {
651 let ptr = if T::IS_ZST {
652 NonNull::dangling()
653 } else {
654 let layout = Layout::new::<mem::MaybeUninit<T>>();
655 alloc.allocate_zeroed(layout)?.cast()
656 };
657 unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
658 }
659
660 /// Constructs a new `Pin<Box<T, A>>`. If `T` does not implement [`Unpin`], then
661 /// `x` will be pinned in memory and unable to be moved.
662 ///
663 /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin_in(x, alloc)`
664 /// does the same as <code>[Box::into_pin]\([Box::new_in]\(x, alloc))</code>. Consider using
665 /// [`into_pin`](Box::into_pin) if you already have a `Box<T, A>`, or if you want to
666 /// construct a (pinned) `Box` in a different way than with [`Box::new_in`].
667 #[cfg(not(no_global_oom_handling))]
668 #[unstable(feature = "allocator_api", issue = "32838")]
669 #[must_use]
670 #[inline(always)]
671 pub fn pin_in(x: T, alloc: A) -> Pin<Self>
672 where
673 A: 'static + Allocator,
674 {
675 Self::into_pin(Self::new_in(x, alloc))
676 }
677
678 /// Converts a `Box<T>` into a `Box<[T]>`
679 ///
680 /// This conversion does not allocate on the heap and happens in place.
681 #[unstable(feature = "box_into_boxed_slice", issue = "71582")]
682 pub fn into_boxed_slice(boxed: Self) -> Box<[T], A> {
683 let (raw, alloc) = Box::into_raw_with_allocator(boxed);
684 unsafe { Box::from_raw_in(raw as *mut [T; 1], alloc) }
685 }
686
687 /// Consumes the `Box`, returning the wrapped value.
688 ///
689 /// # Examples
690 ///
691 /// ```
692 /// #![feature(box_into_inner)]
693 ///
694 /// let c = Box::new(5);
695 ///
696 /// assert_eq!(Box::into_inner(c), 5);
697 /// ```
698 #[unstable(feature = "box_into_inner", issue = "80437")]
699 #[inline]
700 pub fn into_inner(boxed: Self) -> T {
701 *boxed
702 }
703
704 /// Consumes the `Box` without consuming its allocation, returning the wrapped value and a `Box`
705 /// to the uninitialized memory where the wrapped value used to live.
706 ///
707 /// This can be used together with [`write`](Box::write) to reuse the allocation for multiple
708 /// boxed values.
709 ///
710 /// # Examples
711 ///
712 /// ```
713 /// #![feature(box_take)]
714 ///
715 /// let c = Box::new(5);
716 ///
717 /// // take the value out of the box
718 /// let (value, uninit) = Box::take(c);
719 /// assert_eq!(value, 5);
720 ///
721 /// // reuse the box for a second value
722 /// let c = Box::write(uninit, 6);
723 /// assert_eq!(*c, 6);
724 /// ```
725 #[unstable(feature = "box_take", issue = "147212")]
726 pub fn take(boxed: Self) -> (T, Box<mem::MaybeUninit<T>, A>) {
727 unsafe {
728 let (raw, alloc) = Box::into_raw_with_allocator(boxed);
729 let value = raw.read();
730 let uninit = Box::from_raw_in(raw.cast::<mem::MaybeUninit<T>>(), alloc);
731 (value, uninit)
732 }
733 }
734}
735
736impl<T> Box<[T]> {
737 /// Constructs a new boxed slice with uninitialized contents.
738 ///
739 /// # Examples
740 ///
741 /// ```
742 /// let mut values = Box::<[u32]>::new_uninit_slice(3);
743 /// // Deferred initialization:
744 /// values[0].write(1);
745 /// values[1].write(2);
746 /// values[2].write(3);
747 /// let values = unsafe { values.assume_init() };
748 ///
749 /// assert_eq!(*values, [1, 2, 3])
750 /// ```
751 #[cfg(not(no_global_oom_handling))]
752 #[stable(feature = "new_uninit", since = "1.82.0")]
753 #[must_use]
754 pub fn new_uninit_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
755 unsafe { RawVec::with_capacity(len).into_box(len) }
756 }
757
758 /// Constructs a new boxed slice with uninitialized contents, with the memory
759 /// being filled with `0` bytes.
760 ///
761 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
762 /// of this method.
763 ///
764 /// # Examples
765 ///
766 /// ```
767 /// let values = Box::<[u32]>::new_zeroed_slice(3);
768 /// let values = unsafe { values.assume_init() };
769 ///
770 /// assert_eq!(*values, [0, 0, 0])
771 /// ```
772 ///
773 /// [zeroed]: mem::MaybeUninit::zeroed
774 #[cfg(not(no_global_oom_handling))]
775 #[stable(feature = "new_zeroed_alloc", since = "1.92.0")]
776 #[must_use]
777 pub fn new_zeroed_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
778 unsafe { RawVec::with_capacity_zeroed(len).into_box(len) }
779 }
780
781 /// Constructs a new boxed slice with uninitialized contents. Returns an error if
782 /// the allocation fails.
783 ///
784 /// # Examples
785 ///
786 /// ```
787 /// #![feature(allocator_api)]
788 ///
789 /// let mut values = Box::<[u32]>::try_new_uninit_slice(3)?;
790 /// // Deferred initialization:
791 /// values[0].write(1);
792 /// values[1].write(2);
793 /// values[2].write(3);
794 /// let values = unsafe { values.assume_init() };
795 ///
796 /// assert_eq!(*values, [1, 2, 3]);
797 /// # Ok::<(), std::alloc::AllocError>(())
798 /// ```
799 #[unstable(feature = "allocator_api", issue = "32838")]
800 #[inline]
801 pub fn try_new_uninit_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
802 let ptr = if T::IS_ZST || len == 0 {
803 NonNull::dangling()
804 } else {
805 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
806 Ok(l) => l,
807 Err(_) => return Err(AllocError),
808 };
809 Global.allocate(layout)?.cast()
810 };
811 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
812 }
813
814 /// Constructs a new boxed slice with uninitialized contents, with the memory
815 /// being filled with `0` bytes. Returns an error if the allocation fails.
816 ///
817 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
818 /// of this method.
819 ///
820 /// # Examples
821 ///
822 /// ```
823 /// #![feature(allocator_api)]
824 ///
825 /// let values = Box::<[u32]>::try_new_zeroed_slice(3)?;
826 /// let values = unsafe { values.assume_init() };
827 ///
828 /// assert_eq!(*values, [0, 0, 0]);
829 /// # Ok::<(), std::alloc::AllocError>(())
830 /// ```
831 ///
832 /// [zeroed]: mem::MaybeUninit::zeroed
833 #[unstable(feature = "allocator_api", issue = "32838")]
834 #[inline]
835 pub fn try_new_zeroed_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
836 let ptr = if T::IS_ZST || len == 0 {
837 NonNull::dangling()
838 } else {
839 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
840 Ok(l) => l,
841 Err(_) => return Err(AllocError),
842 };
843 Global.allocate_zeroed(layout)?.cast()
844 };
845 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
846 }
847
848 /// Converts the boxed slice into a boxed array.
849 ///
850 /// This operation does not reallocate; the underlying array of the slice is simply reinterpreted as an array type.
851 ///
852 /// If `N` is not exactly equal to the length of `self`, then this method returns `None`.
853 #[unstable(feature = "slice_as_array", issue = "133508")]
854 #[inline]
855 #[must_use]
856 pub fn into_array<const N: usize>(self) -> Option<Box<[T; N]>> {
857 if self.len() == N {
858 let ptr = Self::into_raw(self) as *mut [T; N];
859
860 // SAFETY: The underlying array of a slice has the exact same layout as an actual array `[T; N]` if `N` is equal to the slice's length.
861 let me = unsafe { Box::from_raw(ptr) };
862 Some(me)
863 } else {
864 None
865 }
866 }
867}
868
869impl<T, A: Allocator> Box<[T], A> {
870 /// Constructs a new boxed slice with uninitialized contents in the provided allocator.
871 ///
872 /// # Examples
873 ///
874 /// ```
875 /// #![feature(allocator_api)]
876 ///
877 /// use std::alloc::System;
878 ///
879 /// let mut values = Box::<[u32], _>::new_uninit_slice_in(3, System);
880 /// // Deferred initialization:
881 /// values[0].write(1);
882 /// values[1].write(2);
883 /// values[2].write(3);
884 /// let values = unsafe { values.assume_init() };
885 ///
886 /// assert_eq!(*values, [1, 2, 3])
887 /// ```
888 #[cfg(not(no_global_oom_handling))]
889 #[unstable(feature = "allocator_api", issue = "32838")]
890 #[must_use]
891 pub fn new_uninit_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
892 unsafe { RawVec::with_capacity_in(len, alloc).into_box(len) }
893 }
894
895 /// Constructs a new boxed slice with uninitialized contents in the provided allocator,
896 /// with the memory being filled with `0` bytes.
897 ///
898 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
899 /// of this method.
900 ///
901 /// # Examples
902 ///
903 /// ```
904 /// #![feature(allocator_api)]
905 ///
906 /// use std::alloc::System;
907 ///
908 /// let values = Box::<[u32], _>::new_zeroed_slice_in(3, System);
909 /// let values = unsafe { values.assume_init() };
910 ///
911 /// assert_eq!(*values, [0, 0, 0])
912 /// ```
913 ///
914 /// [zeroed]: mem::MaybeUninit::zeroed
915 #[cfg(not(no_global_oom_handling))]
916 #[unstable(feature = "allocator_api", issue = "32838")]
917 #[must_use]
918 pub fn new_zeroed_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
919 unsafe { RawVec::with_capacity_zeroed_in(len, alloc).into_box(len) }
920 }
921
922 /// Constructs a new boxed slice with uninitialized contents in the provided allocator. Returns an error if
923 /// the allocation fails.
924 ///
925 /// # Examples
926 ///
927 /// ```
928 /// #![feature(allocator_api)]
929 ///
930 /// use std::alloc::System;
931 ///
932 /// let mut values = Box::<[u32], _>::try_new_uninit_slice_in(3, System)?;
933 /// // Deferred initialization:
934 /// values[0].write(1);
935 /// values[1].write(2);
936 /// values[2].write(3);
937 /// let values = unsafe { values.assume_init() };
938 ///
939 /// assert_eq!(*values, [1, 2, 3]);
940 /// # Ok::<(), std::alloc::AllocError>(())
941 /// ```
942 #[unstable(feature = "allocator_api", issue = "32838")]
943 #[inline]
944 pub fn try_new_uninit_slice_in(
945 len: usize,
946 alloc: A,
947 ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
948 let ptr = if T::IS_ZST || len == 0 {
949 NonNull::dangling()
950 } else {
951 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
952 Ok(l) => l,
953 Err(_) => return Err(AllocError),
954 };
955 alloc.allocate(layout)?.cast()
956 };
957 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
958 }
959
960 /// Constructs a new boxed slice with uninitialized contents in the provided allocator, with the memory
961 /// being filled with `0` bytes. Returns an error if the allocation fails.
962 ///
963 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
964 /// of this method.
965 ///
966 /// # Examples
967 ///
968 /// ```
969 /// #![feature(allocator_api)]
970 ///
971 /// use std::alloc::System;
972 ///
973 /// let values = Box::<[u32], _>::try_new_zeroed_slice_in(3, System)?;
974 /// let values = unsafe { values.assume_init() };
975 ///
976 /// assert_eq!(*values, [0, 0, 0]);
977 /// # Ok::<(), std::alloc::AllocError>(())
978 /// ```
979 ///
980 /// [zeroed]: mem::MaybeUninit::zeroed
981 #[unstable(feature = "allocator_api", issue = "32838")]
982 #[inline]
983 pub fn try_new_zeroed_slice_in(
984 len: usize,
985 alloc: A,
986 ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
987 let ptr = if T::IS_ZST || len == 0 {
988 NonNull::dangling()
989 } else {
990 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
991 Ok(l) => l,
992 Err(_) => return Err(AllocError),
993 };
994 alloc.allocate_zeroed(layout)?.cast()
995 };
996 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
997 }
998}
999
1000impl<T, A: Allocator> Box<mem::MaybeUninit<T>, A> {
1001 /// Converts to `Box<T, A>`.
1002 ///
1003 /// # Safety
1004 ///
1005 /// As with [`MaybeUninit::assume_init`],
1006 /// it is up to the caller to guarantee that the value
1007 /// really is in an initialized state.
1008 /// Calling this when the content is not yet fully initialized
1009 /// causes immediate undefined behavior.
1010 ///
1011 /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
1012 ///
1013 /// # Examples
1014 ///
1015 /// ```
1016 /// let mut five = Box::<u32>::new_uninit();
1017 /// // Deferred initialization:
1018 /// five.write(5);
1019 /// let five: Box<u32> = unsafe { five.assume_init() };
1020 ///
1021 /// assert_eq!(*five, 5)
1022 /// ```
1023 #[stable(feature = "new_uninit", since = "1.82.0")]
1024 #[inline]
1025 pub unsafe fn assume_init(self) -> Box<T, A> {
1026 let (raw, alloc) = Box::into_raw_with_allocator(self);
1027 unsafe { Box::from_raw_in(raw as *mut T, alloc) }
1028 }
1029
1030 /// Writes the value and converts to `Box<T, A>`.
1031 ///
1032 /// This method converts the box similarly to [`Box::assume_init`] but
1033 /// writes `value` into it before conversion thus guaranteeing safety.
1034 /// In some scenarios use of this method may improve performance because
1035 /// the compiler may be able to optimize copying from stack.
1036 ///
1037 /// # Examples
1038 ///
1039 /// ```
1040 /// let big_box = Box::<[usize; 1024]>::new_uninit();
1041 ///
1042 /// let mut array = [0; 1024];
1043 /// for (i, place) in array.iter_mut().enumerate() {
1044 /// *place = i;
1045 /// }
1046 ///
1047 /// // The optimizer may be able to elide this copy, so previous code writes
1048 /// // to heap directly.
1049 /// let big_box = Box::write(big_box, array);
1050 ///
1051 /// for (i, x) in big_box.iter().enumerate() {
1052 /// assert_eq!(*x, i);
1053 /// }
1054 /// ```
1055 #[stable(feature = "box_uninit_write", since = "1.87.0")]
1056 #[inline]
1057 pub fn write(mut boxed: Self, value: T) -> Box<T, A> {
1058 unsafe {
1059 (*boxed).write(value);
1060 boxed.assume_init()
1061 }
1062 }
1063}
1064
1065impl<T, A: Allocator> Box<[mem::MaybeUninit<T>], A> {
1066 /// Converts to `Box<[T], A>`.
1067 ///
1068 /// # Safety
1069 ///
1070 /// As with [`MaybeUninit::assume_init`],
1071 /// it is up to the caller to guarantee that the values
1072 /// really are in an initialized state.
1073 /// Calling this when the content is not yet fully initialized
1074 /// causes immediate undefined behavior.
1075 ///
1076 /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
1077 ///
1078 /// # Examples
1079 ///
1080 /// ```
1081 /// let mut values = Box::<[u32]>::new_uninit_slice(3);
1082 /// // Deferred initialization:
1083 /// values[0].write(1);
1084 /// values[1].write(2);
1085 /// values[2].write(3);
1086 /// let values = unsafe { values.assume_init() };
1087 ///
1088 /// assert_eq!(*values, [1, 2, 3])
1089 /// ```
1090 #[stable(feature = "new_uninit", since = "1.82.0")]
1091 #[inline]
1092 pub unsafe fn assume_init(self) -> Box<[T], A> {
1093 let (raw, alloc) = Box::into_raw_with_allocator(self);
1094 unsafe { Box::from_raw_in(raw as *mut [T], alloc) }
1095 }
1096}
1097
1098impl<T: ?Sized> Box<T> {
1099 /// Constructs a box from a raw pointer.
1100 ///
1101 /// After calling this function, the raw pointer is owned by the
1102 /// resulting `Box`. Specifically, the `Box` destructor will call
1103 /// the destructor of `T` and free the allocated memory. For this
1104 /// to be safe, the memory must have been allocated in accordance
1105 /// with the [memory layout] used by `Box` .
1106 ///
1107 /// # Safety
1108 ///
1109 /// This function is unsafe because improper use may lead to
1110 /// memory problems. For example, a double-free may occur if the
1111 /// function is called twice on the same raw pointer.
1112 ///
1113 /// The raw pointer must point to a block of memory allocated by the global allocator.
1114 ///
1115 /// The safety conditions are described in the [memory layout] section.
1116 ///
1117 /// # Examples
1118 ///
1119 /// Recreate a `Box` which was previously converted to a raw pointer
1120 /// using [`Box::into_raw`]:
1121 /// ```
1122 /// let x = Box::new(5);
1123 /// let ptr = Box::into_raw(x);
1124 /// let x = unsafe { Box::from_raw(ptr) };
1125 /// ```
1126 /// Manually create a `Box` from scratch by using the global allocator:
1127 /// ```
1128 /// use std::alloc::{alloc, Layout};
1129 ///
1130 /// unsafe {
1131 /// let ptr = alloc(Layout::new::<i32>()) as *mut i32;
1132 /// // In general .write is required to avoid attempting to destruct
1133 /// // the (uninitialized) previous contents of `ptr`, though for this
1134 /// // simple example `*ptr = 5` would have worked as well.
1135 /// ptr.write(5);
1136 /// let x = Box::from_raw(ptr);
1137 /// }
1138 /// ```
1139 ///
1140 /// [memory layout]: self#memory-layout
1141 #[stable(feature = "box_raw", since = "1.4.0")]
1142 #[inline]
1143 #[must_use = "call `drop(Box::from_raw(ptr))` if you intend to drop the `Box`"]
1144 pub unsafe fn from_raw(raw: *mut T) -> Self {
1145 unsafe { Self::from_raw_in(raw, Global) }
1146 }
1147
1148 /// Constructs a box from a `NonNull` pointer.
1149 ///
1150 /// After calling this function, the `NonNull` pointer is owned by
1151 /// the resulting `Box`. Specifically, the `Box` destructor will call
1152 /// the destructor of `T` and free the allocated memory. For this
1153 /// to be safe, the memory must have been allocated in accordance
1154 /// with the [memory layout] used by `Box` .
1155 ///
1156 /// # Safety
1157 ///
1158 /// This function is unsafe because improper use may lead to
1159 /// memory problems. For example, a double-free may occur if the
1160 /// function is called twice on the same `NonNull` pointer.
1161 ///
1162 /// The non-null pointer must point to a block of memory allocated by the global allocator.
1163 ///
1164 /// The safety conditions are described in the [memory layout] section.
1165 ///
1166 /// # Examples
1167 ///
1168 /// Recreate a `Box` which was previously converted to a `NonNull`
1169 /// pointer using [`Box::into_non_null`]:
1170 /// ```
1171 /// #![feature(box_vec_non_null)]
1172 ///
1173 /// let x = Box::new(5);
1174 /// let non_null = Box::into_non_null(x);
1175 /// let x = unsafe { Box::from_non_null(non_null) };
1176 /// ```
1177 /// Manually create a `Box` from scratch by using the global allocator:
1178 /// ```
1179 /// #![feature(box_vec_non_null)]
1180 ///
1181 /// use std::alloc::{alloc, Layout};
1182 /// use std::ptr::NonNull;
1183 ///
1184 /// unsafe {
1185 /// let non_null = NonNull::new(alloc(Layout::new::<i32>()).cast::<i32>())
1186 /// .expect("allocation failed");
1187 /// // In general .write is required to avoid attempting to destruct
1188 /// // the (uninitialized) previous contents of `non_null`.
1189 /// non_null.write(5);
1190 /// let x = Box::from_non_null(non_null);
1191 /// }
1192 /// ```
1193 ///
1194 /// [memory layout]: self#memory-layout
1195 #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1196 #[inline]
1197 #[must_use = "call `drop(Box::from_non_null(ptr))` if you intend to drop the `Box`"]
1198 pub unsafe fn from_non_null(ptr: NonNull<T>) -> Self {
1199 unsafe { Self::from_raw(ptr.as_ptr()) }
1200 }
1201
1202 /// Consumes the `Box`, returning a wrapped raw pointer.
1203 ///
1204 /// The pointer will be properly aligned and non-null.
1205 ///
1206 /// After calling this function, the caller is responsible for the
1207 /// memory previously managed by the `Box`. In particular, the
1208 /// caller should properly destroy `T` and release the memory, taking
1209 /// into account the [memory layout] used by `Box`. The easiest way to
1210 /// do this is to convert the raw pointer back into a `Box` with the
1211 /// [`Box::from_raw`] function, allowing the `Box` destructor to perform
1212 /// the cleanup.
1213 ///
1214 /// Note: this is an associated function, which means that you have
1215 /// to call it as `Box::into_raw(b)` instead of `b.into_raw()`. This
1216 /// is so that there is no conflict with a method on the inner type.
1217 ///
1218 /// # Examples
1219 /// Converting the raw pointer back into a `Box` with [`Box::from_raw`]
1220 /// for automatic cleanup:
1221 /// ```
1222 /// let x = Box::new(String::from("Hello"));
1223 /// let ptr = Box::into_raw(x);
1224 /// let x = unsafe { Box::from_raw(ptr) };
1225 /// ```
1226 /// Manual cleanup by explicitly running the destructor and deallocating
1227 /// the memory:
1228 /// ```
1229 /// use std::alloc::{dealloc, Layout};
1230 /// use std::ptr;
1231 ///
1232 /// let x = Box::new(String::from("Hello"));
1233 /// let ptr = Box::into_raw(x);
1234 /// unsafe {
1235 /// ptr::drop_in_place(ptr);
1236 /// dealloc(ptr as *mut u8, Layout::new::<String>());
1237 /// }
1238 /// ```
1239 /// Note: This is equivalent to the following:
1240 /// ```
1241 /// let x = Box::new(String::from("Hello"));
1242 /// let ptr = Box::into_raw(x);
1243 /// unsafe {
1244 /// drop(Box::from_raw(ptr));
1245 /// }
1246 /// ```
1247 ///
1248 /// [memory layout]: self#memory-layout
1249 #[must_use = "losing the pointer will leak memory"]
1250 #[stable(feature = "box_raw", since = "1.4.0")]
1251 #[inline]
1252 pub fn into_raw(b: Self) -> *mut T {
1253 // Avoid `into_raw_with_allocator` as that interacts poorly with Miri's Stacked Borrows.
1254 let mut b = mem::ManuallyDrop::new(b);
1255 // We go through the built-in deref for `Box`, which is crucial for Miri to recognize this
1256 // operation for it's alias tracking.
1257 &raw mut **b
1258 }
1259
1260 /// Consumes the `Box`, returning a wrapped `NonNull` pointer.
1261 ///
1262 /// The pointer will be properly aligned.
1263 ///
1264 /// After calling this function, the caller is responsible for the
1265 /// memory previously managed by the `Box`. In particular, the
1266 /// caller should properly destroy `T` and release the memory, taking
1267 /// into account the [memory layout] used by `Box`. The easiest way to
1268 /// do this is to convert the `NonNull` pointer back into a `Box` with the
1269 /// [`Box::from_non_null`] function, allowing the `Box` destructor to
1270 /// perform the cleanup.
1271 ///
1272 /// Note: this is an associated function, which means that you have
1273 /// to call it as `Box::into_non_null(b)` instead of `b.into_non_null()`.
1274 /// This is so that there is no conflict with a method on the inner type.
1275 ///
1276 /// # Examples
1277 /// Converting the `NonNull` pointer back into a `Box` with [`Box::from_non_null`]
1278 /// for automatic cleanup:
1279 /// ```
1280 /// #![feature(box_vec_non_null)]
1281 ///
1282 /// let x = Box::new(String::from("Hello"));
1283 /// let non_null = Box::into_non_null(x);
1284 /// let x = unsafe { Box::from_non_null(non_null) };
1285 /// ```
1286 /// Manual cleanup by explicitly running the destructor and deallocating
1287 /// the memory:
1288 /// ```
1289 /// #![feature(box_vec_non_null)]
1290 ///
1291 /// use std::alloc::{dealloc, Layout};
1292 ///
1293 /// let x = Box::new(String::from("Hello"));
1294 /// let non_null = Box::into_non_null(x);
1295 /// unsafe {
1296 /// non_null.drop_in_place();
1297 /// dealloc(non_null.as_ptr().cast::<u8>(), Layout::new::<String>());
1298 /// }
1299 /// ```
1300 /// Note: This is equivalent to the following:
1301 /// ```
1302 /// #![feature(box_vec_non_null)]
1303 ///
1304 /// let x = Box::new(String::from("Hello"));
1305 /// let non_null = Box::into_non_null(x);
1306 /// unsafe {
1307 /// drop(Box::from_non_null(non_null));
1308 /// }
1309 /// ```
1310 ///
1311 /// [memory layout]: self#memory-layout
1312 #[must_use = "losing the pointer will leak memory"]
1313 #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1314 #[inline]
1315 pub fn into_non_null(b: Self) -> NonNull<T> {
1316 // SAFETY: `Box` is guaranteed to be non-null.
1317 unsafe { NonNull::new_unchecked(Self::into_raw(b)) }
1318 }
1319}
1320
1321impl<T: ?Sized, A: Allocator> Box<T, A> {
1322 /// Constructs a box from a raw pointer in the given allocator.
1323 ///
1324 /// After calling this function, the raw pointer is owned by the
1325 /// resulting `Box`. Specifically, the `Box` destructor will call
1326 /// the destructor of `T` and free the allocated memory. For this
1327 /// to be safe, the memory must have been allocated in accordance
1328 /// with the [memory layout] used by `Box` .
1329 ///
1330 /// # Safety
1331 ///
1332 /// This function is unsafe because improper use may lead to
1333 /// memory problems. For example, a double-free may occur if the
1334 /// function is called twice on the same raw pointer.
1335 ///
1336 /// The raw pointer must point to a block of memory allocated by `alloc`.
1337 ///
1338 /// # Examples
1339 ///
1340 /// Recreate a `Box` which was previously converted to a raw pointer
1341 /// using [`Box::into_raw_with_allocator`]:
1342 /// ```
1343 /// #![feature(allocator_api)]
1344 ///
1345 /// use std::alloc::System;
1346 ///
1347 /// let x = Box::new_in(5, System);
1348 /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1349 /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1350 /// ```
1351 /// Manually create a `Box` from scratch by using the system allocator:
1352 /// ```
1353 /// #![feature(allocator_api, slice_ptr_get)]
1354 ///
1355 /// use std::alloc::{Allocator, Layout, System};
1356 ///
1357 /// unsafe {
1358 /// let ptr = System.allocate(Layout::new::<i32>())?.as_mut_ptr() as *mut i32;
1359 /// // In general .write is required to avoid attempting to destruct
1360 /// // the (uninitialized) previous contents of `ptr`, though for this
1361 /// // simple example `*ptr = 5` would have worked as well.
1362 /// ptr.write(5);
1363 /// let x = Box::from_raw_in(ptr, System);
1364 /// }
1365 /// # Ok::<(), std::alloc::AllocError>(())
1366 /// ```
1367 ///
1368 /// [memory layout]: self#memory-layout
1369 #[unstable(feature = "allocator_api", issue = "32838")]
1370 #[inline]
1371 pub unsafe fn from_raw_in(raw: *mut T, alloc: A) -> Self {
1372 Box(unsafe { Unique::new_unchecked(raw) }, alloc)
1373 }
1374
1375 /// Constructs a box from a `NonNull` pointer in the given allocator.
1376 ///
1377 /// After calling this function, the `NonNull` pointer is owned by
1378 /// the resulting `Box`. Specifically, the `Box` destructor will call
1379 /// the destructor of `T` and free the allocated memory. For this
1380 /// to be safe, the memory must have been allocated in accordance
1381 /// with the [memory layout] used by `Box` .
1382 ///
1383 /// # Safety
1384 ///
1385 /// This function is unsafe because improper use may lead to
1386 /// memory problems. For example, a double-free may occur if the
1387 /// function is called twice on the same raw pointer.
1388 ///
1389 /// The non-null pointer must point to a block of memory allocated by `alloc`.
1390 ///
1391 /// # Examples
1392 ///
1393 /// Recreate a `Box` which was previously converted to a `NonNull` pointer
1394 /// using [`Box::into_non_null_with_allocator`]:
1395 /// ```
1396 /// #![feature(allocator_api, box_vec_non_null)]
1397 ///
1398 /// use std::alloc::System;
1399 ///
1400 /// let x = Box::new_in(5, System);
1401 /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1402 /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1403 /// ```
1404 /// Manually create a `Box` from scratch by using the system allocator:
1405 /// ```
1406 /// #![feature(allocator_api, box_vec_non_null, slice_ptr_get)]
1407 ///
1408 /// use std::alloc::{Allocator, Layout, System};
1409 ///
1410 /// unsafe {
1411 /// let non_null = System.allocate(Layout::new::<i32>())?.cast::<i32>();
1412 /// // In general .write is required to avoid attempting to destruct
1413 /// // the (uninitialized) previous contents of `non_null`.
1414 /// non_null.write(5);
1415 /// let x = Box::from_non_null_in(non_null, System);
1416 /// }
1417 /// # Ok::<(), std::alloc::AllocError>(())
1418 /// ```
1419 ///
1420 /// [memory layout]: self#memory-layout
1421 #[unstable(feature = "allocator_api", issue = "32838")]
1422 // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1423 #[inline]
1424 pub unsafe fn from_non_null_in(raw: NonNull<T>, alloc: A) -> Self {
1425 // SAFETY: guaranteed by the caller.
1426 unsafe { Box::from_raw_in(raw.as_ptr(), alloc) }
1427 }
1428
1429 /// Consumes the `Box`, returning a wrapped raw pointer and the allocator.
1430 ///
1431 /// The pointer will be properly aligned and non-null.
1432 ///
1433 /// After calling this function, the caller is responsible for the
1434 /// memory previously managed by the `Box`. In particular, the
1435 /// caller should properly destroy `T` and release the memory, taking
1436 /// into account the [memory layout] used by `Box`. The easiest way to
1437 /// do this is to convert the raw pointer back into a `Box` with the
1438 /// [`Box::from_raw_in`] function, allowing the `Box` destructor to perform
1439 /// the cleanup.
1440 ///
1441 /// Note: this is an associated function, which means that you have
1442 /// to call it as `Box::into_raw_with_allocator(b)` instead of `b.into_raw_with_allocator()`. This
1443 /// is so that there is no conflict with a method on the inner type.
1444 ///
1445 /// # Examples
1446 /// Converting the raw pointer back into a `Box` with [`Box::from_raw_in`]
1447 /// for automatic cleanup:
1448 /// ```
1449 /// #![feature(allocator_api)]
1450 ///
1451 /// use std::alloc::System;
1452 ///
1453 /// let x = Box::new_in(String::from("Hello"), System);
1454 /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1455 /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1456 /// ```
1457 /// Manual cleanup by explicitly running the destructor and deallocating
1458 /// the memory:
1459 /// ```
1460 /// #![feature(allocator_api)]
1461 ///
1462 /// use std::alloc::{Allocator, Layout, System};
1463 /// use std::ptr::{self, NonNull};
1464 ///
1465 /// let x = Box::new_in(String::from("Hello"), System);
1466 /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1467 /// unsafe {
1468 /// ptr::drop_in_place(ptr);
1469 /// let non_null = NonNull::new_unchecked(ptr);
1470 /// alloc.deallocate(non_null.cast(), Layout::new::<String>());
1471 /// }
1472 /// ```
1473 ///
1474 /// [memory layout]: self#memory-layout
1475 #[must_use = "losing the pointer will leak memory"]
1476 #[unstable(feature = "allocator_api", issue = "32838")]
1477 #[inline]
1478 pub fn into_raw_with_allocator(b: Self) -> (*mut T, A) {
1479 let mut b = mem::ManuallyDrop::new(b);
1480 // We carefully get the raw pointer out in a way that Miri's aliasing model understands what
1481 // is happening: using the primitive "deref" of `Box`. In case `A` is *not* `Global`, we
1482 // want *no* aliasing requirements here!
1483 // In case `A` *is* `Global`, this does not quite have the right behavior; `into_raw`
1484 // works around that.
1485 let ptr = &raw mut **b;
1486 let alloc = unsafe { ptr::read(&b.1) };
1487 (ptr, alloc)
1488 }
1489
1490 /// Consumes the `Box`, returning a wrapped `NonNull` pointer and the allocator.
1491 ///
1492 /// The pointer will be properly aligned.
1493 ///
1494 /// After calling this function, the caller is responsible for the
1495 /// memory previously managed by the `Box`. In particular, the
1496 /// caller should properly destroy `T` and release the memory, taking
1497 /// into account the [memory layout] used by `Box`. The easiest way to
1498 /// do this is to convert the `NonNull` pointer back into a `Box` with the
1499 /// [`Box::from_non_null_in`] function, allowing the `Box` destructor to
1500 /// perform the cleanup.
1501 ///
1502 /// Note: this is an associated function, which means that you have
1503 /// to call it as `Box::into_non_null_with_allocator(b)` instead of
1504 /// `b.into_non_null_with_allocator()`. This is so that there is no
1505 /// conflict with a method on the inner type.
1506 ///
1507 /// # Examples
1508 /// Converting the `NonNull` pointer back into a `Box` with
1509 /// [`Box::from_non_null_in`] for automatic cleanup:
1510 /// ```
1511 /// #![feature(allocator_api, box_vec_non_null)]
1512 ///
1513 /// use std::alloc::System;
1514 ///
1515 /// let x = Box::new_in(String::from("Hello"), System);
1516 /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1517 /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1518 /// ```
1519 /// Manual cleanup by explicitly running the destructor and deallocating
1520 /// the memory:
1521 /// ```
1522 /// #![feature(allocator_api, box_vec_non_null)]
1523 ///
1524 /// use std::alloc::{Allocator, Layout, System};
1525 ///
1526 /// let x = Box::new_in(String::from("Hello"), System);
1527 /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1528 /// unsafe {
1529 /// non_null.drop_in_place();
1530 /// alloc.deallocate(non_null.cast::<u8>(), Layout::new::<String>());
1531 /// }
1532 /// ```
1533 ///
1534 /// [memory layout]: self#memory-layout
1535 #[must_use = "losing the pointer will leak memory"]
1536 #[unstable(feature = "allocator_api", issue = "32838")]
1537 // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1538 #[inline]
1539 pub fn into_non_null_with_allocator(b: Self) -> (NonNull<T>, A) {
1540 let (ptr, alloc) = Box::into_raw_with_allocator(b);
1541 // SAFETY: `Box` is guaranteed to be non-null.
1542 unsafe { (NonNull::new_unchecked(ptr), alloc) }
1543 }
1544
1545 #[unstable(
1546 feature = "ptr_internals",
1547 issue = "none",
1548 reason = "use `Box::leak(b).into()` or `Unique::from(Box::leak(b))` instead"
1549 )]
1550 #[inline]
1551 #[doc(hidden)]
1552 pub fn into_unique(b: Self) -> (Unique<T>, A) {
1553 let (ptr, alloc) = Box::into_raw_with_allocator(b);
1554 unsafe { (Unique::from(&mut *ptr), alloc) }
1555 }
1556
1557 /// Returns a raw mutable pointer to the `Box`'s contents.
1558 ///
1559 /// The caller must ensure that the `Box` outlives the pointer this
1560 /// function returns, or else it will end up dangling.
1561 ///
1562 /// This method guarantees that for the purpose of the aliasing model, this method
1563 /// does not materialize a reference to the underlying memory, and thus the returned pointer
1564 /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1565 /// Note that calling other methods that materialize references to the memory
1566 /// may still invalidate this pointer.
1567 /// See the example below for how this guarantee can be used.
1568 ///
1569 /// # Examples
1570 ///
1571 /// Due to the aliasing guarantee, the following code is legal:
1572 ///
1573 /// ```rust
1574 /// #![feature(box_as_ptr)]
1575 ///
1576 /// unsafe {
1577 /// let mut b = Box::new(0);
1578 /// let ptr1 = Box::as_mut_ptr(&mut b);
1579 /// ptr1.write(1);
1580 /// let ptr2 = Box::as_mut_ptr(&mut b);
1581 /// ptr2.write(2);
1582 /// // Notably, the write to `ptr2` did *not* invalidate `ptr1`:
1583 /// ptr1.write(3);
1584 /// }
1585 /// ```
1586 ///
1587 /// [`as_mut_ptr`]: Self::as_mut_ptr
1588 /// [`as_ptr`]: Self::as_ptr
1589 #[unstable(feature = "box_as_ptr", issue = "129090")]
1590 #[rustc_never_returns_null_ptr]
1591 #[rustc_as_ptr]
1592 #[inline]
1593 pub fn as_mut_ptr(b: &mut Self) -> *mut T {
1594 // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1595 // any references.
1596 &raw mut **b
1597 }
1598
1599 /// Returns a raw pointer to the `Box`'s contents.
1600 ///
1601 /// The caller must ensure that the `Box` outlives the pointer this
1602 /// function returns, or else it will end up dangling.
1603 ///
1604 /// The caller must also ensure that the memory the pointer (non-transitively) points to
1605 /// is never written to (except inside an `UnsafeCell`) using this pointer or any pointer
1606 /// derived from it. If you need to mutate the contents of the `Box`, use [`as_mut_ptr`].
1607 ///
1608 /// This method guarantees that for the purpose of the aliasing model, this method
1609 /// does not materialize a reference to the underlying memory, and thus the returned pointer
1610 /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1611 /// Note that calling other methods that materialize mutable references to the memory,
1612 /// as well as writing to this memory, may still invalidate this pointer.
1613 /// See the example below for how this guarantee can be used.
1614 ///
1615 /// # Examples
1616 ///
1617 /// Due to the aliasing guarantee, the following code is legal:
1618 ///
1619 /// ```rust
1620 /// #![feature(box_as_ptr)]
1621 ///
1622 /// unsafe {
1623 /// let mut v = Box::new(0);
1624 /// let ptr1 = Box::as_ptr(&v);
1625 /// let ptr2 = Box::as_mut_ptr(&mut v);
1626 /// let _val = ptr2.read();
1627 /// // No write to this memory has happened yet, so `ptr1` is still valid.
1628 /// let _val = ptr1.read();
1629 /// // However, once we do a write...
1630 /// ptr2.write(1);
1631 /// // ... `ptr1` is no longer valid.
1632 /// // This would be UB: let _val = ptr1.read();
1633 /// }
1634 /// ```
1635 ///
1636 /// [`as_mut_ptr`]: Self::as_mut_ptr
1637 /// [`as_ptr`]: Self::as_ptr
1638 #[unstable(feature = "box_as_ptr", issue = "129090")]
1639 #[rustc_never_returns_null_ptr]
1640 #[rustc_as_ptr]
1641 #[inline]
1642 pub fn as_ptr(b: &Self) -> *const T {
1643 // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1644 // any references.
1645 &raw const **b
1646 }
1647
1648 /// Returns a reference to the underlying allocator.
1649 ///
1650 /// Note: this is an associated function, which means that you have
1651 /// to call it as `Box::allocator(&b)` instead of `b.allocator()`. This
1652 /// is so that there is no conflict with a method on the inner type.
1653 #[unstable(feature = "allocator_api", issue = "32838")]
1654 #[inline]
1655 pub fn allocator(b: &Self) -> &A {
1656 &b.1
1657 }
1658
1659 /// Consumes and leaks the `Box`, returning a mutable reference,
1660 /// `&'a mut T`.
1661 ///
1662 /// Note that the type `T` must outlive the chosen lifetime `'a`. If the type
1663 /// has only static references, or none at all, then this may be chosen to be
1664 /// `'static`.
1665 ///
1666 /// This function is mainly useful for data that lives for the remainder of
1667 /// the program's life. Dropping the returned reference will cause a memory
1668 /// leak. If this is not acceptable, the reference should first be wrapped
1669 /// with the [`Box::from_raw`] function producing a `Box`. This `Box` can
1670 /// then be dropped which will properly destroy `T` and release the
1671 /// allocated memory.
1672 ///
1673 /// Note: this is an associated function, which means that you have
1674 /// to call it as `Box::leak(b)` instead of `b.leak()`. This
1675 /// is so that there is no conflict with a method on the inner type.
1676 ///
1677 /// # Examples
1678 ///
1679 /// Simple usage:
1680 ///
1681 /// ```
1682 /// let x = Box::new(41);
1683 /// let static_ref: &'static mut usize = Box::leak(x);
1684 /// *static_ref += 1;
1685 /// assert_eq!(*static_ref, 42);
1686 /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1687 /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1688 /// # drop(unsafe { Box::from_raw(static_ref) });
1689 /// ```
1690 ///
1691 /// Unsized data:
1692 ///
1693 /// ```
1694 /// let x = vec![1, 2, 3].into_boxed_slice();
1695 /// let static_ref = Box::leak(x);
1696 /// static_ref[0] = 4;
1697 /// assert_eq!(*static_ref, [4, 2, 3]);
1698 /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1699 /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1700 /// # drop(unsafe { Box::from_raw(static_ref) });
1701 /// ```
1702 #[stable(feature = "box_leak", since = "1.26.0")]
1703 #[inline]
1704 pub fn leak<'a>(b: Self) -> &'a mut T
1705 where
1706 A: 'a,
1707 {
1708 let (ptr, alloc) = Box::into_raw_with_allocator(b);
1709 mem::forget(alloc);
1710 unsafe { &mut *ptr }
1711 }
1712
1713 /// Converts a `Box<T>` into a `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
1714 /// `*boxed` will be pinned in memory and unable to be moved.
1715 ///
1716 /// This conversion does not allocate on the heap and happens in place.
1717 ///
1718 /// This is also available via [`From`].
1719 ///
1720 /// Constructing and pinning a `Box` with <code>Box::into_pin([Box::new]\(x))</code>
1721 /// can also be written more concisely using <code>[Box::pin]\(x)</code>.
1722 /// This `into_pin` method is useful if you already have a `Box<T>`, or you are
1723 /// constructing a (pinned) `Box` in a different way than with [`Box::new`].
1724 ///
1725 /// # Notes
1726 ///
1727 /// It's not recommended that crates add an impl like `From<Box<T>> for Pin<T>`,
1728 /// as it'll introduce an ambiguity when calling `Pin::from`.
1729 /// A demonstration of such a poor impl is shown below.
1730 ///
1731 /// ```compile_fail
1732 /// # use std::pin::Pin;
1733 /// struct Foo; // A type defined in this crate.
1734 /// impl From<Box<()>> for Pin<Foo> {
1735 /// fn from(_: Box<()>) -> Pin<Foo> {
1736 /// Pin::new(Foo)
1737 /// }
1738 /// }
1739 ///
1740 /// let foo = Box::new(());
1741 /// let bar = Pin::from(foo);
1742 /// ```
1743 #[stable(feature = "box_into_pin", since = "1.63.0")]
1744 pub fn into_pin(boxed: Self) -> Pin<Self>
1745 where
1746 A: 'static,
1747 {
1748 // It's not possible to move or replace the insides of a `Pin<Box<T>>`
1749 // when `T: !Unpin`, so it's safe to pin it directly without any
1750 // additional requirements.
1751 unsafe { Pin::new_unchecked(boxed) }
1752 }
1753}
1754
1755#[stable(feature = "rust1", since = "1.0.0")]
1756unsafe impl<#[may_dangle] T: ?Sized, A: Allocator> Drop for Box<T, A> {
1757 #[inline]
1758 fn drop(&mut self) {
1759 // the T in the Box is dropped by the compiler before the destructor is run
1760
1761 let ptr = self.0;
1762
1763 unsafe {
1764 let layout = Layout::for_value_raw(ptr.as_ptr());
1765 if layout.size() != 0 {
1766 self.1.deallocate(From::from(ptr.cast()), layout);
1767 }
1768 }
1769 }
1770}
1771
1772#[cfg(not(no_global_oom_handling))]
1773#[stable(feature = "rust1", since = "1.0.0")]
1774impl<T: Default> Default for Box<T> {
1775 /// Creates a `Box<T>`, with the `Default` value for `T`.
1776 #[inline]
1777 fn default() -> Self {
1778 let mut x: Box<mem::MaybeUninit<T>> = Box::new_uninit();
1779 unsafe {
1780 // SAFETY: `x` is valid for writing and has the same layout as `T`.
1781 // If `T::default()` panics, dropping `x` will just deallocate the Box as `MaybeUninit<T>`
1782 // does not have a destructor.
1783 //
1784 // We use `ptr::write` as `MaybeUninit::write` creates
1785 // extra stack copies of `T` in debug mode.
1786 //
1787 // See https://github.com/rust-lang/rust/issues/136043 for more context.
1788 ptr::write(&raw mut *x as *mut T, T::default());
1789 // SAFETY: `x` was just initialized above.
1790 x.assume_init()
1791 }
1792 }
1793}
1794
1795#[cfg(not(no_global_oom_handling))]
1796#[stable(feature = "rust1", since = "1.0.0")]
1797impl<T> Default for Box<[T]> {
1798 /// Creates an empty `[T]` inside a `Box`.
1799 #[inline]
1800 fn default() -> Self {
1801 let ptr: Unique<[T]> = Unique::<[T; 0]>::dangling();
1802 Box(ptr, Global)
1803 }
1804}
1805
1806#[cfg(not(no_global_oom_handling))]
1807#[stable(feature = "default_box_extra", since = "1.17.0")]
1808impl Default for Box<str> {
1809 #[inline]
1810 fn default() -> Self {
1811 // SAFETY: This is the same as `Unique::cast<U>` but with an unsized `U = str`.
1812 let ptr: Unique<str> = unsafe {
1813 let bytes: Unique<[u8]> = Unique::<[u8; 0]>::dangling();
1814 Unique::new_unchecked(bytes.as_ptr() as *mut str)
1815 };
1816 Box(ptr, Global)
1817 }
1818}
1819
1820#[cfg(not(no_global_oom_handling))]
1821#[stable(feature = "pin_default_impls", since = "1.91.0")]
1822impl<T> Default for Pin<Box<T>>
1823where
1824 T: ?Sized,
1825 Box<T>: Default,
1826{
1827 #[inline]
1828 fn default() -> Self {
1829 Box::into_pin(Box::<T>::default())
1830 }
1831}
1832
1833#[cfg(not(no_global_oom_handling))]
1834#[stable(feature = "rust1", since = "1.0.0")]
1835impl<T: Clone, A: Allocator + Clone> Clone for Box<T, A> {
1836 /// Returns a new box with a `clone()` of this box's contents.
1837 ///
1838 /// # Examples
1839 ///
1840 /// ```
1841 /// let x = Box::new(5);
1842 /// let y = x.clone();
1843 ///
1844 /// // The value is the same
1845 /// assert_eq!(x, y);
1846 ///
1847 /// // But they are unique objects
1848 /// assert_ne!(&*x as *const i32, &*y as *const i32);
1849 /// ```
1850 #[inline]
1851 fn clone(&self) -> Self {
1852 // Pre-allocate memory to allow writing the cloned value directly.
1853 let mut boxed = Self::new_uninit_in(self.1.clone());
1854 unsafe {
1855 (**self).clone_to_uninit(boxed.as_mut_ptr().cast());
1856 boxed.assume_init()
1857 }
1858 }
1859
1860 /// Copies `source`'s contents into `self` without creating a new allocation.
1861 ///
1862 /// # Examples
1863 ///
1864 /// ```
1865 /// let x = Box::new(5);
1866 /// let mut y = Box::new(10);
1867 /// let yp: *const i32 = &*y;
1868 ///
1869 /// y.clone_from(&x);
1870 ///
1871 /// // The value is the same
1872 /// assert_eq!(x, y);
1873 ///
1874 /// // And no allocation occurred
1875 /// assert_eq!(yp, &*y);
1876 /// ```
1877 #[inline]
1878 fn clone_from(&mut self, source: &Self) {
1879 (**self).clone_from(&(**source));
1880 }
1881}
1882
1883#[cfg(not(no_global_oom_handling))]
1884#[stable(feature = "box_slice_clone", since = "1.3.0")]
1885impl<T: Clone, A: Allocator + Clone> Clone for Box<[T], A> {
1886 fn clone(&self) -> Self {
1887 let alloc = Box::allocator(self).clone();
1888 self.to_vec_in(alloc).into_boxed_slice()
1889 }
1890
1891 /// Copies `source`'s contents into `self` without creating a new allocation,
1892 /// so long as the two are of the same length.
1893 ///
1894 /// # Examples
1895 ///
1896 /// ```
1897 /// let x = Box::new([5, 6, 7]);
1898 /// let mut y = Box::new([8, 9, 10]);
1899 /// let yp: *const [i32] = &*y;
1900 ///
1901 /// y.clone_from(&x);
1902 ///
1903 /// // The value is the same
1904 /// assert_eq!(x, y);
1905 ///
1906 /// // And no allocation occurred
1907 /// assert_eq!(yp, &*y);
1908 /// ```
1909 fn clone_from(&mut self, source: &Self) {
1910 if self.len() == source.len() {
1911 self.clone_from_slice(&source);
1912 } else {
1913 *self = source.clone();
1914 }
1915 }
1916}
1917
1918#[cfg(not(no_global_oom_handling))]
1919#[stable(feature = "box_slice_clone", since = "1.3.0")]
1920impl Clone for Box<str> {
1921 fn clone(&self) -> Self {
1922 // this makes a copy of the data
1923 let buf: Box<[u8]> = self.as_bytes().into();
1924 unsafe { from_boxed_utf8_unchecked(buf) }
1925 }
1926}
1927
1928#[stable(feature = "rust1", since = "1.0.0")]
1929impl<T: ?Sized + PartialEq, A: Allocator> PartialEq for Box<T, A> {
1930 #[inline]
1931 fn eq(&self, other: &Self) -> bool {
1932 PartialEq::eq(&**self, &**other)
1933 }
1934 #[inline]
1935 fn ne(&self, other: &Self) -> bool {
1936 PartialEq::ne(&**self, &**other)
1937 }
1938}
1939
1940#[stable(feature = "rust1", since = "1.0.0")]
1941impl<T: ?Sized + PartialOrd, A: Allocator> PartialOrd for Box<T, A> {
1942 #[inline]
1943 fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
1944 PartialOrd::partial_cmp(&**self, &**other)
1945 }
1946 #[inline]
1947 fn lt(&self, other: &Self) -> bool {
1948 PartialOrd::lt(&**self, &**other)
1949 }
1950 #[inline]
1951 fn le(&self, other: &Self) -> bool {
1952 PartialOrd::le(&**self, &**other)
1953 }
1954 #[inline]
1955 fn ge(&self, other: &Self) -> bool {
1956 PartialOrd::ge(&**self, &**other)
1957 }
1958 #[inline]
1959 fn gt(&self, other: &Self) -> bool {
1960 PartialOrd::gt(&**self, &**other)
1961 }
1962}
1963
1964#[stable(feature = "rust1", since = "1.0.0")]
1965impl<T: ?Sized + Ord, A: Allocator> Ord for Box<T, A> {
1966 #[inline]
1967 fn cmp(&self, other: &Self) -> Ordering {
1968 Ord::cmp(&**self, &**other)
1969 }
1970}
1971
1972#[stable(feature = "rust1", since = "1.0.0")]
1973impl<T: ?Sized + Eq, A: Allocator> Eq for Box<T, A> {}
1974
1975#[stable(feature = "rust1", since = "1.0.0")]
1976impl<T: ?Sized + Hash, A: Allocator> Hash for Box<T, A> {
1977 fn hash<H: Hasher>(&self, state: &mut H) {
1978 (**self).hash(state);
1979 }
1980}
1981
1982#[stable(feature = "indirect_hasher_impl", since = "1.22.0")]
1983impl<T: ?Sized + Hasher, A: Allocator> Hasher for Box<T, A> {
1984 fn finish(&self) -> u64 {
1985 (**self).finish()
1986 }
1987 fn write(&mut self, bytes: &[u8]) {
1988 (**self).write(bytes)
1989 }
1990 fn write_u8(&mut self, i: u8) {
1991 (**self).write_u8(i)
1992 }
1993 fn write_u16(&mut self, i: u16) {
1994 (**self).write_u16(i)
1995 }
1996 fn write_u32(&mut self, i: u32) {
1997 (**self).write_u32(i)
1998 }
1999 fn write_u64(&mut self, i: u64) {
2000 (**self).write_u64(i)
2001 }
2002 fn write_u128(&mut self, i: u128) {
2003 (**self).write_u128(i)
2004 }
2005 fn write_usize(&mut self, i: usize) {
2006 (**self).write_usize(i)
2007 }
2008 fn write_i8(&mut self, i: i8) {
2009 (**self).write_i8(i)
2010 }
2011 fn write_i16(&mut self, i: i16) {
2012 (**self).write_i16(i)
2013 }
2014 fn write_i32(&mut self, i: i32) {
2015 (**self).write_i32(i)
2016 }
2017 fn write_i64(&mut self, i: i64) {
2018 (**self).write_i64(i)
2019 }
2020 fn write_i128(&mut self, i: i128) {
2021 (**self).write_i128(i)
2022 }
2023 fn write_isize(&mut self, i: isize) {
2024 (**self).write_isize(i)
2025 }
2026 fn write_length_prefix(&mut self, len: usize) {
2027 (**self).write_length_prefix(len)
2028 }
2029 fn write_str(&mut self, s: &str) {
2030 (**self).write_str(s)
2031 }
2032}
2033
2034#[stable(feature = "rust1", since = "1.0.0")]
2035impl<T: fmt::Display + ?Sized, A: Allocator> fmt::Display for Box<T, A> {
2036 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2037 fmt::Display::fmt(&**self, f)
2038 }
2039}
2040
2041#[stable(feature = "rust1", since = "1.0.0")]
2042impl<T: fmt::Debug + ?Sized, A: Allocator> fmt::Debug for Box<T, A> {
2043 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2044 fmt::Debug::fmt(&**self, f)
2045 }
2046}
2047
2048#[stable(feature = "rust1", since = "1.0.0")]
2049impl<T: ?Sized, A: Allocator> fmt::Pointer for Box<T, A> {
2050 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2051 // It's not possible to extract the inner Uniq directly from the Box,
2052 // instead we cast it to a *const which aliases the Unique
2053 let ptr: *const T = &**self;
2054 fmt::Pointer::fmt(&ptr, f)
2055 }
2056}
2057
2058#[stable(feature = "rust1", since = "1.0.0")]
2059impl<T: ?Sized, A: Allocator> Deref for Box<T, A> {
2060 type Target = T;
2061
2062 fn deref(&self) -> &T {
2063 &**self
2064 }
2065}
2066
2067#[stable(feature = "rust1", since = "1.0.0")]
2068impl<T: ?Sized, A: Allocator> DerefMut for Box<T, A> {
2069 fn deref_mut(&mut self) -> &mut T {
2070 &mut **self
2071 }
2072}
2073
2074#[unstable(feature = "deref_pure_trait", issue = "87121")]
2075unsafe impl<T: ?Sized, A: Allocator> DerefPure for Box<T, A> {}
2076
2077#[unstable(feature = "legacy_receiver_trait", issue = "none")]
2078impl<T: ?Sized, A: Allocator> LegacyReceiver for Box<T, A> {}
2079
2080#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2081impl<Args: Tuple, F: FnOnce<Args> + ?Sized, A: Allocator> FnOnce<Args> for Box<F, A> {
2082 type Output = <F as FnOnce<Args>>::Output;
2083
2084 extern "rust-call" fn call_once(self, args: Args) -> Self::Output {
2085 <F as FnOnce<Args>>::call_once(*self, args)
2086 }
2087}
2088
2089#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2090impl<Args: Tuple, F: FnMut<Args> + ?Sized, A: Allocator> FnMut<Args> for Box<F, A> {
2091 extern "rust-call" fn call_mut(&mut self, args: Args) -> Self::Output {
2092 <F as FnMut<Args>>::call_mut(self, args)
2093 }
2094}
2095
2096#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2097impl<Args: Tuple, F: Fn<Args> + ?Sized, A: Allocator> Fn<Args> for Box<F, A> {
2098 extern "rust-call" fn call(&self, args: Args) -> Self::Output {
2099 <F as Fn<Args>>::call(self, args)
2100 }
2101}
2102
2103#[stable(feature = "async_closure", since = "1.85.0")]
2104impl<Args: Tuple, F: AsyncFnOnce<Args> + ?Sized, A: Allocator> AsyncFnOnce<Args> for Box<F, A> {
2105 type Output = F::Output;
2106 type CallOnceFuture = F::CallOnceFuture;
2107
2108 extern "rust-call" fn async_call_once(self, args: Args) -> Self::CallOnceFuture {
2109 F::async_call_once(*self, args)
2110 }
2111}
2112
2113#[stable(feature = "async_closure", since = "1.85.0")]
2114impl<Args: Tuple, F: AsyncFnMut<Args> + ?Sized, A: Allocator> AsyncFnMut<Args> for Box<F, A> {
2115 type CallRefFuture<'a>
2116 = F::CallRefFuture<'a>
2117 where
2118 Self: 'a;
2119
2120 extern "rust-call" fn async_call_mut(&mut self, args: Args) -> Self::CallRefFuture<'_> {
2121 F::async_call_mut(self, args)
2122 }
2123}
2124
2125#[stable(feature = "async_closure", since = "1.85.0")]
2126impl<Args: Tuple, F: AsyncFn<Args> + ?Sized, A: Allocator> AsyncFn<Args> for Box<F, A> {
2127 extern "rust-call" fn async_call(&self, args: Args) -> Self::CallRefFuture<'_> {
2128 F::async_call(self, args)
2129 }
2130}
2131
2132#[unstable(feature = "coerce_unsized", issue = "18598")]
2133impl<T: ?Sized + Unsize<U>, U: ?Sized, A: Allocator> CoerceUnsized<Box<U, A>> for Box<T, A> {}
2134
2135#[unstable(feature = "pin_coerce_unsized_trait", issue = "123430")]
2136unsafe impl<T: ?Sized, A: Allocator> PinCoerceUnsized for Box<T, A> {}
2137
2138// It is quite crucial that we only allow the `Global` allocator here.
2139// Handling arbitrary custom allocators (which can affect the `Box` layout heavily!)
2140// would need a lot of codegen and interpreter adjustments.
2141#[unstable(feature = "dispatch_from_dyn", issue = "none")]
2142impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Box<U>> for Box<T, Global> {}
2143
2144#[stable(feature = "box_borrow", since = "1.1.0")]
2145impl<T: ?Sized, A: Allocator> Borrow<T> for Box<T, A> {
2146 fn borrow(&self) -> &T {
2147 &**self
2148 }
2149}
2150
2151#[stable(feature = "box_borrow", since = "1.1.0")]
2152impl<T: ?Sized, A: Allocator> BorrowMut<T> for Box<T, A> {
2153 fn borrow_mut(&mut self) -> &mut T {
2154 &mut **self
2155 }
2156}
2157
2158#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2159impl<T: ?Sized, A: Allocator> AsRef<T> for Box<T, A> {
2160 fn as_ref(&self) -> &T {
2161 &**self
2162 }
2163}
2164
2165#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2166impl<T: ?Sized, A: Allocator> AsMut<T> for Box<T, A> {
2167 fn as_mut(&mut self) -> &mut T {
2168 &mut **self
2169 }
2170}
2171
2172/* Nota bene
2173 *
2174 * We could have chosen not to add this impl, and instead have written a
2175 * function of Pin<Box<T>> to Pin<T>. Such a function would not be sound,
2176 * because Box<T> implements Unpin even when T does not, as a result of
2177 * this impl.
2178 *
2179 * We chose this API instead of the alternative for a few reasons:
2180 * - Logically, it is helpful to understand pinning in regard to the
2181 * memory region being pointed to. For this reason none of the
2182 * standard library pointer types support projecting through a pin
2183 * (Box<T> is the only pointer type in std for which this would be
2184 * safe.)
2185 * - It is in practice very useful to have Box<T> be unconditionally
2186 * Unpin because of trait objects, for which the structural auto
2187 * trait functionality does not apply (e.g., Box<dyn Foo> would
2188 * otherwise not be Unpin).
2189 *
2190 * Another type with the same semantics as Box but only a conditional
2191 * implementation of `Unpin` (where `T: Unpin`) would be valid/safe, and
2192 * could have a method to project a Pin<T> from it.
2193 */
2194#[stable(feature = "pin", since = "1.33.0")]
2195impl<T: ?Sized, A: Allocator> Unpin for Box<T, A> {}
2196
2197#[unstable(feature = "coroutine_trait", issue = "43122")]
2198impl<G: ?Sized + Coroutine<R> + Unpin, R, A: Allocator> Coroutine<R> for Box<G, A> {
2199 type Yield = G::Yield;
2200 type Return = G::Return;
2201
2202 fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2203 G::resume(Pin::new(&mut *self), arg)
2204 }
2205}
2206
2207#[unstable(feature = "coroutine_trait", issue = "43122")]
2208impl<G: ?Sized + Coroutine<R>, R, A: Allocator> Coroutine<R> for Pin<Box<G, A>>
2209where
2210 A: 'static,
2211{
2212 type Yield = G::Yield;
2213 type Return = G::Return;
2214
2215 fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2216 G::resume((*self).as_mut(), arg)
2217 }
2218}
2219
2220#[stable(feature = "futures_api", since = "1.36.0")]
2221impl<F: ?Sized + Future + Unpin, A: Allocator> Future for Box<F, A> {
2222 type Output = F::Output;
2223
2224 fn poll(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Self::Output> {
2225 F::poll(Pin::new(&mut *self), cx)
2226 }
2227}
2228
2229#[stable(feature = "box_error", since = "1.8.0")]
2230impl<E: Error> Error for Box<E> {
2231 #[allow(deprecated)]
2232 fn cause(&self) -> Option<&dyn Error> {
2233 Error::cause(&**self)
2234 }
2235
2236 fn source(&self) -> Option<&(dyn Error + 'static)> {
2237 Error::source(&**self)
2238 }
2239
2240 fn provide<'b>(&'b self, request: &mut error::Request<'b>) {
2241 Error::provide(&**self, request);
2242 }
2243}