Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Tags: zookage/spark

Tags

v4.0.1-zookage-0.3

Toggle v4.0.1-zookage-0.3's commit message

Verified

This commit was signed with the committer’s verified signature.
okumin Shohei Okumiya
[SPARK-53539][INFRA] Add `libwebp-dev` to recover `spark-rm/Dockerfil…

…e` building

### What changes were proposed in this pull request?

This PR aims to add `libwebp-dev` to recover `spark-rm/Dockerfile` building.

### Why are the changes needed?

`Apache Spark` release docker image compilation has been broken for last 7 days due to the SparkR package compilation.
- https://github.com/apache/spark/actions/workflows/release.yml
    - https://github.com/apache/spark/actions/runs/17425825244

```
apache#11 559.4 No package 'libwebpmux' found
...
apache#11 559.4 -------------------------- [ERROR MESSAGE] ---------------------------
apache#11 559.4 <stdin>:1:10: fatal error: ft2build.h: No such file or directory
apache#11 559.4 compilation terminated.
apache#11 559.4 --------------------------------------------------------------------
apache#11 559.4 ERROR: configuration failed for package 'ragg'
```

### Does this PR introduce _any_ user-facing change?

No, this is a fix for Apache Spark release tool.

### How was this patch tested?

Manually build.

```
$ cd dev/create-release/spark-rm
$ docker build .
```

**BEFORE**

```
...
Dockerfile:83
--------------------
  82 |     # See more in SPARK-39959, roxygen2 < 7.2.1
  83 | >>> RUN Rscript -e "install.packages(c('devtools', 'knitr', 'markdown',  \
  84 | >>>     'rmarkdown', 'testthat', 'devtools', 'e1071', 'survival', 'arrow',  \
  85 | >>>     'ggplot2', 'mvtnorm', 'statmod', 'xml2'), repos='https://cloud.r-project.org/')" && \
  86 | >>>     Rscript -e "devtools::install_version('roxygen2', version='7.2.0', repos='https://cloud.r-project.org')" && \
  87 | >>>     Rscript -e "devtools::install_version('lintr', version='2.0.1', repos='https://cloud.r-project.org')" && \
  88 | >>>     Rscript -e "devtools::install_version('pkgdown', version='2.0.1', repos='https://cloud.r-project.org')" && \
  89 | >>>     Rscript -e "devtools::install_version('preferably', version='0.4', repos='https://cloud.r-project.org')"
  90 |
--------------------
ERROR: failed to build: failed to solve:
```

**AFTER**
```
...
 => [ 6/22] RUN add-apt-repository 'deb https://cloud.r-project.org/bin/linux/ubuntu jammy-cran40/'                                                             3.8s
 => [ 7/22] RUN Rscript -e "install.packages(c('devtools', 'knitr', 'markdown',      'rmarkdown', 'testthat', 'devtools', 'e1071', 'survival', 'arrow',       892.2s
 => [ 8/22] RUN add-apt-repository ppa:pypy/ppa                                                                                                                15.3s
...
```

After merging this PR, we can validate via the daily release dry-run CI.

- https://github.com/apache/spark/actions/workflows/release.yml

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes apache#52290 from dongjoon-hyun/SPARK-53539.

Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>

v4.1.0-preview1

Toggle v4.1.0-preview1's commit message
Preparing Spark release v4.1.0-preview1-rc1

v4.1.0-preview1-rc1

Toggle v4.1.0-preview1-rc1's commit message
Preparing Spark release v4.1.0-preview1-rc1

v3.5.6

Toggle v3.5.6's commit message
Preparing Spark release v3.5.6-rc1

v4.0.0

Toggle v4.0.0's commit message
Preparing Spark release v4.0.0-rc7

v4.0.0-rc7

Toggle v4.0.0-rc7's commit message
Preparing Spark release v4.0.0-rc7

v4.0.0-rc6

Toggle v4.0.0-rc6's commit message
Preparing Spark release v4.0.0-rc6

v4.0.0-rc5

Toggle v4.0.0-rc5's commit message
Preparing Spark release v4.0.0-rc5

v4.0.0-rc4

Toggle v4.0.0-rc4's commit message
Preparing Spark release v4.0.0-rc4

v4.0.0-rc3

Toggle v4.0.0-rc3's commit message
Preparing Spark release v4.0.0-rc3