Thanks to visit codestin.com
Credit goes to github.com

Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
0be401f
Create urlcheck.yml
hillarymarler Aug 28, 2024
a400130
Update urlcheck.yml
hillarymarler Aug 28, 2024
3505669
Update urlcheck.yml
hillarymarler Aug 28, 2024
472e5fa
Update urlcheck.yml
hillarymarler Aug 28, 2024
828594f
Update urlcheck.yml
hillarymarler Aug 28, 2024
8ffef5c
Update urlcheck.yml
hillarymarler Aug 28, 2024
84dce53
Update urlcheck.yml
hillarymarler Aug 28, 2024
dbbc471
Update urlcheck.yml
hillarymarler Aug 28, 2024
28cd483
Update urlcheck.yml
hillarymarler Aug 28, 2024
ee91886
Update urlcheck.yml
hillarymarler Aug 28, 2024
c48bb34
Update urlcheck.yml
hillarymarler Aug 28, 2024
88c6b5e
Update urlcheck.yml
hillarymarler Aug 28, 2024
bf012ac
Update urlcheck.yml
hillarymarler Aug 28, 2024
7fcdec4
Check
hillarymarler Aug 28, 2024
43af412
Update URL check
hillarymarler Aug 28, 2024
99c8baf
URL check update
hillarymarler Aug 28, 2024
89290ad
Update url-check.yml
hillarymarler Aug 28, 2024
8999628
Update url-check.yml
hillarymarler Aug 28, 2024
2456f91
Update url-check.yml
hillarymarler Aug 28, 2024
2ffce91
Update url-check.yml
hillarymarler Aug 28, 2024
bf32497
Update url-check.yml
hillarymarler Aug 28, 2024
faf6c5b
Update url-check.yml
hillarymarler Aug 28, 2024
9572221
Draft code
hillarymarler Aug 29, 2024
a2e9d0e
Merge branch 'develop' into hrm_test
hillarymarler Oct 28, 2024
b91356a
Update url-check.yml
hillarymarler Oct 28, 2024
a2f0e53
URL check updates
hillarymarler Oct 29, 2024
d37c080
Update test-URLChecker.R
hillarymarler Oct 29, 2024
83cc5e7
Update test-URLChecker.R
hillarymarler Oct 29, 2024
13e156f
Update broken url and add fix url check bug
hillarymarler Oct 29, 2024
c43faa9
Update test-URLChecker.R
hillarymarler Oct 29, 2024
2381593
Update DESCRIPTION
hillarymarler Oct 29, 2024
863c1c8
Update test-URLChecker.R
hillarymarler Oct 29, 2024
29cd2ba
Review updates
hillarymarler Oct 31, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
64 changes: 64 additions & 0 deletions R/Maintenance.R
Original file line number Diff line number Diff line change
Expand Up @@ -232,3 +232,67 @@ FindSynonyms <- function() {
# # run to update spelling word list
# spelling::get_wordlist()
# spelling::update_wordlist()

# # Find Broken Links if test-URLChecker.R fails
# # Run the code below:
# # extract urls function
# extract_urls <- function(text) {
# stringr::str_extract_all(text, "http[s]?://[^\\s\\)\\]]+") %>% unlist()
# }
#
# # clean urls function
# clean_url <- function(url) {
# stringr::str_remove_all(url, "[\\\\.,\\\")]+$|[{}].*") %>%
# stringr::str_remove_all("[<>]")
# }
#
# # create lists of files to check
# other_files <- c(
# system.file("README.md", package = "EPATADA"),
# system.file("DESCRIPTION", package = "EPATADA"),
# system.file("NAMESPACE", package = "EPATADA")
# )
#
# vignettes <- list.files(system.file("vignettes", package = "EPATADA"), pattern = ".Rmd", full.names = TRUE)
#
# articles <- list.files(system.file("vignettes/articles", package = "EPATADA"), pattern = ".Rmd", full.names = TRUE)
#
# r_files <- list.files(system.file("R", package = "EPATADA"), pattern = ".R", full.names = TRUE)
#
# # combine file lists
# files <- append(other_files, vignettes) %>%
# append(articles) %>%
# append(r_files)
#
# # create list of urls
# urls <- purrr::map(files, ~ readLines(.x)) %>%
# unlist() %>%
# extract_urls() %>%
# clean_url() %>%
# unique() %>%
# # problematic URL I can't get a response from using multiple methods (itec) and CRAN because its response is inconsistent, likely due to redirecting to mirrors (HRM 10/28/2024)
# setdiff(c(
# "https://www.itecmembers.org/attains/"
# ))
#
# # retrieve http response headers from url list
# headers <- urls %>%
# purrr::map(~ tryCatch(curlGetHeaders(.x), error = function(e) NA))
#
# # extract response code from first line of header response
# response_code <- sapply(headers, "[[", 1)
#
# # create data frame of urls and response codes
# df <- data.frame(urls, response_code)
#
# # filter for any response codes that are not successful or redirect responses
# df_false <- df %>%
# dplyr::filter(!grepl("200", response_code) &
# !grepl("301", response_code) &
# !grepl("302", response_code))
#
# # Review the output of df_false.
# # More information about http response codes can be found here:
# # [Mozilla Developer HTTP response status codes] (https://developer.mozilla.org/en-US/docs/Web/HTTP/Status)
# # Replace the broken links with functional ones or remove if no acceptable substitute is available.
# # Rerun code above to verify that df_false contains zero rows.
2 changes: 1 addition & 1 deletion R/Utilities.R
Original file line number Diff line number Diff line change
Expand Up @@ -1405,7 +1405,7 @@ TADA_addPoints <- function(map, layerfilepath, layergroup, layername, bbox = NUL
if (is.na(lbbox[1])) {
return(map)
}
shapes <- c(2) # open triangle; for other options see http://www.statmethods.net/advgraphs/parameters.html
shapes <- c(2) # open triangle; for other options see https://www.geeksforgeeks.org/r-plot-pch-symbols-different-point-shapes-available-in-r/
iconFiles <- pchIcons(shapes, width = 20, height = 20, col = c("#CC7722"), lwd = 2)
map <- leaflet::addMarkers(
map,
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

[![](https://github.com/USEPA/EPATADA/actions/workflows/R-CMD-check.yaml/badge.svg)](https://github.com/USEPA/EPATADA/actions/workflows/R-CMD-check.yaml)

Tools for Automated Data Analysis, or TADA, is being developed to help States, Tribes (i.e., Tribal Nations, Pueblos, Bands, Rancherias, Communities, Colonies, Towns, Indians, Villages), federal partners, and any other [Water Quality Portal (WQP)](https://www.waterqualitydata.us/) users (e.g. researchers) efficiently compile and evaluate WQP data collected from water quality monitoring sites. TADA is both a stand-alone R package, and a building block to support development of the [TADA R Shiny application](https://github.com/USEPA/EPATADAShiny). We encourage you to read this package's [LICENSE](https://usepa.github.io/EPATADA/LICENSE.html) and [README](https://usepa.github.io/EPATADA/index.html) files (you are here).
Tools for Automated Data Analysis, or TADA, is being developed to help States, Tribes (i.e., Tribal Nations, Pueblos, Bands, Rancherias, Communities, Colonies, Towns, Indians, Villages), federal partners, and any other [Water Quality Portal (WQP)](https://www.waterqualitydata.us/) users (e.g. researchers) efficiently compile and evaluate WQP data collected from water quality monitoring sites. TADA is both a stand-alone R package, and a building block to support development of the [TADA R Shiny application](https://github.com/USEPA/TADAShiny). We encourage you to read this package's [LICENSE](https://usepa.github.io/EPATADA/LICENSE.html) and [README](https://usepa.github.io/EPATADA/index.html) files (you are here).

- How to use TADA:

Expand Down Expand Up @@ -34,7 +34,7 @@ install.packages("remotes")
remotes::install_github("USEPA/EPATADA", ref = "develop", dependencies = TRUE, force = TRUE)
```

The TADA R Shiny application can be run [on the web](https://owshiny-dev.app.cloud.gov/tada-dev/) (R and R Studio install not required), or within R Studio. Run the following code within R Studio to install or update and run the most recent version of the [TADA R Shiny](https://github.com/USEPA/EPATADAShiny) application:
The TADA R Shiny application can be run [on the web](https://rconnect-public.epa.gov/TADAShiny/) (R and R Studio install not required), or within R Studio. Run the following code within R Studio to install or update and run the most recent version of the [TADA R Shiny](https://github.com/USEPA/TADAShiny) application:

```{r}
if(!"remotes"%in%installed.packages()){
Expand Down Expand Up @@ -73,7 +73,7 @@ Effective August 8, 2016, the [OMB Mandate: M-16-21; Federal Source Code Policy:

The EPA specific implementation of OMB Mandate M-16-21 is addressed in the [System Life Cycle Management Procedure](https://www.epa.gov/irmpoli8/policy-procedures-and-guidance-system-life-cycle-management-slcm). EPA has chosen to use GitHub as its version control system as well as its inventory of open-source code projects. EPA uses GitHub to inventory its custom-developed, open-source code and generate the necessary metadata file that is then posted to code.gov for broad reuse in compliance with OMB Mandate M-16-21.

If you have any questions or want to read more, check out the [EPA Open Source Project Repo](https://github.com/USEPA/open-source-projects) and [EPA's Interim Open Source Code Guidance](https://www.epa.gov/developers/open-source-software-and-epa-code-repository-requirements).
If you have any questions or want to read more, check out the [EPA Open Source Project Repo](https://www.epa.gov/developers/open-source-software-and-code-repositories) and [EPA's Interim Open Source Software Policy](https://www.epa.gov/sites/default/files/2018-02/documents/interim_oss_policy_final.pdf).

## License

Expand Down
68 changes: 68 additions & 0 deletions tests/testthat/test-URLChecker.R
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
test_that("URLs are not broken", {
# extract urls function
extract_urls <- function(text) {
stringr::str_extract_all(text, "http[s]?://[^\\s\\)\\]]+") %>% unlist()
}

# clean urls function
clean_url <- function(url) {
stringr::str_remove_all(url, "[\\\\.,\\\")]+$|[{}].*") %>%
stringr::str_remove_all("[<>]")
}

# create lists of files to check
other_files <- c(
system.file("README.md", package = "EPATADA"),
system.file("DESCRIPTION", package = "EPATADA"),
system.file("NAMESPACE", package = "EPATADA")
)

vignettes <- list.files(system.file("vignettes", package = "EPATADA"), pattern = ".Rmd", full.names = TRUE)

articles <- list.files(system.file("vignettes/articles", package = "EPATADA"), pattern = ".Rmd", full.names = TRUE)

r_files <- list.files(system.file("R", package = "EPATADA"), pattern = ".R", full.names = TRUE)

# combine file lists
files <- append(other_files, vignettes) %>%
append(articles) %>%
append(r_files)

# create list of urls
urls <- purrr::map(files, ~ readLines(.x)) %>%
unlist() %>%
extract_urls() %>%
clean_url() %>%
unique() %>%
# problematic URL I can't get a response from using multiple methods (itec) and CRAN because its response is inconsistent, likely due to redirecting to mirrors (HRM 10/28/2024)
setdiff(c(
"https://www.itecmembers.org/attains/"
))

# retrieve http response headers from url list
headers <- urls %>%
purrr::map(~ tryCatch(curlGetHeaders(.x), error = function(e) NA))

# extract response code from first line of header response
response_code <- sapply(headers, "[[", 1)

# create data frame of urls and response codes
df <- data.frame(urls, response_code)

# filter for any response codes that are not successful or redirect responses
df_false <- df %>%
dplyr::filter(!grepl("200", response_code) &
!grepl("301", response_code) &
!grepl("302", response_code))

# count number of failed responses
n <- nrow(df_false)

# print url and response code for failures
print(df_false)

# verify that there are zero urls with failing response codes
testthat::expect_equal(n, 0)
})


8 changes: 4 additions & 4 deletions vignettes/CONTRIBUTING.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -82,9 +82,9 @@ There are multiple ways to interact with GitHub using Git.
- Option 2: Interact with GitHub using the command line or a web
browser

- [Setting Up Git](https://docs.github.com/articles/set-up-git/)
- [Setting Up Git](https://docs.github.com/en/get-started/getting-started-with-git/set-up-git)

- [Git Basics](https://git-scm.com/book/ch1-3.html)
- [Git Basics](https://git-scm.com/book/en/v2/Git-Basics-Getting-a-Git-Repository)

- [Comprehensive Guide: Happy Git and GitHub for the
useR](https://happygitwithr.com/ "Great and comprehensive guide for Git with an R flair")
Expand All @@ -107,7 +107,7 @@ might be needed for Mac or Linux OS:

- Download both:

- [GitHub Desktop](https://desktop.github.com/)
- [GitHub Desktop](https://github.com/apps/desktop)

- [Git](https://git-scm.com/downloads)

Expand Down Expand Up @@ -149,7 +149,7 @@ install.packages(c("devtools", "rmarkdown"))
might be good first pickings for your first contribution to this
open-source project.
- Pull requests can be directly
[linked](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue)
[linked](https://docs.github.com/en/issues/tracking-your-work-with-issues/using-issues/linking-a-pull-request-to-an-issue)
to a specific issue. If linked, the Repository Administrators can
more easily review the pull request and issue at the same time once
a contributor submits the pull request. The issue can then be closed
Expand Down
4 changes: 2 additions & 2 deletions vignettes/TADAModule1.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -165,7 +165,7 @@ Additional resources:
the console: ?TADA_DataRetrieval

- [Introduction to the dataRetrieval
package](https://CRAN.R-project.org/package=dataRetrieval)
package](https://cran.r-project.org/web/packages/dataRetrieval/index.html)

- [General Data Import from Water Quality
Portal](https://rdrr.io/cran/dataRetrieval/man/readWQPdata.html)
Expand Down Expand Up @@ -1308,7 +1308,7 @@ interface. The shiny application queries the WQP, contains maps and data
visualizations, flags suspect data results, handles censored data, and
more. You can launch it using the code below.

DRAFT [Module 1](https://owshiny-dev.app.cloud.gov/tada-dev/) is also
DRAFT [Module 1](https://rconnect-public.epa.gov/TADAShiny/) is also
currently hosted on the web with minimal server memory/storage
allocated.

Expand Down
2 changes: 1 addition & 1 deletion vignettes/TADAModule1_AdvancedTraining.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -781,6 +781,6 @@ remotes::install_github("USEPA/TADAShiny", ref = "develop", dependencies = TRUE)
TADAShiny::run_app()
```

DRAFT [Module 1](https://owshiny-dev.app.cloud.gov/tada-dev/) is also
DRAFT [Module 1](https://rconnect-public.epa.gov/TADAShiny/) is also
currently hosted on the web with minimal server memory/storage
allocated.
2 changes: 1 addition & 1 deletion vignettes/TADAModule1_BeginnerTraining.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -1147,7 +1147,7 @@ interface. The shiny application queries the WQP, contains maps and data
visualizations, flags suspect data results, handles censored data, and
more. You can launch it using the code below.

DRAFT [Module 1](https://owshiny-dev.app.cloud.gov/tada-dev/) is also
DRAFT [Module 1](https://rconnect-public.epa.gov/TADAShiny/) is also
currently hosted on the web with minimal server memory/storage
allocated.

Expand Down
2 changes: 1 addition & 1 deletion vignettes/WQXValidationService.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ interquartile range) and a [probability density
function](https://en.wikipedia.org/wiki/Probability_density_function
"Probability density function") (pdf) of a Normal N(0,σ2) Population.
Attribution: Jhguch at en.wikipedia, [CC BY-SA
2.5](https://creativecommons.org/licenses/by-sa/2.5), via Wikimedia
2.5](https://creativecommons.org/licenses/by-sa/2.5/), via Wikimedia
Commons.](images/IQR.png)](https://commons.wikimedia.org/wiki/File:Boxplot_vs_PDF.svg)

Additional tests only available in WQX Web and Node Submissions (for
Expand Down
Loading